How to Create an AI Chat with Gemini in Construct 3

UpvoteUpvote 7 DownvoteDownvote

Index

Features on these Courses

Stats

1,679 visits, 2,009 views

Tools

Translations

This tutorial hasn't been translated.

License

This tutorial is licensed under CC BY 4.0. Please refer to the license text if you wish to reuse, share or remix the content contained within this tutorial.

Published on 26 Aug, 2025.

Step 5: Connecting the Function to the UI

Now that our ask function is written, let's wire it up to the Construct 3 interface using event sheet scripting. We'll trigger this function when the user presses the 'Enter' key in the text input field.

// TypeScript code in a Construct 3 Event Sheet
const key = runtime.globalVars.API_KEY;
const question = runtime?.objects?.Question?.getFirstInstance()?.text || "";
const answer = question.trim().length > 0 ? await Gemini.ask({key, question, runtime}) : "Missing Question!";
const textAnswer = runtime.objects.Answer.getFirstInstance();
if (textAnswer) { textAnswer.text = answer};
  • 2 Comments

  • Order by
Want to leave a comment? Login or Register an account!
  • Good tutorial. I'm not sure how memory is normally achieved, but sending the entire chat back to the ai each prompt does not sound good. It will use an exponentially larger amount of tokens. Not sure how it will effect the free tier, but it will cost you a fortune if you have to pay.

    • Hi, thank you for the comment!

      You're right, sending the full history increases token usage. This method is necessary because models like Gemini are stateless and need that history for context. For a real application, it's vital to manage this to control costs. Common solutions include using a "sliding window" of recent messages or summarizing the chat.

      From a game design perspective, I also believe it's better to guide the AI instead of allowing completely free chat. Using structured output helps maintain creative control over the game's narrative, a topic I hope to explore soon.