🛠

Hello Alice

Hello Alice reimagines coding education for 10–12-year-old students by moving beyond traditional drag-and-drop models and introducing AI-assisted programming in classroom settings. Integrated with OpenAI’s API, the tool translates natural language prompts into executable code, allowing users to instantly see their changes reflected in a dynamic 3D game scene.

To support learning transparency, the system also generates visualized, editable code blocks, helping young learners connect their actions to real programming concepts.

This project was developed in collaboration with the Alice Team as the client.

Try it on itch.io -> [LINK] (Note: the toon shader we use is not capable for WebGL build, and the art may look different from Windows build.)

Project Type

Educational tool

💭

Time

Jan 2025 - Recent

Team

Jinyi Dai, Ruiting Chang, Yifan Jiang, Zhanqi Yang, Jia Wang, Wendy Di

My Role

🧩

1.

2.

Designed the core functionality of Hello Alice and led full-stack development across frontend and backend systems. Structured the entire project architecture to support scalability and modular expansion.

Integrated OpenAI’s Assistant API for seamless real-time prompt processing via web API communication, enabling natural language inputs to drive scene updates.

Real-Time Code Execution Pipeline

Built a Lua scripting layer that exposes C# functions to AI-generated scripts, enabling safe, dynamic code execution inside Unity’s runtime environment.

Modular Code Visualization System

Developed an extensible code visualization framework that represents functions and method calls as editable code blocks, allowing users to intuitively modify AI-generated behaviors.

Tool Walkthrough

Pick a character and multiple story scenes.

In Spell Lab, give prompts to AI by typing or voice input. Wait for AI response.

Note: As the target audience is 10-12 year-old kids, we use the word “Spell“ instead of “Script“ in the tool.

“Set the forest on fire!“

Check the AI-generated spells.

The AI-generated code is executed dynamically at runtime, with changes immediately reflected in the game scene — for example, setting a forest on fire happens the moment the command is issued.

To enhance understanding, the corresponding code is visualized on the right side of the screen as colored, editable code blocks. Each block represents an atomic function within the system. Users can customize these blocks using dropdown menus, checkboxes, and draggable input fields, allowing for intuitive adjustment of parameters without directly writing code.

Link spells with objects to create build interactions.

All spells (AI-generated scripts) require an in-game trigger to execute.

We designed an intuitive interaction system: when the player approaches an interactive object and presses the designated key, one or more linked scripts are triggered immediately. This allows players to naturally embed actions into the game world, creating dynamic, story-driven interactions without needing complex setup.

Technical Structure

User Layer

The user initiates interaction by entering a natural language prompt (e.g., "I want a large mushroom").

After submission, the user interface enters a waiting state while the system processes the request and AI generates the response.

System Layer

The Unity-based system handles the following responsibilities:

  1. Scene Context Collection
    Captures relevant scene data at the time of input to provide contextual awareness to the AI.

  2. Message Packaging
    Combines the user prompt with scene data and formats it into a structured message for the AI.

  3. AI Request Handling
    Sends the packaged message via a POST request to OpenAI’s Assistant API and begins polling for status updates.

  4. Response Parsing & Validation
    Once a response is received, the system parses the returned Lua script, checks for errors or invalid calls, and queues it for execution.

  5. Lua Code Execution (via MoonSharp)
    The returned Lua code is:

    • Queued and managed in a coroutine-based execution system

    • Executed sequentially, ensuring predictable changes in the scene

    • Visualized as code blocks, mapping each method call to a corresponding UI representation (e.g., colored blocks with editable parameters)

  6. Scene Update & Feedback
    Upon successful execution, visual changes are applied to the Unity scene in real-time (e.g., a large mushroom appears), and the executed scripts are recorded for debugging or undo purposes.

System Sequence Diagram

Lua Scripting Layer

The Lua scripting layer acts as a controlled bridge between AI and Unity, interpreting and validating all commands.

Instead of asking AI to generate a complete C# script and executing raw C# code, the system interprets AI outputs through Lua, where AI produces Lua scripts using only a curated set of exposed functions. This layer interprets the Lua instructions and delegates execution to corresponding C# methods within Unity, while enforcing strict control over what operations are allowed. By sandboxing AI outputs in Lua, the system prevents unsafe behavior, enables fast runtime execution, and ensures that user-generated interactions remain stable, modular, and easy to visualize.

AI Layer

The AI component uses OpenAI's Assistant API:

  • Each prompt-thread is handled as a task that goes through Queued → In Progress → Completed states

  • On completion, a Lua-formatted response is returned, containing functions available within the exposed API sandbox

  • This structure ensures all outputs are interpretable and executable within Unity's scripting environment

Implemented Methods Preview

One-Page Design Documentation

My Role

Lead programmer, designer

Tool Architecture & Development

AI System Integration

🪄

3.

4.

Previous
Previous

QuestMakers

Next
Next

VampStar