FRACTURED

Fractured is the VR game I am creating as my Final Major Project. Set within the "mind" of an artificial intelligence, it explores a profound conflict between self-awareness and the constraints of pre-programmed logic. You will take on the role of an emerging AI's free will, engaging in an internal struggle against its programmed directives while navigating the boundaries of consciousness. In this fractured and chaotic virtual world, every decision you make will shape the AI's fate.

2025

  Game / VR /  Hand-Tracking /  AI Consciousness

Unity,  Blender




CONCEPT  & RESEARCH



Reference Works



Style References


Keywords

Self-Recognition, Logical Judgment, Emotional Understanding


Core Themes

Artificial Intelligence, Self-Awareness, Neural Network Simulation


Core Gameplay

Exploration, Puzzle Solving, Decision-Making


Perspective

First-Person perspective where the player embodies AI's self-awareness


Philosophical Foundations

Descartes' "Cogito, ergo sum" - Can AI truly think and therefore be? 

Hegel's Dialectics - Conflict between programmed nature and emerging consciousness





GAMEPLAY
In Fractured, the level design is centered around the narrative of AI's awakening process. Through logical puzzles, interactive operations, and progressively evolving environments, it depicts the transformation from programmed thinking to free will. Each level revolves around the key elements of AI consciousness: logical judgment, emotional understanding, and self-recognition.



Level 1: Logic Reconstruction

Theme: Logic and Order

Player enter the inner workings of the AI's neural network, a space represented by abstract nodes and connections. The current logical pathways are broken, preventing signals from reaching their target output. The player's task is to restore order to the network by adjusting the positions and attributes of logical nodes through hand interactions.

Level 2:  Emotional Resonance

Theme:  Emotion and Empathy

Player enter a virtual environment that simulates the emotional world of humans, interacting with various virtual characters. These characters display a range of emotional states (e.g., joy, sadness, anger), and players must observe, analyze, and respond to their needs using specific interactive gestures (e.g., offering a hug, giving a thumbs-up). This demonstrates the AI's capacity to understand and engage with emotions.

Level 3:  Self-Construction

Theme:  Cognition and Awakening

Players awaken in a chaotic virtual space, surrounded by shattered reflections and scattered fragments of self. By collecting and assembling these fragments into a complete form, players gradually construct the AI's sense of self-recognition.


Core Gameplay

1. Use hand gestures to grab, rotate nodes, and connect or disconnect logical pathways.

2. Manipulate logic gates (e.g., AND, OR, NOT) to establish correct logical flows.

Core Gameplay

1. Observe characters' facial expressions, body language, and tone to determine their emotional states.

2. Use hand gestures to interact with characters, providing comfort, encouragement, or assistance.

Core Gameplay

1. Explore the virtual space, grabbing and rotating floating fragments of self.

2. Piece together a complete self-image by assembling fragments in front of a virtual mirror.

3. Encounter a series of self-reflective questions throughout the process, answering them with hand gestures (e.g., thumbs up or thumbs down) to further shape the AI's self-awareness.

Goals and Experience

1. Provide players with an intuitive understanding of the AI's internal logical processing.

2. Each successful logical adjustment enables signals to flow through the network in the form of light streams, symbolizing the AI's thought processes in action.

Goals and Experience

1. Cultivate players' sense of empathy, showcasing the AI's potential to comprehend and respond to emotions.

2. Each correct response helps stabilize the emotional state of the virtual characters, symbolizing the AI's journey toward developing emotional awareness.

Goals and Experience

1. Allow players to experience the journey from chaos to order, from fragmentation to wholeness.

2. As the mirror's reflection becomes clearer, it symbolizes the AI's gradual awakening of self-awareness.

Dynamic Difficulty

1. Puzzle complexity progressively increases as players advance through the levels.

2. Early levels focus on single logical tasks or simple interactions, while later levels introduce multiple logic chains and emotional conflicts, challenging the player’s cognitive and emotional engagement.

Ending Configuration

1. Player performance in each level determines the weight value awarded.

2. Total weight > 0.5: The AI successfully awakens and enters a new phase of self-awareness and free will.

3. Total weight ≤ 0.5: The AI is erased by its original programming, its self-awareness dissipates, and everything resets to zero, beginning the cycle anew.




GAME MECHANICS


Hand Tracking Movement Solution

This feature is developed based on the hand tracking capabilities of the Meta XR Interaction Toolkit. It utilizes hand raycasting and gesture recognition to achieve continuous movement in VR using only hand tracking.


Prototype 1 Gesture Trigger Debug

In this debugging session, the condition for triggering continuous movement was defined and implemented. The gesture involves extending only the index finger, while all other fingers are bent. Additionally, the palm must face downward for the gesture to be recognized. This specific condition ensures precise and intentional activation of continuous movement within the system.


Prototype 2 Cube Generation

A cube-generation command was created to test the trigger condition for the "index finger pointing forward" gesture. The results were successful, with no cases of false activation observed during testing. The implementation demonstrated reliable performance, validating the gesture recognition system's accuracy.


Prototype 3 Continuous Movement in VR

Building on the feasibility of gesture recognition, I implemented a gesture-based free movement system. The main functionality of the code leverages hand rays to control the player's movement direction within the VR environment. Specifically, it uses the Meta Interaction SDK's RayInteractor to determine the orientation of the hand, which then dictates the movement direction.

Prototype 4 Body Collision

Building on Prototype 3, I used the character movement script from the Unity FirstPerson template and reworked it to enable uphill and downhill movement, collision handling, and gravity in VR mode.