ECHOES OF MOTION
Echoes of Motion is an immersive digital dance production created in collaboration with LCDS dancers. It integrates motion capture technology and the stunning visual effects of Unreal Engine 5. By blending video technology with dance, this piece invites the audience to explore the meaning of life through an immersive virtual experience.
2024
Immersive Dance / Interactive Installation / VR
UE5, TouchDesigner
SPECIAL THANKS
Emma Dexter-Smith - Dancer
Maddy Knight - Dancer
VR Mode Showcase
CONCEPT DESIGN
Soundtrack
Soundtrack
I chose Ryuichi Sakamoto's "andata" for Echoes of Motion because its atmosphere harmonizes seamlessly with the emotions we aim to evoke. The delicate background sounds, reminiscent of gentle rain or a soft stream, and the serene melody create an ethereal environment that accentuates the performance’s narrative. Through its subtle beauty, the music underscores the exploration of life's transient nature and the fragility of the natural world, inviting the audience into a meditative journey where sound and movement converge to reflect the essence of existence.
Virtual Space Design
The space design is meant to mirror the essence of
the music. A shaft of light breaks through the darkness, highlighting the dancer and the rhythmically flowing
particles. These visual elements not only enhance the
beauty of the scene but also amplify the connection
between the music and the dance, offering a poetic
homage to the ephemeral nature of life and its delicate
state. This design enriches the theme and crafts a space
that resonates with the heart and captivates the eyes.
the music. A shaft of light breaks through the darkness, highlighting the dancer and the rhythmically flowing
particles. These visual elements not only enhance the
beauty of the scene but also amplify the connection
between the music and the dance, offering a poetic
homage to the ephemeral nature of life and its delicate
state. This design enriches the theme and crafts a space
that resonates with the heart and captivates the eyes.
The motion capture for Echoes of Motion was a collaborative effort with dancers from the London Contemporary Dance School (LCDS), utilizing the Vicon motion capture system in the motion capture studio at London College of Communication. The choreography was created by the LCDS dancers in response to the selected music, following discussions and creative direction.
I was responsible for refining the captured motion data and applying it to the virtual dancer character, ensuring a seamless and expressive translation of their movements into the digital space.
Character Animation
Due to technical limitations, motion capture data exported from Vicon Shogun was not immediately compatible with Unreal Engine 5 characters. To address this, the raw data was first imported into Blender for refinement. In this process, issues such as jitter and unintended motion artifacts in the animation were corrected to ensure smooth and natural
movement.
The character's skeleton was then skinned to the motion data, creating a fully rigged animation. Finally, the refined animation was imported into Unreal Engine 5, where it was retargeted to a Unreal character. This workflow ensured seamless integration of the motion capture data into the virtual environment while maintaining the quality and fidelity of the original performance.
The character's skeleton was then skinned to the motion data, creating a fully rigged animation. Finally, the refined animation was imported into Unreal Engine 5, where it was retargeted to a Unreal character. This workflow ensured seamless integration of the motion capture data into the virtual environment while maintaining the quality and fidelity of the original performance.
Interactive Water Surface
To create an interactive water surface, I utilized Unreal Engine's official water blueprint from the demo project. After integrating it into my project, I noticed that my character couldn't interact with the water. Upon reviewing the official documentation, I learned that the water blueprint only interacts with the capsule collider of a character, which works well for movement-based gameplay. However, since my character's capsule collider remains stationary during animations, no interaction occurred.
To resolve this, I modified the blueprint nodes to allow the water surface to interact with other objects. I then attached two spherical colliders to the character's foot bones, enabling the creation of ripples as the character dances.
Particle System
To visually represent the rhythm and emotion of the music, I created an audio-reactive particle effect that dynamically responds to the music's tempo and the dancer's movements. This effect highlights the interaction between music and dance, allowing the audience to intuitively experience their synergy.
Using UE5's Niagara system, I designed particles that react to the audio's fluctuations. Initially, due to the gentle nature of the background music, the particle flow appeared subtle. To enhance the effect, I adjusted the system's sensitivity to the music's intensity, added a metallic texture to make the particles shimmer under light, and reduced their flow speed for smoother synchronization with the music.
To transform the project into an interactive installation, I used Kinect sensor to track the viewer's head position in real-time. This position data was then imported into Unreal Engine 5 via TouchDesigner, enabling a dynamic parallax effect.
This setup allows viewers to experience the virtual dance from different angles without the need for VR equipment. By accurately tracking the viewer's head position, the system renders view-dependent images on the display, effectively creating the illusion of watching an immersive dance performance through a real window.
Prototype 1
To build the first prototype, I activated the TouchEngine plugin in Unreal Engine 5 and set up a blueprint to retrieve only the right hand's left-right movement data from TouchDesigner. The transmitted range was from -100 to 100, where negative values indicated movement to the left and positive values to the right. In UE5, this data was mapped to camera rotation, with a value of 100 rotating the camera 360 degrees to the right around the center.
Prototype 2
Building on Prototype 1, Prototype 2 introduces additional axes to track up-down and forward-backward movements, enhancing spatial interaction. The tracked object was switched from the right hand to the head, enabling a more intuitive parallax experience. The camera's behavior was refined to dynamically counteract head movement—when the head moves right, the camera rotates left—creating a realistic parallax effect that simulates depth and spatial perspective. A debugging node was also integrated to display real-time position values on the screen, ensuring accurate monitoring and smoother development iteration.
Building on Prototype 1, Prototype 2 introduces additional axes to track up-down and forward-backward movements, enhancing spatial interaction. The tracked object was switched from the right hand to the head, enabling a more intuitive parallax experience. The camera's behavior was refined to dynamically counteract head movement—when the head moves right, the camera rotates left—creating a realistic parallax effect that simulates depth and spatial perspective. A debugging node was also integrated to display real-time position values on the screen, ensuring accurate monitoring and smoother development iteration.
Intractive Installation Showcase