ASU MIX Center 
Presentation at SXSW 2024



        My contribution to Volvox Labs as an Unreal Engine Artist. Responsible for developing a Blueprint system for the Main LED screen presentation and Ultra Reality interaction at for Arizona State University Media and Immersive Experience Center.
Official Event Description by ASU MIX Center: ASU is shaping the future creative workforce, offering transdisciplinary degrees in immersive media and extended reality arts—“XRts”— in Arizona and California. The Media and Immersive eXperience (MIX) Center is a world-class facility powered by cutting-edge technology—a hub for collaboration among creators, researchers and worldbuilders. Worlds for Change, the ASU XRts worldbuilding contest celebrating the next generation of XR creators and their visions of the future, will award grand prize winners at SXSW 2024.


On-Site Event




      This project stands as one of my proudest achievements, marking a breakthrough in my expertise with Blueprint Visual Scripting in Unreal Engine 5, user interface interaction, visual design, and environment building. One of the biggest challenges during the early stages was testing workflows between Unreal Engine and TouchDesigner while ensuring the scripting logic functioned seamlessly in both presentation versions. For the smaller Ultra Reality screen, the project runs as a standalone game using an Xbox controller, facilitated by Unreal Engine’s built-in gamepad input mapping for easy implementation. Conversely, the larger screen installation incorporates interactive hand gesture inputs. For example, to select a level for teleportation, the audience raises their hand toward a button and holds it steady, triggering the teleport after a set duration. TouchDesigner processes these gestures, with OpenSoundControl (OSC) enabling communication with Unreal Engine. My responsibilities included leading the setup of Blueprint Visual Scripting and ensuring smooth integration with TouchDesigner.


Ultra Reality Screen Interaction with Game Controller


Logic Planning Chart. Direction from Volvox Labs



    The image above outlines the interaction plan. The experience begins at the main menu, where the audience chooses one of four worlds to explore. Both the LED and Ultra Reality screens function similarly: the player selects a world, triggering an animation sequence featuring a planet orbit, a hyper jump, a cutscene, and a seamless teleport to the selected world. These transitions ensure immersion during level loading.

    Once in the level, the screen displays videos about ASU’s academic programs, sponsors, and study centers. After the video ends, the system returns to the main menu. A timer is also set in the main menu to prevent inactivity; if no interaction occurs, a world and video are selected randomly to showcase the experience.



In-Game Visual from Unreal Engine at Introduction Stage



World Selection Logic. Blueprint set up to receive input from a game controller. 
One click: animate the planet to the middle of the screen. 
Another click: launch the new level. Without confirmation: Animate the planet back to the orbit ring.


Once level is launched, randomly play the sponsor videos from the list of 7 videos. 
Once the video ends, return the viewer to the idle. 
OSC message is also sent at the beginning and end of a level to communicate with on-site Touchdesigner feature


Media set up logic for each sponsor video


World selection function for main LED screen. Receiving input from Touchdesigner instead of the game controller.


World Variations


Direction
Volvox Labs
Contribution
Logic Implementation, Blueprint Scripting, 
World Creation, Visual Effects


2024 Chatrin Samanchuen
All rights reserved
Chatrinsamanchuen@gmail.com
New York City, NY