Design + Programming

Cyberworld Skies

Cyberworld Skies is an art installation piece that uses Unreal Engine, Ableton and Max 8 to create an interactive art piece that would react to noise and sound. 

The Project

Cyberworld skies is an interactive sound and visual experience exploring the relationship and contrast of cyber aesthetics and natural forms. This is represented through the transformation of weather including cloud cover, rain, and sun responding in generative forms through EDM style music and interactive feedback with human voice and MIDI keyboard inputs.

Audio

The audio aspects of the experience were developed using Ableton, with each track running separately to respond to different elements. The two interactive elements are the cloud cover wave that fluctuates from voice input with a mic, and the gray sun orb that distorts through midi keyboard hits. The other three elements including the pixel rain drops, building lights, and sky are synced to different elements such as the snare, claps, and base. Upon the completion of the sound design, the tracks are run through envelopes in Max to track frequency levels. Once a certain threshold is hit, Max readings are outputted to the unreal engine where our cyber city was built and where the programs combine to form the final experience.

Connecting the Visuals

The visuals were created by using Unreal Engine and various 3D assets. Each track was bounded to an envelope in Max and passed through by having a signal from Ableton to activate. Once the track frequency reached the desired threshold of the envelope, it is then outputted by Unreal Engine visually by lighting up a specific object.

Max programming was used to activate assets in Unreal Engine, creating a unique and engaging art installation that responded to sound in real-time. I created a Max patch that received sound data from Ableton, and used a custom control interface in Max to manipulate the sound data in real-time. I used Max nodes to send OSC messages to Unreal Engine, and created a blueprint in Unreal Engine that received the OSC messages which then triggered interactions with the corresponding assets and elements. By using Unreal Engine's visual scripting language, I created the specific interactions and behaviors that I wanted to trigger when the OSC message was received. The result was a dynamic and immersive experience that responded to sound and allowed viewers to actively engage with the art installation.

In Real-Time 

By using interactive elements in Unreal Engine, participants were able to actively engage with the art installation in real-time, creating a more immersive and engaging experience. The ability to pick up the frequency of sound being played by the microphone and MIDI keyboard adds an additional layer of interactivity, allowing viewers to see the direct interaction of their actions on the visual representation of the sound.

The Experience 

Overall, using Max programming to activate Unreal Engine assets was a super interesting and exciting experience. The use of custom control interfaces, scripting, and visual programming allowed me to create a unique and engaging art installation that responded to sound in real-time. It was a great way to showcase the different types of technology coming together in creating something immersive and interactive.

The Installation Piece

The combination of Max, Ableton, and Unreal Engine, alongside the use of interactive elements being picked up by sound played by the microphone or MIDI keyboard from the participant, creates a truly unique and engaging art installation that encourages viewers to explore and interact with sound and music in a unique way.