Note: Content being added shortly -Tim (April ‘24)

Origin

After working a bit with Intel on design and prototyping, I had been approached to submit a pitch for an audio reactive VR experience to showcase the potential of Intel’s massive up-and-coming volumetric capture stage in collaboration alongside singer, songwriter, and comedian Reggie Watts, a feat that should also be ready in a short 6 month window to showcase at Sundance 2019 and subsequent festivals. To accomplish this, I put together a small team of incredibly talented individuals who filled each niche perfectly. Working with Ryan McGee and Sagar Patel under the LifeOrange name, we designed and scoped a vision that would highlight use cases and benefits of what Intel’s technology had to offer in an emerging content market while ensuring that attendees would be taken on a visually stunning musical escapade that only the talented Reggie Watts and co-collaborator John Tejada’s (Wajatta) multifaceted personas and definitive music would harmoniously compliment together.

During production, we consulted with NYC-based dance coordinator Kiira Benzing (Double Eye Studios), whom directed the original live performance capture of Wajjata’s song Runnin’ with Intel Studios, to block out the best cuts from the live dance footage that would highlight each dancer’s unique emotion and style as it was envisioned in the moment.

Visual

In order to bring the most diverse and vividly personal experience possible to the audience at each point in the song, we built a range of custom shaders that worked seamlessly with the plugin Intel provided to highlight each dancer and push the desired emotions towards any style we chose and yet could work effectively with the large data sets provided via the volumetric capture footage to ensure the performance necessary to ensure a solid 90fps for the entire experience. Many of the early revisions of design focused on a completely audio-reactive space built procedurally, a nod to the brilliant aesthetics of Disney’s Tron Legacy, though we ultimately felt it was best to balance that with a static environment should the audience need to tether themselves in order to account for the wide variation of people who were going to experience this (and potentially VR for the first time).

We consulted with designer Robbie Tilton and concept artist Candi Quach to help us define the look of the main stage and how it could change dynamically with the entire performance. The experience starts off in a dark and dusty record store and through a few magical twists of Reggie’s voice that helps attract the audience towards the Wajatta vinyl, a poster of Reggie is displayed above a single record player, and as the vinyl is laid down to spin up it’s beat, a glimpse into the volumetric tech starts as Reggie’s photo takes form, grabs the player in, and brings them into a world where rules are dispelled and the visual presence of what Intel’s volumetric capture technology is capable of takes center stage. Around you, dancers as one have never seen them before in 3D, captured in such high quality that their emotions and movements are able to be experienced and even molded, instills the audience with the permission to let loose and enjoy the raw state of curiosity that only in childhood would you otherwise have had the privilege of embracing.

A lot of work was done on both the plugin side (Intel) and the Unity side (LifeOrange) to ensure that we could read each 3D pixel of the every frame into a GPU buffer, toss away some unnecessary data, and allow for manipulation to occur between the original volumetric footage and the interaction provided by each VR controller, headset location, and each song stem’s waveform. This gave us the ability to do some interesting things such as giving players the ability to go up to a dancer and direct their movements in various ways by affecting their dance timing or even create a motion boomerang effect. This low level control of the data also gave us the ability to define resolutions on the fly, letting us create tiny dancers that could fit in your hand or a skyscraper-sized Reggie in the background without any noticeable difference in quality.

To push this further, I wanted to give people the ability to see the performance from all angles, even the non-standard ones. Points were put all over the scenes that let players teleport to different ranges, even on the ceiling upside-down, or becoming tiny themselves, yet during this potential path of exploration we ensured that their sense of direction was maintained through a set of visual constants, spatial audio, and eased transitions.

This awe and feeling of joy led to a lot of interesting connections with the virtual dancers and brought players to their own enhanced states of emotion, whether it be dancing, crying, smiling, or shouting, the experience successfully let people forget the world around them and embrace the one we crafted.

Audio

Given the talents of Reggie and John as musicians, we knew that audio would be just as important as the visual element, only successful if both were of the highest fidelity. We employed a custom built FFT analyzer (Ryan) that fed into our custom GPU compute shader system (Sagar), and using the stems provided by Wajatta, were able to target specific tracks, sounds, and voices to affect visuals, giving us complete control over how we wanted to performance to unfold for the audience. Elements of every environment could be interacted with which responded in sync with the music, ensuring that while the world around you was interactable at any moment you chose, it was still in rhythm with the song. Every element of the experience was driven by the music and player interaction. Once the player started to take hold of what their curiosity was capable of, they realized it was them whom were ultimately directing the music video.

Hardware

Runnin’VR was built to shine on an Intel processor, a mid-range GPU, and designed to be experienced on a high resolution VR headset without a tethered cable that would otherwise restrict movement. To free the player completely to express themselves, we modified a few Vive Pros to hold a small battery, high-end headphones, and a wireless Gig-E adapter from Intel allowing for a fully dance-ready untethered experience for the audience. During future installations, we unshackled it’s reliance on SteamVR’s SDK and added support for the Oculus PC platform and OpenXR for devices like the Windows Mixed Reality headset. Additionally, an on-rails stereoscopic 360 version of the experience that could run on the Meta Quest and HTC Vive Elite was ported by Intel via Light Sail VR.

Debut

Runnin’VR premiered during the Sundance 2019 film festival in Park City, Utah at the New Frontier pavilion to an incredibly curious, energetic, and captivated audience over the snow-filled week. It went on to be showcased and win a few awards at other festivals as well:

Winner of Best Interactive @ SXSW - Austin 2019

Winner of Best International Experience @ Sandbox Immersive Festival - Qingdao, China 2019

Raindance Film Festival - London 2019

San Francisco Dance Festival 2019

Haifa Film Festival - Israel 2019

Clients: Intel Studios, Unity, Reggie Watts
Tools: Unity (2018x-2021x), Houdini, SteamVR