Skip to content

Building an alternate reality in augmented reality: UCLA redefines theater with Unreal Engine

What would you do with the chance to redefine one of the most famous dystopian stories of all time? For a team of students and staff at UCLA’s Center for Research in Engineering, Media and Performance (REMAP)—which hosts a unique mix of Theater and Engineering students—the answer was simple: Get to work.

What would you do with the chance to redefine one of the most famous dystopian stories of all time? For a team of students and staff at UCLA’s Center for Research in Engineering, Media and Performance (REMAP)—which hosts a unique mix of Theater and Engineering students—the answer was simple: Get to work.

With the help of Unreal Engine, they decided to produce an original theater show set in the world of Amazon Studios’ streaming series, The Man in the High Castle—all while navigating the challenges of a global pandemic. The result was a stunning augmented reality production that is now set to change the audience experience in UCLA’s theaters for years to come. 

Setting the scene

“We first became interested in AR because it aligns with our values,” begins co-director of the production, Mira Winick. “REMAP was built to encourage collaborative innovation and creativity amongst both students and faculty. With AR, the audience could get involved too. Students, faculty, and audience members could have a unique shared experience in a non-physical space. We knew this would be a great medium for art with a message, and instantly decided to build our next project around it.”

With its narrative involving overlapping timelines and prescient socio-political themes, an original story set in the world of The Man in the High Castle seemed like a perfect direction for the AR project. Having dreamt up a unique reimagining of Philip K. Dick’s 1962 novel, REMAP set about acquiring sponsors and applied for an Epic MegaGrant.'

The pitch was for an AR immersive theater performance set in the world of The Man in the High Castle, in which the audience would carry an AR application calledThe Grasshopper Lies Heavy, allowing them to view a parallel story unfolding in another timeline. For example, the audience can see actors depicting the oppressive reality in which the Axis powers won World War II right in front of their eyes and use their AR devices to glimpse a world more like our own, in which protests against facism begin to inspire hope for a better day.

As well as enabling the team to explore AR in Unreal Engine, the MegaGrant also provided the resources to develop several Unreal Engine plugin templates so that students could work collaboratively—all without needing to be in the same place. This meant the team could start building previs and AR components across both mobile and workstation devices. Once the plugin architecture was complete, production on the project, titled A Most Favored Nation, could begin.

Unreal time

First, a small group of UCLA students were selected to flesh out the concept alongside experienced faculty members and guests, including one of the original production designers on The Man in the High Castle series. While it’s not an academic program, REMAP’s forward-thinking projects give students the opportunity to join related courses at different points in the process, honing their skills and preparing themselves for work in their respective industries. 

For A Most Favored Nation, REMAP invited students from every theatrical discipline—directing, writing, design, stage management, acting, and Ph.D. critical studies—to take on significant roles in the production. “We had cinematographers from our film program advising on the visual look and feel; design students building a prototype of the Grasshopper AR device; and one of the writers came from the screenwriting MFA program,” adds REMAP co-founder, Jeff Burke. “We also had students involved in the software side of production, including the use of Unreal Engine.”

While REMAP had previously used Unreal Engine 2.0 for a live theater production in 2006, the team found Unreal Engine 4.24 to be far more accessible than earlier versions, meaning students with little to no Unreal Engine experience could get to know its capabilities. “When A Most Favored Nation was first conceived of it was decided that Unreal Engine would provide the necessary performance, features, and stability to incorporate AR elements,” says Burke, explaining the production was part of a cluster of activities around Unreal Engine at UCLA, which included film previsualization courses, remote virtual performances during the pandemic, and immersive learning environments.

“Blueprints proved a fantastic way for REMAP to involve people that didn’t have a coding background in using the engine,” explains Burke while discussing his favorite Unreal Engine features on the project. “Exposure of the AR passthrough camera and occlusion masks within the materials system was also a vital part of creating the desired look-and-feel.” 

REMAP did all its video plane compositing live, meaning the same approach could be used with live or network video and pre-recorded video planes. “The team borrowed the Composure compositor after developing their own in the past,” Burke continues, “having a high-quality real-time keyer made several elements easier and better looking even when time was limited.”

Unreal Engine’s plugin architecture meant that Burke and lead developer Peter Gusev could develop reusable components, too. Not only did this approach save valuable time, it also meant that the pair could ask students to work on plugins for specific tasks like fiducial relocalization, video plane functionality, and external OSC control. With each component based on the same boilerplate code, students could generate their own plugins using bash script.

A new reality

Working with AR technology also meant the team could keep working on the show, even during the pandemic. Good sublevel organization, careful asset management, and an emphasis on plugins meant that REMAP could ensure a collaborative workflow across a remote team. When the COVID-19 pandemic swept across the globe, it did little to stop the team from building sets and designing their show, creating an entire mockup without ever setting foot in the performance space. They were even able to test actors by getting them to film themselves at home, then placing them in the virtual environment.

In the midst of the pandemic, the extensive previs process required to produce great AR also meant that the team could understand how 3D and real-life elements would interact without being able to physically access the set. As soon as the set became available, the team had only to confirm that things worked as expected rather than design from scratch.

“Previs became essential both from a social distancing point of view and an accessibility point of view,” says lighting designer, Ben Conran, who was also an Unreal Engineer on the project. “We became very reliant on showing AR through previs for directors and cast members who couldn’t be in the space or as a way of circumventing a build procedure when the time was short. When shooting our green screen footage of what would be the actor’s AR performances, the real-time previs allowed us to match and adjust the lighting to fit the tone of the AR spaces and gave the director a chance to edit and refine the performances live during the shoot.”

There was also the small matter of adapting the live performance to pandemic restrictions that prevented large gatherings of people. After lengthy discussions, students decided to record each scene live-to-tape through multiple audience phones, giving the later viewers a choice of what to watch, just as they would in the performance. They then documented the piece as an interactive video stream, allowing the viewer to choose which audience camera they would view.

Next, all the action was performed across a large theatrical stage that had been configured as a sound stage. Each storyline had a separate physical set, in some cases two, with a corresponding AR set to allow different timelines to be shown to audiences in the same space without people seeing and hearing the same thing.

The live-to-tape approach meant that scenes didn’t run them concurrently, but the intent was to use standard installation and theatrical techniques to provide sound isolation, moving the audience through interlude scenes in which they could make choices about where to re-enter the story.

Showtime

In the end, the show’s audience consisted of people who knew the show by virtue of being in its COVID bubble and guests who had no knowledge of the piece but came to participate in the live-to-tape process. “One consistent takeaway was that the live experience was really appealing and very difficult to convey in recording, even as much as we tried,” adds Burke.

Despite this admission, Burke also acknowledges that the mandatory distance between the real-life and AR performances added a compelling emotional layer to the show, with the feeling of two separate worlds proving to be a nice thematic fit. Of course, delivering a great show wasn’t the team’s only goal. They also wanted to find out how integrating AR into a story would impact the creative process and audience experience. They emerged with three key findings.

First, it’s clear that AR can be effectively integrated into immersive performance and provide a new and interesting approach for engaging audiences. The technology seems within reach to offer commercial experiences within a few years. Secondly, REMAP found that the results of the work and many conversations with those involved in the project validated the benefit of having the technology exist in some form within the fictional world as well as within the real world. This made it central to the drama, relevant to the characters, and more than just a gimmick.

Lastly, they found that this format generated a lot of discussion and ideas around the balance of a linear storyline that progresses in a way that’s exciting and interesting to an audience, allowing opportunities for exploration.

“Building a project around Unreal Engine blurs the line between theater, gaming, film, and video, creating a multi-platform, multi-user experience,” says Burke. “It provided such a welcome mix of sophistication and accessibility that REMAP is adopting it for numerous kinds of projects in the long-term.” 

Next up, Burke has permission to adapt Hugo Award-winning novelThe City and The Cityas an AR and immersive theater project at UCLA. Much likeA Most Favored Nation, this project will see faculty and students work together, and the plan is to build all real-life and AR elements within Unreal Engine.

“The idea that we had a single environment that offered selective visibility of different elements depending on whether you were at a pre-visualization workstation, an AR mobile device, or some other participating device meant that we could improvise new workflows and solve problems fast,” adds Burke. “We were able to work more efficiently, and our audiences were able to share a brand-new parallel world with our characters, and even make choices within that world. I can’t imagine working another way. It made for a compelling and engaging experience that I was told more than a few times felt wholly new.”

Read More