The Virtual Arena: Immersive Theater Breaks New Ground

The Virtual Arena

The application of immersive technology into the attractions and amusement landscape is covered by industry specialist Kevin Williams. In his latest Virtual Arena column – we visit the test project for a new kind of Immersive Theater – employing the latest technology, including Magic Leap AR headsets, making its debut in live performance.

Lost Origin
Image credit: Seamus Ryan

The diversity of location-based experiences is constantly growing, we have already covered in this column some of the related immersive presentations in the arts. And recently the team behind a new project invited the media to be the first to be immersed in a new audience experience. Called Lost Origin Experience – the endeavour has been self-styled as a boundary-breaking piece, combing performance, mixed reality to offer an “Immersive Theatre”. A fusing of technologies, including a partnership with Magic Leap to deploy their headsets as part of the performance. Allowing the audience to interact with both the physical and digital worlds.

The experience was developed by studio Factory 42, presented in partnership with the Almeida Theater and Sky. The work is a UK government-funded research and development project, part of the Innovate UK to push boundaries in immersive experiences (the Industrial Strategy Challenge Fund’s Audience of the Future initiative). Along with the whole unique aspect of this immersive theatre, is the careful attention to detail cemented using the full gambit of mixed reality (MR) applications. This is stated as being the first-ever large-scale visitor experience deployed with Magic Leap headsets. The whole project is a limited-time test run of the concept, being operational only for a one-month window, in London.

Having booked a slot, the theatre experience sees the guest recruited as a member of the organization called “Wing 7”, planning to carry out an investigation, codenamed “Operation Origin” – directed to arrive at the field base at Hoxton Docks, in London. The experience starts before arriving at the secret venue, as guests receive a mysterious video emailed to them before they arrive setting the scene, from the operations director. Upon arrival at the field base, the guest is taken into a briefing and introduced to the team and key players. Actors set the scene of a story of dark-web auctions, and secret activities unfold, and then it’s time to enter the adjoining premises and start the search for clues, and more!

Operation Origin
Image credit: Seamus Ryan

Without going into too much detail and revealing the compelling storyline and experience, we can reveal that the adventure takes the group through several rooms’ settings, though the experience is fundamentally broken into four key acts but is much more nuanced. The first offers an immersive puzzle section, then we move into an area of wonder and mystery, then a chance to wear the Magic Leap AR headsets and interact with the environment. And then finally the denouement, where the group get to decide the outcome.

As stated, Lost Origin Experience has played with all the toys in the toybox of immersive experiences. Essentially, we have at the front the use of LARP’ing. Live-action role-playing (LARP) has grown in popularity since the early murder mystery experiences, and more recently with the Secret Cinema kind of events. As we reported in our coverage of ‘The War of the Worlds’ VR experience, the use of theatrical production to drive the audience immersion and steer them through the narrative has grown in popularity combined with immersive entertainment. The Lost Origin offers a great cast, who worked hard to drive the experience for all the guests.

Lost Origin
Image credit: Seamus Ryan

Regarding the other elements, the surprising use of projection-mapping was cleverly and subtly achieved, with guests solving the puzzles and then being transported into a dream-like state. Alongside the projection mapping, the use of motion tracking allows the small audience to drive the story interacting with the narrative being revealed. The cast was ably supported by the live performances, masterfully steering the guests.

But it was the use of AR in this first of its kind immersive theatre performance that was the main area of interest. The developers had elected to use the Magic Leap One Creator Edition headsets for the performance. The group of guests on the third act of the experience are helped to put on the systems, and then navigate around a unique location, and given glimpses of spirits and even transported back in time. The Magic Leap systems were able to offer a competent AR representation, though they were limited by their performance, and it was not a seamless experience. But the developers of the AR app had managed to squeeze as much as they could out of the hardware, and it did work with the narrative presented.

Magic Leap
Image credit: Seamus Ryan

For Magic Leap, the company has pivoted from consumer-facing towards wholly commercial (enterprise) development. Having even announced their plans for a Magic Leap Two, a new interpretation of their headset, with redesigned elements, for some time in 2022. The company has had their original hardware deployed in other pop-up attractions, most notably the deployment in AT&T flagship stores in America, running an experience based on HBO’s ‘Game of Thrones’ universe. Following troubling financial conditions for Magic Leap, and the exodus of senior management, a new CEO has repositioned the company, and secured new investment, to hopefully allow them to grow once again. The team behind Lost Origin worked with Magic Leap as far back as 2018, at the time as one of the only systems able to acquire for the research project.

Regarding the use of AR in such “Immersive Theater” and “Artainment” – several developers have attempted to harness this technology to that end. The most ambitious of these and one of the first mainstream applications was in ‘The Unreal Garden’, which launched as part of the ill-fated Onedome facility in 2018. Employing Microsoft HoloLens AR hardware. The experience proved so compelling that it has been re-launched now as a standalone experience. ‘The Unreal Garden 2.0’ has opened in San Francisco, continuing to expand the use of physical elements and digital illusion – with updated hardware (using the HoloLens 2) and new content.

Returning to London, and Lost Origin Experience – in conclusion, this was a great example of the development in immersive performance, and the strength in bringing strangers together to experience a narrative. A mixture of immersive escape room, with mixed reality experience and live-action performance – the whole thing lasted over 60-minutes and did not drag, seamlessly orchestrated. The experience will only be open for a short period, from 21st November till the 4th December, and will cost £30.00 (and £18.00 for 14–16-year-olds) all bookings online.

This latest example of Immersive Theater offered a glimpse of how tech can play its part in the grand illusion, and we look forward to seeing this kind of application evolve and grow.      

The Yang and the Yin of Immersive Storytelling with Oculus’ Yelena Rachitsky

yelena-rachitskyThe future of VR storytelling will be immersive and interactive. Yelena Rachitsky is an executive producer of experiences at Oculus, and she’s been inspired by how interactive narratives have allowed her to feel like a participant who is more engaged, more present, and more alive. The fundamental challenge of interactive narratives is how to balance the giving and receiving of making choices and taking action vs. receiving a narrative and being emotionally engaged and having an embodied experience of immersion and presence. Balancing the active and passive dimensions is the underlying tension of the yang and yin of any experience.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST


The boundaries between what is a game and what is an immersive story will continue to be blurred, but Rachitsky looks at the center of gravity of an experience. Are you centered in your embodied experience and emotional engagement of a story (yin)? Or are you centered in your head of thinking about the strategy of your next action in achieving a goal in a game (yang)?

She’s recommends that experiential designers start with more yin aspects of an experience including the feeling, the colors, the space, and the visceral sensory experience of a story that you’re primarily telling directly to someone’s body. She’s also been finding a lot of inspiration and innovation of the future of storytelling from immersive theater, where actors are able to use their body language to communicate unconsciously with the audience and use their bodies moving through space in order to drive specific behaviors. The Oculus-produced Wolves in the Walls used immersive theater actors from the production Then She Fell in order to do the motion capture, and to help tell the spatial story using the body language of an embodied character in the story.

I had a chance to catch up with Rachitsky at Sundance this year where Oculus had five different experiences including Dispatch, Masters of the Sun, Space Explorers, Spheres, & Wolves in the Walls. Rachitsky has been key in helping to discover immersive storytellers and supporting projects that push the edge of innovation when it comes to the future of interactive storytelling. She says that the biggest open question that is driving her journey into immersive storytelling is “How can you be passive and active at the same time?”

Rachitsky says that immersive storytelling isn’t about the beginning, middle, or end, but rather it is about cultivating an experience that you have, and it’s about the story that you tell yourself after you take the headset off. This matches some of the depth psychological perspectives on immersive storytelling that John Bucher shared in his Storytelling for Virtual Reality book where VR storytelling could be used as a technological as a vehicle for inner reflection and contemplation.

I suspect that the focus on embodiment and the audience’s direct experience is part of a larger trend towards a new forms of storytelling that transcend the Yang Archetypal journey of Joseph Campbell’s Hero’s Journey, and VR and AR are more about a more receptive Yin Archetypal Journey that I would say is more non-linear, cyclical, embodied, sensory, centered in your own experience, environmental, nurturing, receptive, cooperative, community-driven, worldbuilding, depth psychological, connective, transcendent, esoteric, & alchemical.

The exact patterns and underlying structures of this more yin archetypal journey are still be explored in VR stories, but there’s likely a lot of inspiration that might come from kishōtenketsu literary structures found in classic Chinese, Korean and Japanese narratives that focus more on conflict-free stories of cooperation, collaboration, and revealing holistic interconnections of how the totality is greater than the sum of all of the individual parts.

I’ve recorded nearly 100 interviews on the future of immersive storytelling now (here’s a list of the Top 50 from 2016), and a consistent theme has been this underlying tension of giving and receiving where there is a striving for a balance of the active and passive experience. I find that the concepts of the yang and the yin from Chinese philosophy and the four elements from natural philosophy provide compelling metaphors to talk about this underlying tension.

Using metaphors from natural philosophy, the fire element (active presence) and air element (mental & social presence) are yang expressions of exerting energy outward while the water element (emotional presence) and earth element (embodied & environmental presence) are more yin expressions of receiving energy internally. My keynote on from the Immersive technology Conference elaborates on how these play out in the more yang communications mediums like videos games and more yin communications mediums of film and VR.

Video games focus on outward yang expressions of making choices and taking action while film focuses on inward yin expressions of receiving an emotionally engaging story. VR introduces the body and direct embodied sensory experience, but it’s possible that this focus on embodiment and presence helps to create new expressions of yin archetypal stories that have otherwise been impossible to tell.

Most of my recent conversations about VR storytelling from Sundance 2018 & the Immersive Design Summit have been focused on this emerging yin archetypal journey of how embodiment & presence are revealing these new structures of immersive storytelling:

The concept of a “Living Story‘” from the Future of Storytelling’s Charlie Melcher is very similar to what The VOID’s Camille Cellucci calls “Story-Living,” which is about “creating spaces and worlds where people have a chance to live out their own stories within a framework that we design.” The recently released Ready Player One movie did not include some of the ‘story-living’ live action role playing scenes that were included within the novel, but Ernest Cline was definitely attuned to the trends towards immersive narratives when his novel came out in 2011, which is the year that the Punchdrunk immersive theater production Sleep No More opened up in New York City.

Whether it’s a living story or story-living, both involve becoming an active participant and character within the story that’s unfolding. AI is going to play a huge role in helping to resolve some of this tension between authorial control of the story and creating generative possibility spaces, and it’s something that I’m starting to explore in the Voices of AI podcast with interviews with AI storytelling pioneer Michael Mateas, AI social simulator designer & improv actor Ben Samuel, and AI researcher/indie game developer Kristin Siu. Oculus’ Rachitsky is looking forward to integrating more and more AI technologies within future VR storytelling experiences, and she’s even experimenting with using live actors randomly appearing within some future VR experiences that she’s working on.

I expect that the underlying tension between giving and receiving, active and passive, and the yang and the yin to continue to be explored through a variety of different immersive storytelling experiences. While Ready Player One explores a typical Yang Archetypal Journey in the style of Campbell’s monomyth, these types of active gaming and mental puzzle-solving experiences may look great on a film screen, but they’re not always compelling VR experiences that amplify the unique affordances of immersion and presence in VR.

I predict that immersive storytellers will continue to define and explore new storytelling structures that I expect will initially be focusing these more Yin Archetypal Journey of immersion and presence. There will continue to be a fusion of traditional storytelling techniques from cinema, but it’s possible that VR stories need to completely detach from the paradigms of storytelling that tend to focus on conflict, drama, and outward journeys.

It’s possible that the Kishōtenketsu story structures from Eastern cultures might work well in VR as they focus on more cooperative and conflict-free stories that focus on the Gestalt of interconnectivity. It’s also likely that if there does turn out to be a fundamental Yin Archetypal Journey structure that’s different than the Campbell’s monomyth that it’s likely that these stories have been ignored and overlooked, and that it’s possible that the mediums of VR and AR have been needed in order to provide people with an embodied, direct experience of these types of stories.

Eventually we’ll be able to find a perfect balance of the yang and the yin in immersive stories, but perhaps before we get this perfect balance then we’ll need focus on these Yin Archetypal Journey of immersion and presence. Once we open our minds about what the optimal structures for embodied stories that center us in our experiences, then I expect more of a seamless integration of live-action role play, gaming elements, social interactions, and collaborative stories.


Support Voices of VR

Music: Fatality & Summer Trip

The post The Yang and the Yin of Immersive Storytelling with Oculus’ Yelena Rachitsky appeared first on Road to VR.

Blending Immersive Theater & VR with ‘Draw Me Close’ by NFB and National Theatre

johanna-nicollsThe National Theatre has created an Immersive Storytelling Studio to better understand the practices, protocols, opportunities of how virtual and augmented reality technologies are creating new storytelling possibilities. They collaborated with the National Film Board of Canada on an immersive theater piece called Draw Me Close that premiered at Tribeca Film Festival. It featured a one-on-one interaction with a live actor in a mixed reality environment while the audience was unveiled within a virtual reality headset where you play the archetypal role of a son/daughter as your mother embraces you, draws with you, and tucks you into bed as she narratives a memoir of her life. I talked with Immersion Storytelling Studio producer Johanna Nicolls about the reactions, intention, and overall development of Draw Me Close, which is their first immersive theater VR piece.

LISTEN TO THE VOICES OF VR

The spatial storytelling techniques and skills that theater has been developing for hundreds of years translates really well to the even more immersive 360-degree, VR environments. But with live experiences like Sleep No More and Then She Fell, there’s also a whole other ‘immersive theater’ movement within the theater world that is bringing new levels of embodiment, choices, and agency into authored theater performances.

No Proscenium podcast host Noah Nelson wrote up a great introductory primer of immersive theater that explores the nuanced differences between immersive theater, site-specific performances, and environmentally-staged theater. One differentiation that Nelson makes is that immersive theater has much more of an explicit experiential design that “feels more like an event you experienced than a performance that you witnessed.”

The version of Draw Me Close that I saw at Tribeca took a powerful first step in exploring how live actors sharing the same physical space within a mixed context provides a new dimension of emotional and embodied presence. The haptic feedback of an embodied hug from a co-present human is something that may never be able to ever be fully simulated in VR, and so this illustrates a clear threshold to me of what can and cannot currently be done in VR.

I also saw the Then She Fell immersive theater piece which featured a lot of one-on-one interactions with performers, and so I think that there’s a profound depth of emotional presence and intimacy that you can achieve with another person without the barriers of technology. You still can’t see the more subtle microexpressions of emotion or perceive the more nuanced body language cues when interacting with other humans while you’re in VR, but feeling the actor touch me provided a deeper phenomenological sense of embodied essence that I was interacting with an actual human in real-time. Directly interacting with another physically co-located person and feeling their touch closed some perceptual gaps and took my sense of social presence beyond the normal levels I have in distributed social VR experiences.

This was also such a new type of experience that I didn’t know the rules of engagement for how much I was expected to speak or interact. There weren’t a lot of prompts for talking or engaging, and so I mostly silently received the story as each moment’s actions were being actively being discussed, analyzed, and contextualized by a steady stream of real-time narration. There were not a lot of prompts, invitations, or space made available for dialogue, but there were a number of interactive actions I was invited to do ranging from opening a window to drawing Tilt Brush-style on the floor. There was a deliberate decision to be fairly vague in casting a magic circle of the rules and boundaries of what to expect, since the story, characters, and loving embrace of a motherly hug was all designed to be a surprise. This shows the challenging issues of balancing how to receive explicit consent to being touched while also maintaining the integrity of the mystery of a story that’s about to unfold.

Draw Me Close is an ambitious experiment to push the storytelling possibilities that are made available within a one-on-one interaction of an immersive theater piece while the audience is within virtual reality. It was a profound enough experience for a number of people who needed to have some level of decompression and help transitioning back from exploring some of the deeper issues that were brought within the experience.

There are obviously limitations for how this type of experience could be scaled up so that it was logistically feasible to be shown on a wider scale, but it’s refreshing to see the NFB and National Theatre’s Immersive Storytelling Studio experiment, explore, and push the limit for what’s even possible. If too much effort is focused on what’s sustainable or financially viable, then it could hold back deeper discoveries about the unique affordances of combining immersive theater with immersive technologies.

Explore Further:


Support Voices of VR

Music: Fatality & Summer Trip

The post Blending Immersive Theater & VR with ‘Draw Me Close’ by NFB and National Theatre appeared first on Road to VR.

Projection-mapped Immersive Theater Shows the Future of Live AR Performances

There was an amazing projection-mapped, immersive theater piece at Sundance this year by Heartcorps called Riders of the Storyboard. Trained street performers interacted with virtual projection-mapped 2D objects, and through the sleight of hand magic broke these flat objects into the third as glowing 3D props.

LISTEN TO THE VOICES OF VR PODCAST

There were 15 people packed into a small room with about half a dozen performers for a 13-minute show about these 2D characters who interact with the performers who are playing Alchemy of Light gods in the third dimension. It was an awe-inspiring performance, and the projection mapping technology provided a shared experience akin to what future augmented reality technology could provide.

Heartcorps is proving out some of the techniques with projection mapping technology that should also work really well in the future of live performance and immersive theater designed for AR glasses.

I had a chance to catch up with the Heartcorps member and performer dandypunk, who talks about their process, ritual inspiration, and mixture of immersive theater and cutting-edge projection mapping. Be sure to check out the trailer and clips from their show at Sundance down below.

Here’s the Trailer for the Heartcorps Riders of the Storyboard piece that showed at Sundance New Frontier:

Final three minutes of the Riders of the Storyboard show at Sundance:


Support Voices of VR

Music: Fatality & Summer Trip

The post Projection-mapped Immersive Theater Shows the Future of Live AR Performances appeared first on Road to VR.