‘Life of Us’: An Embodied & Social Story of Human Evolution

Aaron-KoblinWithin premiered their first real-time rendered, interactive experience at Sundance New Frontier this year with Life of Us, which is the story of life on the planet as told through embodying a series of characters who are evolving into humans. The experience is somewhere between a film and game, and ends up feeling much like a theme park ride. There’s an on-rails narrative story being told, but there’s also opportunities to throw objects, swim or fly around, control a fire-breathing dragon, and interact with another person who has joined you on the experience. You learn about which new character you’re embodying by watching the other person embody that creature with you, and the modulation of your voice also changes with each new character deepening your sense of embodiment and presence.

LISTEN TO THE VOICES OF VR PODCAST

I had a chance to catch up with Within CTO and co-founder Aaron Koblin at Sundance to talk about their design process, overcoming the uncanny valley of voice modulation delays, how the environment is a primary feature of VR experiences, and how their background in large-scale museum installations inspires their work in virtual reality.

Koblin also talks quite a bit about finding that balance between the storytelling of a film and interaction of a game, and how Life of Us is their first serious investigation into that hybrid form that VR provides. He compares this type of VR storytelling to the experience of going to a baseball game with a friend in that this type of sports experience is amplified by the shared stories that are told by your friends. This is similar to collaborative storytelling of group explorations of VRChat, but with an environment that is a lot more opinionated in how it tells a story.

Life of Us is a compelling way to connect and get to know someone. The structure of the story is open enough to allow each individual to explore and express themselves, but it also gives a more satisfying narrative arc than a completely open world that can have a fractured story. Life of Us has a deeper message about our relationship to each other and the environment that it’s asking us to contemplate. Overall, Koblin says that our relationships with each other essentially amount to the sum total of our shared experiences, and so Within sees an opportunity to create the types of social & narrative-driven, embodied stories that we can go through to connect and express our humanity to each other.

Here’s a trailer for Life of Us:

The Life of Us experience should be released sometime in 2017, and you can find more information about Within website (which links to all of their platform-specific apps), or their newly launched WebVR portal at VR.With.in.


Support Voices of VR

Music: Fatality & Summer Trip

The post ‘Life of Us’: An Embodied & Social Story of Human Evolution appeared first on Road to VR.

Bringing Conversational Gameplay & Interactive Narrative to ‘Starship Commander’

Alexander-MejiaDeveloper Human Interact announced this past week that they are collaborating with Microsoft’s Cognitive Services in order to power the conversational interface behind their Choose-Your-Own-Adventure, interactive narrative title named Starship Commander. They’re using Microsoft’s Custom Recognition Intelligent Service (CRIS) as the speech recognition engine, and then Microsoft’s Language Understanding Intelligent Service (LUIS) in order to translate spoken phrases into a number of discrete intention actions that are fed back into Unreal Engine for the interactive narrative.

LISTEN TO THE VOICES OF VR PODCAST

I caught up with Human Interact founder and creative director Alexander Mejia six months ago to talk about the early stages of creating an interactive narrative using a cloud-based and machine learning powered natural language processing engine. We talk about the mechanics of using conversational interfaces as a gameplay element, accounting for gender, racial, and regional dialects, the funneling structure of accumulating a series of smaller decisions into larger fork in the story, the dynamics between multiple morally ambiguous characters, and the role of a character artist who sets bounds of AI and their personality, core belief system, a complex set of motivations.

Here’s a Trailer for Starship Commander:

Here’s Human Interact’s Developer Story as Told by Microsoft Research:


Support Voices of VR

Music: Fatality & Summer Trip

The post Bringing Conversational Gameplay & Interactive Narrative to ‘Starship Commander’ appeared first on Road to VR.

Emotional Branching VR Stories: Combining Empathy & Interactivity for Compassion Acts

eric-darnellLast year, Baobab Studios’ Eric Darnell was skeptical about adding interactivity to virtual reality stories because he felt like there was a tradeoff between empathy and interactivity. But after watching people experience their first VR short Invasion!, he saw that people were much more engaged with the story and wanted to get more involved. He came to that realization that it is possible to combine empathy and interactivity in the form of compassion acts, and so he started to construct Baobab’s next VR experience Asteroids! around the idea of allowing the user to participate in an act of compassion. I had a chance to catch up with Darnell at Sundance where we talked about his latest thoughts about storytelling in VR, and explored his insights from their first explorations of what he calls “emotional branching.”

LISTEN TO THE VOICES OF VR PODCAST

Darnell says that one of the key ingredients of a story is “character being revealed by the choices that they make under pressure.” Rather than make you the central protagonist as a video game might, in Asteroids! you’re more of a sidekick who can choose whether or not to help out the main characters. This allows an authored story to be told though the main characters that are ultimately independent of your actions, but your “local agency” choices still flavor your experience in the sense that there are different “emotional branches” of the story for how the main protagonists react to you based upon your decisions.

Unpacking the nuances of these emotional branches showed me that Asteroids! was doing some of the most interesting explorations of interactive narrative at Sundance this year, and I would’ve completely missed them had I not had this conversation with him. We explore some of the more subtle nuances of the story, and so I’d recommend holding off on this interview if you don’t want to get too many spoilers (it should be released sometime in the first half of 2017). But Darnell is a master storyteller, and he’s got a lot of really fascinating thoughts about how stories might work in VR that are worth sharing out to the storytellers in the wider VR community.

They’re also doing some interesting experiments of adding in body language mirroring behaviors into the other sidekick characters that are based upon social science research in order to create subtle cues of connecting to the characters and story. There is another dog-like robot the experience that is in the same sidekick class as you where you can play fetch with it and interact with in subtle ways.

SEE ALSO
Oculus Story Studio Co-founder Roundtable + Top 50 VR Storytelling Interviews

Storytelling is a time-based art form that has a physical impact of releasing chemicals in our bodies including cortisol at moments of dramatic tension, oxytocin with character interactions, and dopamine at the resolution of that dramatic tension. Given these chemical reactions, Darnell believes that the classic three-act structure of a story is something that is encoded within our DNA. Storytelling is something that has helped humans evolve, and it’s part of what makes us human. He cites Kenneth Burke saying that “stories are equipment for living.” Stories help us learn about the world by watching other people making choices under pressure.

There’s still a long ways to go before we achieve the Holy Grail of completely plausible interactive stories that provide full global agency while preserving the integrity of a good dramatic arc. It’s likely that artificial intelligence will eventually have a much larger role in accomplishing this, but Asteroids! is making some small and important steps with Darnell’s sidekick insights and “emotional branching” concept. It was one of the more significant interactive narrative experiments at Sundance this year, and showed that it’s possible to combine empathy and interactivity to make a compassionate story.


Support Voices of VR

Music: Fatality & Summer Trip

The post Emotional Branching VR Stories: Combining Empathy & Interactivity for Compassion Acts appeared first on Road to VR.

Mindshow VR’s Collaborative Storytelling Platform & Celebrating VoVR’s 500th Episode

Gil-BaronThere are a number of immersive storytelling innovations Sundance 2017 in a number of experiences including Dear Angelica, Zero Day VR, Miyubi, and Life of Us, but Mindshow VR’s collaborative storytelling platform was the most significant long-term contribution to the future of storytelling in VR. I first saw Mindshow at it’s public launch at VRLA, and it’s still a really compelling experience to record myself playing multiple characters within a virtual space. It starts to leverage some of virtual reality’s unique affordances when it comes to adding a more spatial and embodied dimension to collaboratively telling stories.

Super Serious Show PolaroidsI had a chance to catch up with Visionary VR’s CEO Gil Baron and Chief Creative Officer Jonnie Ross where we talk about how Mindshow is unlocking collaborative creative expression that allows you to explore a shared imagination space within their platform. We talk about character embodiment, and the magic of watching recordings of yourself within VR, how they’re working towards enabling more multiplayer and real-time improv interactions, and they announced at Sundance that they’re launching Mindshow as a closed alpha.

LISTEN TO THE VOICES OF VR PODCAST

This is also episode #500 of the Voices of VR podcast, and Jonnie and Gil turn the tables on me for what I think the ultimate potential of VR is. My full answer to this question that I’ve asked over 500 people will be fully covered in my forthcoming book The Ultimate Potential of VR. But briefly, I think that VR has the power to connect us more to ourselves, to other people, and to the larger cosmos. Mindshow VR is starting to live into that potential today of providing a way to expressing your inner life through the embodiment of virtual characters that you can then witness, reflect upon, and share with others, and Google Earth VR shows power of using VR to connect more to the earth as well as the wider cosmos.

If you’d like to help celebrate The Voices of VR podcast’s 500th episode, then I’d invite you to leave a review on iTunes to help spread the word, and become a donor to my Voices of VR Patreon to help support this type of independent journalism. Thanks for listening!


Support Voices of VR

Music: Fatality & Summer Trip

The post Mindshow VR’s Collaborative Storytelling Platform & Celebrating VoVR’s 500th Episode appeared first on Road to VR.

Creators Behind ‘Monument Valley’, ‘Wallace and Gromit’, and More Share VR Insights Ahead of Develop:VR Conference in London

The first ever Develop:VR conference debuts in London tomorrow, December 1st. We preview the event by speaking with five speakers presenting at the show.

Produced by the same company responsible for the popular Develop:Brighton conference, Develop:VR is drawing top VR creatives from across the UK to share insights learned from the rapidly evolving virtual reality landscape. Tickets to Develop:VR are still available, including free access to the Expo and Networking events (but you must register!).

Road to VR London Correspondent Jon Tustain previews the conference with a series of five interviews from speakers presenting at tomorrow’s conference. Presented in audio format, the 30 minute episode can be listened below:

Speakers Interviewed (in order)

Dr. Charles Nduka – Co-founder, Emteq
Develop:VR Presentation: VR Analytics: Is Gaze Tracking Enough?

dr-charles-ndukaCharles Nduka is co-founder and Chief Scientific Officer of Emteq Ltd a company focussed on measuring facial expressions and emotions. His background is as a surgeon and internationally recognised facial muscle expert. This work led him to develop a patent pending technology for non-invasive facial expression monitoring using wearable technology.

Charles published the first review of virtual reality for surgical training in 1994 and has an extensive background in research and the evaluation of new technologies. He has won numerous research and development awards including from the Wellcome Trust, the National Institute for Health Research (NIHR), and Innovate UK. Emteq Ltd was founded in 2015 to improve lives through the development of facial sensing healthcare solutions and is applying its technology to VR.

Solomon Rogers – Founder, REWIND
Develop:VR Presentation: VR Production: From Story-telling to Story-living 

solomon-rogers-rewindSol founded REWIND, a VR and creative digital agency, in 2011 after growing demands for his professional work pulled him away from 15 years as a University Senior Lecturer in Digital Animation, Visual Effects & Emerging Technology.

Since then, Sol has grown Rewind into an industry award winning tribe of vibrant creative technologists and digital artists. Focused on harnessing immersive technologies to deliver groundbreaking VR, AR, Animation, DOOH, VFX and 360 degree video projects for some of the world’s largest brands including Sony, BBC, Red Bull, Microsoft and Lexus. The team has also been working closely as an approved content provider for Oculus (Rift), Valve (Vive), Samsung (Gear VR) and Fove, plus building release VR demos for Autodesk, AMD and The Foundry.

peter-pashley-ustwo-gamesPeter Pashley – Head of Development, Ustwo Games
Develop:VR Presentation: VR for Everyone: Lessons from Monument Valley in the Success of Land’s End

Peter is Head of Development at Ustwo Games. With a background in AAA console development, he has spent the last five years at Ustwo Games learning to make successful mobile games. He was tech lead on the BAFTA-award-winning Monument Valley (2014) and co-directed Gear VR hit Land’s End (2015).

Daniel Efergan – Group Creative Director of Digital, Aardman Animations
Develop:VR Presentation: Trying To Connect

daniel-efergan-aardman-animationsDaniel Efergan is the Group Creative Director of Digital at Aardman Animations. What this actually means is he gets to spend lots of time doing fun things like making games, forming playful communities, and messing around in the murky bits between storytelling and interactivity.

With story at the heart of everything Aardman makes they find themselves overly excited by new and interesting forms of storytelling… VR & 360 is no exception. To date Aardman have created the 360 story Special Delivery (2015) for Google Spotlight, and We Wait, an animated VR doc for the BBC.

Pete Short – CTO, Breaking Forth
Develop:VR Presentation: The Future of Stories in Virtual Reality

pete-short-breaking-forthPete is the CTO at Breaking Fourth, a company dedicated to telling incredible stories in virtual reality. While studying for a Masters of Digital Media at the University of New South Wales, he found a passion for creating emotional stories in mixed media. More recently he was the Director of Visualisations at Omnicom Media Group.

Breaking Fourth’s latest drama Ctrl has been incredibly well received by industry experts as the way forward for long form narrative VR. By focusing primarily on mobile VR, Breaking Fourth wants to make VR films accessible and easily distributable to a mainstream audience.


Road to VR is a proud media sponsor of Develop:VR 2016

The post Creators Behind ‘Monument Valley’, ‘Wallace and Gromit’, and More Share VR Insights Ahead of Develop:VR Conference in London appeared first on Road to VR.

Oculus Story Studio Co-founder Roundtable + Top 50 VR Storytelling Interviews

saschka-unseldI had a chance to talk about storytelling in VR with three of the co-founders of Oculus Story Studio during Oculus Connect 3. Saschka Unseld, Maxwell Planck, and Edward Saatchi were showing off a preview of their third VR experience Dear Angelica as well as their immersive storytelling tool of Quill, which enabled them to create a VR narrative experience entirely within VR.

LISTEN TO THE VOICES OF VR PODCAST

Maxwell-PlanckThey all emphasized to me that it’s still very early days of figuring out the unique affordances of virtual reality as a storytelling medium, and that Oculus Story Studio is still doing quite a bit of experimentation. They were in agreement in believing that it’s likely going to take a long time to figure out what narrative in VR looks like, and that it could be another generation before VR finds its true form.

Edward-SaatchiWhile I agree that VR storytelling is still very much within a Wild West phase of development, at the same time I do believe that there have been a lot of solid lessons learned about VR as a storytelling medium that I’ve covered on the Voices of VR Podcast. At the bottom of this post is a Top 50 List of Voices of VR interviews about storytelling in VR where the list is broken up into the following seven categories: the language of VR storytelling, interactive storytelling, multiple perspectives and empathy in storytelling, social storytelling, world building & environmental storytelling, plausibility & presence in narrative, and audio.

Some of the key discoveries that Oculus Story Studio made with Dear Angelica are first of all that changing scale as an effective way to evoke different emotional reactions. They also discovered that stopping and scrubbing through time was a very compelling experience that allowed audience members to have more control over their pacing through an experience. They also developed a unique “Quillustration” aesthetic that is like a lucid dream that’s trying to mimic how memory works. Perhaps having tools to create VR stories within VR will provide new narrative devices for how stories will be told in VR.

Saschka defined the essential components of a story in VR as simply having a beginning, middle, and end, and this broadens the scope of what could be classified as a narrative within a VR experience. Edward says that it often feels like they have the “dead hand of cinema” hovering over whatever VR storytellers do within a VR experience. The target VR demographic right now is so familiar with the film and video game mediums that they are bringing a whole set of expectations that impacts how they consume and receive VR narrative experiences.

Saschka was also really cautious and skeptical about creating stories that have branching narratives with multiple endings. He interprets multiple storylines as a sign that the author may not know what he/she wants to say, and this blocks his process of cultivating a personal connection with the content creator.

We also had a wide-ranging discussion about narrative vs interactivity, and the balance between creating authored stories versus balancing the amount of control a user has within the context of their sandbox of interactivity. Oculus Story Studio is made up of a lot of filmmaking gamers and so they cited a number of 2D narrative games as inspiration including Stanley Parable, Papers Please, Tacoma, Virginia, Gone Home, LMNO, and Façade. In the end, they imagine that VR experiences will be like the Holodeck in that it’s social, it’s a game, but it’s a movie. We’re still quite a ways away from having a widespread consensus on where VR storytelling is going, and Oculus Story Studio will continue to try that sweet spot between authored narrative and that sandbox of interactivity.

Top 50 Voices of VR Interviews on Virtual Reality Storytelling

THE LANGUAGE OF VR STORYTELLING

  • The Four Different Types of Stories in VR (292)
  • The Language of Cinematic VR with Google’s Jessica Brillhart (291)
  • Storytelling in VR: Ambiguity and Implication in 1st Person Narratives (339)
  • Pushing the Language of Cinematic VR Forward with ‘Sonar’ (296)
  • “Pearl” is an Emotionally Powerful Story about Selfless Service (415)
  • Ted Schilowitz on Bringing VR & Interactive Storytelling to Hollywood (439)
  • What Broadway Theater Can Teach VR Video Production (380)
  • Oculus Story Studio’s Quill: An Immersive Storytelling Tool (467)
  • Storytelling in Virtual & Mixed Reality with SPACES (374)
  • John Gaeta on ILMxLAB & Immersive Storytelling (294)

INTERACTIVE STORYTELLING

  • AI and the Future of Interactive Drama (293)
  • Storytelling in VR & the Tradeoffs of Empathy and Interactivity (290)
  • Using Code as a Canvas for Living Stories (411)
  • Sequenced & the Challenge of Interactive VR Narratives (396)
  • Interactive Storytelling Triggered by Gaze, Kevin Cornish (349)
  • “Luna”: A Deep Game, Narrative Puzzler about Recovering From Grief & Trauma (438)
  • Cracking the Narrative Code of VR with the Interactive Documentary Genre (407)

MULTIPLE PERSPECTIVES & EMPATHY in STORYTELLING

  • Rose Troche on the Vulnerability of a 1st-Person Perspective (286)
  • Situational Knowledges in VR Narrative: The Role of Place & Perspective (408)
  • Nonny de la Peña on Immersive Journalism, Empathy, & VR storytelling (6)
  • Building Empathy with a 360-degree Video about a Sexual Assault from Two Perspectives (242)
  • Nonny de la Pena on Empathy in VR (298)
  • Empathizing with a War-Torn Family in ‘Giant’ (342)

SOCIAL STORYTELLING

  • Group Explorations of User-Generated Worlds with VRChat (318)
  • What Dungeons & Dragons Can Teach Storytelling in VR (441)
  • Telling Stories with Improv Acting in ‘Mindshow’ (420)
  • Wizard of Oz Narratives: Puppeting Virtual Characters with Improv Acting (409)

WORLDBUILDING & ENVIRONMENTAL STORYTELLING

  • Alex McDowell on World Building in Storytelling (309)
  • Building Storyworlds with Lawnmower Man’s Brett Leonard (406)
  • Explore the Psychological Impacts of Solitary Confinement in ‘6×9’ (287)
  • Embedding a Story within a Place with ‘Obduction’ (432)
  • Denny Unger on the Future of Non-Linear Storytelling (462)
  • The Principle of Embodied Cognition as connected to the Environment (Episodes: 412, 469, 375, & 73)
  • Designing Google Earth VR: The Overview Effect & Finding Common Ground (475)
  • Walk Through a Vincent van Gogh Painting with ‘The Night Cafe’ (259)
  • Walking On a Virtual Tightrope Across the World Trade Centers (345)
  • Using Magic to Create Astonishment with The VOID (299)
  • Beyond Room-Scale: Exploring Infinite Worlds with THE VOID (284)

PLAUSIBILITY AND PRESENCE IN NARRATIVE

  • Rob Morgan on Narrative Design in VR & escaping the uncanny valley by implementing interactive social behaviors in NPCs (125)
  • ‘Rick & Morty Simulator’: Making Narratives More Plausible through Interruption (433)
  • Betty Mohler on Social Interactions in VR, Uncanny Valley Expectations, & Locomotion in VR (129)
  • Richard Skarbez on Immersion & Coherence being the two key components of Presence (130)
  • Mel Slater on VR Presence, Virtual Body Ownership, & the Time Travel Illusion (183)
  • Technolust’s Cloudstep VR Locomotion & Adding Social Behavior Scripts to NPCs (237)
  • Ross Mead on designing social behaviors & body language for virtual human avatars (56)
  • Job Simulator and the Magic of Hand Presence (315)
  • VR Time Perception Insights from Filmmaking & Cognitive Science (379) + Time Dilation (363)

AUDIO

  • Audio Objects for Narrative 360 VR with Dolby Atmos (398)
  • OSSIC & 3D Audio as the Next Frontier of Immersion (399)
  • Rod Haxton on VisiSonics’ RealSpace 3D audio licensed to Oculus & their Audio Panoramic Camera (124)

Support Voices of VR

Music: Fatality & Summer Trip

The post Oculus Story Studio Co-founder Roundtable + Top 50 VR Storytelling Interviews appeared first on Road to VR.

Watch: Square Enix’s ‘Project Hikari’ Melds Manga and VR

By now, there have been many attempts at storytelling in VR that have been inspired by various traditional forms of entertainment, such as film or animation, but with Project Hikari the Square Enix Advanced Technology Division is pioneering the concept of what it would be like to adapt existing manga into immersive VR experiences. I was able to try an early demo of it at Oculus Connect 3 using Oculus Touch.

While I could point out some potential technical flaws or glitches, overall the experience was something innovative and promising that I haven’t quite seen in VR yet. The experience featured Japanese voice acting like in anime, but most importantly, the use of real hand-drawn manga panels that would float around you and change in shape, and size, as well as panels that would literally suck you into them, transporting you fully into the manga with a fully modeled 3D world, not unlike the moment in Minecraft VR where you jump through the virtual living room TV into the Minecraft world.

project-hikari-manga-vr-2
Sometimes panels are windows into a fully 3D environment with depth

The panels however were not all 3D, as they used a combination of both 3D and 2D art to achieve a hybrid of old and new techniques. When it was 3D, though, it felt like simply windows or portals that you were peering through, and it was all black, white, and cel shaded to stay true to the traditional manga style.

project-hikari-manga-vr-3
Real artwork from the manga was presented in Project Hikari floating in panels in front of the viewer

But the contrasting 2D and 3D was, perhaps, where one could see the potential pain points to such a translation from old medium to a new. The manga that the experience is based on, Tales of Wedding Rings, has a hand drawn art style, one that would be hard to capture with regularly modeled 3D bodies and faces. Looking at it in the headset, the characters in the 3D art indeed didn’t quite feel the same as their drawn counterparts to me, even though they had a unique shading effect. There’s a little dissonance when you see Satou, the main character, saying something in his animated virtual form, and then seeing him suddenly in his still 2D form, in a flat panel, physically right beside the panel or virtual space where he stays in his 3D form.

project-hikari-manga-vr-4
Parts of the experience bring the viewer inside the manga, removing the frame of panels completely

Other small bugs or distractions kept me from getting fully immersed in the story as well, like how sometimes the subtitles would intersect with things in the virtual space (granted, this is a prototype experience), though you could actually use one of the Touch controllers to position where you wanted some of the panels to be in space. Some of the movement they did to the panels and virtual spaces in the experience also were enough to make me feel off balance at certain times (likely because much of the surrounding environment was pure white, leaving few static visual references), and while I don’t usually get motion sick, I could see how someone could without further adjustments to the experience.

The black and white style of everything was not something I was used to, and it felt a bit odd, though not necessarily uncomfortable, to be in such a world. Unlike with manga, which somewhat relies on your imagination to immerse you in the story, with VR, you’re directly living in it, so it would be like if you made yourself colorblind, rather than reading a comic where everything is more symbolically represented. Stylistically, I can still appreciate their dedication to crafting the experience of putting you into the physical space of a manga.

SEE ALSO
3 Must-see Anime about Virtual Reality That You Can Watch for Free

Despite those quirks, as a concept demonstration, Project Hikari definitely shows promise and a look into what future storytelling experiences that don’t just rely on one form of media representation—be it fully immersive VR or 2D—could behave like. And it does unique things like full black and white scenes, and panels that morph and move around, allowing for a flexible palette of ways to convey the story. The future of manga, anime, and other Japanese content in VR is yet to be fully defined, but experiments like this one by the R&D team at Square Enix are a promising glimpse.

The post Watch: Square Enix’s ‘Project Hikari’ Melds Manga and VR appeared first on Road to VR.

What Dungeons & Dragons Can Teach Us About Storytelling in VR

chrisperkinsDungeons & Dragons is a form of collaborative storytelling that isn’t constrained by time or budget. Because it’s all happening within the theater of the mind, if you can imagine it, then it can be constructed instantaneously within everyone’s imagination. The end result is that each participant is able to express the full extent of their free will to the ‘dungeon master’, who either directly controls their fate or delegates it to a roll of the dice. It’s the ultimate expression of imagination, improvisation, and storytelling that provides a high benchmark and design inspiration for what virtual reality and artificial intelligence can only hope to someday fully replicate within the metaverse.

Chris Perkins is a Dungeons & Dragons story designer as well as the Dungeon Master for the Acquisitions Incorporated podcast. I had a chance to talk with Chris the day after the 3-hour, season finale show for Acq Inc that took place in front of a live audience of 2500 people in the PAX West Main Theater.

LISTEN TO THE VOICES OF VR PODCAST

Chris and I talk about what DnD can teach VR storytelling, designing a DnD story within a traditional three-act structure, the expression of free will in DnD, and how to balance out the participation of all of the players and enabling them to do something really cool. Chris sees so much of the dynamics of DnD storytelling as a social experience, and as such most of the biggest open questions for DnD are more shaped by human interactions than by technological limitations.

Some of the hardest open problems with artificial intelligence have to do with understanding stories, disambiguating pronouns, and comprehending inside jokes, cultural references, and different tones of voice. The dungeon master has to track all of these things, observe the mood and body language of all of the participants to keep them engaged while at the same time pacing each character though series of perils. These are all sufficiently complicated enough that having an AI dungeon master successfully guide DnD players through a campaign could be a next-generation Turing test.

Chris says he hasn’t been impressed with any of the VR experiences that he’s seen so far because it felt like walking through someone else’s mind. With all of the DnD experiences he’s had, he’d much rather walk though a VR experience of his own mind. There’s creative experiences like Tilt Brush and Oculus Medium, but painting or sculpting is 3D is still nowhere as fast to the instantaneous ability of the mind to construct a scene and story on the fly. Perhaps it will some day be possible if neuroscientists are able to completely code the brain, and unlock the ability to be able to use neural activity to automatically translate our thoughts into virtual objects and full scenes within virtual reality.

Chris is fairly confident that DnD doesn’t have too much to fear from technological competitors. It’s entirely possible that technology may never be able to fully replicate the capabilities of the human mind as we visualize stories with our mind’s eye. So he’s skeptical about the capabilities of VR or AI to be able to accurately and synthetically express your own personal “theater of the mind.” But he also said that it’s inevitable that we’re going to try our hardest to do so because humans and storytelling are inseparable. As history has shown, we’re going to always be looking for new ways to reach people through the latest storytelling techniques.

Here’s the YouTube video of the PAX West 2016 Acquisitions Incorporated campaign discussed in this podcast.


Support Voices of VR

Music: Fatality & Summer Trip

The post What Dungeons & Dragons Can Teach Us About Storytelling in VR appeared first on Road to VR.

20th Century Fox’s Ted Schilowitz on Bringing VR & Interactive Storytelling to Hollywood

ted_schilowitz_p_2014When Ted Schilowitz was looking for what to do after traveling the world as the first RED Camera employee, he happened upon an opportunity to serve as a futurist for 20th Century Fox, looking at how to use emerging technologies for storytelling. Over the past three years, he’s had a lot of early access to hardware from all of the major virtual and augmented reality companies ranging from Oculus, Valve, Sony, Google, Magic Leap, ODG, and Microsoft.

LISTEN TO THE VOICES OF VR PODCAST

He’s been exploring what’s possible with VR and AR, and he says that “the abilities of a new medium start to define the demands of a new medium.” He’s worked on a number of different VR experiments to discover how to best blend together narrative and interactivity within the context of these new “spatial mediums.”

the-martian-vr-experience-caption
See Also: Hands On – The Martian VR Experience is a Triumph in Motion

One of the first and most ambitious experiments was a half-hour long Martian VR experience that was one of the hottest tickets at Sundance. It integrated the D-BOX 4D effects chair and Oculus Touch controllers, and put you in the first-person perspective of many key scenes from The Martian movie.

I had a chance to catch up with Ted at VRLA where he told me the story of introducing VR and AR technologies to Hollywood studio executives and storytellers. He shares some of his favorite interactive narrative experiences ranging from Pearl to Valve’s Aperture Robot Repair to The Gallery, as well as polished interactive experiences like NVIDIA’s VR Funhouse and Valve’s The Lab.

We also talk about the balance between global and local agency in interactive narratives, what can be learned from storytelling in theme park rides, the emerging language of storytelling in VR, and what it takes to become a viable practitioner of these future technologies.


Support Voices of VR

Music: Fatality & Summer Trip

The post 20th Century Fox’s Ted Schilowitz on Bringing VR & Interactive Storytelling to Hollywood appeared first on Road to VR.

Oculus’ ‘Henry’ Becomes the First VR Film to Win an Emmy

Oculus Story Studios’ Henry, the tale of a lovable, hard to hug hedgehog and his search for friendship, has walked away with the first ever Emmy awarded to a virtual reality film.

I wrote recently that the traditional motion picture entertainment industry seemed to be gravitating towards immersive media, keen to explore creative and financial possibilities, and now one of the earliest VR films has itself become recognised by that industry, awarded as “Outstanding Original Interactive Program.”

If you own an Oculus Rift consumer headset, it’s unlikely you’ll have missed Henry, the first film made specifically for virtual reality to come out of Oculus Story Studios – itself set up to explore the creative possibilities VR might afford. It’s the story of the titular hedgehog with a desperate desire for friendship, but whose less than cuddly exterior foils his attempts to do so.

henry-emmy

Henry’s a delightful experience, channelling as it does the charm present in many of Pixar’s trailblazing CG animated features, but the film also represents a milestone in the world of VR entertainment. Henry represents an early attempt at both extending and in some cases completely re-inventing the language of linear visual storytelling, inherited TV and movies to cope with, and take advantage of, the ‘look anywhere’ challenges virtual reality presents. This is something that the OSS team explored in depth in their presentation at last year’s Oculus Connect conference. You can watch it below, and it’s highly recommended to anyone with even a passing interest in the subject.

So, after all of that pioneering work, predictably, the Oculus Story Studio team are over the moon. “When we set out to make Henry, it was a step into the unknown world of making an emotional VR movie,” says Ramiro Lopez Dau, director of Henry, “While we didn’t know what the outcome was going to be, we were excited about the possibilities. We never anticipated that one of our first projects would be given such a distinction and this recognition is not only a testament to our team’s creative and technical achievements, but also a validation for the VR storytelling community as a whole. While Henry is just one step in the long journey ahead, we hope this moment inspires storytellers to bring their ideas to this new medium and help shape the future of VR storytelling.”

Oculus Story Studios is continuing on it’s experimental journey into VR film-making. It’s already released LOST, and will soon release Dear Angelica, a VR film previewed at this year’s Sundance Film Festival and with an altogether different feel to Henry.

Here’s hoping Henry‘s Emmy win inspires more of those Hollywood executives and creatives to seek out and invest in virtual reality as a narrative platform. In the mean time, Henry is available on the Oculus Store for free.

The post Oculus’ ‘Henry’ Becomes the First VR Film to Win an Emmy appeared first on Road to VR.