Grabbing Virtual Objects with the HaptX Glove (Formerly AxonVR)

Jake-Rubin
Jake Rubin

The HaptX Glove that was shown at Sundance was one of the most convincing haptics experiences that I’ve had in VR. While it was still primitive, I was able to grab a virtual object in VR, and for the first time have enough haptic feedback to convince my brain that I was actually grabbing something. Their glove uses a combination of exoskeletal force feedback with their patented microfluidic technology, and they’ve significantly reduced the size of their external box driving the experience from the demo that I saw at GDC (back when they were named AxonVR) thanks to a number of technological upgrades and ditching the temperature feedback.

LISTEN TO THE VOICES OF VR PODCAST

joe-michaels
Joe Michaels

I had a chance to talk with CEO & co-founder Jake Rubin and Chief Revenue Officer Joe Michaels at Sundance where we talked about why enterprise & military training customers are really excited about this technology, some of the potential haptics-inspired interactive storytelling possibilities, how they’re refining the haptics resolution fidelity distribution that will provide the optimal experience, and their collaboration with SynTouch’s texture-data models in striving towards creating a haptic display technology that can simulate a wide ranges of textures.

SEE ALSO
Hands-on: HaptX Glove Delivers Impressively Detailed Micro-pneumatic Haptics, Force Feedback

HaptX was using a Vive tracker puck for arm orientation, but they had to develop customized magnetic tracking to get the level of precision required to simulate individual finger movements, and one side effect is that their technology could start to be used as an input device. Some of HaptX’s microfludic technologies combined with a new air valve that is 1000x more precise could also start to create unique haptics technologies that could have some really interesting applications for sensory replacement or sensory substitution or start to be used in assisting data visualizations in a similar way that sound enhances spatialization through a process called sonification.

Photo by Road to VR

Overall, HaptX is making rapid progress and huge leaps with their haptics technologies and they’ve crossed a threshold for becoming useful enough for a number of different enterprise and military training applications. Rubin isn’t convinced that VR haptics will ever be able to fully trick the brain in a way that’s totally indistinguishable from reality, but they’re getting to the point where it’s good enough to start to be used creatively in training and narrative experiences. Perhaps soon we’ll be seeing some of HaptX’s technology in location-based entertainment applications created by storytellers who got to experience their technology at Sundance this year, and I’m really looking forward to seeing how their textures haptic display evolves over the next year.


Support Voices of VR

Music: Fatality & Summer Trip

The post Grabbing Virtual Objects with the HaptX Glove (Formerly AxonVR) appeared first on Road to VR.

360 Film ‘Dinner Party’ is a Symbolic Exploration of Race in America Wrapped in an Alien Abduction Story

laura-wexler
Laura Wexler

Dinner Party is an immersive exploration of Betty and Barney Hill’s widely known 1961 alien abduction story that premiered at the Sundance New Frontier film festival. Rather than using normal alien tropes, writers Laura Wexler & Charlotte Stoudt chose to use the spatial affordances of VR to present a symbolic representation of each of their experiences to highlight how vastly different they were.

LISTEN TO THE VOICES OF VR PODCAST

charlotte-stoudt
Charlotte Stoudt

Betty and Barney were an interracial couple in New Hampshire, and their purported encounter with aliens was a positive peak experience for Betty, but Barney had an opposite experience that Wexler & Stoudt attribute to his experience as a black man in the early 1960s. Inspired by passages of Barney’s hypnosis recordings posted online, Wexler & Stoudt expanded Hill’s story into an immersive narrative at the New Frontier Story Lab, and collaborated with director Angel Manuel Soto to bring this story to life in a 360 film.

Dinner Party is the pilot episode of a larger series called The Incident, which explores the aftermath of how people deal with a variety of paranormal or taboo experiences. Wexler & Stoudt are using these stories to explore themes of truth and belief such as: Who is believed in America? Who isn’t? What’s it feel like to go through an extreme experience that no one believes happened to you? And can immersive media allow you to empathize with someone’s extreme subjective experience without being held back by an objective reality that you believe is impossible?

Dinner Party is great use of immersive storytelling, and it was one of my favorite 360 experiences I saw at Sundance this year. It has a lot of depth and subtext that goes beyond what’s explicitly said, and I thought they were able to really use the affordances of immersive storytelling to explore a phenomenological experience in a symbolic way. It’s a really fascinating exploration of radical empathy using paranormal narrative themes that you might see in the The X-Files or The Twilight Zone, and I look forward to see what other themes are explored in future episodes.

Here’s a teaser for Dinner Party


Support Voices of VR

Music: Fatality & Summer Trip

The post 360 Film ‘Dinner Party’ is a Symbolic Exploration of Race in America Wrapped in an Alien Abduction Story appeared first on Road to VR.

Inside Sundance Hit ‘SPHERES: Songs of Spacetime’ with Director Eliza McNitt

eliza-mcnittSundance New Frontier had a solid line-up of VR experiences this year with a number of immersive storytelling innovations including SPHERES: Songs of Spacetime, which takes you on a journey into the center of a black hole. It’s a hero’s journey that provides an embodied experience of the evolution of a star from birth to death with a poetic story written and directed by Eliza McNitt, narrated by Jessica Chastain, and produced by Darren Aronofsky’s Protozoa Pictures.

LISTEN TO THE VOICES OF VR PODCAST

SPHERES made news for being acquired for a 7-figure deal, and it represents a unique collaboration between science and art. There were a number of scientific collaborators including the National Academy of Sciences and physicists who study black holes, and so the VR producers had to come up with creative interpretations of mathematical descriptions of the edges of spacetime that push the frontiers of our scientific knowledge.

I had a chance to sit down with McNitt at Sundance in order to talk about the inspiration for this project, her journey into creative explorations of science, the challenges of depicting gravitational lensing in Unity, what’s known and not known about black holes, how listening to gravitational waves for the first time inspired the sound design, and crafting an embodied hero’s journey story in collaboration with Protozoa Pictures. The acquisition deal by CityLights was secured on Kaleidoscope’s funding platform, and includes this first chapter shown at Sundance as well as two additional chapters yet to be produced, and will be released later this year by Oculus.

Here’s a promo for SPHERES produced by Sundance:


Support Voices of VR

Music: Fatality & Summer Trip

The post Inside Sundance Hit ‘SPHERES: Songs of Spacetime’ with Director Eliza McNitt appeared first on Road to VR.

Oculus-funded VR Experience ‘SPHERES’ Sold at Sundance in “Seven-Figure Deal”

SPHERES, a three-chapter space experience from Darren Aronofsky’s Protozoa Pictures, was just bought by VR finance and distribution firm CityLights in what Variety describes as a “seven-figure deal.”

While the respective companies are remaining mum on the exact price of the acquisition, the Variety report maintains the deal was in “the low- to mid-seven figures.”

Songs Of Spacetime, the first chapter of Spheres, debuted at Sundance as a part of Oculus’ five funded experiences.

Written and directed by Eliza McNitt, Spheres is an experience that explores sound while taking you to the heart of a black hole. Speaking to Oculus in a recent ‘VR Visionaries’ profile, McNitt called Spheres a story about “the human connection to the cosmos,” and how we relate to the sound of the universe—gravitational waves.

“…as I dove into research and the science behind the project, I learned that the discovery of gravitational waves won the Nobel in physics, so that was a huge part of the development of this project. I wanted to capture the most cutting-edge scientific discovery, and, in fact, that was this idea of sound. The title is inspired by the ancient philosophical theory called the Music of the Spheres, that predicted that celestial bodies created a form of music—and we truly did prove that with the discovery of gravitational waves.

Spheres is also packed with talent, with narration by Jessica Chastain (The MartianInterstellar, The Zookeeper’s Wife) and music by Kyle Dixon and Michael Stein of electronica band Survive (The Stranger Things theme song).

“We’re incredibly excited to work with Eliza and the entire team on Spheres,” CityLights co-founder Joel Newton told Variety. “The ambition and generative nature of the vision for Spheres perfectly fits with our mission to bring content to broader audiences and showcase the types of experiences only VR can deliver.”

Spheres is slated to arrive on Rift in the coming months, with a launch on other VR platforms to follow.

The post Oculus-funded VR Experience ‘SPHERES’ Sold at Sundance in “Seven-Figure Deal” appeared first on Road to VR.

Oculus Sends 5 VR Experiences to Sundance 2018

The Sundance Film Festival just kicked off in Park City, Utah, and Oculus announced in a blogpost they’re debuting five experiences at the New Frontier section of Sundance—all of which they helped bring to life.

Sundance’s New Frontier hosts a curated selection of works realized in the mediums of VR, AR, MR and AI. You can check out a full list of every entry into their year’s Sundance New Frontier showcase here.

Oculus will be doing a deep dive on each experience they’ve brought to Sundance, the first of which details the making of will.i.am and The Black Eyed Peas’ Masters of the Sun.

Dispatch

Written and directed by Edward Robles of Here Be Dragons, Dispatch follows a small-town police dispatcher (Martin Starr, Silicon Valley) as he faces an all-night crime spree. The experience takes you inside the dispatcher’s perspective in this episodic, audio-based miniseries. The first three episodes launched on Rift and Gear VR in November, and the finale will launch on the Oculus Store January 25 following its world premiere at Sundance.

Masters of the Sun

Launching for Gear VR, Masters of the Sun is presented by will.i.am and The Black Eyed Peas. It takes place during the ’80s when ancient and modern forces of evil started destroying black communities. Vocal talent including Rakim, Queen Latifah, KRS-One, Jason Isaacs, Slick Rick, and comics industry legend Stan Lee tell the story of mobilization and reclaiming their city, fighting back against the evils of drugs, crime, and discrimination.

Check out Oculus’ Q & A with will.i.am here.

Space Explorers

The latest project from Felix & Paul Studios, Space Explorers lets you reach new heights through the power of VR. Created in partnership with NASA, the experience follows their astronauts as they prepare to launch into space. Space Explorers is coming to Oculus in 2018.

SPHERES

The first chapter of SPHERES, called “Songs Of Spacetime,” is debuting at Sundance. SPHERES is a three-part series that transports viewers into the deepest pockets of the Universe, bringing to life future worlds and exploring oneness with the cosmos. SPHERES is created by Eliza McNitt and will launch on Rift in 2018.

Wolves in the Walls

From the team behind the Emmy Award-winning project Henry comes Wolves in the Walls, a gorgeous, interactive adaptation of Neil Gaiman and Dave McKean’s haunting work brought to life in VR. Wolves in the Walls is coming to Oculus in 2018.

– – — – –

The post Oculus Sends 5 VR Experiences to Sundance 2018 appeared first on Road to VR.

Hands-On: Wolves in the Walls’ Convincing Character Interactions Left Me In Awe

Hands-On: Wolves in the Walls’ Convincing Character Interactions Left Me In Awe

I’m a sucker for convincing characters in VR experiences. Some of my favorite moments in different VR apps came from the characters I interacted with and the sense of presence they afforded me. Things like eye contact, subtle movements and mannerisms, or even just quality voice acting all go a long way towards making a VR environment feel more real. These are all things that Wolves in the Walls by Fable Studio excels at.

Based on the Neil Gaiman story by the same name, Wolves in the Walls tells the tale of a cute, scared, and lonely little girl named Lucy that lives in an attic. She is certain that wolves live inside the walls but no one will believe her. That is, until you see them too and begin to understand this bizarre, strange world she lives in.

Check out the debut trailer for Wolves in the Walls below:

At a pre-Sundance screening this week we got to see a small slice of the first chapter of the experience and I came away completely in awe. The very start is nothing but a black void as swirling lights appear in front of me and I hear a faint murmur of a little girl’s voice and then poof — I’m standing in an attic in front of a little girl holding a pencil. She looks up at me and remarks that she “drew me too tall” and erases me. Back to black.

Soon, I’m back again at a shorter height — closer to her own — and she starts talking to me like I’d been there all along.

The stylized, slightly cartoonish visual style fits the tone perfectly and immediately grants the small girl a sense of believability. She wanders around the room, fiddling with things in corners and peering down at a notebook, or pictures, or even just whatever she’s holding. When she looks up to speak she makes piercing eye contact and commands my attention, just as a real person would. At one point she goes to hand me a camera, but looks away to keep doing what she was doing as I walk over to grab it with my Oculus Touch controllers.

It feels like she’d continue rambling and rummaging even if I weren’t there and that this is a world I’m becoming a part of, rather than a passive story that’s being told to me through VR.

Holding up the camera I snap a picture of her drawing and writing in her notebook and she startles, telling me to focus on getting proof of the wolves, not her.

After shaking the polaroid out and letting it develop it becomes clear — there is clearly a wolf on (or rather “in” as it were) the wall right behind her. Creepy.

You can see how I discovered it using a magnifying glass like in the GIF below:

Just before this Lucy had been running around the room pointing out all of the different sounds she’d heard from the scratching and clawing to the howling in the distance. The lighthearted tone and visual style can’t hide the sinister, somewhat nightmarish underpinnings of this story that evokes a slow-building sense of dread.

I only got to see about 10 minutes of this VR app, but I can’t wait to see more. This is easily now one of my most anticipated non-game applications of VR to date. It’s on display at Sundance this weekend.

Let us know what you think of it if you try it or what you think of it from reading here down in the comments below!

Tagged with: , , ,

Hands-On – Spheres: Songs Of Spacetime Takes You Inside Two Blackholes Colliding

Hands-On – Spheres: Songs Of Spacetime Takes You Inside Two Blackholes Colliding

You know that feeling you get after watching a really, really good thought-provoking film? It’s a feeling often associated with Christopher Nolan’s work, whether it be Memento, Inception, Interstellar, or even The Prestige. You’re caught somewhere between a feeling of complete awe and utter confusion, with a dash of delight sprinkled on top.

Spheres: Songs of Spacetime by Eliza McNitt in collaboration with Oculus Studios, Protozoa Pictures, and Kaleidoscope, is a short VR experience about the vastness of space and simulating the sights and sounds of two blackholes colliding. To date, it’s the first VR experience I’ve had that replicates a small portion of what it feels like to watch a Christopher Nolan film. I was immediately reminded of Interstellar.

Now to level your expectations I want to be clear: Spheres isn’t up to the same standards as a Nolan movie by any means, but it certainly does a good impression. The basic premise in Songs of Spacetime, the first part of Spheres, which is intended to be a three-part series, is all about black holes. You’ll take on the role of one of these enormous space anomalies and interact with various stars and objects around you.

Everything is portrayed with exquisite narration by Jessica Chastain (Interstellar, Crimson Peak) over the top of it all, who lends a very poignant but subtle voice over to the experience.

Fans of Interstellar will adore the pulsating visuals, thumping audio tracks, and sheer sense of bewilderment as swirls of color, gravity, and time itself cascade around you. It’s difficult to interpret the themes and events that are involved with blackholes and the universe in its entirety, let alone to visually depict them. The fact of the matter is that we really have no idea what something like two blackholes colliding looks like, but it’s certainly interesting to theorize about.

You know that feeling you get in VR when the sense of scale around you is just overwhelming? Those moments when you have to crane your neck upwards just to take it all in? Spheres is full of those moments and I definitely think my mouth was hanging ajar much more often than it was closed.

Spheres: Songs of Spacetime is on display this weekend at Sundance and can be found in the NF VR Experience section. The runtime is about 13 minutes.

Let us know what you think from what you’ve heard about Spheres so far down in the comments below!

Tagged with: ,

Projection-mapped Immersive Theater Shows the Future of Live AR Performances

There was an amazing projection-mapped, immersive theater piece at Sundance this year by Heartcorps called Riders of the Storyboard. Trained street performers interacted with virtual projection-mapped 2D objects, and through the sleight of hand magic broke these flat objects into the third as glowing 3D props.

LISTEN TO THE VOICES OF VR PODCAST

There were 15 people packed into a small room with about half a dozen performers for a 13-minute show about these 2D characters who interact with the performers who are playing Alchemy of Light gods in the third dimension. It was an awe-inspiring performance, and the projection mapping technology provided a shared experience akin to what future augmented reality technology could provide.

Heartcorps is proving out some of the techniques with projection mapping technology that should also work really well in the future of live performance and immersive theater designed for AR glasses.

I had a chance to catch up with the Heartcorps member and performer dandypunk, who talks about their process, ritual inspiration, and mixture of immersive theater and cutting-edge projection mapping. Be sure to check out the trailer and clips from their show at Sundance down below.

Here’s the Trailer for the Heartcorps Riders of the Storyboard piece that showed at Sundance New Frontier:

Final three minutes of the Riders of the Storyboard show at Sundance:


Support Voices of VR

Music: Fatality & Summer Trip

The post Projection-mapped Immersive Theater Shows the Future of Live AR Performances appeared first on Road to VR.

YouTube VR Wants to Find the Next Billion Dollar Genre That Hasn’t Been Created Yet

Jamie Byrne

At Sundance this year, I had a chance to catch up with a couple of representatives from Google to talk about what’s happening on the YouTube VR platform with 360 videos. I talked with Jamie Byrne, YouTube’s Director of Global Creator & Enterprise Partnerships as well as Julia Hamilton Trost, Google VR’s Business Development & Content Partnerships.

Julia Hamilton Trost

We talked about the YouTube VR application, what they’re doing to do to empower content creators, how they see 360 video as a gateway into higher-end VR, and some of the potential future to add more volumetric and interactive elements to the YouTube platform in the future.

LISTEN TO THE VOICES OF VR PODCAST

Here are a number of 360 videos that were mentioned in this interview:

The Unboxing Time Machine – NES 1985

Rhomaleosaurus: Back to Life in Virtual Reality #PreviouslyOnEarth

The Dropper – A Minecraft 360° Video

Meredith Foster giving a 360 tour of her apartment

New York Times is doing a Daily 360 video


Support Voices of VR

Music: Fatality & Summer Trip

The post YouTube VR Wants to Find the Next Billion Dollar Genre That Hasn’t Been Created Yet appeared first on Road to VR.

Sundance’s Shari Frilot on the Power of VR Storytelling to See Ourselves in a New Way

Shari-FrilotShari Frilot started the New Frontier program at Sundance in 2007, and produced the festival’s first VR experience in 2012 with Nonny de la Peña’s Hunger in LA using an early Rift prototype made by Oculus founder Palmer Luckey. Frilot has since programmed around 75 VR experiences since 2014 that explore storytelling, empathy, and emotional presence, but she sees that it’s going beyond empathy. She says that being in VR gives us “the ability to see ourselves in a way that we could never do alone,” and that VR embodiment may allow us to overcome our unconscious biases. In speaking about embodying a number of different creatures in The Life of Us she says, “you can watch yourself tap these primitive instinctual responses and you watch yourself go into another place of being able to socially engage with somebody” beyond the normal labels of white dude or a black lesbian.

LISTEN TO THE VOICES OF VR PODCAST

I had a chance to catch up with Frilot at Sundance this year where we talked about the power of story to change someone’s reality, the role of Sundance in the modern history of consumer VR, interdisciplinary insights into storytelling from over 10 years of New Frontier, how VR could change how we see and understand our underlying value systems, and how VR could help us reconnect the body to the brain in a new way.

Here’s the short documentary video that Frilot references in the podcast about “Scientists Have Found a Way to Make Paraplegics Move Again”:

Here’s the keynote that Nonny de la Peña’s gave at SVVR where she talks about Hunger in LA and some of her early pieces that premiered at Sundance:


Support Voices of VR

Music: Fatality & Summer Trip

The post Sundance’s Shari Frilot on the Power of VR Storytelling to See Ourselves in a New Way appeared first on Road to VR.