Simulating Crowds of Virtual Humans in Immersive Environments

Mubbasir-KapadiaThe IEEE VR conference held a new pre-conference workshop this year on Virtual Humans and Crowds for Immersive Environments. Movies like Lord of the Rings and video games like Assassin’s Creed use this research in order to create convincing group behaviors with NPCs, and architects want to be able to test their building designs to ensure that they are comfortable for different types of flows of people and evacuation scenarios.

LISTEN TO THE VOICES OF VR PODCAST

There are a number of non-VR researchers who are studying how groups of move through through different situations and contexts, and virtual reality is providing new opportunities to test out some of their theories within immersive environments.

At the IEEE VR, I had a chance to chat with Rutger’s professor Mubbasir Kapadia who studies crowd simulation and his latest book is called Simulating Heterogeneous Crowds with Interactive Behaviors. We talk about his research and theories into how to describe and simulate crowd behaviors, some of the entertainment and architectural applications, how VR can be used to test and verify some of these theories, and some of the big open questions that are driving his research.


Support Voices of VR

Music: Fatality & Summer Trip

The post Simulating Crowds of Virtual Humans in Immersive Environments appeared first on Road to VR.

Tracking Your Hands Using Flex Sensor Technology with Manus VR

Stijn-StumpelI had a chance to try out the Manus VR hand-tracked controller on the expo floor of GDC this year and saw that there a couple of really strong use cases for having your hands and fingers tracked in VR. You can be a lot more expressive within social VR, and in mixed reality experiences where passive haptic feedback is available, having your hands tracked can actually increase the level of embodied presence.

I had a chance to catch up with the lead designer of Manus VR, Stijn Stumpel, at GDC where we compared Manus VR to Leap Motion, talked about how the flex sensors work, the use cases where having tracked hands makes sense, their extremely polished demo called Pillow’s Willow, and where they’re going in the future.

LISTEN TO THE VOICES OF VR PODCAST

At GDC, Manus VR strapped an HTC Vive controller to the back of my wrist, and it gave a lot more consistent tracking of the location of my hands as a result; I didn’t have to worry about keeping my hands within my field of view like I do with optically tracked solutions like Leap Motion. There was some uncanniness in not being able to actually physically grab objects, which can break presence. And I also experienced a lot more than 20ms of latency in my finger movements, which was a presence breaker. But I was told that they are able to achieve much better latency performance in their lab environment.

Manus VR just announced in a press release that their “gloves are being used in experiments to train NASA astronauts in mixed reality to prepare them for the International Space Station.” Here’s some footage of some of that training that they’ve released.

They also announced that Manus VR is joining the first SteamVR Tracking class being taught by Synapse on September 12th in order to create a version of their glove that has the SteamVR Tracking sensors built in. So I expect to see the next iteration remove the stopgap solution of attaching a SteamVR controller onto the back of your arm. With the increased amount of tracking on the arm, then they might also start to be able to do a lot more accurate inverse kinematic tracking of your body and be able to have a powerful invocation of the virtual body ownership illusion.


Support Voices of VR

Music: Fatality & Summer Trip

The post Tracking Your Hands Using Flex Sensor Technology with Manus VR appeared first on Road to VR.

‘Rick and Morty Simulator’: Making Narratives More Plausible through Interruption

alex-schwartzWhen Owlchemy LabsAlex Schwartz saw that Rick and Morty creator Justin Roiland was a fan of their Job Simulator VR experience, then he reached out and met up with Justin in Los Angeles. They came up with the idea of creating an interactive Rick and Morty Simulator VR experience that would combine the mechanics of Job Simulator within the setting of Rick’s garage.

LISTEN TO THE VOICES OF VR PODCAST

When Alex started adding narrative components to the and discovered a big problem that would immediately break presence. Every character and action needed to be interruptible in order to maintain the plausibility illusion within the experience. Matching expectations is the biggest challenge for creating a highly interactive VR environment, and interacting with real humans means that they should have an appropriate reaction if you try to interrupt them. One of the most complicated new systems that Owlchemy Labs had to develop was a framework that could account for all different types of interruptions.

The result is that Rick and Morty Simulator is one of the most advanced interactive narratives that I’ve seen so far. Their interrupt system seamlessly blends highly dynamic interaction within a narrative structure that keeps the overall experience moving forward in what ends up feeling like a complete adventure within the Rick & Morty universe. There’s still a lot of work to be done in having the characters directly respond and react to your physical presence and action directed at them, and Alex says that this is one of the biggest open problems that they’re working on.

I had a chance to catch up with Alex at PAX West where we talked about how the Rick and Morty Simulator project came about, the importance of interruptions in interactive narratives, maintaining presence within VR, their workflow for writing and collaborating with Adult Swim and Justin Roiland, and some of the open problems that they’re working to solve.

Here are some tweets that document how Alex and Justin first got together.

Subscribe on iTunes

Donate to the Voices of VR Podcast Patreon

Music: Fatality & Summer Trip


Support Voices of VR

Music: Fatality & Summer Trip

The post ‘Rick and Morty Simulator': Making Narratives More Plausible through Interruption appeared first on Road to VR.

Embedding a Story Within a Place with Rand Miller, Creator of ‘Myst’, ‘Riven’, and ‘Obduction’

rand-millerWhen Rand Miller was a kid, he played Dungeons & Dragons with his brother Robyn where they would go on adventures together exploring and creating imagined worlds. They wanted to embed that same sense of wonder and awe of exploration and discovery into a videogame, and so they were inspired to create Myst (1993) together. They tend to think of Myst, Riven (1997), and their latest adventure game Obduction more as places than games since you can’t die, and you’re learning more about the story of the world as you solve puzzles.

LISTEN TO THE VOICES OF VR PODCAST

Since the story is embedded within the place, it’s the place that ends up telling the story. With Obduction, there are 3-4 discrete places that each have subzones, and there’s no set linear path to explore these worlds and discover each part of the story. This non-linear storytelling mechanism means that the story will unfold uniquely for each person as they make choices as to where to go and what to see.

obduction3
See Also: ‘Obduction’ VR Review – A spiritual successor to Myst that hits all the right buttons

I had a chance to catch up with Rand Miller at PAX West where he talked about his early inspirations from Dungeons & Dragons, their world building process for architecting a place with a story and puzzles, and some of the unique affordances and design challenges they faced making Obduction compatible with virtual reality.


Support Voices of VR

Music: Fatality & Summer Trip

The post Embedding a Story Within a Place with Rand Miller, Creator of ‘Myst’, ‘Riven’, and ‘Obduction’ appeared first on Road to VR.

A Professional Gamer’s View on VR eSports

missharveyStephanie Harvey (aka “missharvey”) is a professional eSports gamer playing Counter-Strike: Global Offensive with the Counter Logic Gaming Red team. I had a chance to catch up with her at PAX West to talk about the future of eSports in VR, the ecosystem of announcers and observers that makes games more entertaining for spectators, her training schedule, core competencies for maintaining a competitive edge, and all of the various ingredients that have to emerge in order to have a viable VR eSports ecosystem.

LISTEN TO THE VOICES OF VR PODCAST


Support Voices of VR

Music: Fatality & Summer Trip

The post A Professional Gamer’s View on VR eSports appeared first on Road to VR.

Opting Out of a Phobia with VR

Theresa-DuringerTheresa Duringer has a fear of flying, and rather than treating her aviophobia with VR exposure therapy she’s been experimenting with using VR to just completely opt out of the real-life signals of the flying experience altogether. She’s found some anecdotal success of avoiding some of her fear of flying triggers just by using the transportive elements of VR. I don’t expect that this approach would work for very many other phobias since being in VR requires you to be stationary and completely isolated from your immediate surroundings for an extended period of time, which works well for flying.

I had a chance to catch up with Theresa at SXSW where she was giving a presentation titled “Game Design for VR Pioneers.” She’s the CEO of Temple Gates Games where she has helped design the VR titles of Bazaar and Ascension VR. She talks about some of her VR design best practices as well as her personal experience of the power of VR in helping her to better cope with her physiological and stress responses to flying.

LISTEN TO THE VOICES OF VR PODCAST

Here’s the recording of Theresa’s “Game Design for VR Pioneers” talk from SXSW.


Support Voices of VR

Music: Fatality & Summer Trip

The post Opting Out of a Phobia with VR appeared first on Road to VR.

Using VR Revolutionize Sports Training with STRIVR Labs

michael-casaleWhen Derek Belch was a kicker on Stanford’s football team in 2007, he took a class with the Virtual Human Interaction Lab’s Jeremy Bailenson where he was exposed to virtual reality technologies for the first time. Belch asked Bailenson if it was possible to use VR to train football players, but the technology wasn’t ready yet back in 2004. Fast-forward six years, with Oculus Rift VR development kits readily available, Belch started a master’s thesis project with Bailenson to study how to use VR to train quarterbacks.

Their pilot program had promising results, but not enough conclusive evidence to be able to say for sure. But the response from football players and coaches was so overwhelmingly positive that they decided to start a company called STRIVR Labs to put their research into practice. They quickly signed up Stanford, Vanderbilt, Clemson, Auburn, Arkansas and Dartmouth as their first official partners to continue their research, and they also started working with NFL teams including the Cowboys, Cardinals, Giants, Vikings, and Jets.

LISTEN TO THE VOICES OF VR PODCAST

I had a chance to catch up with STRIVR Lab’s Chief Science Officer, Michael Casale, at the Experiential Technology and Neurogaming Conference in May. Casale was brought on by Bailenson to help advise Belch’s master’s thesis on learning transfer and category learning techniques that would optimize the learning process. He’s continued this transfer learning research by working with elite athletes at both the collegiate and professional level at STRIVR Labs since it moved from an academic research project into the real world.

Casale was hesitant to report on specific quantitative evidence since there are a lot of proprietary metrics that they’re using internally, but he said that there’s been a lot that’s been reported within the mainstream press. The San Diegeo Tribune reported that Stanford quarterback “Hogan’s pass completion rate jumped from a 65 percent average over the first 10 games of the season, to a 76.3 percent average over the last three games – right around the time he started using the virtual reality trainer to study defenses and make decisions.”

The article emphasizes that correlation is not causation, and establishing learning transfer from VR technologies to the real world is still an open problem. But there’s a strong indicator that VR is having a huge impact when looking at Arizona Cardinal’s record 13-3 season with VR early adopter Carson Palmer telling ESPN that “I think it’s improved my stats. It’s improved my knowledge of our offense.” ESPN speculated that “It might not be a coincidence that Palmer had the best season of his career, throwing 4,671 yards and 35 touchdowns, and finished the season with a career high in quarterback rating (104.1) and QBR (82.1).”

Casale hinted that there’s a lot of value that’s being gained from VR training that might not explicitly show up within the existing statistics that drive fantasy football leagues. Being able to detect an oncoming blitz and dynamically changing the play before the snap is one example of a skill that can honed within VR, but not directly measured on the field. Quarterbacks can also watch themselves from the 360 footage and they can then work on correcting their throwing motion and footwork in the offseason.

A vital part of the training is being able to have more interactive coaching sessions where the quarterback can re-watch different defensive positions and talk about how they would change or adapt their play. Here’s some footage of a Stanford quarterback reading the defense and telling his coach what he sees.

Rather than translating X’s and O’s of a play from a 2D whiteboard in their mind, quarterbacks can prepare and watch what the field actually looks like from VR reps. Carson Palmer was learning 171 plays in 5 days using STRIVR Labs VR system installed in the comfort of his own home.

VR locomotion is still an unsolved open problem, and so most of STRIVR Labs’ VR training for football, basketball, football, baseball, and soccer is shot using a stationary 360 camera, but they’re looking to be able to move around as well. It’s likely that they would have to move to a CGI environment for that, or perhaps there will eventually be a breakthrough in volumetric digital lightfield capture. But for now, they’re focusing on training quarterbacks, goalies, watching pitches, and shooting freethrows.

One big challenge facing STRIVR Labs is that their sample size for elite athletes at the collegiate and professional level is still pretty small, and so determining the optimal combination and sequence of physical and virtual reps is one of the biggest open questions that they’re still trying to answer. This could explain a big motivation for why they’re considering expanding and scaling into high school training as well.

As the 360 video capture process evolves and becomes more mainstream, there’s not going be a lot of technological barriers for other competitors to start to enter into the sports training space, but knowing the optimal training combinations and VR production best practices is going to help STRIVR Labs maintain their current leadership position. And just as Sabermetrics revolutionized the ability to more objectively track impactful baseball players, then I expect that STRIVR Labs to come up with their own set of new objective measurements that use VR technologies to track the progress of learning and performance of elite athletes. And given the objective success that VR early adopters have seen, we can expect that virtual reality sports training is here to stay.

The post Using VR Revolutionize Sports Training with STRIVR Labs appeared first on Road to VR.

SMI Talks Eye Tracking VR Applications & Foveated Rendering

Sensomotoric Instruments (SMI) is a German-based eye tracking company who has released an eye tracking kit for the Oculus DK2 & Gear VR, and most recently for the HTC Vive. At SIGGRAPH this year, Nvidia was showing a foveated rendering demo where it only renders high resolution to the sections of the scene that you are actually looking at. It’s really an imperceptible difference that would allow mobile technologies to render higher resolution scenes, or potentially help make it more feasible to wirelessly transfer data to a desktop VR HMD.

LISTEN TO THE VOICES OF VR PODCAST

Walter Nistico Tom Sengelaub

At SIGGRAPH, I had a chance to talk with Walter Nistico, Head of R&D and Lead Architect Computer Vision, as well as Tom Sengelaub, Manager Solution Delivery, about SMI tracking, and some of the applications in foveated rendering, medical applications for autism research and concussion detection, marketing and analytics, and even deception detection with Converus’ EyeDetect.

new-fove-industrial-design-closer-square
See Also: FOVE Debuts Latest Design for Eye Tracking VR Headset

Researcher Hao Li told me that eye tracking is pretty essential in order to take VR social presence to the next level, and so I expect that the second generation of the Oculus Rift and HTC Vive would both include eye tracking technologies. In talking to Walter and Tom at SIGGRAPH, they’re also very confident that we’ll start to see eye tracking technologies in the next generation of VR headsets.

From SMI’s perspective, they’re hoping to be able to license their eye tracking algorithms to the big headset manufacturers. In my interview with Tobii eye tracking, they also told me that they’ve also been in discussions with some of the major VR HMD manufacturers. SMI says that the hardware required for eye tracking is not a huge barrier, and so it will likely be a matter of whether the eye tracking algorithms are going to be developed in-house or licensed from one of the big eye tracking players.

Here’s a video of NVIDIA’s foveated rendering demo shown off at SIGGRAPH:

Here’s a shadertoy fovea visualizer demo that illustrates how your fovea works (be sure to watch it in full screen).

Here’s a recent demo of using SMI eye tracking with the HTC Vive.

Here’s a demo of eye tracking of a spatial search task within VR.


Support Voices of VR

Music: Fatality & Summer Trip

The post SMI Talks Eye Tracking VR Applications & Foveated Rendering appeared first on Road to VR.

‘TheWaveVR’ Co-founder on His Rave & Interactive Concert Platform

Adam-ArrigoTheWave VR coordinated two days worth of DJs playing musical sets in the world’s first VR Rave at VRLA. It was TheWave’s first event since announcing that they raised a round of $2.5 million to build out an interactive, VR music concert platform. I previously had a conversation with TheWave co-founder Aaron Lemke at the Silicon Valley Virtual Reality conference where they were showing off their musical content creation tools, but they’ve since shifted their focus on building out the social experience for watching live performances of electronic DJs.

LISTEN TO THE VOICES OF VR PODCAST

I had a chance to catch up with TheWave CEO Adam Arrigo at VRLA after the first day of their VR rave at VRLA to get more information about the future direction of their music platform, inspiration from livestreaming of DJ sets, and some of the unique affordances of performing music in VR.


Support Voices of VR

Music: Fatality & Summer Trip

The post ‘TheWaveVR’ Co-founder on His Rave & Interactive Concert Platform appeared first on Road to VR.

Sketchfab: Largest Social Media Site for 3D Objects Adds VR Navigation

Alban-DenoyelOriginally launched in 2012, Sketchfab has grown to be the largest social media website for sharing 3D objects and scenes with over 500,000 members and close to 1 million experiences. At SIGGRAPH, Sketchfab launched their VR browser to be able to navigate between many different 3D objects without having to leave virtual reality. Once inside an Sketchfab experience, then you can teleport around and change the scale. Sketchfab can import over 30 different types of 3D file formats, and is currently working with the Tiltbrush team to be able to have direct support for exporting and sharing your 3D creations on the web.

I had a chance to catch up with Sketchfab co-founder Alban Denoyel at SIGGRAPH to talk about the evolution of Sketchfab over the years, and their two big bets that there are going to be more and more 3D content creators and consumers as virtual and augmented reality technologies become more mainstream.

LISTEN TO THE VOICES OF VR PODCAST


Support Voices of VR

Music: Fatality & Summer Trip

The post Sketchfab: Largest Social Media Site for 3D Objects Adds VR Navigation appeared first on Road to VR.