Simulating Crowds of Virtual Humans in Immersive Environments

Mubbasir-KapadiaThe IEEE VR conference held a new pre-conference workshop this year on Virtual Humans and Crowds for Immersive Environments. Movies like Lord of the Rings and video games like Assassin’s Creed use this research in order to create convincing group behaviors with NPCs, and architects want to be able to test their building designs to ensure that they are comfortable for different types of flows of people and evacuation scenarios.

LISTEN TO THE VOICES OF VR PODCAST

There are a number of non-VR researchers who are studying how groups of move through through different situations and contexts, and virtual reality is providing new opportunities to test out some of their theories within immersive environments.

At the IEEE VR, I had a chance to chat with Rutger’s professor Mubbasir Kapadia who studies crowd simulation and his latest book is called Simulating Heterogeneous Crowds with Interactive Behaviors. We talk about his research and theories into how to describe and simulate crowd behaviors, some of the entertainment and architectural applications, how VR can be used to test and verify some of these theories, and some of the big open questions that are driving his research.


Support Voices of VR

Music: Fatality & Summer Trip

The post Simulating Crowds of Virtual Humans in Immersive Environments appeared first on Road to VR.

This Live Mixed Reality Solution Uses Kinect to Eliminate Green Screen

This project uses Kinect depth cameras to the fuse virtual and the real to produce real-time mixed reality footage with no green screens required, all using Microsoft’s depth camera Kinect.

Mixed reality is all the rage and has become one of the most effective methods to convey the power of immersion afforded by new virtual reality technologies. Spearheaded most recently by the Fantastic Contraption team for their excellent teaser series, the Vive’s room-scale positional tracking used in conjunction with green screen backdrops to fuse the virtual and the real.

A new experimental project has come up with an alternative method, one that does away with the requirement for draping your demo area with green sheets and leverages Microsoft’s Kinect depth camera to achieve a similar effect, in real time. The new technique allows potential exhibitors to show off a user interacting with a virtual application an retain any demo space design (say, a stand at a conference venue) and still produce a compelling way to visualise what makes VR so special.

“We built a quick prototype using one of HTC Vive’s controllers for camera tracking, Microsoft Kinect v2, a Kinect v2 plugin and Unity running on 2x machines,” says the team, who have demonstrated their work via the above YouTube video. “The server ran the actual VR scene and the client extracted data from the Kinect, placed the point cloud into the scene, resulting in a the mixed reality feed. The depth threshold was altered dynamically based on the position of the headset.”

kinect-mmixed-reality

Of course, the compositing effectiveness is not as precise as a professionally produced green-screen equivalent, and there will be occasional pop-in from other objects which creep into the demo space, but it’s a neat, low cost and potentially more practical approach to getting your VR app noticed at a venue.

However, the biggest drawback for the technique will likely be it’s Achilles heel, specifically the requirement for the target VR application to provide integration for displaying the point cloud imagery alongside the view captured by the virtual camera. No mean feat.

kinect-mixed-reality-2

Nevertheless, it’s an intriguing approach that once again reminds us how Microsoft’s gaming peripheral seems to have found a life much more productive than it’s original, ill fated designed purpose.

You can read in detail all about the project over at the team’s YouTube channel here.

The post This Live Mixed Reality Solution Uses Kinect to Eliminate Green Screen appeared first on Road to VR.

This VR Experiment Generates Beautiful Worlds From Your Biometrics

STRATA is a new virtual reality experiment from The Mill that taps into biometric data—heart rate, breathing etc.—and produces beautiful procedurally generated worlds to match.

No Man’s Sky is the latest and possibly the most high profile example of procedurally generated virtual environments, where in theory no world you ever visit is the same as the last, based on a series of algorithms and systems defined by the programmers. But what if virtual worlds were generated from and tailored to you as a person?

STRATA_Subteranean-Lake

Developers The Mill have come up with just such a prospect with their latest VR project called STRATA. According to press info released by The Mill, STRATA “responds to your physiological and neurological data to generate procedural audio and visuals”, the upshot is a virtual experience that’s potentially unique to anyone stepping into it. The experience gathers biometric from the user via a series of biometric sensors are placed on viewers equipped with VR headsets. “These sensors measure EEG (brainwaves), GSR (stress levels), heart rate, and breathing (via a conductive band created by The Mill),” the press release states, “This data feeds to an app running on the HMD that generates visuals and audio.” A special lap mounted “pillow” collects GSR and heart rate and transmits the data wirelessly to the app too.

As you can see from the trailer embedded above, the results are a meditative dreamscape in which aspects of both the environment and the sound field alter dependent on the players physical and mental state.

 

biometric-vr-world

This is all very interesting of course, but what are the practical uses for the technology? The Mill say that STRATA represents “a radical imagining for new VR applications, biometrics as a control scheme, and a step forward in responsive immersive visuals,” and cite potential applications beyond the entertainment sphere as mental fitness training in Sports & Athletics, distraction therapy for patients undergoing “unpleasant” medical procedures and “Mindfulness training for stress alleviation” in meditation, anger management etc.

This is all theoretical of course, but we’ve already seen studies which suggest immersive media can be beneficial to physiological treatment, for example veterans suffering from post traumatic stress disorder. In any case, it’s an interesting area of research and The Mill have certainly gone all out to construct the hardware and software stack needed to demonstrate it. It’ll be interesting if the work seen in STRATA manages to break out of the experimental sphere into the real world in the near future.

The post This VR Experiment Generates Beautiful Worlds From Your Biometrics appeared first on Road to VR.

Research on VR Presence & Plausibility with VR Technical Achievement Award Winner Anthony Steed

anthony_steedOne of the gold standards of a VR experience is being able to achieve Presence, but Presence is an elusive concept to precisely define. Mel Slater is one of the leading researchers into Presence and says that it’s a combination of the ‘Place Illusion’ and ‘Presence Illusion’, which Richard Skarbez elaborates by saying that the Place Illusion represents the degree of immersion that you feel by being transported to another place, and the Plausibility Illusion is the degree to which you feel that that the overall scene matches your expectations for coherence.

LISTEN TO THE VOICES OF VR PODCAST

Anthony Steed is a professor in the Virtual Environments and Computer Graphics group in the Department of Computer Science, University College London. I had a chance to catch up with Anthony at the IEEE VR conference where he talks about doing distributed Presence research with a Gear VR, the role of plausibility in Presence, how social Presence fits into Mel’s two illusions of Presence, and some of the discussions about sharing knowledge between game developers and academics that happened at GDC and IEEE VR conferences this year.

brendan-iribe-oculus-presense
See Also: Oculus Shares 5 Key Ingredients for Presence in Virtual Reality

Anthony studied under Mel Slater, and he was a co-author of one of the major Presence surveys referred to as the Slater, Usoh & Steed survey in the “Depth of Presence in Virtual Environments” paper. Anthony was also the winner of the 2016 Virtual Reality Technical Achievement Award presented at the IEEE VR conference this year.

Here’s a video of the Presence Experiment that Anthony conducted on the Gear VR, and where he found that tapping on your body during the music without having your hands tracked had a negative impact on embodiment.

Here’s the 2015 IEEE VR poster from Richard Skarbez talking about his Presence research into the Place Illusion and Plausibility Illusion:


Support Voices of VR

Music: Fatality & Summer Trip

The post Research on VR Presence & Plausibility with VR Technical Achievement Award Winner Anthony Steed appeared first on Road to VR.