Hands-on: Intel’s Project Alloy ‘Merged Reality’ Roomscale Multiplayer Demo

At CES 2017, Intel was showing off more of its Project Alloy VR headset, including a roomscale multiplayer demo featuring ‘merged reality’, where the game environment adapts to your physical surroundings.

Project Alloy

Project Alloy is Intel’s VR headset reference design. The company expects device makers to use it as a starting point to create their own VR headsets (based on Intel chips, of course). Alloy is an all-in-one headset which does everything—game rendering, computer vision, processing, and more—directly on the headset. It’s literally an x86 computer running Windows 10 that you wear on your head.

intel project alloy (1) intel project alloy (3)

That means the headset is completely untethered, and with inside-out positional tracking, it can track your movement without the use of external sensors.

Merged Reality Multiplayer

intel project alloy (2)At CES 2017, Intel was showing off Project Alloy with a roomscale ‘merged reality’ multiplayer demo. The idea behind merged reality (AKA virtual augmented reality) is to make the virtual world account for the physical environment around you. In the demo, that meant turning the physical objects in the room into virtual objects in the game world that could be used for cover. You can see the experience in action in the video at the top of this article.

After putting on a pair of the fairly bulky prototype headsets, a colleague and I saw a virtual version of the same chair, desk, bookshelf, couch, and coffee table that were in the room we had been standing in. The room was scanned ahead of time using the sensors on the Alloy headset. We were able to physically navigate around the virtual version of the room thanks to the headset’s inside-out positional tracking.

intel project alloy merged reality (3)

After a few minutes of walking around and inspecting the virtual version of the room, the walls faded away to reveal a vast skybox of distant mountains and clouds. It really did feel like the walls had opened up before us and we had been transported to anothe realm. Before we knew it, the objects in the room had transformed into geometry that thematically matched the game world; the couch became a big rectangular metal storage bin, the desk and chair became a futuristic metal chair and computer terminal, the bookshelf turned into a door frame, and the circular coffee table turned into a futuristic metal… circular thing. The digital versions were not inch-for-inch facilities of the real furniture, but the assets were at least as big as the footprint of the real furniture. There was more virtual geomtry added too which didn’t exist at all in the real world, like a computer monitor on a tall mount and a crane-like mechanism perched overhead.

Using 3DOF controllers which were parented to the location of our head, we were able to aim and fire a rifle. The shooting mechanics worked fine, but the lack of positional tracking on the controllers meant it was a simple point-and-shoot affair with no ability to actually raise the weapon to your shoulder and look down the scope to aim properly (as we’ve seen on other VR systems with more advanced VR controllers).

Waves of flying drones came at us and were easily dispatched with one shot each. After clearing a swarm we would advance to the next wave which had a few more enemies. Thanks to the headset’s positional tracking, we could walk around the entire space as we played and duck behind the virtual/real cover. But it wasn’t exactly a high-action experience as the drones weren’t quite competent enough to make us really feel pressured into cover.

After running out of ammo, we’d needed to find an ammo pickup to replenish the weapon’s clip. I remember inching my way toward the pickups because I just didn’t feel quite confident in the mapping and tracking. Impressive as it was, I wasn’t able to achieve a sense of totally forgetting about the physical objects around me. Inside the demo, it felt as if the scale of the virtual environment was slightly off; when I took a step forward, it didn’t quite feel like I’d traveled the same distance in the virtual world as all my bodily sensors said I traveled in the real world. That amounted to taking careful steps with each movement.

Just a Demo, but Promising

intel project alloy merged reality (1)

As a demo and a concept, it was pretty cool to see this working. But there’s still a lot of work to be done to bring this sort of experience to everyone’s home. For one, the objects in the environment were not automatically identified and replaced with virtual objects. The demo appeared to be made for this specific room size and this specific arrangement of this particular furniture. Adapting a VR game to any room and any furniture automatically will require some smart game design thinking, especially for anything beyond a wave-shooter where your couch turns from a couch into a virtual metal box for cover.

There’s also work to be done in building confidence in users so they can trust they aren’t going to bump into the real environment. The limited field of view makes knee-high objects like coffee tables and chairs a notable threat because you’re much less likely to see out of the corner of your eye. This is compounded not only by the scale issue I described above, but also because it’s hard to tell exactly where your legs are when you can’t see them in VR (like most VR experiences, I didn’t have an accompanying virtual body beyond my head and gun. With a VR headset like the HTC Vive, the chaperone system is so competent that I can almost completely forget about the real world around me, because I know the system will alert me if I’m in danger of running into the physical world 100% of the time. That sort of “freedom to forget” is essential for immersive virtual reality.

There’s also the added complication that a virtual asset may not perfectly align with the real one. This was demonstrated quite clearly by the coffee table in the middle of the room which—to the dismay of my shins—I bumped into more than once, even when I felt like I was well clear of the virtual counterpart. One of the developers running us through the experience also gave his knee a good whack on the table, at which point he figured out that the table had probably been moved after the scan. This is the sort of thing that needs to be solved if this tech is going to take off in people’s homes.

But of course that’s why all of this is just a demo, and a pretty exciting one at that. In fact it was the first time I did VR multiplayer where both players were inhabiting the same physical space. There’s kinks to work out before this sort of merged reality experience can work well in a wide range of environments, but the vision is promising and could be very compelling with the right execution.

The post Hands-on: Intel’s Project Alloy ‘Merged Reality’ Roomscale Multiplayer Demo appeared first on Road to VR.

Apple AR/VR Product to Debut in 2017, Predicts Sony’s Head of Worldwide Studios

sony-project-morpheus-ps4-vr-headset-reveal-captionTo many, the question of an Apple AR or VR headset has become a “when” rather than an “if”. The President of Sony’s Worldwide Studios thinks 2017 is the year that Apple introduces its first immersive device.

Apple does R&D on a wide range of technologies and has been actively researching the fields of VR and AR for years, including submitting and receiving several relevant patents. And while much of Apple’s R&D doesn’t see the light of day, the company certainly excels at taking novel tech and marketing it as something that everyone can use. VR and AR are on the rise, and Apple is widely expected to jump into the immersive device space when the time is right. But exactly when that time is has been up for debate.

As for Shuhei Yoshida, President of Sony’s Worldwide Studios, 2017 is likely the year that Apple makes its first move. That’s according to Virtual Reality Pop, who queried a number of VR and AR industry insiders in a brief Q&A about their biggest predictions for the landscape in 2017.

Yoshia is a major believer in VR and has been closely involved with the creation of Sony’s PlayStation VR headset; he was the one to introduce the device (formerly called Project Morpheus) to the public for the first time at GDC 2014 (see leading photo). Since then he’s appeared numerous times to herald the headset and has carefully followed the evolution of the VR and AR landscape by attending and participating the industry’s top conferences.

SEE ALSO
Apple Mum on AR/VR for iPhone 7, but Here's How We Know They're Still Working on It

It’s doubtful that Yoshida has any specific knowledge of an Apple AR or VR device announcement, but his prediction certainly contains the wisdom of a long time Sony veteran who is carefully considering PlayStation’s forward-looking VR strategy with regards to Apple’s possible entry into the marketplace.

Do you think Apple will come to the market with an AR/VR product this year? Let us know in the comments.

The post Apple AR/VR Product to Debut in 2017, Predicts Sony’s Head of Worldwide Studios appeared first on Road to VR.

VR Interface Design Contest with $10,000 in Cash Prizes Launched by Purple Pill

Immersive content agency Purple Pill has announced a VR interface design competition and is offering $10,000 in cash prizes to those who create the best virtual reality interfaces.

From gaze-based interaction modalities to laser pointer menus to skeuomorphic knobs and buttons, today’s VR interfaces are all over the place. Even from one motion controller to the next, VR interface designs don’t agree on the best way to pick up and hold virtual objects. It’s going to take time before reaching any sort of consensus on VR interface design, but Purple Pill is hoping to spur things along.

oculus-home
Most of today’s VR interfaces are carryovers from screen-based interfaces

The company has announced a VR interface design contest that begins today and runs until March 15th. Entries will be judged on Usability, Design, Creativity, and Performance. The first place prize is $7,500 in cash and the second place prize is $2,500.

“The majority of interfaces we see in the current generation of VR apps are confusing and rather plain. They’re usually not much more than a floating plane with some text on it,” says Purple Pill’s Nick Kraakman. “With this competition we want to stimulate designers and developers from around the world to come up with fresh ideas about UI’s in VR and create some innovative designs that push the boundaries of this exciting medium.”

SEE ALSO
VR Interface Design Insights from Mike Alger

What’s the catch? Well, Purple Pill isn’t quite doing this just out of the goodness of their hearts—entries must be based on the company’s Headjack Unity API, a foundation for creating cross-platform VR apps which include 360 video.

The contest’s official rules require that each entry:

• Is created using the Headjack Template API
• Runs smoothly and without frame drops
• Is submitted during the Competition Period
• Is added to the Marketplace as a Public free template

Although not part of the official rules, Purple Pill says entries “Should have support for mobile VR.” The rules further say that entries can be submitted in the following way:

  1. Sign up for a free Headjack account on https://app.headjack.io
  2. Create a VR Template using the Headjack API found at https://headjack.io/docs
  3. Upload the template to the Headjack Template Marketplace at
  4. https://app.headjack.io/#/templates/my-templates/add as a Public free template

Starting today, participants can submit any number of entries but are only eligible to win one of the two cash prizes. Purple Pill says that the winners will be announced one month after the March 15th submission deadline.

The post VR Interface Design Contest with $10,000 in Cash Prizes Launched by Purple Pill appeared first on Road to VR.