Microsoft Files ‘Virtual Reality Floor Mat’ Patent, Possibly Aimed at an Xbox VR Headset

A Microsoft patent application for a ‘virtual reality floor mat’ describes a special mat which defines a virtual playspace and can be used for tracking.

Spotted by Twitter user WalkingCat, a Microsoft patent application published last week describes a “virtual reality floor mat activity region,” a special mat that would both define the virtual playspace and be used for tracking.

The artwork in the patent clearly suggests a living room environment where the VR experience is powered by a game console. When it comes to VR on Xbox, Microsoft has had a confused relationship; the company had announced way back in 2016 that VR support was coming to Xbox One X but then suddenly backpedaled on that decision. It has remained mum on the subject ever since, even in the face of the next-gen Xbox ‘Project Scarlett’ which was revealed earlier this year.

SEE ALSO
Next-gen Xbox Revealed with No VR in Sight, Sony Forges Ahead with PSVR on PS5

Microsoft’s virtual reality mat patent application is a curious one. With inside-out tracking increasingly becoming the norm on VR headsets (including Microsofts own Windows VR headsets which were among the first to market with the tech), it’s not immediately clear why a special mat would be necessary, but the patent application reveals some interesting possibilities.

Image courtesy Microsoft

First, the application describes that the mat would have active or passive markers on it which would make its location known to headsets. While inside-out tracked headsets need to have their playspace boundary defined, having a mat in your home would be a ‘set it and forget it’ approach to defining the playspace. Not only would this eliminate occasions where a system without such a mat might forget the boundary and require the user to reconfigure it, the mat-defined boundary would be accessible to any headset without per-device configuration, similar to the way that Valve’s SteamVR Tracking system can be used by any device that comes into the area without any pairing or configuration with the tracking system.

With the mat you could even drop inside-out tracking altogether; the mat could function as an outside-in tracking system (where the headset looks for only the markers on the mat and nothing else). This would reduce the compute requirements compared to inside-out tracking, which in turn could reduce the size and cost of a compatible headset and other devices. For instance, you could create a much less expensive controller which is tracked with the mat markers rather than its own on-board compute.

The mat would also facilitate a static and shared coordinate system which would be useful for multi-user experiences where two or more players are using a headset at the same time in the same space. This could also extend to AR devices like HoloLens, and even the tracking of non-immersive devices like smartphones or tablets, making possible various hybrid experiences.

And finally, there’s the human affordance factor. A mat on the floor makes clear to everyone else in the room where they shouldn’t stand if they don’t want to risk a haymaker to the noggin when someone is playing a VR boxing game or similar.

The patent application also considers a version of the mat which would be based on tiles which could be configured into novel shapes and expanded as needed by users.

Image courtesy Microsoft

Much of the functionality described in the patent sounds very similar to the approach of Antilatency which has developed a tile-based outside-in tracking system with minimal compute requirements which can functional well with just a single camera on the headset. The system is inherently expandable, doesn’t require per-device configuration, supports arbitrary numbers of users, and can be configured into novel shapes. The company is already selling dev kits for the system which is aimed at enterprise and commercial use-cases.

Image courtesy Antilatency

Microsoft’s virtual reality mat patent application is attributed to inventors Julia Schwarz, Principal Software Engineer on HoloLens, and Jason Michael Ray, Software Engineer on HoloLens. The application was filed in April 2018 and published on October 3rd, 2019, though it has not yet been granted.

As ever, it’s worth pointing out that large companies file hundreds if not thousands of patents each year as a strategic exercise; there’s no telling whether or not something described in a patent will manifest in a product. Without additional evidence, patents are best interpreted as a glimpse into what a company is thinking rather than what it is doing.

The post Microsoft Files ‘Virtual Reality Floor Mat’ Patent, Possibly Aimed at an Xbox VR Headset appeared first on Road to VR.

OC6: Mark Zuckerberg Teases Finger Tracking For Keynote

On the eve of Oculus Connect 6 Facebook CEO Mark Zuckerberg teased apparent finger tracking ahead of his talk at the event’s keynote.

Zuckerberg posted a video of two virtual hands making the letters O and C and then holding up six fingers, to symbolize Facebook’s OC6 VR developer’s conference. We can’t think of any way to interpret this other than Zuckerberg teasing some kind of finger tracking to be announced during the keynote.

The video was accompanied by the following blurb:

Putting the final touches on my talk for Oculus Connect tomorrow. I’m excited to share our latest work in augmented and virtual reality. Tune in here at 10am PT to watch the keynote live.

As Zuckerberg states, he will be speaking during OC6’s opening keynote which begins at 10am PT, available to stream on the Oculus Facebook page, as well as in Oculus Venues and in Bigscreen on VR devices.

The tease from Zuckerberg comes soon after we reported Facebook is in the final stages of acquiring a New York-based company called CTRL Labs. The company made an armband that allows finger tracking by reading electrical signals inside the user’s arm. Facebook also recently filed a patent of their own earlier this year with a similar concept depicted in the filing.

Earlier today, a ‘game changing’ reveal at OC6 was also teased by some Oculus software designers. One of those designers, Eugene Krivoruchko, previously worked on software for hand-tracking at Leap Motion.

We’ll have more information for you tomorrow from the floor at OC6 where we will be live tweeting all the announcements and providing regular updates. You can read our best guess predictions for what to expect and take a look at the six sessions you don’t want to miss.

The post OC6: Mark Zuckerberg Teases Finger Tracking For Keynote appeared first on UploadVR.

OptiTrack Shows Hundreds of Simultaneously Tracked Objects in a Single VR Experience

OptiTrack at GDC last week showed off a demonstration of their enterprise/commercial tracking technology which is capable of accurately tracking hundreds of objects simultaneously in real-time. For those building out of home applications with VR, OptiTrack now has available a set of add-ons and a suite of tools for tracking of Rift and Vive headsets, bodies, and props, including a SteamVR plugin.

At the company’s GDC booth last week, OptiTrack had one of their modular tracking volumes set up with 25 cameras tracking 140 real-world objects simultaneously. Most of the objects were custom-made giant Jenga blocks which were (each individually tracked), along with VR headsets, gloves, and more.

Wearing a wireless HTC Vive Pro (with an OptiTrack tracking add-on), I was able to play a complete game of Jenga purely by relying on the mediated information coming through the headset.

The demo showed the capability of the OptiTrack system to accurately track objects with only a single active marker visible—thanks to an array of cameras which can accurately triangulate, plus an on-board IMU for rotation and acceleration—leading to robustness against occlusion by brute-forcing the problem with highly redundant camera coverage.

The result was that at the booth, even with lots of attendees mulling about and playing interactively with the Jenga blocks (which were largely occluded except on the sides, thanks to being stacked in towers), while maintaining high-performance tracking.

To demonstrate the tracking performance, the OptiTrack team also set up the blocks as dominoes to knock them down. While it looks outwardly like a physical simulation rendered in the computer, there’s actually no physics calculations happening at all, just raw motion tracking information of how the blocks are individually moving in the real world.

With the ability to track hundreds of objects at once, in a VR context one might imagine an entire ‘set’ of props in a single room—chairs, desks, balls, guns, swords, and other objects—all of which could be tracked in addition to multiple users in VR headsets which could cooperatively use the objects as part of an entertainment experience. OptiTrack says they haven’t found a practical limit on the number of tracked objects their system can support, even after experimenting with more than 300 at a time.

In the last few years OptiTrack has been building out its software and hardware to support out of home XR applications. The company now sells ‘snap-on’ faceplates for both the Rift and Vive, which augment the headsets with the markers needed for large-scale OptiTrack tracking, as well as ‘pucks’ (self-contained trackers for attaching to arbitrary objects), and an ‘active tag’ which can be used to create custom-made tracked objects (like the Jenga blocks shown at GDC).

SEE ALSO
Oculus Research Devises High-accuracy Low-cost Stylus for Writing & Drawing in VR

OptiTrack also says they’ve built an OpenVR plugin which allows their tracking to transparently replace SteamVR Tracking, enabling compatibility between OptiTrack tracking and SteamVR content which is useful for developers building out of home VR content for Vive headsets.

A minimum of three OptiTrack cameras are required for tracking, the company says, with arbitrary numbers of additional cameras able to be added for larger tracking volumes and robustness against occlusion. While the hardware is no doubt expensive (with single cameras starting at $1,500), the company’s tracking platform is becoming an increasingly comprehensive and scalable solution for large-scale, high precision XR tracking.

The post OptiTrack Shows Hundreds of Simultaneously Tracked Objects in a Single VR Experience appeared first on Road to VR.

Report: New Oculus ‘Rift S’ Headset to be Revealed at GDC Next Week

According to a report by UploadVR, citing emails sent from Facebook to Oculus developers, the rumored Rift S headsets will be revealed next week at GDC.

Rift S was first brought to light in a TechCrunch report last year about the leadership shakeup at Oculus and purported shifting product plans. The report said that Facebook/Oculus had decided to cancel a larger ‘Rift 2’ overhaul in favor of a more modest product refresh called the ‘Rift S’.

An UploadVR report this evening cites an email sent from Facebook to VR developers which “suggests ‘Rift S’ will be formally revealed at GDC 2019,” the annual Game Developers Conference hosted next week in San Francisco. Oculus is already confirmed as attending the event but hasn’t said anything about the potential of a new headset being revealed. We’ve reached out to Oculus for comment.

Origin of the Rift S Rumors

In the TechCrunch report last year, citing “a source familiar with the matter,” it was stated that the Rift S would likely bump the resolution of the headset and move to an inside-out tracking system which would ditch the external sensors—which the headset needs to track its position—in favor of on-board cameras which could do the same job while simplifying setup and usage of the headset. Oculus has already demonstrated its inside-out tracking technology, which it calls ‘Insight’, on its upcoming standalone VR headset, Quest.

SEE ALSO
Oculus Quest Hands-on and Tech Details

Last month, UploadVR uncovered code in the Oculus software referencing the Rift S explicitly. Rumors of the updated headset have further been stoked by spurious availability of stock for original Rift headset in recent weeks, as well as a quiet price drop on the headset from $400 to $350 back in January.

Original Rift Going Strong Despite Age

Image courtesy Oculus

The Rift is Oculus’ first, and so far only, PC VR headset. It was released back in 2016, and sports a 1,080 × 1,200 resolution and a ~100 degree diagonal field of view. At launch, the headset was priced at $600 and included an Xbox One controller as the primary input device, with content primarily designed for seated gameplay.

The Oculus Rift original shipped with an Xbox One controller in the box for seated gameplay. | Photo by Road to VR

It wouldn’t be until the end of 2016 that Oculus would launch the now standard ‘Touch’ motion controllers for $200, which have become the headset’s primary input device (with the Xbox One controller eventually removed entirely from the package) and pivoted the vast majority of content toward standing ‘front-facing’ gameplay with motion input.

The Rift supports ‘room-scale’ tracking, but requires an optional third sensor and requires routing a USB cable from the far corner of the playspace back to the host PC.

While the default setup for the two included sensors offers front-facing gameplay, the system supports standing 360 gameplay with opposing sensors, or full ‘room-scale’ gameplay with an optional third sensor placed in a corner of the playspace. As the sensors need to plug into the host PC, adding a third sensor for a larger playspace can be a pain because it means running a cable across the room. Inside-out tracking— which does away with external sensors in favor of cameras mounted on the headset itself—would make room-scale tracking the default, while simplifying the setup and usage.

Microsoft and its hardware partners were the first to debut consumer PC VR headsets featuring inside-out tracking back at the end of 2017, with many agreeing about the improved ease of use; compared to headsets with external tracking equipment, the Windows VR headsets simply plug into the PC from a single tether and are ready for room-scale tracking. However, controller tracking is made more difficult with inside-out tracking because players can more easily block the on-headset cameras from seeing the controllers. This can pose challenges for certain games which ask the player to move their hands close to their body (or in some cases behind their back or over their shoulder).

Despite its age and lack of successor three years later, the Rift is among the leading consumer VR headsets overall, and the most popular headset in use on Steam. Oculus has steadily cut prices on the Rift from the initial $800 price point (for the headset and Touch controllers) all the way down to the $350 as of January, 2019. While the Rift’s industrial design has held up well, three years on there are a handful of headsets offering higher resolutions and wider fields of view.

Rift S Expectations

Pictured: Oculus Quest. Rift S is expected to look similar to Quest, with cameras on headset for inside-out tracking, and ‘reversed’ Touch controllers which better position the hidden IR LEDs to be seen by the headset’s on-board cameras. | Image courtesy Oculus

The Rift S isn’t expected to be a sequel to original Rift as much as a refresh. Aside from the inside-out ‘Insight’ tracking, it’s expected that the headset’s resolution will be bumped and that newer optics will be used, possibly the same (or similar) as those used in Oculus Go or Quest, which the company has called their “best ever.”

In the resolution department, it seems likely that the Rift S will wind up with the same display as Quest (1,600 × 1,440), which would be a nice step up, and put the headset on par with the Vive Pro and Samsung Odyssey in terms of resolution. Alternatively, Oculus could try to get ahead of the competition by adopting even high-res displays, like the 2,160 × 2,160 panels seen in the upcoming HP ‘Copper’ headset, though this would mean a move from OLED to LCD; so far Oculus and others have mostly chosen OLED displays for higher-end headsets, though there remain pros and cons to consider with regards to LCD.

SEE ALSO
Valve Psychologist to Explore Brain-Computer Interface Research at GDC

What isn’t expected to change (at least not by much) is the headset’s ~100 degree field of view. While Oculus itself has shown off the ‘Half Dome’ prototype headset with a 140 degree field of view, expanding the field of view would require more significant changes to the headset’s optics and displays, likely being out of scope for a ‘Rift S’ refresh. Pimax is already offering an ultra-wide FOV VR headset, but other consumer headsets remain largely in the ~100 FOV class. No improvement in field of view could leave many early adopters wanting, as resolution and FOV are among the most vocally requested improvements.

Eye-tracking is another feature which is up in the air for Rift S. On one hand, eye-tracking is a game-changing technology that’s expected to play a big role in the future of VR—and having eye-tracking in a real product could provide Oculus with real-world data to further hone the tech—but on the other hand the company might withhold eye-tracking until it can provide a complete package with varifocal displays, as seen in Half Dome.

– – — – –

GDC 2019 is being held next week from March 18th to 22nd in San Francisco, CA. Road to VR will be on the ground to bring you the most important news from the event. Stay tuned.

The post Report: New Oculus ‘Rift S’ Headset to be Revealed at GDC Next Week appeared first on Road to VR.

Vive Focus 6DOF Controller Dev Kit Uses Ultrasonic Tracking

HTC earlier this month revealed a 6DOF controller dev kit for the Vive Focus standalone headset. New details have emerged about the device this week.

When HTC revealed the Vive Focus 6DOF controller dev kit earlier this month, the company wasn’t ready to share details. This week at XRDC in San Francisco, the company spoke more about the dev kit and noted that between the Vive Focus and other Vive Wave powered headsets, consumers are likely to see a number of different 6DOF controller tracking technologies accompanying different headsets.

HTC’s Viveport President, Rikard Steiber, said during a presentation today that the Vive Focus 6DOF controller dev kit uses a combination of ultrasonic tracking and IMUs to track the user’s hands. Ultrasonic tracking systems use soundwaves at frequencies above the audible human range for triangulation, typically using a series of receivers to identify differences in timing between ultrasonic sounds emitted by the tracked object.

Steiber noted that the system’s tracking field of view is 180 degrees horizontally and 140 degrees vertically, and that it’s capable of “high accuracy” up to one meter from the headset.

While the Vive Focus ships with a 3DOF controller, the 6DOF controller dev kit includes two new controllers and a large module which is mounted to the headset. US developers can sign up to receive one here.

We haven’t had a chance to try the controller tracking system yet, but aren’t entirely surprised to find that it’s based on ultrasonic tracking considering that it’s among the options offered by Qualcomm’s Snapdragon VRDK, which we understand Vive Focus to be based on.

Ultrasonic tracking is not new by any means; we’ve seen VR trackers based on the technology in recent years, and the tech was employed for similar purposes long before the modern era of VR. Pico Neo was one of the first modern standalone headsets we’ve seen using ultrasonic tracking for 6DOF input, though our hands-on with the headset earlier this year didn’t inspire much confidence in the controller tracking.

SEE ALSO
Google Reveals Experimental 6DOF Controllers for Lenovo Mirage Solo

Generally speaking, the capabilities of ultrasonic tracking have been considered insufficient as a head-tracking solution for high-end VR headsets, though hand-tracking is less sensitive to latency and inaccuracy, and could prove effective with the right implementation.

Steiber made a point to say that among headsets running Vive Wave (like Vive Focus), there will likely be several different 6DOF hand tracking solutions employed, but from a developer standpoint the platform aims to work seamlessly with all of them.

While the Vive Focus is available in China as a consumer ready product, in the US and elsewhere it’s still a developer kit only. With hand input still in flux, it seems it may remain that way for some time still.

The post Vive Focus 6DOF Controller Dev Kit Uses Ultrasonic Tracking appeared first on Road to VR.

Oculus Quest Hands-on and Tech Details

Oculus Quest was the main attraction at Oculus Connect 5 this week. Following the high-end standalone headset’s reveal, attendees of the conference got to try several demos to experience the headset’s inside-out positional head & hand tracking, including in an ‘arena-scale’ setting. We also got a handful of interesting details about the headset’s specs and capabilities.

Oculus Quest (formerly Project Santa Cruz) is officially set to launch in Spring 2019, priced at $400. While that’s twice the cost of the company’s lower-end Go headset, it could certainly be worth it for the much more immersive class of games that comes with positional tracking (which the Go lacks). But that will only be true if the inside-out tracking tech, which Oculus calls ‘Insight’, can really deliver.

Quest ‘Insight’ Tracking

Photo by Road to VR

Insight seems to be shaping up to be the best inside-out head and hand-tracking that I’ve seen to date. I say “seems” because I haven’t had a chance to test the headset’s tracking in a non-demo environment. The tracking system relies on recognizing features of the surrounding environment to determine the headset’s position in space; if you played in an empty room with shiny, perfectly lit white walls, it probably wouldn’t work at all since there wouldn’t be sufficient feature density for Insight to know what’s going on. Demo environments are often set up as best-case scenarios, and for all I know something in my house (or anyone’s house) could really throw the system off.

Oculus claims they’re tuning the headset’s tracking to be robust in a wide range of scenarios, and as long as that remains consistently true, Insight will impress many. While other inside-out tracking systems either lack positional controller tracking entirely, or require some compromise on the size of the controller tracking volume, Quest’s four cameras, mounted on the corners of the headset, cover an impressively wide range that I effectively couldn’t defeat. A simple test I’ll often do with such systems is to move my outstretched arm as far outside of my own field of view as possible, then move it while (hoping to have lost the view of the tracking cameras), then bring it back into the tracking volume from some other angle. Generally I’m trying to see my hand ‘pop’ into existence at that new point of entry, as the cameras pick it up and realize it wasn’t where they saw it last.

Despite my efforts, I couldn’t manage to make this happen. By the time my hand came anywhere near my own field of view, the hand was already re-acquired and placed properly (if it had even left the tracking volume at all). I would need to design a special test (using something like a positional audio source emitted from my virtual hand) to find out if I was even able to get my hand outside of the tracking volume, short of putting it directly behind my head or back.

So that means there’s vanishingly few situations where tracking is going to harm your gameplay, even in situations that would normally be cited as problematic for inside-out tracking systems, like throwing a frisbee or hip-firing a gun. Two specific scenarios that I haven’t had a chance to test just yet (but believe will be important to do so) is shooting a bow or aiming down sights with a two-handed weapon. In both scenarios, one of your hands typically ends up directly in front of, or next to your face/headset, which could be a challenging situation for the tracking system. I’ll certainly test those situations next time I have Quest on my head.

In any case, it feels like Oculus has done a very good job with Quest and its insight tracking system. Assuming Quest can achieve a consistently high level of robustness once as it finds itself across a huge variety of rooms and lighting situations, I think the vast majority of players wouldn’t be able to reliably tell the difference between Quest’s inside-out tracking and Rift’s outside-in tracking in a blind test.

And that has big benefits beyond just getting rid of the external trackers. For one, it means the device has 360 roomscale tracking by default rather than this being dependent on how many sensors a Rift user chooses to buy and how they decide to set them up. Additionally, it means players can easily play in much larger spaces than what was previously possible with the Rift.

At Connect I played a few demos with Quest, one of which was Superhot VR. The game was demoed in a larger-than-roomscale space (about the size of a large two car garage) and I was free to walk wherever I wanted within that area. When it came to hand-tracking, I played the game exactly like I’ve played it on the Rift many times before, without noticing any issues. Being used to tethered headsets, it was also incredibly freeing to take a few steps in one direction and not see a Guardian/Chaperone boundary, then simply keep walking for many more steps before needing to think about the outside world.

Oculus took this to the extreme at Connect in an experiment they put together to show how Quest tech is capable of ‘arena-scale’ tracking. They created a large arena space, about the size of a tennis court, and put six players wearing Quests inside to play a special version of Dead and Buried. Physical cover (boxes of varying sizes and shapes) covered the area, and players could physically walk anywhere around the arena and use the cover while shooting at the other team.

A rendering of the arena set up at Oculus Connect for ‘Dead and Buried Arena’. I added a little red stick-person for scale. | Image courtesy Oculuis

Through my time in this demo I didn’t see any jumping in the positional head tracking, even while I walked 10 to 15 feet at a time from one piece of cover to the next.

So, Quest is shaping up to deliver the same kind of quality positional tracking experience that most of us associate with high-end tethered headsets today, but now with more freedom and greater convenience. That’s a big deal at a $400 all-in price point.

Continue on Page 2: Oculus Quest Tech Details »

The post Oculus Quest Hands-on and Tech Details appeared first on Road to VR.

Oculus Quest Supports “Arena-scale” Tracking and Multi-room Guardian

Oculus today offered up some details on the ‘Insight’ tracking system of their new Quest headset. The inside-out tracking goes “beyond roomscale,” according to Oculus, enabling large-scale experiences that let players physically move around large spaces. Additionally, Oculus Quest is equipped with the familiar Guardian system to keep you from bumping into your surroundings, now upgraded to support multiple rooms.

Oculus Quest uses four wide-angle cameras to not only track the Touch controllers, but also to track the environment around the user in order to understand where the headset is located in space.

This ‘inside-out’ tracking, which Oculus calls ‘Insight’, means that users don’t need to rely on external sensors or trackers, freeing them to move around large scale spaces rather than being constrained to a room-scale tracking volume. As such, Oculus says that the headset can support “arena-scale” tracking.

At Oculus Connect the company is showing a multiplayer VR FPS arena for Dead and Buried that occupies several thousand square feet, allowing players to have a VR laser tag-like experience where they are physically moving around a large space with real objects for cover, all while battling in VR. In this early stage, it sounds like this kind of large-scale tracked experience is still experimental, but something the company is hoping to expand going upon forward.

SEE ALSO
Oculus Announces Quest, The High-end Standalone Headset Starting at $400

For consumers in their home, Oculus says that Quest will support a multi-room Guardian system. Oculus Rift users will be familiar with Guardian, which allows them to trace an outline of their physical space in order to set a safe boundary for their movements in virtual reality. When approaching the edge of the boundary, the Guardian system shows itself as a virtual wall that looks like a blue grid, preventing you from running into a wall or punching your computer monitor (usually).

Since Oculus Quest offers ostensibly unconstrained movement via inside-out tracking, Oculus has announced a multi-room Guardian system. The company says that this will allow users to set up Guardian spaces in multiple rooms, and Quest will remember those setups and automatically identify which room the user is in to show them the right Guardian boundary.

It remains to be seen how well this will work, as it’s surely challenging to reliably identify rooms in various lighting conditions and when objects could move from one session to the next. We’ll be going hands-on with Quest soon to bring you more details on how it all works.

The post Oculus Quest Supports “Arena-scale” Tracking and Multi-room Guardian appeared first on Road to VR.

Oculus Quest Supports “Arena-scale” Tracking and Multi-room Guardian

Oculus today offered up some details on the ‘Insight’ tracking system of their new Quest headset. The inside-out tracking goes “beyond roomscale,” according to Oculus, enabling large-scale experiences that let players physically move around large spaces. Additionally, Oculus Quest is equipped with the familiar Guardian system to keep you from bumping into your surroundings, now upgraded to support multiple rooms.

Oculus Quest uses four wide-angle cameras to not only track the Touch controllers, but also to track the environment around the user in order to understand where the headset is located in space.

This ‘inside-out’ tracking, which Oculus calls ‘Insight’, means that users don’t need to rely on external sensors or trackers, freeing them to move around large scale spaces rather than being constrained to a room-scale tracking volume. As such, Oculus says that the headset can support “arena-scale” tracking.

At Oculus Connect the company is showing a multiplayer VR FPS arena for Dead and Buried that occupies several thousand square feet, allowing players to have a VR laser tag-like experience where they are physically moving around a large space with real objects for cover, all while battling in VR. In this early stage, it sounds like this kind of large-scale tracked experience is still experimental, but something the company is hoping to expand going upon forward.

SEE ALSO
Oculus Announces Quest, The High-end Standalone Headset Starting at $400

For consumers in their home, Oculus says that Quest will support a multi-room Guardian system. Oculus Rift users will be familiar with Guardian, which allows them to trace an outline of their physical space in order to set a safe boundary for their movements in virtual reality. When approaching the edge of the boundary, the Guardian system shows itself as a virtual wall that looks like a blue grid, preventing you from running into a wall or punching your computer monitor (usually).

Since Oculus Quest offers ostensibly unconstrained movement via inside-out tracking, Oculus has announced a multi-room Guardian system. The company says that this will allow users to set up Guardian spaces in multiple rooms, and Quest will remember those setups and automatically identify which room the user is in to show them the right Guardian boundary.

It remains to be seen how well this will work, as it’s surely challenging to reliably identify rooms in various lighting conditions and when objects could move from one session to the next. We’ll be going hands-on with Quest soon to bring you more details on how it all works.

The post Oculus Quest Supports “Arena-scale” Tracking and Multi-room Guardian appeared first on Road to VR.

Google Demonstrates Promising Low-cost, Mobile Inside-out Controller Tracking

A number of standalone VR headsets will be hitting the market in 2018, but so far none of them offer positional (AKA 6DOF) controller input, one of the defining features of high-end tethered headsets. But we could see that change in the near future, thanks to research from Google which details a system for low-cost, mobile inside out VR controller tracking.

The first standalone VR headsets offering inside-out positional head tracking are soon to hit the market: the Lenovo Mirage Solo (part of Google’s Daydream ecosystem), and HTC Vive Focus. But both headsets have controllers which track rotation only, meaning that hand input is limited to more abstract and less immersive movements.

Detailed in a research paper (first spotted by Dimitri Diakopoulos), Google says that the reasons behind the lack of 6DOF controller tracking on many standalone headsets is because of hardware expense, computational cost, and occlusion issues. The paper, titled Egocentric 6-DoF Tracking of Small Handheld Objects goes on to demonstrate a computer-vision based 6DOF controller tracking approach which works without active markers.

Authors Rohit Pandey, Pavel Pidlypenskyi, Shuoran Yang, and Christine Kaeser-Chen, all from Google, write, “Our key observation is that users’ hands and arms provide excellent context for where the controller is in the image, and are robust cues even when the controller itself might be occluded. To simplify the system, we use the same cameras for headset 6-DoF pose tracking on mobile HMDs as our input. In our experiments, they are a pair of stereo monochrome fisheye cameras. We do not require additional markers or hardware beyond a standard IMU based controller.”

The authors say that the method can unlock positional tracking for simple IMU-based controllers (like Daydream’s), and they believe it could one day be extended to controller-less hand-tracking as well.

SEE ALSO
Qualcomm Snapdragon 845 VRDK to Offer Ultrasonic 6DOF Controller Tracking

Inside-out controller tracking approaches like Oculus’ Santa Cruz use cameras to look for for IR LED markers hidden inside the controllers, and then compare the shape of the markers to a known shape to solve for the position of the controller. Google’s approach effectively aims to infer the position of the controller by looking at the users arms and hands, instead of glowing markers.

To do this, they captured a large dataset of images from the headset’s perspective, which show what it looks like when a user holds the controller in a certain way. Then they trained a neural network—a self-optimizing program—to look at those images and make guesses about the position of the controller. After learning from the dataset, the algorithm can use what it knows to infer the position of the controller from brand new images fed in from the headset in real time. IMU data from the controller is fused with the algorithm’s positional determination to improve accuracy.

Image courtesy Google

A video, which has since been removed, showed the view from the headset’s camera, with a user waving what looked like a Daydream controller around in front of it. Overlaid onto the image was a symbol marking the position of the controller, which impressively managed to follow the controller as the user moved their hand, even when the controller itself was completely blocked by the user’s arm.

Image courtesy Google

To test the accuracy of their system, the authors captured the controller’s precise location using a commercial outside-in tracking system, and then compared to the results of their computer-vision tracking system. They found a “mean average error of 33.5 millimeters in 3D keypoint prediction,” (a little more than one inch). Their system runs at 30FPS on a “single mobile CPU core,” making it practical for use in mobile VR hardware, the authors say.

Image courtesy Google

And there’s still improvements to be made. Interpolation between frames is suggested as a next step, and could significantly speed up tracking, as the current model predicts position on a frame-by-frame basis, rather than sharing information between frames, the team writes.

As for the dataset which Google used to train the algorithm, the company plans to make it publicly available, allowing other teams to train their own neural networks in an effort to improve the tracking system. The authors believe the dataset is the largest of its kind, consisting of some 547,000 stereo image pairs, labeled with precise 6DOF position of the controller in each image. The dataset was compiled from 20 different users doing 13 different movements in various lightning conditions, they said.

– – — – –

We expect to hear more about this work, and the availability of the dataset, around Google’s annual I/O developer conference, hosted this year May 8th–10th.

The post Google Demonstrates Promising Low-cost, Mobile Inside-out Controller Tracking appeared first on Road to VR.

VRgineers to Integrate Leap Motion Hand-tracking into its Wide FOV VRHero Headset

Enterprise VR headset manufacturer VRgineers and Leap Motion, the company behind its eponymous optical hand tracker, announced they’re working together to embed Leap Motion’s tech into professional-grade VR headsets.

VRgineers is best known for their enterprise-focused wide field of view (FOV) VRHero 5K Plus headset, which boasts a 170-degree FOV and dual 2560 x 1440 OLED displays. The company is currently working with Leap Motion to bring the next generation of its professional VR headsets to market later this year with integrated Leap Motion technology.

“Embedding Leap Motion’s next generation hand-tracking technology directly into future VRgineers headsets will allow professional users to interact with VR applications completely naturally using their hands,” says Marek Polcak, CEO and co-founder of VRgineers. “Having this technology is especially important for us because our high-resolution headsets allow users to see 170º horizontally. Now they can move their virtual hands throughout the headset’s entire field of view without losing sight of them.”

Leap Motion controller mounted on an Oculus RIft DK2, image courtesy Leap Motion

Leap Motion’s hand tracker, aka ‘Leap Motion Controller’, is a small USB device capable of being mounted on VR headsets via a plastic mounting kit. The Controller has been available since July 2013, although has since seen a major software upgrade in 2016 that has refined its made-for-VR hand-tracking engine.

The VRHero platform, VRgineers says in a statement, is presently used by automotive designers in companies, such as BMW, Audi, and Volkswagen, for design validation and evaluation, allowing them to accelerate the development of new prototypes. Adding hand-tracking, which inherently doesn’t require new users to learn control schemes, might make those professional usercases more approachable to the uninitiated.

The post VRgineers to Integrate Leap Motion Hand-tracking into its Wide FOV VRHero Headset appeared first on Road to VR.