You Can Now Set Up Oculus Quest’s Guardian With Controller-Free Hand Tracking

Facebook seems to be steadily turning the Oculus Quest standalone headset into a more advanced platform with each subsequent update to its system software.

Facebook says the latest update, version 16 of the system running Oculus Quest rolling out this week, allows the set up of “your roomscale Guardian boundaries without controllers by using hand tracking instead.” That’s a pretty remarkable addition given that since Quest started shipping in May 2019 it already featured the slickest approach to room set up in VR. You put on the headset and then use the controllers to paint the outline of your available space on the floor. The system was intuitive, quick, and gets you into a VR experience much faster than, say, mounting cameras or lasers to your walls and then tracing your space before putting the headset on.

With the addition of experimental controller-free hand tracking to the Quest in December, though, Facebook is building out a future ecosystem for VR (and AR) hardware that doesn’t even need controllers to operate. Some games are expected to be updated for compatibility with hand tracking and the addition of guardian set up without controllers is just one more piece of the puzzle.

Earlier this week an UploadVR analysis of driver code found by a developer suggested future VR headsets from Facebook might feature cameras which sample their surroundings at higher rates than the current Quest. That might result in more robust hand tracking in future systems. At the time of this writing, the hand tracking in Quest is still labeled “experimental” and your hands can seem to jitter or shake quite a bit while using the system. If you have a Quest you can download some hand tracking demos on SideQuest now, and we hope to see some improvements to the tracking quality before apps start adding support from the Oculus Store.

The v16 update also adds an AR-like view of your surrounding room as an Oculus Home option as well as the ability to open links in VR that are shared on Facebook.

The post You Can Now Set Up Oculus Quest’s Guardian With Controller-Free Hand Tracking appeared first on UploadVR.

Unplugged: Air Guitar Is Like Guitar Hero And Rock Band But Powered By Oculus Quest Hand Tracking

A video teaser popped up for a VR game on Oculus Quest that uses controller-free hand tracking for a familiar musical experience similar to Rock Band and Guitar Hero.

And it already carries the perfect name — Unplugged: Air Guitar.

The game is a side project by Ricardo Acosta that he’s been working on at night. He works at Microsoft on the Maquette art app team but Unplugged is a personal project he’s been developing on his own time.

It is still pretty early in the work and there are some major obstacles to overcome for this project to see the light of day. There are of course questions surrounding copyright and what kind of songs people will be able to play in the finished game. The developer is thinking of partnering with small bands who want to promote their music in the game.

Right now Oculus Quest hand tracking is a cool experiment but also far from perfect. Musical instruments in particular test the quality of the hand tracking system and experienced musicians suggest it is pretty limiting. For example, doing things like playing a physical piano with Quest hand tracking can only go so far with the current implementation. Nonetheless, there are glimmers of hope in ideas related to music and hand tracking with VR. For example, one of our writers, Harry Baker, recently synced up his physical piano to a SideQuest app and used it to learn how to play the Tetris theme on it in VR.

Acosta says he’s thinking of taking Unplugged: Air Guitar to Kickstarter to crowdfund further development. The current idea is an evolution of an idea he explored several years ago with Vive wands and Oculus Touch controllers. You can see the earlier “Rock The Stage” idea in the video below:

As of this writing Quest hand tracking software is still firmly in the experimental category. Facebook is only accepting apps to its Oculus Store that use the Touch controllers and meet their high bar for quality and market viability. But that should change at some point, with some games expected to get new features that’ll make them compatible with the controller-free tracking mode. When that happens we sure hope to see an app like Unplugged available on Quest.

We’ll bring you the latest updates about Unplugged: Air Guitar as we follow the game’s development. You can follow the game’s Twitter account UpluggedVR for updates as well and let us know in the comments below whether you want to see something like this on Quest.

The post Unplugged: Air Guitar Is Like Guitar Hero And Rock Band But Powered By Oculus Quest Hand Tracking appeared first on UploadVR.

Manus Polygon Mixes Full VR Body And Hand-Tracking For Multiplayer Use

Manus, the VR company that offers enterprise-level hand-tracking solutions, this week revealed a new full-body tracking solution to complement its existing offerings.

Called Manus Polygon, the system works using Vive Tracker pucks. The original Manus solutions already use two of these sensors attached to the backs of your hands to track their positions in virtual space. The sensors utilize the same SteamVR base-stations that track the HTC Vive and Valve Index headsets.

Manus Polygon GIF

But Polygon adds another Tracker to your waist and two on your feet. We’ve seen full body tracking of this sort in plenty of other applications before (HTC itself was keen to highlight its use). But it’s in the mix of hand-tracking that makes this solution stand out.

In theory, this could take Manus a step beyond some competing VR body tracking systems. Other body tracking systems do sometimes offer finger-tracking, but we’ve tried Manus’ gloves, which are ready to slip on and calibrate. We haven’t tried Polygon itself, but that ease of use could be key. Crucially, the system comes with what Manus says is an easy calibration system that users themselves can operate. Polygon also comes with multiplayer support.

We took a look at the Manus VR gloves in a recent episode of our VR Culture Show. You can see it in action below.

Polygon will be arriving with enterprise customers this June. A price hasn’t been announced but, considering the Manus gloves themselves start at €2990, we wouldn’t hold our breath for a more consumer-friendly option.

You can find out more about Manus Polygon here.

The post Manus Polygon Mixes Full VR Body And Hand-Tracking For Multiplayer Use appeared first on UploadVR.

Watch: Antilatency’s 10-Person, Full-Body Tracking System

A new video from Antilatency shows a system that uses full-body tracking in VR on 10 users simultaneously, allowing them to interact with virtual representations of each other. The system positionally tracks each person’s feet, head and hands.

The system uses Antilatency’s ‘Bracer’ and ‘Tag’ tracking devices, which are small radio sockets that can be added onto existing HMDs to provide additional tracking capability. At CES 2019, these devices were used to turn the Oculus Go, a 3DoF headset, into a 6DoF headset with increased tracking capabilities and multi-user support.

In January, Antilatency expanded support for these custom tracking peripherals to include the Oculus Quest, providing new tracking options for location-based VR experiences using the mobile headset.

In this new video, Antilatency uses two Bracers on user’s hands and two Tags on user’s feet to provide a total of five tracking points, when including positional data of the headset.

The video shows an experience where 10 users are all interacting at once, with five points of tracking each, allowing for a deeper sense of immersion and realism for the user’s VR avatars. Antilatency says the session used Pico G2 headsets, with the trackers using a proprietary low-latency radio protocol. To avoid confusion and interference, each user has their own radio channel in the 2.4Ghz range to communicate between the tracking peripherals and the headset.

After receiving and processing the user’s tracking data locally, each headset then shares this information with all of the other headsets across a 5Ghz WiFi network to sync up each user. A PC was added into the system to create the demonstration video and visualise the whole experience, but otherwise would not be needed.

You can read more about Antilatency’s tracking peripherals here. 

The post Watch: Antilatency’s 10-Person, Full-Body Tracking System appeared first on UploadVR.

Half-Life Alyx Footage, Nreal Hand-Tracking & Win Eleven Table Tennis VR – VRecap

New Half-Life: Alyx footage, new hand-tracking tech, new VRecap – let’s go!

Did you check out the new Half-Life: Alyx footage released this week? We got a good look at gameplay, including enemies, weapons, and incredible physics from opening doors to interactions with nearly every object in sight. The thing that got me the most was the many injections to your hands for health.

Facebook has hinted it’s looking into creating a middle ground platform for games that can’t quite make it onto the Oculus Store. It looks like it could be a good deal for developers to take advantage of if they’re not initially elegible to get on Quest, and it makes us wonder if we’re going to start seeing Early Access titles on the Oculus Store.

And finally, Nreal only just announced its new hand-tracking support for the Nreal Lite today. We tried out the XR headset back at CES, which you can read about right here.

This week’s competition gives you a chance to win one of our five Eleven Table Tennis VR codes! Enter via the Gleam below – best of luck!

GIVEAWAY: Win A Free Copy Of Eleven Table Tennis VR On Oculus Quest!

This week saw more than three big stories and you can check them out here:

Alright, that’s it from us! Make sure to check out our Twitter and Facebook over the weekend for additional VR content if you haven’t had enough this week. Ciao!

The post Half-Life Alyx Footage, Nreal Hand-Tracking & Win Eleven Table Tennis VR – VRecap appeared first on UploadVR.

Nreal AR Glasses To Get Controller-Free Hand Tracking Soon

Nreal Light AR glasses will get controller-free hand tracking in a near-future software update, thanks to a partnership with Qualcomm-backed Clay AIR.

Nreal Light

Nreal is a China-based company founded in 2017 with the goal of delivering lightweight consumer AR glasses before the major tech companies. Their first product is called Nreal Light. Instead of having on-board processing, the Nreal Light glasses are tethered to either a high end recent Android phone or an Nreal compute pack.

Specifically, the company claims the glasses will work with any Android phone which uses the Snapdragon 855 processor. That should include the Samsung Galaxy S10, Google Pixel 4, OnePlus 7, Galaxy Note 10, and more.

Nreal opened preorders back in November for a “developer kit” which includes the glasses and compute pack, for around $1200. The glasses alone are expected to be priced around $500 to consumers.

Nreal planned to ship the Nreal Light in 2019, but delayed until spring 2020. However, production was recently halted due to the novel coronavirus.

Controller-Free Hand Tracking

Nreal’s controller-free hand tracking is powered by California-based optical hand tracking company Clay AIR, which is backed by Qualcomm.

Developers can use the SDK to display the user’s hands as full hands, separate fingers, a bounding box, cursor or customized skins. The SDK will report when the user makes gestures such as pinch, point, grab, swipe and zoom.

This should make for a significantly improved user experience over Nreal’s previous default input method: your smartphone used as a 3DoF rotational laser pointer (similar to an Oculus Go controller). You can still opt for 6DoF controllers via Finch if you want, for an extra $200.

Nreal is not the first AR headset to get hand tracking. It’s a feature present in both Microsoft’s HoloLens 2 and the Magic Leap One. But it is the first to do so without a depth sensor, just like Oculus Quest was the first VR device to do this. Real time hand tracking without hardware-level depth information is significantly more difficult to do, requiring state of the art machine learning algorithms.

AR like Nreal is trying to build is extremely hard to do well. Outside-the-home AR faces a larger set of problems to overcome as compared with inside-the-house VR. A range of lighting conditions and an ever-changing world present significant hurdles for both AR displays and tracking, and our last few demos of Nreal were not without issue.

The fact that this hand tracking is provided by a third party (Clay AIR) also raises the possibility that other manufacturers, including potential future Oculus Quest competitors, could integrate this same technology.

We’ve seen the interesting ways developers have been experimenting with controller-free hand tracking in the VR space on Quest. If Nreal is able to sort out the production problems and gets this hardware out to a wider audience, we’ll keep a close eye on what possibilities it opens for developers in AR.

The post Nreal AR Glasses To Get Controller-Free Hand Tracking Soon appeared first on UploadVR.

You Can Grab HandSpace On SideQuest With Your Tiny Finger Hands

Your fingers are fitted with tiny hands in the new SideQuest app HandSpace.

The hands can also become floppy, long, small, big, wrong, or kaleidoscopic. The new controller-free hand tracking app from Daniel Beauchamp, aka viral sensation @pushmatrix on Twitter, offers perhaps the most mind-bending use yet of Facebook’s experimental Oculus Quest hand tracking.

We’ve covered a number of Beauchamp’s experiments as the VR/AR development lead at Shopify rapidly prototypes new ideas on the standalone Quest. Each new concept draws massive attention on Twitter, where he first published videos showing the ideas. There was finger walking — an interesting gesture concept that would enable simulated movement by “walking” your fingers from one hand across your other hand’s palm. And there was the detachable hand you could throw across the room and it would “walk” to a destination like the Addams Family’s Thing. In recent days we’ve seen a yo-yo, fingerboarding and more.

HandSpace is the first piece of Quest hand-tracking software you can actually download from Beauchamp. Like most controller-free hand tracking software on SideQuest it is just a small experiment to play around with. You’ll simply clap your hands to switch between different hand styles, including affixing all 10 digits with tiny versions of your matching hand pose.

For those of you with Oculus Quests here is the link to the software on SideQuest and instructions for sideloading content onto the standalone headset from a PC here. For those of you without the headset, or afraid of the strange finger hands, here’s a video I captured showing how it works:

And here are videos of some of Beauchamp’s previous experiments:

 

The post You Can Grab HandSpace On SideQuest With Your Tiny Finger Hands appeared first on UploadVR.

Pimax to Make Ultraleap Hand Tracking Available Across Entire Headset Range

Ultraleap’s hand tracking technology has found its way into enterprise virtual reality (VR) headsets like VRgineers’ XTAL or the Varjo VR-2 Pro but for consumers, it was a case of tacking on the Leap Motion controller. But thanks to a new collaboration with Pimax that could soon be a thing of the past.

Pimax - Ultraleap Module

The two companies have confirmed that all of Pimax’s product range will be able to use a new module which neatly connects to the underside of the headset. That means not only will Pimax’s latest 8K X and 8K Plus flagship headsets be compatible but even the entry-level Artisan will be.

Featuring Ultraleap’s latest hand tracking technology the module specifications include a stereoscopic IR camera that creates an interaction zone of up to 100cm (40″) range, extending from the device in a 160×160[AH1] ° field of view (FoV). It’ll be a plug and play solution working in conjunction with Ultraleap’s software platform.

“We’ve been working with the Ultraleap team for some time now and we are so excited to be able to officially bring this amazing technology to the gaming and immersive computing community,” said Kevin Henderson, COO of Pimax, in a statement. “As the most advanced hand tracking available, we can’t wait to see our supporters play around with this addition and finally have the power of natural interaction for all of their VR experiences. Final production arrangements are well underway and we expect to make these modules available to the Pimax community in Q2 2020.”

“Using our hands, as we do in the physical world, is a fundamental element needed as we interact with the digital world. What’s so important about this agreement is that it means even more people will be able to experience the magic of hand tracking,” adds Steve Cliffe, CEO of Ultraleap.

With the $169.95 USD Ultraleap hand tracking module slated to be available in the next few months, you can head to its listing on Pimax’s website to sign up for notifications.

Pimax unveiled its latest 8K resolution headsets late last year before showcasing them at CES 2020 last month. VRFocus will continue its coverage of Pimax, reporting back with its latest hardware updates.

More Amazing Oculus Quest Hand-Tracking Demos Show Yo-Yos & Nightmares

Yet more amazing Oculus Quest hand-tracking demos have surfaced online over the past week.

Last week we covered a bunch of new demos from developer Daniel Beauchamp, who showcased Quest hand-tracking used for finger skating and more. Since then, Beauchamp revealed two more impressive videos. The first is for this VR yo-yo demo:

As you can see the string of the yo-yo is attached to the user’s virtual finger, allowing them to perform realistic tricks and get tangled up. Even without the weight of a yo-yo in-hand, this looks like it would be more intuitive than doing the same actions with a Touch controller.

Next up is this decidedly freakier demo, in which the tips of each finger are replaced with… smaller hands mirroring the same movements.

It reminds us of a scene from the Dr. Strange movie. Extending all of the fingers creates a strange and somewhat disgusting pattern. Honestly, we’re kind of glad we haven’t seen this one in VR ourselves; we’re not sure we’d be able to sleep afterwards.

We’re not just talking new demos from Beauchamp this time around, though. Holoception creator Dennys Kuhnert also revealed a pretty incredible demo this week, which looks a little like a Quest version of the Aperture Hand Lab demo for the Valve Index controllers.

It’s a pretty amazing bit of work; by mirroring the user’s arm movements and adding in physics, the user is able to shake hands with their virtual doppleganger and even play a game of rock, paper, scissors.

Facebook still hasn’t opened up hand-tracking for release in apps in the Oculus Store, but there are plenty of interesting games that utilize the feature over on SideQuest. We’ll keep covering any cool Quest hand-tracking demos we find, so check back!

The post More Amazing Oculus Quest Hand-Tracking Demos Show Yo-Yos & Nightmares appeared first on UploadVR.

Oculus ‘Designing for Hands’ Document Introduces Best Practices for Quest Hand-tracking

Hand-tracking on Quest rolled out as an experimental feature in late 2019, but Oculus is letting it gestate before it will accept third-party apps with hand-tracking. In the meantime, the company has published fresh developer documentation which establishes best practices for working within the limitations of Quest hand-tracking.

Hand-tracking brings many benefits to Quest, especially ease-of-use. And while Oculus’ first stab at the feature is reasonably solid, there’s still limitations around accuracy, latency, pose detection, and tracking coverage. To help developers best work within the limitations of the system, a new section of the Oculus developer documentation called ‘Designing for Hands‘ offers up practical advice and considerations.

SEE ALSO
Three Totally Creative Uses of Quest Hand-tracking

“In these guidelines, you’ll find interactions, components, and best practices we’ve validated through researching, testing, and designing with hands. We also included the principles that guided our process,” the documentation says. “This information is by no means exhaustive, but should provide a good starting point so you can build on what we’ve learned so far. We hope this helps you design experiences that push the boundaries of what hands can do in virtual reality.”

The document notes the challenges that come with the territory, and reminds developers to “remember that hands aren’t controllers.”

There are some complications that come up when designing experiences for hands. Thanks to sci-fi movies and TV shows, people have exaggerated expectations of what hands can do in VR. But even expecting your virtual hands to work the same way your real hands do is currently unrealistic for a few reasons.

  • There are inherent technological limitations, like limited tracking volume and issues with occlusion
  • Virtual objects don’t provide the tactile feedback that we rely on when interacting with real-life objects
  • Choosing hand gestures that activate the system without accidental triggers can be difficult, since hands form all sorts of poses throughout the course of regular conversation

You can find solutions we found for some of these challenges in our Best Practices section.

[…]

It’s very tempting to simply adapt existing interactions from input devices like the Touch Controller, and apply them to hand tracking. But that process will limit you to already-charted territory, and may lead to interactions that would feel better with controllers while missing out on the benefits of hands.

Instead, focus on the unique strengths of hands as an input and be aware of the specific limitations of the current technology to find new hands-native interactions. For example, one question we asked was how to provide feedback in the absence of tactility. The answer led to a new selection method, which then opened up the capability for all-new 3D components.

It’s still early days, and there’s still so much to figure out. We hope the solutions you find guide all of us toward incredible new possibilities.

The ‘Interactions‘ section of the document offers some of the most practical advice for how developers should consider allowing users to interact with the virtual world using hand-tracking.

A clear distinction is made between Absolute and Relative interactions; the former meaning objects directly touched by the user and controlled 1:1, with the latter being about how to control objects at a distance in discrete ways, like rotating an object around one axis.

The ‘User Interface Components‘ section makes specific suggestions about how things like buttons and menus should work, and how they should be sized to complement the accuracy of Quest’s hand-tracking. There’s also some examples shown of more complex interface modules, like toggle switches, radial selectors, and scrolling lists.

Oculus says they aren’t yet accepting hand-tracking applications onto Quest. In the future they plan to graduate hand-tracking from an experiment to a full fledged feature, and when they do they will open the door to apps which use the feature. The company hasn’t given any indication as to when that will happen, but we’d expect some time in 2020.

SEE ALSO
Hands-on: Quest Hand-tracking Will be Great for Casual Input, But Core Games Will Still Rely on Controllers

As for hand-tracking on Rift S—Oculus has only announced the feature for Quest and has not yet committed to bringing hand-tracking to Rift S.

The post Oculus ‘Designing for Hands’ Document Introduces Best Practices for Quest Hand-tracking appeared first on Road to VR.