Vacation Simulator Developer Teases Quest Hand Tracking Support

A new teaser video suggests that hand tracking support could be added to Vacation Simulator on Oculus Quest.

Devin Reimer, CEO of Google-owned developer Owlchemy Labs, posted a video to Twitter showing a scene from Vacation Simulator, except with the player using hand tracking to make a peace and thumbs up sign. In a reply, Reimer mentioned that this was definitely not the Valve Index hand tracking at work. While there’s no indication of any release date, overall this looks like a pretty strong indication that Vacation Simulator on Quest and Quest 2 might receive an update that adds in hand tracking as an input method.

We’ll of course need to see details on exactly how much of the game is transferable to controller-free hand tracking. Owlchemy’s games are so familiar and any quality loss to the stability of gesture recognition would likely be instantly recognizable. Not a lot of games have added hand tracking support since the feature’s launch, with Richie’s Plank Experience, Waltz of the Wizard, Virtual Desktop and others being some of the exceptions.

Both of Owlchemy’s ‘Simulator’ games are now available on Oculus Quest and they were recently enhanced for Oculus Quest 2. Improvements for both titles include some visual upgrades, removing foveated rendering and support for 90Hz on Quest 2, which is rolling out now as part of the v23 update.

Vacation Simulator also recently received the free Back to Job expansion, which adds in several mechanics from the original Job Simulator game into Vacation Simulator.

Job Simulator and Vacation Simulator are available on most VR platforms. You can read our review of the latter here.

Facebook Researchers Help Typists Match Speed Without Physical Keyboard

Researchers at Facebook developed a predictive motion model for marker-based hand tracking that could enable some typists to match their speed and accuracy with a physical keyboard while only tapping their fingers on a flat surface.

I’ve emphasized some of the most notable bits announced by Facebook in a blog post today:

To support touch typing without a physical keyboard — and without the benefit of haptic feedback from individual keys — the team had to make sense of erratic typing patterns. They adopted statistical decoding techniques from automatic speech recognition, and where speech recognition uses an acoustic model to predict phonemes from audio frames they instead use a motion model to predict keystrokes from hand motion. This, along with a language model, predicts what people intended to type despite ambiguous hand motion. Using this new method, typists averaged 73 words per minute with a 2.4% uncorrected error rate using their hands, a flat surface, and nothing else, achieving similar speed and accuracy to the same typist on a physical keyboard.

This surprising result led researchers to investigate why hand tracking was more effective than other physical methods, like tapping on a tablet. The team discovered hand tracking is uniquely good at isolating individual fingers and their trajectories as they reach for keys — information that is missing from capacitive sensing on tablets and smartphones today.

You can see in the video below a person with markers on their hands with external tracking cameras:

I asked a Facebook representative if Quest 2’s XR2 processor might enable any specific features for hand tracking, and if the marker-based research here might transfer to a markerless system as seen with Quest’s hand tracking.

“This marker-based hand tracking system is still purely in the research phase and not currently on our product roadmap — it’s one of several text input research methods FRL Research is exploring, including the EMG-based approach Michael Abrash showed last month at Facebook Connect,” a spokesperson wrote in an email. “For Quest 2, our current focus for text input is tracked keyboard support we introduced at Connect as part of Infinite Office.”

We found multiple text input methods on Oculus Quest 2 in the Oculus Browser prior to a launch software update. There was a mode that would let you tap out letters on your phone and send them to your Quest, as well as voice dictation. Facebook frequently rolls out experimental features for its standalone VR headsets as opt-in testing betas for a period before improving performance and finalizing the feature more broadly, with Oculus Link and Hand Tracking itself being the most significant examples. Right now, if you connect a Bluetooth keyboard and don’t use the tracked controllers, Quest gets confused about whether you are typing input on a physical keyboard or pinching to select a letter on a floating virtual keyboard with hand tracking. That’s likely to change with the forthcoming Infinite Office update and a tracked keyboard.

So while typing on any flat surface is just one of several methods being explored for text entry in AR and VR headsets by Facebook, the company did note the research was “a major step towards FRL Research’s goal of using hand-tracking to let you type 70 WPM on any surface.”

Facebook Develops Hand Tracking Method to Let You Touch Type Without Needing a Keyboard

Some of the most basic questions surrounding AR/VR tech aren’t entirely solved yet, like making text input a comfortable and familiar experience. Facebook’s Reality Labs (FRL) today revealed new research into hand tracking which aims to bring touch typing to AR/VR users, all without the need of a physical keyboard.

There’s already basic hand tracking on Quest which lets you navigate system UI, browse the web, and play supported games like Waltz of the Wizard: Extended Edition (2019) without the need of Touch controllers, instead letting you reach out with your own to hands to cast spells and manipulate objects.

As interesting and useful those use cases may be, we’re still very much in the infancy of hand tracking and its potential uses for virtual reality. Using your fingers as glorified laser pointers on a virtual keyboard reveals just how much of a gap there is left in natural VR input methods. On that note, Facebook researchers have been trying to build out hand tracking to even more useful applications, and their most recent is aimed at solving some of the most frustrating things in VR/AR headset users to this day: text input.

VR keyboards haven’t evolved beyond this, Image courtesy Virtual Desktop

Facebook today revealed that its FRL researchers used a motion model to predict what people intended to type despite the erratic motion of typing on a flat surface. The company says their tech can isolate individual fingers and their trajectories as they reach for keys—information that simply doesn’t exist on touch screen devices like smartphones and tablets.

“This new approach uses hand motion from a marker-based hand tracking system as input and decodes the motion directly into the text they intended to type,” FRL says. “While still early in the research phase, this exploration illustrates the potential of hand tracking for productivity scenarios, like faster typing on any surface.”

One of the biggest barriers to overcome was “erratic” typing patterns. And without the benefit of haptic feedback, researchers looked to other predictive fields in AI to tackle the issue of guessing where fingers would logically go next. FRL says it researchers borrowed statistical decoding techniques from automatic speech recognition, essentially replacing phenomes for hand motion in order to predict keystrokes—that’s the short of it anyway.

“This, along with a language model, predicts what people intended to type despite ambiguous hand motion. Using this new method, typists averaged 73 words per minute with a 2.4% uncorrected error rate using their hands, a flat surface, and nothing else, achieving similar speed and accuracy to the same typist on a physical keyboard,” the researchers say.

With its insights into hand tracking, Facebook is undoubtedly preparing for the next generation of AR headsets—the ‘always on’ sort of standalone AR headsets that you might wear in the car, at work, at home and only take off when it’s time to recharge. Using Quest 2 as a test bed for AR interactions sounds like a logical step, and although the company hasn’t said as much, we’re hoping to see even more cool hand tracking tech pushed out for experimental use on the new, more powerful standalone VR headset.

The post Facebook Develops Hand Tracking Method to Let You Touch Type Without Needing a Keyboard appeared first on Road to VR.

Qualcomm Adds Ultraleap’s Hand Tracking To Latest XR Headset Reference Design

Ultraleap and Qualcomm signed a multi-year agreement that will see Ultraleap’s hand tracking technology integrated into the Snapdragon XR2 5G reference design.

The XR reference design for these Qualcomm’s chips exists to show what’s possible with the processor and its integrated technology. Qualcomm processors have been used in the Oculus Quest and now-discontinued Oculus Go headsets, and are likely to be the go-to option for most manufacturers looking to produce a standalone, mobile VR headset.

Here’s an excerpt from Ultraleap on the announcement:

Ultraleap’s fifth generation hand tracking platform, known as Gemini, will be pre-integrated and optimised on the standalone, untethered Snapdragon XR2 5G reference design, signalling a significant step change for the XR space. The Gemini platform delivers the fastest, most accurate and most robust hand tracking and will provide the most open and accessible platform for developers.

The inclusion of Ultraleap’s Gemini hand tracking platform into the reference design for the new Qualcomm chips could provide would-be Quest competitors with a viable method to integrate hand tracking technology into a similar standalone headset experience.

Hand tracking offers a route for interaction with VR content that could conceivably unlock more natural social interactions between VR headsets. Last year, Oculus Quest launched native hand tracking support which can be adopted by developers into their own apps. Ultraleap’s hand tracking technology — formerly known as Leap Motion — has been used for years by VR developers exploring the cutting edge of interaction design.

“The compatibility of our technology with the Snapdragon XR2 5G Platform will make the process of designing hand tracking within a very wide variety of products as simple as pick and place,” said Steve Cliffe, CEO of Ultraleap, in a prepared statement.

The Line Wins Emmy Award For Innovation In Interactive Programming

Brazilian VR developers ARVORE won a juried Emmy Award for Outstanding Innovation in Interactive Programming for its immersive VR narrative The Line, which released earlier this year.

The Line is available on Oculus Quest and remains one of the best immersive narrative experiences available on the headset. It’s only about 15 minutes in length and supports hand tracking. The story set in a scale model of 1940s São Paulo follows Pedro, a miniature doll and newspaper delivery man. He runs the same route around the model every day, leaving a flower outside the house of Rosa, the girl he loves, each time. It’s a charming little tale with a beautiful amount of detail in the models.

“I am at a loss for words on how to describe what it is like for all of the hard work that our amazing team put into ‘The Line’ to be recognized with such a prestigious award as a Primetime Emmy,” said Ricardo Justus, CEO of ARVORE, in a prepared statement. “To receive this recognition, particularly in an innovation category, is truly an incredible achievement for us, which validates our dedication and our vision for the future of immersive technologies as a storytelling medium.”

The Line’s Emmy award is part of a juried category, which means entrants are screened and the winner selected by a panel of professionals. It is also the first Emmy award won by a Brazilian company.

The full Emmy Awards ceremony is set to take place later this month, where we expect to find out who won in the two other categories featuring VR nominees.

The Line is available on Oculus Quest for $4.99.

Facebook Comments On Horizon Hand-Tracking, Releasing On Other Headsets

Last week we got the chance to briefly test out Facebook Horizon and chat with key members of the development team about a wide range of topics. You can read our hands-on impressions and some details about privacy concerns here.

During that interview though, we dove into several other topics as well such as possible hand-tracking support in Horizon and even the potential for release on other headsets, officially, later down the line.

Specifically, this is what Ari Grant, Horizon’s Product Management Director, had to say on the topic of hand-tracking support in Facebook Horizon:

“We’re really excited about all of the upcoming platform features, hardware changes, where the entire industry is going and to include as much as we can in Horizon to give people deep presence, to really feel there with people and can connect.

And so really do want to look to incorporate as many of these as possible. We don’t have specific dates to announce that any of the features yet, but in general, trying to really optimize toward friendship and community, helping people, foster connections, the things that help boost those things are going to be a priority.

It is really, really important that this is a place where people can connect. So really the features that help do so are definitely going to be important for us to, to build much more near term.”

Shifting gears a bit, we proceeded to ask them what their plan was to incentivize an early influx of games and developer talent for inside of Horizons itself beyond just the community of players/makers.

This is what Meaghan Fitzgerald, Facebook Reality Labs’ Head of Experiences for Product Marketing, had to say:

“That’s an interesting question. We’re not working with any of the game studios, first-party, in-house. I think they probably would bring an interesting perspective, but we’re kind of working on different things right now. But I think we’re, we’re interested to see how people with a range of skills jump in and use the world creation tools.

And I think that it’s been interesting to see how someone with a little bit more of a coding or a world creation background is able to create like much more interactive scaled experiences, multiplayer games, and then somebody with no coding experience at all can make something that’s just a really pleasant environment to hang out in.

So we’re really excited for people with a lot of like background in game development to come in and stress, test the tools with us and give us that feedback. But also recognize that there is also a place for that next level of game development to make the VR ecosystem a lot broader.”

Now since Horizon is made by Facebook it’s natural to assume it would only officially work on actual Oculus headsets. And as a result, that is in fact the plan. We’ve got full details on the required Facebook account to login and play Oculus VR content in years’ to come.

In the meantime, I had to ask: Will Horizon ever come to other platforms and headsets like PSVR or all of the Steam VR PC devices? This was Grant’s response:

“Currently we’re focused on making Horizon a really awesome product on the Rift platform and Quest, and we have nothing else to share about this only about other platforms…

We think the creation tools in VR are really great. One of the things about, you know, a lot of creation tools is creating with a mouse and keyboard is a lot to learn. You go to learn these 3D animation software, learn how to control cameras and a lot of nuance stuff. And in VR, you can just place it with your hands and build very directly.

And so making that fun, easy and accessible is, our current focus and don’t have anything to share about other platforms yet. The one thing I will say is we are trying to build communities and connect folks, and that is our ultimate north star is building community is connecting to your friends. And so using that as our north star, whatever we can do to help connect people will be our primary motivator, recognizing that not everyone’s in VR, but still currently focused on VR initially to build the community.”


Hopefully you enjoyed checking out the interview and will consider reading our hands-on preview for more details.

Qualcomm Signs “Multi-year” Deal to Bring Ultraleap Hand-tracking to XR2 Headsets

Qualcomm and Ultraleap today announced a “multi-year co-operation agreement” that will bring Ultraleap’s controllerless hand-tracking tech (formerly of Leap Motion) to XR headsets based on the Snapdragon XR2 chipset. Ultraleap claims to have the “fastest, most accurate, and most robust hand tracking.”

Snapdragon XR2 is Qualcomm’s latest made-for-XR chip which the company has touted as being the ideal foundation for standalone XR headsets.

The leading standalone VR headset, Oculus Quest, has been increasingly focusing on controllerless hand-tracking as a means of input for the device. Other major headset makers, like Microsoft and its HoloLens 2, have also honed in on hand-tracking as a key input method. As industry leaders coalesce around hand-tracking, it becomes increasingly important for competing devices to offer similar functionality.

But hand-tracking isn’t a ‘solved’ problem, making it a challenge for organizations that don’t have the resources of Facebook and Microsoft to work out their own hand-tracking solution.

Over the years Qualcomm has been working to reduce the barrier to entry to making a standalone XR headset by offering ready-made technologies—like inside-out tracking—alongside its chips. Now the company is announcing that its XR2 chip will be optimized for Ultrealeap hand-tracking out of the box.

While Qualcomm and Ultraleap have previously worked together on this front, the Ultraleap hand-tracking solution offered through Qualcomm was tied to Ultraleap’s hand-tracking hardware. The new announcement means that Ultraleap’s hand-tracking software is being offered independent of its hardware. This makes it a more flexible and cost-effective solution, with the hand-tracking software ostensibly making use of a headset’s existing inside-out tracking cameras, rather than requiring an additional cameras just for hand-tracking; this also frees up two of XR2’s seven supported camera slots for other uses like eye-tracking, mouth, tracking, and more.

Qualcomm and Ultraleap say the hand-tracking tech will be “pre-integrated” and “optimized” for XR2. It isn’t clear if this simply means that Ultraleap hand-tracking will be available as a service in the XR2 software stack, or if XR2 will include special hardware to accelerate Ultraleap hand-tracking, making it more power and resource efficient.

SEE ALSO
Leap Motion 'Virtual Wearable' AR Prototype is a Potent Glimpse at the Future of Your Smartphone

Despite being a years-long leader in hand-tracking technology, Ultraleap (formerly Leap Motion) hassn’t managed to get its solution to catch on widely in the XR space. Now that hand-tracking is seeing greater emphasis from leading companies, Ultraleap’s camera-agnostic solution on XR2 could be the moment where the company’s hand-tracking tech begins to find significant traction.

The post Qualcomm Signs “Multi-year” Deal to Bring Ultraleap Hand-tracking to XR2 Headsets appeared first on Road to VR.

Oculus Browser Adds Experimental WebXR Hand Tracking Support For Oculus Quest

Oculus Browser has rolled out support for experimental hand-tracking API support with WebXR, and some demos are already available to try out.

In a tweet last week, Oculus Browser Product Manager Jacob Rossi announced that WebXR in Oculus Browser now supports the hand tracking API for Oculus Quest. This allows developers to create WebXR experiences where users only have to use their hands while in VR, and can forgo Touch controllers in a similar manner to select games and apps available on the Oculus Store.

Oculus Browser already supported hand tracking as an input method while browsing the web, however this update add support for developers who want to integrate hand tracking into a proper WebXR experience in the browser. Some developers have already created some proof of concept experiences as well, such as this demo developed by Marlon Lückert that is available to try online now if you have an Oculus Quest.

As you can see from the video, this implementation is pretty basic but is also just meant to show off that the support exists and works.

If you want to try the demo out for yourself, just head over to webxr-handtracking.vercel.app on your Quest browser. However, you are going to need to perform a few one-time setup steps, as outlined when you visit the address. It essentially boils down to going to chrome://flags/ on your Quest browser and enabling WebXR experiences with joints tracking and disabling WebXR experiences with hands tracking. You’ve also got to make sure that automatic switching between hands and controllers is enabled in your Quest settings.

This is just the beginning of hand tracking implementation in WebXR — we expect to see a lot more uses of the feature in the future.

The post Oculus Browser Adds Experimental WebXR Hand Tracking Support For Oculus Quest appeared first on UploadVR.

Oculus Browser Gets Experimental Hand-tracking Support on Quest

A newly released update for the Oculus Browser on Quest includes full-finger hand-tracking support for WebXR projects.

The feature was quietly released in the recent 10.2 build of Oculus Browser, which now comes with a hand-tracking API and timewarp layer support, both of which are considered ‘experimental’ at this time.

The 8.0 version of Oculus Browser released back in February was the first to include initial hand-tracking support for WebXR projects, although it was focused on controller emulation and not true five-finger tracking.

SEE ALSO
Dev Tool Uses Quest Hand-tracking to Quickly Model Realistic Hand Poses for VR

Oculus Browser Product Manager Jacob Rossi announced the news via Twitter, providing some examples via a WebXR test page and layer page which demonstrates hand-tracking and timewarp layer support respectively.

Check out a demo in the familiar WebXR solar system scene featuring the new hand-tracking support:

Hand-tracking came to Oculus Quest as an experimental feature back in late 2019. The intuitive control scheme has since come out of beta and is now available for all Quest owners to use system-wide, letting you control the basic functions of Quest without the need of Touch controllers.

Moreover, there are an increasing number of games and cool experimentations that make use of Quest’s native hand-tracking abilities. We’re hoping to see more soon using the full five-fingered hand-tracking support on Quest.

As for timewarp support, just like on non-WebXR-based apps timewarp support allows the headset to reproject past frames when the scene can’t maintain its native refresh rate, which in Quest’s case is 72Hz.

The post Oculus Browser Gets Experimental Hand-tracking Support on Quest appeared first on Road to VR.

Dev Tool Uses Quest Hand-tracking to Quickly Model Realistic Hand Poses for VR

Grabbing objects in VR may be one of the medium’s most fundamental interactions, but making it work well isn’t as easy as you might think. Developers often need to spend time hand animating the hand model so that it appears to hold each object in a realistic way. Developer Luca Mefisto has built a smart tool which uses Quest’s hand-tracking to enable developers to motion-capture hand-poses, making the whole ordeal quicker, more realistic, and ultimately more immersive for players.

Update (July 20th, 2020): Developer Luca Mefisto has released the first version of his HandPosing tool which uses Oculus Quest hand-tracking to quickly author realistic hand-poses for virtual reality interactions. The tool is available on GitHub.

“This is a work in progress, and things are subject to change. I hope it serves others either as a useful tool or at least as a starting point for their grabbing-interaction implementations,” Mefisto writes.

The original article below, which outlines the benefits and functions of the HandPosing tool, continues below.

Original Article (July 7th, 2020): Some VR games employ various methods of ‘dynamic’ animation to create realistic hand poses when players grab objects in VR (Lone Echo, for instance). Generally that’s proprietary tech, which means any developers wanting to do the same would need to build a similar system from scratch (not an easy task).

Rather than do that, some games cut out the hand-posing problem entirely by simply making your virtual hands disappear when you grab objects (Vacation Simulator, for instance).

Developers that want to keep the player’s hands visible need to create hand-poses manually so that when you grab an object, your virtual hand grips the object in a realistic way. It’s not that this is a difficult task per se, but it can be immensely time consuming.

At minimum you need one custom hand-pose for every uniquely shaped object in a given game. Even then, consider how many different ways players might want to hold a single object… even if you cut out unlikely poses, you still may need four or five poses for a single object to cover the most obvious grips. If there’s 100 uniquely shaped objects in a game, that could mean animating 400 or 500 hand-poses.

SEE ALSO
Three Totally Creative Uses of Quest Hand-tracking

VR developer Luca Mefisto wants to make this whole process quicker and easier—allowing developers to make more realistic poses in less time. He’s building a tool which smartly leverages Quest’s hand-tracking feature to allow developers to take a ‘snapshot’ of their own of their own hand gripped around virtual objects.

The tool then allows developers to define valid positions for the pose, allowing the hand to snap realistically to the nearest valid position on the object.

Objects can also have multiple poses and grabbing points to cover different ways of grabbing the same object (like the scissors below).

Though the tool requires Quest’s hand-tracking for creating the poses, Mefisto says the hand-pose tool will work for games that employ hand-tracking or controllers.

Though the tool is so far unnamed, the developer plans to release it as an open source project to the VR development community. You can follow Mefisto on Twitter to see updates on the tool’s development and keep an eye out for its release.

The post Dev Tool Uses Quest Hand-tracking to Quickly Model Realistic Hand Poses for VR appeared first on Road to VR.