Vision Pro and Quest 3 Hand-tracking Latency Compared

Vision Pro is built entirely around hand-tracking while Quest 3 uses controllers first and foremost, but also supports hand-tracking as an alternate option for some content. But which has better hand-tracking? You might be surprised at the answer.

Vision Pro Hand-tracking Latency

With no support for motion controllers, Vision Pro’s only motion-based input is hand-tracking. The core input system combines hands with eyes to control the entire interface.

Prior to the launch of the headset we spotted some footage that allowed us to gauge the hand-tracking latency between 100-200ms, but that’s a pretty big window. Now we’ve run our own test and more precisely find Vision Pro’s hand-tracking to be about 128ms on visionOS beta v1.1.1.

Here’s how we measured it. Using a screen capture from the headset which views both the passthrough hand and the virtual hand, we can see how many frames it takes between when the passthrough hand moves and when the virtual hand moves. We used Apple’s Persona system for hand rendering to eliminate any additional latency which could be introduced by Unity.

After sampling a handful of tests (pun intended), we found this to be about 3.5 frames. At the capture rate of 30 FPS, that’s 116.7ms. Then we add to that Vision Pro’s known passthrough latency of about 11ms, for the final result of 127.7ms of photon to hand-tracking latency.

We also tested how long between a passthrough tap and a virtual input (to see if full skeletal hand-tracking is slower than simple tap detection), but we didn’t find any significant difference in latency. We also tested in different lighting conditions and found no significant difference.

Quest 3 Hand-tracking Latency

How does that compare to Quest 3, a headset which isn’t solely controlled by the hands? Using a similar test, we found Quest 3’s hand-tracking latency to be around 70ms on Quest OS v63. That’s a substantial improvement over Vision Pro, but actual usage of the headset would make one think Quest 3 has even lower hand-tracking latency. But it turns out some of the perceived latency is masked.

Here’s how we found out. Using a 240Hz through-the-lens capture, we did the same kind of motion test as we did with Vision Pro to find out how long between the motion of the passthrough hand and the virtual hand. That came out to 31.3ms. Combined with Quest 3’s known passthrough latency of about 39ms that makes Quest 3’s photon to hand-tracking latency about 70.3ms.

When using Quest 3, hand-tracking feels even snappier than that result suggests, so what gives?

Because Quest 3’s passthrough latency is about three-and-a-half times that of Vision Pro (11ms vs. 39ms), the time between seeing your hand move and your virtual hand move appears to be just 31.3ms (compared to 116.7ms on Vision Pro).

– – — – –

An important point here: latency and accuracy of hand-tracking are two different things. In many cases, they may even have an inverse relationship. If you optimize your hand-tracking algorithm for speed, you may give up some accuracy. And if you optimize it for accuracy, you may give up some speed. As of now we don’t have a good measure of hand-tracking accuracy for either headset, outside of a gut feeling.

The post Vision Pro and Quest 3 Hand-tracking Latency Compared appeared first on Road to VR.

Meta Shows Off Quest Hand-tracking Improvements, Claims “almost as responsive as controllers”

Apple is going all-in with hand-tracking for its upcoming Vision Pro, but can hand-tracking really replace proper VR controllers? Meta, Apple’s main competitor in the space, isn’t going so far as to make that particular claim, however the company says Quest’s recent hand-tracking improvements are now “almost as responsive as controllers” thanks to its recent v56 software update.

First announced in late July, the company’s Hands 2.2 tracking improvements are introducing what Meta says is better hand responsiveness and a few new experimental features that we’ll probably see in Quest 3.

Now rolling out to Quest headsets, Meta says users should notice hand-tracking latency reduced “up to 40%” in regular use, and “up to 75%” during fast movement. Meta says those dramatic gains in fast movement latency are thanks to the introduction of a new Fast Motion Mode (FMM) for more frenetic games, like fitness and rhythm games that require you to punch incoming objects.

Here’s a look at controllers relative to the new Hands 2.2 release:

v56 is also rolling out to Quest Pro, which includes two new experimental features: simultaneous hands and controllers (Multimodal) tracking, and controller-driven hand pose (Capsense Hands).

Multimodal tracking is said to unlock a number of mixed input style, including Hand+controller gameplay, instant transitions between hands and controllers, and improved social presence when using one controller. It’s only available on Quest Pro for experimentation, although Meta plans to add support for additional devices later.

On the other hand, Capsense Hands lets developers show a natural hand model visualization on top of, or instead of, a user’s controller. Check out both in action in the video below:

“Hand Tracking gives your app’s users the ability to engage with their VR or mixed reality environment in a way that is natural and intuitive,” Meta says in a developer blogpost. “These interactions deepen the immersive experience and help people feel more connected to what’s going on around them in-headset. Hands can also provide a faster on ramp for users new to VR. By integrating Hand Tracking in your existing apps, you can give your users more flexibility to tailor their experience and find what works best for them—and thanks to Hands 2.2, you can feel confident knowing your app’s users will enjoy a great experience regardless of whether they play with hands or controllers.”

Meta says we should hear more about its hand-tracking upgrades in the near future, as the company is hosting its annual Meta Connect developer conference on September 27th, which ought to include an info dump (and likely release date) for its upcoming Quest 3 headset, which is bringing much of the functionality of Quest Pro to the consumer price point of $500.

Meta’s New Demo App is Like ‘Beat Saber’ for Hand-tracking

Meta’s hand-tracking has improved by leaps and bounds since we first saw it on the original Quest in 2019, but as Apple serves up stiff competition with its upcoming Vision Pro mixed reality headset, Meta tossed out a new hand-tracking demo that shows off the benefits of its latest software update.

As a part of Quest’s v56 software update Meta unveiled Hand Tracking 2.2, something the company says focuses on hands responsiveness. If Meta wants to beat Apple at its own game—Vision Pro’s input relies primarily on hand-tracking—it really needs to make hand-tracking as a reliable as possible.

To boot, Meta says in a blogpost its Hand Tracking 2.2 update reduces hand-tracking latency by 40% “in typical usage” and “up to 75% during fast movement.”

To show off its latest hand-tracking improvements, Meta released a first-party app called Move Fast, which is pretty similar to the company’s own block-slashing rhythm game Beat Saber, albeit tasking you with chopping, punching, and blocking incoming objects.

With only four songs to play through however, Move Fast isn’t meant to be a full game, as Meta says it’s more for demonstration purposes so developers can see how the company’s Interaction SDK can now handle fast-action fitness types of apps.

To try it out for yourself, download Move Fast for free on Quest App Lab, which supports both Quest 2 and Quest Pro. Meanwhile, check out a brief clip of the demo in action below:

New Leap Motion 2 Brings High-end Hand-tracking to Standalone Headsets

10 years after the launch of Leap Motion—which garnered praise for offering some of the best hand-tracking in the industry—the company has announced a next-generation version of the device which now supports standalone XR headsets in addition to Windows and MacOS.

Years before the modern era of VR, Leap Motion set out to build a hand-tracking module that it hoped would revolutionize human-computer interaction. Launched initially in 2013, the device was praised for its impressive hand-tracking, but failed to find a killer use-case when used as an accessory for PCs. But as the VR spark began anew a few years later, Leap Motion’s hand-tracking started to look like a perfect input method for interacting with immersive content.

Between then and now the company pivoted heavily into the VR space, but didn’t manage to find its way into any major headsets until well after the launch of first-gen VR headsets like Oculus Rift and HTC Vive (though that didn’t stop developers from attached the Leap Motion module and experimenting with hand-tracking). Over the years the company kept honing their hand-tracking tech, improving its software stack which made hand-tracking with the first generation of the hand-tracking module better over time.

First generation Leap Motion | Image courtesy Leap Motion

(It should be noted that Leap Motion was once both the name of the device and the company itself, Leap Motion was merged with another company to form Ultraleap back in 2019.)

More recently the company has built newer versions of its hand-tracking module—including integrations with headsets from the likes of Varjo and Lynx—but never sold that newer hardware as a standalone tracking module that anyone could buy. Until now.

Leap Motion 2 is the first new standalone hand-tracking module from the company since the original, and it’s already available for pre-order, priced at $140, and expected to ship this Summer.

Purportedly built for “XR, desktop use, holographic displays, and Vtubing,” Ultraleap says the Leap Motion 2 is its “most flexible camera ever” thanks to support for Windows, MacOS, and standalone Android headsets with Qualcomm’s XR2 chip.

Image courtesy Ultraleap

From a specs standpoint, the company says the new tracker has “higher resolution cameras, increased field-of-view, and 25% lower power consumption, all in a 30% smaller package for optimum placement and convenience.”

Ultraleap says that Leap Motion 2 will give developers an easy way to experiment with high-quality hand-tracking by adding it to headsets like Varjo Aero, Pico Neo 3 Pro, and Lenovo’s ThinkReality VRX. The company also plans to sell a mount for the device to be attached to XR headsets, as it did with the original device.

Image courtesy Ultraleap

And with the launch of this next-gen hand-tracking module, Ultraleap says it’s moving on from the original Leap Motion tracker.

“Existing customers [using the first Leap Motion module] may continue to access the latest compatible software including the soon-to-be-released Gemini for macOS. Support will also continue to be provided. Future versions of the software will not deliver any performance improvements to the original Leap Motion Controller device,” the company says.

Ultraleap said it has sold more than 1 million Leap Motion trackers to date, with some 350,000 developers having build apps and experiences using the company’s hand-tracking tech.

Open-Source Project Bringing Hand Tracking To Valve Index And Reverb G2

An open-source project is bringing controller-free hand tracking to PC VR headsets.

The open-source Linux-based OpenXR platform Monado just added hand tracking. Hand tracking is a built-in feature on standalone headsets like Quest, Pico 4, and Vive XR Elite, but isn’t currently natively available on SteamVR except through 3rd party attachments such as Ultraleap.

The new feature fully supports Valve Index and has “degraded quality” support for Oculus Rift S and WMR headsets like HP Reverb G2 – though that should be fixed “soon”.

Collabora, the group developing Monado, claims the feature can track fast hand movements and is usable for drawing, typing, and UI interaction in specialized apps. It’s mainly intended to be used with your hands separated, with “limited” support for hand-over-hand interactions.

Monado also supports inside-out headset positional tracking on Linux, allowing Valve Index to be used without base stations.

Of course, almost all PC VR owners use their headsets through Windows, not Linux. Collabora says a Windows SteamVR driver for its hand tracking tech should arrive “in the coming weeks”, alongside improvements to stability and jitter. There are no announced plans for the headset positional tracking to come to Windows, though.

Whether the driver will become popular enough to encourage developers of games with support for hand tracking on standalone headsets to also support it on PC is a very different question, but for specialized applications this could still prove a very useful feature.

Meta Interaction SDK Gets Hand Tracking Teleport Gesture With Demo Game

Meta added a hand tracking teleportation system to its Interaction SDK.

The Interaction SDK is a Unity framework providing high quality common hand interactions for controllers and hand tracking. It includes direct object grabbing, distance grabbing, interactable UI elements, gesture detection, and more. This means developers don’t have to reinvent the wheel, and users don’t have to relearn interactions between apps using the SDK.

The next version of Interaction SDK adds gestures and visualization for teleportation and snap turning when using controller-free hand tracking. Gesture based locomotion systems like this are necessary for adding hand tracking to apps and games where you explore a virtual world.

To point to where you teleport you turn your hand to the side and extend your index finger and thumb while closing your other fingers to your palm. To perform the teleport, just pinch your index finger to your thumb. It’s somewhat similar to the pinch “click” used in the Quest system interface, but with your hand rotated.

Some hand tracking apps such as Waltz Of The Wizard already implement their own teleportation gesture, but Interaction SDK should let any developer add it without needing to build their own.

You can try out Meta’s hand tracking teleportation system in the First Hand demo on App Lab. It showcases many Interaction SDK features, and now has a Chapter 2 to show locomotion too.

Quest v50 Lets You Direct Touch The Interface With Your Fingers

The Meta Quest v50 update adds an experimental feature called Direct Touch.

Interacting with the Quest system interface currently requires pointing and clicking. With controllers, this “click” is done with the trigger or A/X button, while hand tracking requires pinching your index finger to your thumb. But the quality of controller-free hand tracking on Quest is still far from perfect. The jitter and inaccuracy present in anything less than ideal conditions can make pointing and pinching a frustrating experience.

Direct Touch will let you simply reach out and touch the interface, tapping and swiping as you would with a smartphone or tablet. It works with both controllers and hand tracking.

We haven’t yet tried v50, but we’ll be curious to see how this feels given that unlike a real touchscreen there’s no physical surface for your fingers to press against. Further, keeping your hands held up to even a real vertical screen for an extended period is tiring, leading to what’s termed “gorilla arm” syndrome. However, the main use of the Quest system interface is just to quickly change settings or launch apps, so this may not be a problem for most people.

If it’s implemented well Direct Touch should also make entering text while using hand tracking much more practical, as this is currently an incredibly slow and frustrating experience.

Direct Touch is an experimental feature, so once you get v50 you’ll need to enable it in the Experimental tab of the settings.

The Story of Unplugged: Bringing Air Guitar To Life In VR

When it comes to hand tracking games on Quest, nothing really comes close to Unplugged.

Developed by Anotherway and published by Vertigo Games in late 2021, Unplugged is an air guitar game, inspired by Guitar Hero and many others, that lets you shred in VR with a virtual guitar and your real hands.

As I’ve said elsewhere, Unplugged leverages Quest’s hand tracking technology to breathe life into the imaginary act of air guitar. In doing so, it takes hand tracking to a whole new conceptual and technological level, surpassing everything else available on Quest.

“From the very beginning, our obsession was to understand how the technology is limited and try to polish that stuff,” says studio director and Unplugged creator Ricardo Acosta. “That was the very first thing. Not the graphics, not even the gameplay.”

After speaking with Acosta in our virtual studio (full video interview embedded above), it’s clear that creating a polished and tangible experience was always the goal. “I think that hand tracking is here for good,” he tells me. “I wanted to create something that worked for real. It wasn’t just another demo.”

Such strong commitment to this new form of input is a big call, especially for Acosta, who spent years as a hand tracking skeptic while working on the HoloLens team at Microsoft. “When I was at Microsoft, I was like an advocate for controllers,” he says with a laugh. “At Microsoft, they are all about hand tracking, but I was like, ‘No guys, we need controllers. Controllers are great.’ And now I’m saying the exact opposite thing.”

“On the first version of the HoloLens … you have hand tracking, but just like the blob. It’s just the hand, not the fingers.” Without full and reliable finger tracking, Acosta came away disappointed and skeptical. “With the HoloLens 2, it was a bit better, but the lag between your movement and the hand was very big, for a lot of technical reasons.”

Even so, Unplugged was first conceptualized in 2015 — well before the advent of any modern VR’s hand tracking functionality. “I remember being in a concert in Prague and I was just like doing air guitar,” he recalls. “And at some point I was like, oh, this is an interaction that could work in VR.”

“As soon as I went back home, I prototyped something … and it totally worked. It was like, oh, this is good. This is something that we could actually turn into a game.” The original idea developed into something akin to Rock Band but for VR, using controllers and the first Vive headsets and Oculus SDKs. Acosta said he quit his job at Microsoft to work on the prototype, titled Rock the Stage, over the course of four months.

“I think that it was pretty good,” he says of the Rock the Stage prototype, of which videos still exist online.  “The best thing it was that it made you feel like you were there.” But Acosta soon ran into a road bump — music games, and particularly the associated licensing, are complicated work. “You need a lot of money. You need a team of people handling all that music licensing. And I didn’t have all that back in the day. So I decided, at some point, to go back to my job.”

After continuing in Microsoft’s VR/AR division for another few years, Acosta revisited the concept in 2020 while bored at home during the pandemic. “Oculus [had] just released the hand tracking system [for Quest] and suddenly it came to me like, ‘Oh my god, I could actually rescue that…prototype and try [see] if it works using hand tracking.'”

Even in the early stages, hand tracking felt like a turning point for the previously controller-only experience. “It worked so well. . .Back in the day with the controllers was nice, but with hand tracking was exactly what it should be.” Acosta adapted his original prototype into something new, ditching controllers for something much more freeing and immersive. “When I put [hand tracking] on the prototype, it wasn’t perfect, but it was good enough for me to start polishing the experience. I knew that with a bit of work and a few algorithms on top of the hand tracking, I could make it work.”

Acosta created a video showcasing the new prototype game and posted it to social media. It soon exploded and attracted a lot of interest, especially from publishers offering funding. After discussions options with a few different publishers, Acosta signed with Vertigo Games. “They offered the best deal. And also they were bigger, and they really had a super nice vision about what the game should be.”

“At first I was a bit scared about it, because it was a super big project. We didn’t have a company together. It was complicated.” What started as a one-man show had to turn into a burgeoning team. Acosta’s wife joined as a project manager and they were then joined by a few others to make up the small studio now known as Anotherway.

“We are six people now, which is not a lot,” he says. “Very recently, we had the opportunity to grow a little bit, but we decided to stay small. I’ve been working in Microsoft for most of my career. That is a very big company and it’s amazing, but I really like working with just a very small amount of people. It’s a very creative environment.”

Working alongside Vertigo, Unplugged quickly developed into a project with bigger ambitions than Acosta had ever imagined. “I’m very conservative in terms of adding features, because I know that anything you add to a project, it will create a lot of problems, a lot of bugs, a lot of things.”

“They pushed for more staff. They wanted more music, they wanted more venues, they wanted more quality on the game and they’ve been always pushing for that. And I think that, in general, the game would have been way smaller without Vertigo,” he says.

In particular, working with Vertigo opened up opportunities when it came to the proposed tracklist. “In the very beginning we were just going for small bands. And then when we signed up with Vertigo they were like ‘No, like indie bands are cool and we will have a few. But we need famous bands.’ And we were like, oh, but that’s going to be super complicated.”

Vertigo sent Anotherway a Spotify playlist and asked them to add any songs they might want in the game. “And we were like ‘Wait, whatever music?'” It was a big mental shift.

The Offspring’s The Kids Aren’t Alright was the first major song that Vertigo and Anotherway secured the rights to. “We were just like jumping, like, ‘Oh my god, we made it.'” The final selection included some massive artists — The Clash, T. Rex, Weezer and Steel Panther, to name a few. “[Music licensing] is a very time-consuming process, and I knew that. So not even in my wildest dreams I would have dreamed about having Weezer or Tenacious D, The Offspring, or Ozzy…”

The inclusion of Tenacious D’s Roadie is particularly special to Acosta — not only is the band one of his favorites, but he had used the song all the way back in 2015 in the very first prototype. However, the song almost didn’t make it into the final game at all.

Vertigo and Anotherway initially struggled to make contact with Tenacious D to secure the rights. However, Vertigo had a trick up its sleeve — Guitar Hero legend Mark Henderson had been brought on board to assist with the game. “He was like, ‘Guys, leave it up to me. I’ll make it happen.’ So somehow he contacted the manager of Tenacious D and started talking to them.”

With Henderson’s help the rights to the song were secured. But another problem emerged — with a PEGI 12 rating, Roadie’s explicit and frequent F-bombs weren’t going to cut it. “So at another point we were like, ‘Okay, we have the song now, but we cannot use it because we are PEGI 12, so we have to take it out from the list.'”

Acosta made his peace with leaving the song off the tracklist but, in his words, “maybe the stars were in a particular position that night.” Henderson was able to get Tenacious D back into the studio to re-record a clean version of Roadie, specifically for Unplugged, excluding all the swearing.

“It was insane,” says Acosta. “Knowing that my favorite band re-recorded a song just for the game. It’s insane. It’s just amazing. And a lot of people have complained about the fact that it’s a different version of the song, without the swearing. But I’m so proud of that. To me, it’s even better because it’s our song.”

With a solid tracklist secured, Acosta and the team at Anotherway set to work on creating an unforgettable and reliable hand tracking experience. “I am a UX designer, so for me, the main important thing on anything is user experience. If the experience is not good, the whole game won’t work, or the whole experience will be shit, and we didn’t want that.”

As a result, the gameplay itself was adapted and designed to work with, not against, hand tracking. Even tiny changes mad a big difference — the size of the guitar in Unplugged, for example, is a bit smaller than a regular, real-life guitar, which helps keep your hands in view of the cameras.

“In the beginning, with hand tracking 1.0, we had to be very aware of your movements,” he explains. “We had to create the mapping so that the music charts in a way that is always aware of the limitations of the technology.”

That meant that at launch, the mapping in Unplugged didn’t always completely follow the music, leading some players to complain that the music and the notes didn’t always line up. “And we knew why, but we couldn’t do anything about it, because the hand tracking was very limited and you couldn’t move your hand that quickly,” he said.

Nonetheless, Acosta remains proud of the experience offered at launch. “In the first version, it was absolutely playable. Obviously it wasn’t perfect, but it was playable. And I think that we proved that you can actually create a hand tracking game that uses hand tracking very intensively.”

Skip forward a few months after launch and the release of Meta’s Hand Tracking 2.0 software offered huge gains for Unplugged. Not only was the technology more reliable than ever, but it was so good that Anotherway went back and re-mapped the entire tracklist for increased accuracy and challenge. “We want the game to be fully accessible for everyone, obviously. But I think that for 98% of people, the game works very well.”

Nonetheless, Anotherway are still implementing algorithms and workarounds to account for error and improve the experience — the latest being an AI system. “We’re using deep learning in order to see where your hands should be or what’s your pose or what’s your intentions. We made all that stuff so [that] when there is a problem with the hand tracking, there is another layer trying to help and trying to make the experience as smooth as possible.”

There’s more to come too. In the short term, Anotherway just released a new DLC pack — featuring songs by metal band Pantera — and are working on an upcoming accessibility update adding new features and “another thing” that is top secret but will be “really big.”

In terms of song selection, there’s definitely more on the way. “We are working to add more music all the time. We want to add free music [as well], not just DLC. Also, I want to add more indie music because I think that there is a lot of really good indie music out there.”

But what about the long term? What does the next year or more look like for Unplugged? “I cannot talk too much about it because Vertigo will kill me,” Acosta says with a laugh. “But our plans are very big. Unplugged is going to become bigger, at least in terms of features…”

“I would be very excited about Unplugged if I knew what’s going to happen. Probably like in a year, Unplugged will be very different. It will have way more stuff. That’s it. That’s all I can say.”

For a game that has already pioneered a new technology on a cutting edge piece of hardware, there could be a lot of interesting developments in Anotherway’s future.

“Unplugged is going to move forward,” Acosta said. “That is for sure. We are not staying still.”


Unplugged is available on Quest headsets and hand tracking-enabled PC VR headsets on Steam. You can read our full and updated 2022 of the game here

Four Quest 2 Apps Making Use of Hand-tracking 2.0 Improvements

Alongside the announcement that Quest 2 is getting upgraded controllerless hand-tracking which improves robustness, Meta revealed that it had given select developers access to the tech ahead of time, some of which used it to directly improve their apps.

The new and improved Quest 2 hand-tracking is said to have begun rolling out this week. And some of the first developers to get access to the tech have shared their thoughts on the improvements and in some cases have even updated their apps to take advantage of the new system.

Unplugged: Air Guitar

Perhaps the most clear example of the improved Quest 2 hand-tracking leading to a better experience is with Unplugged: Air Guitar, a VR rhythm game which uses hand-tracking to give players a ‘real’ air guitar. The studio behind the game, Anotherway, said the improvements to hand-tracking were so significant that it was inspired to re-map some of its songs with added complexity, thanks to the improvements in tracking fast moving hands.

“The update to hand-tracking is a very big deal for us. Unplugged is a game that intensively uses hand-tracking for an authentic sense of air guitar gameplay. There are a lot of fast hand movements and rapid chord changes among others. Since the beginning, we wanted to make the players feel like they were playing actual air guitar. Even though we managed to achieve very solid and accurate gameplay using the older version of hand-tracking, we had to put some limitations on our gameplay. This needed to be done in order to provide a smooth experience that’s not interrupted by any issues that such a new technology might has from time to time,” the studio said in Meta’s announcement of the update to Quest 2’s hand-tracking. “With the latest update, hand-tracking is so accurate and responsive that we could include all the perks we couldn’t before: fast changes of finger positions, plus an increased and more realistic number of notes that makes the songs feel way more authentic.”

Cubism

While Unplugged: Air Guitar benefited greatly from the new system’s ability to more reliably track fast moving hands, VR puzzle game Cubism saw the most benefit from the increased robustness of hand-tracking when hands cross or cover each other.

“This update to hand-tracking is a big step forward in tracking quality and makes playing Cubism with hands feel a lot more stable and consistent. Previously, Cubism’s hand tracking relied on smoothing the hand data input to produce a stable input method. Furthermore, players needed to be taught not to cross their hands since this negatively affected tracking,” the game’s developer shared in the announcement. “This is all improved with the latest hand-tracking—which is consistent enough for me to turn off hand smoothing by default.”

See Also: The Design Behind ‘Cubism’s’ Hand-tracking

Hand Physics Lab

Hand Physics Labanother title that makes exclusive use of Quest 2 hand-tracking, also benefited from the improvements in robustness, especially given the way the game encourages using both hands near each other at the same time.

“This update to hand-tracking is a big step forward for natural and intuitive interactions with hands. One of the biggest challenges when building a hands-first application is the reliability and accuracy of hand tracking provided by the system,” the developer said. “Hand Physics Lab was designed to highlight what was possible at the time and to challenge the user to play with the limitations of the technology. With this big improvement, we hope more people will discover what hand tracking has to offer to immersive experiences.”

Liteboxer

VR fitness game Liteboxer is all about—you guessed it—boxing. Naturally that means hands flying quickly through the air… something the previous Quest 2 hand-tracking system wasn’t great at. With the improved system, Meta says Quest 2 is much better at keeping track of fast moving hands including situations where not all the fingers can be seen (like when making a fist!).

“The update to hand-tracking has definitely improved our ability to deliver a flawless workout experience for Liteboxer Users. Our workouts require a lot of quick punches to be thrown, and it’s imperative for hand-tracking to keep up with the rigorous pace,” said the developer. “We’re really happy with this latest update and excited about the overall direction hand-tracking is headed on the Quest platform.”

– – — – –

Hand-tracking on Quest has been reasonably functional from the start, but latency and reliability issues have left something to be desired. With the hand-tracking 2.0 update on Quest 2 it looks like Meta is taking a strong step forward to make hand-tracking more useful overall which will benefit all hand-tracking apps whether they make specific improvements to take advantage of it or not.

The hand-tracking update for Quest 2 is included with the v39 software which is slowly rolling out to headsets now.

The post Four Quest 2 Apps Making Use of Hand-tracking 2.0 Improvements appeared first on Road to VR.

Quest 2 Hand Tracking 2.0 Handles Fast Movements, Occlusion, And Touching (Update)

Developers are beginning to roll out major hand tracking improvements to their Quest apps as v39 from Meta enables Hand Tracking 2.0. The first apps with the improved feature include Cubism and Unplugged: Air Guitar, with Hand Physics Lab releasing the feature as a beta update that’s likely to roll out more broadly this week.

Quest 2’s new Hand Tracking 2.0 mode brings some dramatic improvements to using your hands without controllers. Meta says its researchers and engineers “developed a new method of applying deep learning to better understand hand poses when the device’s cameras can’t see the full hand or when the hand is moving quickly”, describing the result as “a step-function improvement in tracking continuity”.

The new mode can apparently handle your hands moving quickly, one hand covering the other, and even your hands touching – scenarios which previously caused the tracking to temporarily break. This should make Hand Tracking much more practical to use, and enable new actions like clapping and counting on fingers.

For now Hand Tracking 2.0 is an optional developer side upgrade, so you’ll need to wait for apps to release updates to support it to see any of these improvements. However, later this year Hand Tracking 2.0 will become the default.

The new mode supersedes the High Frequency Hand Tracking mode released last year. That mode was also a developer choice and also improved fast hand movements – as well as reducing latency – but it came with the tradeoff of slightly increased hand jitter and capping the maximum CPU and GPU clock speed available to apps. Meta claims Hand Tracking 2.0 delivers the fast movement benefit of High Frequency and more, but runs at the default frequency so doesn’t have the tradeoffs.

Here’s what the developers who had early access to Hand Tracking 2.0 say, via Meta:

Previously, Cubism’s hand tracking relied on smoothing the hand data input to produce a stable input method. Furthermore, players needed to be taught not to cross their hands since this negatively affected tracking. This is all improved with the latest hand tracking – which is consistent enough for me to turn off hand smoothing by default.” – Cubism developer Thomas Van Bouwel

This update to hand tracking is a big step forward for natural and intuitive interactions with hands. With this big improvement, we hope more people will discover what hand tracking has to offer to immersive experiences.” – Hand Physics Lab

Even though we managed to achieve very solid and accurate gameplay using the older version of hand tracking, we had to put some limitations on our gameplay. This needed to be done in order to provide a smooth experience that’s not interrupted by any issues that such a new technology might has from time to time. With the latest update, hand tracking is so accurate and responsive that we could include all the perks we couldn’t before: fast changes of finger positions, plus an increased – and more realistic number of notes that makes the songs feel way more authentic.” – Unplugged: Air Guitar

Our workouts require a lot of quick punches to be thrown, and it’s imperative for hand-tracking to keep up with the rigorous pace. We’re really happy with this latest update and excited about the overall direction hand-tracking is headed on the Quest platform.” – Liteboxer

This article was originally published April 19 but the publish date was changed on May 2 with the addition of a new first paragraph to note the first apps getting this update.