The Metaverse Gets More Tactile With Meta Quest’s 2.0 Hand Tracking Upgrade

During Meta’s Connect conference in October 2021, the company talked about its Presence Platform and the SDKs (Insight, Interaction, Voice) being made available to virtual reality (VR) developers. Continuing to build towards evermore immersive experiences, today, it has released a major upgrade to Meta Quest’s Hand Tracking capabilities for creators to make use of.

Meta Quest - Hand Tracking

Meta has upgraded the Hand Tracking API to improve performance, tracking and gesture expression and recognition. As you can see from the imagery and videos below, this enables the Meta Quest to detect fast and overlapping hand movements rather than the occluded hand fading out. Thus opening up a whole new range of gestures such as clapping or passing an item from one hand to the other.

These are physically very basic, very natural interactions, therefore adding a touch more presence to videogames and metaverse experiences. Additionally, the update improves familiar hand tracking actions like pinch, grab, and poke recognition, making these far more seamless and less prone to incorrect gestures.

“In the metaverse, your hands will move as naturally as they do in the physical world. Our Presence Platform’s hand tracking tech keeps improving to support more gestures, tracking continuity, and fast movement. Developers can now integrate more responsive hands into apps. For example, the guitar app Unplugged will support faster riffs and more notes to make playing in virtual reality more realistic,” says Mark Zuckerberg in a blog post.

Liteboxer

“This update to hand tracking is a big step forward in tracking quality and makes playing Cubism with hands feel a lot more stable and consistent. Previously, Cubism’s hand tracking relied on smoothing the hand data input to produce a stable input method,” says Cubism’s Thomas Van Bouwel. “Furthermore, players needed to be taught not to cross their hands since this negatively affected tracking. This is all improved with the latest hand tracking—which is consistent enough for me to turn off hand smoothing by default.”

There’s nothing Meta Quest users need to do regarding the Hand Tracking 2.0 upgrade, it’s all handled developer side, they simply need to “opt-in” to utilise the new functions. Developers of apps that already utilise hand-tracking like Cubism, Unplugged, Hand Physics Lab, and LiteBoxer got an early look.

From the look of it, Meta’s Hand Tracking 2.0 brings it closer in line to Ultraleap’s fifth-generation Gemini software which also improved upon the company’s occlusion tracking.

As further enhancements are made to Meta Quest’s software, gmw3 will keep you updated.

“Major improvements” Coming to Quest 2 Hand-tracking with ‘2.0’ Upgrade

Meta today announced “major improvements” coming to Quest 2’s controllerless hand-tracking capability. The ‘re-architected computer vision and machine learning approach’ is said to specifically improve reliability for overlapping or fast moving hands and specific gestures. The SDK and OS update to enable these improved capabilities will begin rolling out today.

Meta first introduced controllerless hand-tracking to the original Quest back in late 2019 where it remained an ‘experimental’ feature until mid-2020 when it began allowing developers to use the new capability in their apps.

Since then we’ve seen a handful of games incorporate hand-tracking into their apps and even the launch of some games that exclusively rely on hand-tracking, like Hand Physics Lab (2021) and Unplugged: Air Guitar (2021).

Now, a little less than a year later, Meta says it’s bringing “major improvements” to Quest 2’s hand-tracking capability (the company confirmed the original Quest will not receive these improvements).

The improvements come thanks to a ‘re-architected computer vision and machine learning approach’ which improves the robustness of hand-tracking in key ways.

With the 1.0 version of hand-tracking on Quest 2, the system had particular trouble recognizing the user’s hands when they obstructed or touched each other and when moving quickly. From the user’s point of view, their virtual hands would disappear momentarily during these lost tracking moments and then reappear once the system detected them again.

With the 2.0 version of hand-tracking on Quest 2, Meta says the system will handle those obstructed and fast-moving scenarios much better, leading to fewer instances of disappearing hands. The company calls it a “step-function improvement in tracking continuity.”

The update is also said to improve gesture recognition in the hand-tracking system. Gesture recognition looks for specific hand-poses which the system detects as unique and can therefore be used as inputs. For instance, pinching is one such gesture and it’s employed to allow users to ‘click’ on elements in the Quest interface.

In the demo below, a ‘grab’ gesture is used to hold the virtual object, and the improvement in robustness for clapping is demonstrated as well.

– – — – –

Meta says the hand-tracking 2.0 update on Quest 2 will begin rolling out via an SDK and OS update starting today. The company says developers who have already built hand-tracking into their apps won’t need to change any API calls in order to use the upgraded system, though it won’t be automatically enabled. The company says developers can reference “upcoming documentation” for enabling it in their apps.

The move should bring Quest 2’s hand-tracking a step closer to Ultraleap, which has maintained some of the best hand-tracking in the industry to date, though it isn’t clear yet how the two systems will stack up.

The post “Major improvements” Coming to Quest 2 Hand-tracking with ‘2.0’ Upgrade appeared first on Road to VR.

Accessorise Your Pico Neo 3 With Ultraleap’s new Hand Tracking Addon

One of the definitive leaders in hand tracking technology is Ultraleap, with its tech integrated into devices such as Varjo’s headsets or available as a third-party accessory. It’s the latter that Ultraleap is announcing today, bringing hand tracking to Pico Interactive’s Neo 3 Pro and Pro Eye headsets.

Pico Neo 3 Pro with Ultraleap Hand Tracking
Image credit: Pico Interactive

As you can see in the image above, the setup consists of an Ultraleap Stereo IR 170 camera inside a bespoke mount, with a power cable running to the Pico Neo 3’s USB-C socket. The setup will then run Ultraleap’s fifth-generation hand tracking software Gemini, with Unity and Unreal platforms supported for developers.

The Ultraleap Hand Tracking Accessory won’t be sold as an individual unit it seems for current Neo 3 Pro and Pro Eye owners to upgrade to. It’ll be sold as a new bundle with one of the aforementioned headsets (Gemini coming pre-installed) through select retailers, available now in early access for developers and enterprise customers. An official launch will then take place this summer, with prices yet to be revealed.

“VR for training is on the cusp of mainstream adoption and we truly believe hand tracking plays an important part in tipping it over the edge. We’re already seeing significant wins from customers who have deployed VR training programmes or LBE experiences with hand tracking,” said Matt Tullis, VP, XR at Ultraleap in a statement. “This first phase of the Pico relationship will mean more developers and organisations will be able to test, pilot and refine their applications to unlock the true power of VR now and deploy at scale in a few months.”

Pico Neo 3 Pro Eye

“We’re very excited to bring Ultraleap hand tracking to our latest VR headsets through this accessory. When applications need the highest performing hand tracking for complex interactions or challenging environments, Ultraleap’s hand tracking really is world-class. We can’t wait to see what developers and organisations will create from this joint effort,” adds Leland Hedges, GM for Pico Interactive Europe.

Hand tracking has been gaining ground of late, featuring in devices like the HTC Vive Focus 3 whilst the upcoming Lynx-R1 utilises hand tracking (Ultraleap’s again) as its default input method. And, of course, let’s not forget about Meta Quest 2 which supports hand tracking out the box with titles like Cubism, Vacation Simulator and Clash of Chef’s VR all adding hand tracking updates.

gmw3 will continue its coverage of hand tracking as further announcements are made.

Gleechi VirtualGrasp SDK Offers Dynamic Hand Interactions For VR Developers

Gleechi’s VirtualGrasp software development kit is now available through the company’s early access program, offering auto-generated and dynamic hand interactions and grasp positions for VR developers.

We first reported on Gleechi’s technology back in early 2020 when the company released footage of their VirtualGrasp SDK in action, which automates the programming of hand interactions in VR and allows for easy grasping and interaction with any 3D mesh object.

Two years on, the SDK is now available through Gleechi’s early access program, which you can apply for here. The SDK supports all major VR platforms, provided as a plug-in that can be integrated into existing applications, with support for Unity and Unreal, for both controller and hand tracking interactions.

Given the timing of release, you might ask what the difference is between Meta’s new interaction SDK and Gleechi’s VirtualGrasp SDK. The key difference is that Meta’s technology uses set positions for grasps and interactions – if you pick up an object, it can snap to pre-determined grasp positions that are manually assigned by the developer.

On the other hand (pun intended), the Gleechi SDK is designed as a dynamic system that can generate natural poses and interactions between hands and objects automatically, using the 3D mesh of the objects. This means there should be much less manual input and assignment needed from the developer, and allows for much more immersive interactions that can appear more natural than pre-set positions.

Gleechi VirtualGrasp

You can see an example of how the two SDKs differ in the graphic above, created with screenshots taken from a demonstration video provided by Gleechi. On the left, the interaction uses the Meta SDK – the mug therefore uses set positions and grab poses that are set manually by the developers. In this case, it’s set so the user will always grab the mug by the handle. Multiple grab poses are possible with the Meta SDK, but each has to be manually set up by the developer.

In the middle and the right, you can see how Gleechi’s SDK allows the user to dynamically grab the mug from any angle or position. A natural grab pose is applied to the object depending on the position of the hand, without the developer having to set up the poses manually. It is done automatically by the SDK, using the 3D mesh of the object.

Gleechi also noted that its SDK supports manual grasp positions as well. Developers can use the dynamic grasp system to find a position they’d like to set as a static grasp and then lock it in. For example, a developer could use VirtualGrasp’s dynamic system to pick the mug up from the top, as pictured above, and then set that as the preferred position for the object. The mug will then always snap to that pose when picked up, as opposed to dynamically from any position. This allows you to set static hand grip poses for some objects, while still using the automatic dynamic poses for others.

We were able to try a demo application using the Gleechi SDK on a Meta Quest headset and can confirm that the dynamic poses and interactions work as described and shown above. We were able to approach several objects from any angle or position and the SDK would apply an appropriate grasp position that felt and looked much more natural than most other interactions with pre-set poses and grasp positions.

If you’re interested in learning more, you can check out the VirtualGrasp site or head over to the Gleechi documentation page to learn more about the SDK’s capabilities.

Hands-On: Quest v37 Features Vastly Improve The Headset’s User Experience

The v37 update for the Quest system software was announced a few weeks ago, but it’s slowly rolling out to headsets across the world. We’ve got our hands on the new update and found it marks a great improvement for user experience and general usability on the system.

Recently, I’ve made it clear that Meta needs to step up its game when it comes to the design of the system UI and user experiences on its headsets, particularly with Cambria around the corner and Apple, the leader in intuitive design, looming large over the industry and about to make an entrance. The Quest menu and user experience has been unnecessarily convoluted and unintuitive for a while now, but I’m happy to say that the v37 update makes some good ground on improvements.

The video above runs through the three biggest features in v37 — the new tablet-to-desktop UI system, tracked keyboard support for Apple’s Magic Keyboard and the new hand tracking menu.

We hope these latest changes form the basis for a stable, universal groundwork that will be kept largely the same and consistent for a while. My recent editorial suggested that solid design principles and gradual improvements in Apple’s iOS and iPadOS help foster familiarity that is important for engaging and connecting with casual users who aren’t part of the wider technology scene. This will also be key on Quest and future Meta headsets as the company’s audience becomes more mainstream.

The tablet-to-desktop system is a fantastic change and — somewhat ironically — includes gestures and actions that feel reminiscent of Apple’s return-to-home and swipe-up-for-multitasking gestures on iOS and iPad OS. The new hand tracking menu is also a huge improvement that adds functionality that’s been largely missing since hand tracking was added as a default option to the system. Likewise, it’s nice to see support for Apple’s Magic Keyboard, which should hopefully provide an easier way for more people to try out the feature and start working more in VR. With the Logitech K830 seemingly out of stock perpetually, it’s nice to have the option to use a keyboard model that comes shipped with all iMacs and is easily purchasable and in stock across most retailers.

Are you enjoying the K830 update and do you like the new features? Let us know what you think in the comments below.

Quest v37 Update Adds Hand-Tracking Quick Menu, Apple Magic Keyboard Support & More

The full version of Meta Quest v37 is rolling out from today with a handful of new features to talk about.

Firstly, there’s support for a brand new keyboard – Apple’s Magic Keyboard. Once v37 is installed you’ll be able to enable support for the device in Quest’s experimental features settings, meaning your headset will track the position of the device and show you it inside VR. This is the first new keyboard to be added to Quest since the feature debuted with the Logitech K830 last year. We just launched a guide on the best keyboards for Quest, so expect an update to that soon.

Quest v37 Update Released

Meta Quest v37 update

Elsewhere, v37 introduces link sharing between your Oculus app and headset. Simply share a website link with your headset via the app and it will take you to it when you next put your Quest on.

Perhaps more interestingly, there’s a new Quick Action Menu for hand tracking. Previously you could pinch your thumb and index fingers to effectively press the Oculus button and access the same menu you would with Touch controllers. Now this action will bring up a quick menu specific to hand tracking that includes options to take screenshots and activate voice commands.

Finally, the update allows you to change display modes for 2D panels. The ‘tablet mode’ is essentially the same as the existing setup, with a larger window closer to you. But ‘desktop mode’ allows you to move the display further away, allowing you to more easily multitask. The Explore section of the menu has also been updated.

Not bad for the first set of updates for the year, then. As always, the update will roll out gradually so, if you don’t have it right now, give it a little time. Previously, we reported that the Public Test Channel version of the v37 update included a hidden peak at the Horizon Home multiplayer feature, though that doesn’t appear to be materializing just yet.

 

Ultraleap Gemini Impressions: Better Hand Tracking Is Coming

If the only controller-free hand tracking you’ve used is Quest 2, you may not have the best opinion of it.

Sure it’s cool to see your actual hands in VR for the first time. But try to navigate Quest’s menu system or grab objects in apps like Hand Physics Lab and you’ll soon realize the tracking quality leaves a lot to be desired.

At CES 2022 I tried Ultraleap’s fifth generation technology, which the company calls Gemini. If you’ve been following VR since before the release of consumer headsets you’ll be familiar with Leap Motion, a startup that launched a desktop hand tracking accessory in 2014 that could be mounted to the front of the Oculus Developer Kit 2. It brought hand tracking to VR before most users even had access to motion controllers.

Leap Motion had a small range of demo applications, and it was even supported in the social platform AltSpaceVR. But when the HTC Vive & Oculus Touch controllers launched in 2016 it quickly faded from relevance, despite a major algorithm quality update called ‘Orion’. By 2019 even AltSpace dropped support for Leap Motion – developers prioritize for hardware users actually own, and even though Leap Motion was relatively inexpensive (around $80) it still faced the classic chicken & egg problem of input accessories.

Leap Motion was acquired in 2019 by UK-based haptic technology firm UltraHaptics, merging to become UltraLeap. A year later, Facebook shipped a software update to Oculus Quest adding hand tracking. Quest’s hand tracking leverages the onboard grey fisheye cameras (which can also see in the infrared spectrum). It does work, and millions of people have used it, but if your hands are at the wrong angle or too close together it quickly breaks down. Trying Ultraleap showed me just how much better hand tracking can be.

VR-quality hand tracking requires sensor overlap – the algorithm compares the perspective of each camera to determine the relative position of each hand. Quest’s cameras have a wide field of view, but since they’re positioned at the edges of the headset pointing outwards the hand tracking area is relatively small. That’s why it only works properly with your hands directly in front of you, and why you can notice the system re-establishing tracking as you bring your hands back into your view.

Ultraleap’s latest Gemini hand tracking tech is the successor to Orion, and Ultraleap claims it was re-written from the ground up. It was announced in 2020 alongside a multi-year agreement with Qualcomm integrating and optimizing it for the Snapdragon XR2 chip. 95% of the algorithm runs on the XR2’s DSP (Digital Signal Processor), freeing up the CPU for the actual VR & AR applications.

I tried Gemini as an attachment to Vive Focus 3 virtual reality headset and as an integrated aspect of the upcoming Lynx R1 mixed reality headset – both headsets use the XR2 chip. The two infrared fisheye cameras face forward, not to the sides, giving almost total overlap over a 170 degrees field of view. That’s wider than either headset’s lenses (and almost every headset in existence).

The result was that just like when using motion controllers, I could stretch my arms out naturally and no longer had to worry about going out of the tracking range.

The hardware, like Leap Motion before it, also has active infrared emitters to illuminate your hands for a better view. And further, the sensors are sampled at 90 frames per second. The combination of these specs and the Gemini algorithm meant the virtual hands seemed to match my own precisely and with no perceptible latency. Gone was the lag and glitching of Quest’s hand tracking – Gemini felt generations ahead. I could even interlock my fingers. Only by almost entirely occluding one hand did tracking start to fail.

And speaking of occlusion, one of the most impressive aspects of the demo was the re-acquisition time. I purposely occluded one of my hands by passing it under a table, but by the time it was visible again tracking had almost instantly resumed.

I have always believed controller-free hand tracking will play an important role in mainstream virtual reality. Some people object to this on the basis of the lack of haptics, a view I can’t really argue with. But others are skeptical on the grounds of a perceived lack of precision. Gemini proves that’s really just the tracking quality of Quest, not a fundamental limit of the technology.

Gemini will be pre-integrated in the Lynx R1 headset, which starts at around $500 for Kickstarter backers. Simply put, there has never been hand tracking of this quality at that price. I’m excited to see what developers will do with it, and for future devices leveraging this tech. Meta has some catching up to do.

First Teaser Trailer Arrives for Finger Gun on Meta Quest

Finger Gun

Using your fingers to pretend to shoot guns followed by a few “pew-pew” sound effects always seemed like a natural pastime as a kid, so it’s no wonder that that experience was the inspiration behind the latest videogame announcement for Meta Quest. Miru Studio has teased the first details for its upcoming shooter Finger Gun, which, you guessed it, uses hand tracking.

Finger Gun

Finger Gun is going to be a wave-based shooter with a wild west theme, the main draw being that it’ll use Quest’s hand tracking so you can simply point your finger at an enemy and shoot. Now, as everyone knows, shooting a finger gun means sticking your thumb up and flicking it down to “shoot”, almost as if you’re cocking the hammer on a pistol. While Miru Studio hasn’t confirmed if that is the case here, the very brief trailer does show the character thumbs flicking a lever on the side of each weapon, so that could very well be the case here.

The developer has also revealed that you’ll be fighting against lots of flying robots, with pistols and what look like mini Gatling guns, so there’s going to be a selection of weapons to select from. It’s unknown currently if Finger Gun will feature 360-degree environments or a 180-degree style similar to Space Pirate Trainer, and whether any reloading will be required. It could just be an all-out shooting fest.

“As kids, we spent hours playing finger guns and imagining different worlds and enemies to be taken down. We saw the opportunity to bring this feeling back with Quest hand tracking, and that’s precisely what we have done,” said Miru Studio in a statement.

Finger Gun

This will be the first virtual reality (VR) title from Miru Studio, an indie team based in Spain. Finger Gun will be joining a very select group of titles on Meta Quest to solely employ hand tracking as their control method. Whilst videogames such as Cubism and Clash of Chefs VR have added the option of hand tracking in post-launch updates, only a few VR experiences focus on hand tracking, one of the biggest to arrive recently was Anotherway’s Unplugged, where you can jam to classic rock tunes in an air guitar fashion.

As Miru Studio release further details for Finger Gun, VRFocus will keep you updated.

Vive Focus 3’s Latest Update Improves Hand Tracking Feature

HTC Vive Focus 3

HTC Vive launched its latest all-in-one (AIO) virtual reality (VR) headset, the Vive Focus 3, back in June, and since then has been introducing new features whilst improving others. The latest update enhances the hand tracking capabilities of the device, making it more accurate and stable in the process.

Vive Focus 3 hand tracking

Vive Focus 3 comes with its own controllers as standard with the hand tracking only introduced after the official launch. In this week’s free firmware update (v3.0.999.284), users should find that hand tracking now feels more natural, keeping up with quick movements while actions like pinching are more accurate when interacting with virtual objects.

HTC Vive’s hand tracking engine uses a 26-point skeletal hand modelling system to track all your individual finger movements, now used right from the room setup process thanks to the update. You can simply pop the controllers down and the Vive Focus 3 will automatically detect your hands.

Developers working on Vive Focus 3 compatible projects are able to integrate six predefined hand gestures for easy accessibility, ideal considering the headset is aimed towards the enterprise end of the market. This sector tends to lean towards training and development uses cases as well as design, all of which can benefit from hand tracking.

HTC Vive Focus 3

Retailing for £1,272 GBP, the Vive Focus 3 is based around the Qualcomm Snapdragon XR2 Platform – like so many AIO headsets – sporting dual 2.5K displays (2448 x 2448 pixels per eye), a 90Hz refresh rate, a 120-degree field of view (FoV), adjustable IPD range from 57mm to 72mm and a rear-mounted battery for even weight distribution.

It’s been quite the hardware year for HTC Vive. Alongside the Vive Focus 3, there’s the new Vive Pro 2 for PC VR gaming and then there the Vive Flow. A slightly different tangent to Vive’s other offerings, the Flow is a smartphone connectable device that’s lightweight and for media consumption, with a strong focus on mental health.

As HTC Vive continues to improve its hardware lineup, VRFocus will keep you updated.

Manus Launches its Free Motion Capture Software Polygon

Manus Polygon

Manus specialises in building enterprise-level data gloves with precision finger tracking and haptic feedback for a range of use cases including virtual reality (VR). The company is moving beyond pure hardware solutions today by releasing Manus Polygon, motion capture software that’s SteamVR compatible and free to download.

Manus Polygon

Designed as an entry point for developers looking for a simple motion capture solution without the expense, Polygon Free enables live streaming of body data into Unity or Unreal Engine. When it comes to tracker support, Polygon can be used with any SteamVR compatible device, from the Vive controllers for a basic setup to Manus’ own SteamVR Pro Trackers or the Vive Trackers. And, of course, the software is compatible with the company’s own Prime X series gloves.

For a basic motion tracking setup beyond merely using controllers, developers need enough trackers to cover six points, hands, feet, waist and head. With a VR headset on that means five extra trackers are required. Polygon can support more though, adding further trackers to the upper arms to finesse that digital avatar movement.

“At Manus, we believe in a future where content creation in the virtual world becomes as integrated as video is currently. Convincing full-body motion capture will play a large part in the adoption and creation of the metaverse,” says Bart Loosman, CEO at Manus in a statement. With this release, we invite developers and content creators to dive into full-body motion capture and explore the opportunities this offers for VR, animation, digital avatars, virtual production, and the coming metaverse.”

Manus Polygon

Manus Polygon Free provides all the software functionality developers might need to get started, with Polygon Pro and Polygon Elite offering further professional features. Polygon Pro features recording and editing tools within the Manus Core, as well as FBX exporting, timesync and genlock. Pro users will also get the Manus Strap Set to attach SteamVR compatible trackers. Taking that a step further is Polygon Elite which includes the Pro bundle, a perpetual license, and Manus SteamVR Pro Trackers and a charging station.

The Manus SteamVR Pro Trackers were announced earlier this year with pre-orders being taken for them individually. On the Manus website currently, they only seem to come in a 6-pack retailing for €1,999 EUR, available Q4 2021. By comparison, six Vive Trackers would set you back €834.

For continued updates from Manus, keep reading VRFocus.