Case Study: The Design Behind ‘Cubism’s’ Hand-tracking

Hand tracking first became available on the Oculus Quest back in late 2019. Out of enthusiasm for this new input method, I published a demo of Cubism to SideQuest with experimental hand tracking support only a few days later. Needless to say, this initial demo had several flaws, and didn’t really take the limitations of the technology into account, which is why I decided to initially omit hand tracking support from the full release of Cubism on the Oculus Store. It took more development, leaning on lessons learned from the work of fellow developers, to build something I was happy to release in the recent Cubism hand-tracking update. Here’s an inside-look at the design process.

Guest Article by Thomas Van Bouwel

Thomas is a Belgian-Brazilian VR developer currently based in Brussels. Although his original background is in architecture, his current work in VR spans from indie games like Cubism to enterprise software for architects and engineers like Resolve.

This update builds on lessons learned from many other games and developers who have been exploring hand tracking over the last year (The Curious Tale of the Stolen Pets, Vacation Simulator, Luca Mefisto, Dennys Kuhnert, and several others).

In this article I’d like to share some things I’ve learned when tackling the challenges specific to Cubism’s hand interactions.

Optimizing for Precise Interactions

Cubism’s interactions revolve around placing small irregular puzzle pieces in a puzzle grid. This meant the main requirement for hand tracking input was precision, both in picking up and placing pieces on to the grid, as well as precisely picking out pieces from a completed puzzle. This informed most of the design decisions regarding hand input.

Ghost Hands

I decided early on to not make the hands physics-based, but instead let them pass through pieces until one is actively grabbed.

This avoided clumsily pushing the floating puzzle pieces away when you are trying to grab them mid-air, but more importantly, it made plucking pieces in the middle of a full puzzle easier since you can just stick your fingers in and grab a piece instead of needing to figure out how to physically pry them out.

Signaled by their transparency, hands are not physical, making it easier to pick out pieces from the middle of a puzzle.

Contact Grabbing

There are several approaches to detecting a users intent to grab and release objects, like focusing on finger pinches or total finger joint rotation while checking a general interaction zone in the palm of the hand.

For Cubism’s small and irregular puzzle pieces however, the approach that seemed to handle the precision requirements the best was a contact based approach, where a piece is grabbed as soon as thumb and index intersect the same piece and are brought together over a small distance, without requiring a full pinch.

Similar to the approach in The Curious Tale of the Stolen Pets, the fingers are locked in place as soon as a grab starts, to help give the impression of a more stable looking grab. The piece is parented to the root of the hand (the wrist) while grabbed. Since this seems to be the most stable tracked joint, it helps produce a steadier grip, and guarantees the piece stays aligned with the locked fingers.

Piece is grabbed when thumb and index intersect it and are brought together slightly. Rotation of index and thumb are then locked in place to help give the impression of a stable grab.

As soon as a piece is grabbed, the distance between thumb and index is saved, and a release margin is calculated based on that distance. Once thumb and index move apart beyond that margin, the piece is released.

Several safeguards try to prevent unintentional releases: we don’t check for release when tracking confidence is below a certain threshold, and after tracking confidence is re-gained, we wait several frames until checking for release again. Fingers are also required to be beyond the release margin for several frames before actually releasing.

Debug visualization: during a grab, the initial grab distance between fingertips is saved (outer red circle). The piece is released when the real position of the fingertips move beyond a certain margin (blue circle).

There is also a system in place similar to Vacation Simulator’s overgrab method. Due to the lack of haptic feedback when grabbing a piece, it’s not uncommon for fingers to drift closer to one another during a grab. If they close beyond a certain threshold, the release margins are adjusted to make releasing the piece easier.

Try it yourself: to see these debug visualizations in-game, go to ‘Settings > Hand Tracking > Debug visualizations’ and turn on ‘Interactions widgets’.

Debug visualization: If fingers drift to each other during a grab over a certain threshold (inner red circle), the release margins are re-adjusted to make releasing the piece feel less “sticky”.

One limit to this approach is that it makes supporting grabbing with fingers other than the index a bit harder. An earlier implementation also allowed grabbing between middle finger and thumb, but this often led to false positives when grabbing pieces out of a full puzzle grid, since it was hard to evaluate which finger the player was intending to grab a specific piece with.

This would not have been an issue if grabbing revolved around full finger pinches, since that results in a more clear input binary from which to determine user intent (at the cost of a less natural feeling grab pose).

Midpoint Check

Besides checking which piece the index and thumb are intersecting, an additional check happens at the midpoint between index fingertip and thumb fingertip.

Whatever piece this midpoint hovers over will be prioritized for grabbing, which helps avoid false positives when a player tries to grab a piece in a full grid.

In the example below, if the player intends to grab the green piece by its right edge, they would unintentionally grab the yellow piece if we didn’t do this midpoint check.

Left: thumb, index & midpoint between fingertips are in yellow → grab yellow. Right: thumb & index are in yellow, midpoint is in green → grab green

Grabbing the Puzzle

Grabbing the puzzle works similar to grabbing puzzle pieces, except it is initiated by performing a full pinch within the grab zone around the puzzle.

The size of this zone is dynamically increased when switching from controllers to hands. This makes it a bit easier to grab, and helps reduce the likelihood of accidentally grabbing a piece in the grid instead of the grid itself.

The grab zone around the puzzle expands when switching from controllers to hands, making it easier to grab. Although it requires a full pinch, grabbing the puzzle works similar to grabbing puzzle pieces.

Dynamic Hand Smoothing

The hand tracking data provided by the Oculus Quest still can have a bit of jitter to it, even when tracking confidence is high. This can actually affect game play too, since jitter can be much more noticeable when holding the puzzle grid or a long puzzle piece by the edge, making precise placement of pieces on the grid harder.

Smoothing the tracking data can go a long way to produce more stable looking grabs, but needs to be done in moderation since too much smoothing will result in a “laggy” feeling to the hands. To balance this, hand smoothing in Cubism is dynamically adjusted depending on whether your hand is holding something or not.

Try it yourself: to see the impact of hand smoothing, try turning it off under
‘Settings > Hand Tracking > Hand smoothing’.

Increasing the smoothing of hand positions while holding objects helps produce a more stable grip, making precise placement on the grid a bit easier.

Pressing Buttons

One thing I noticed with Cubism’s original hand tracking demo was that most people tried pressing the buttons even though that was not supported at the time. Therefore, one of my goals with this new version of hand tracking was to make the buttons actually pushable.

Buttons can be hovered over when a raycast from the index finger tip hits a collider at the back of the button. If the index finger then intersects with the collider, a press is registered. If the index intersects the collider without first hovering it, no press is registered. This helps prevent false positives when the finger moves from bottom to top.

There are a few more checks in place to prevent false positives: the raycast is disabled when the finger is not facing the button, or when the player is not looking at their finger when pressing.

Try it yourself: to see this debug visualization in-game, go to ‘Settings > Hand Tracking > Debug visualizations’ and turn on ‘Interactions widgets’.

Debug visualization: a raycast from the index tip checks whether the finger is hovering over a button. To help prevent false positives, interaction is disabled when the finger is not facing the button, or when the player is not looking at their finger.

Guiding Interactions

One of the main challenges of building any interaction for hand tracking is that, in contrast to buttons on a controller which are either pushed or not pushed, there are many different ways people may try to approach an interaction with their hands while expecting the same outcome.

Playtesting with a diverse set of people can help you learn how people are approaching the interactions presented to them, and can help refine the interaction cues that guide them to the expected gestures. Playtesting can also help you learn some of the outliers you may want to catch by adding some interaction redundancy.

Interaction Cues

There are several cues while grabbing a piece. When a user first hovers over a piece, their index and thumb take on the color of that piece, both to indicate it can be grabbed, and to signal which fingers can grab it (inspired by previous work by Luca Mefisto, Barrett Fox, and Martin Schubert). The piece is also highlighted to indicate it can be grabbed.

Several cues also indicate when the grab is successful: the fingertips become solid, the highlights on the piece flash, and a short audio cue is played.

Various cues both on the hand and the puzzle piece guide and confirm the grab interaction.

Buttons have several cues to help indicate that they can be pushed. Much like with puzzle pieces, the index fingertip is highlighted in white once you hover over a button, indicating which finger can interact. Like they did with controllers, buttons extend outward when hovered, but this time the extended button can actually be pressed: once the index touches it, it follows the finger until it is fully pressed down, at which point an audio cue confirms the click.

A subtle drop shadow on the button surface indicates where the position and distance of the index to the button and helps guide the press interaction.

Various cues guide interactions with buttons: buttons extend outward when hovered, the index fingertip is highlighted, a drop shadow shows where the tip will interact, and the button follows the finger when pushed.

Interaction Redundancy

Since some people may approach some interactions in unintended ways, it can be good to try and account for this where possible by adding some redundancy to the ways people can use their hands to interact. Interaction cues can still guide them to the intended interaction, but redundancy can help avoid them getting unnecessarily stuck.

When it comes to grabbing pieces, a few playtesters would try to grab pieces by making a fist at first instead of using their finger tips. By having the colliders cover the entire finger instead of just the fingertip, a decent amount of these first grabs will still be registered.

I should note this approach still needs some improvement, since it also introduces some issues producing unintended grabs in cases when there are a lot of pieces floating around the play area. A better approach in the future might be to also perform a check on the total finger rotation to account for fist grabs instead.

Though grabbing is designed around fingertips, colliders on index and thumb cover the entire finger to help catch different forms of grabbing.

With buttons, there were a few playtesters who would try pinching them instead of pushing them. In part this seemed to occur when they previously learned how to pinch buttons in the Oculus home screen, right before launching the game.

For this reason, buttons can also be clicked by pinching once they are hovered, and hopefully cues like the highlighted index and drop shadow will eventually guide them to pressing the buttons instead.

Pinching while hovering over buttons also registers as a click.

The first button players encounter when using hands also explicitly states “Push to Start”, to help transition people from pinching to pushing after coming from the Oculus Home menu.

Teaching Limitations

Although the quality of Quest’s hand tracking has improved over the last year, it still has its limitations — and a player’s awareness of these limitations can have a big impact on how good they perceive their experience to be.

Cubism implements a few ways of teaching player’s about the current limitations of hand tracking on Quest.

When the player first switches to hand tracking (either at launch or mid-game), a modal informs them of some best practices, like playing in a well-lit space and avoiding crossing hands.

When a user switches to hand tracking, a modal informs them about limitations and best-practices. The “Push to Start” instruction helps teach new users that buttons can be naturally pushed in this game.

It is important to acknowledge that most people are likely to immediately dismiss modals like this or quickly forget its guidelines, so signaling why things can go wrong during the experience is also important.

In Cubism, hands will turn red to signal when tracking was lost. In some playtests, people would keep one hand on their lap and play with the other, and be puzzled why their lap hand would appear frozen. To help inform cases like this, a message is displayed on the hand to clearly state why the hand is frozen if tracking loss persists. If tracking is lost specifically because the player is crossing their hands, the message changes to inform them not to do that.

Left: hands turn red when tracking is first lost. Middle: when tracking loss persists, a message informs the player about what is going on. Right: if tracking is lost due to occluded hands this is also indicated

For more seasoned players, or players who prefer playing with one hand, this feature can be replaced in the settings by having hands fade out when they lose tracking instead, more closely resembling the behavior in the Oculus home menu.

The red hands and warning messages can be replaced in the settings by fading hands.

Future Work

Hand tracking on Quest still has its limitations, and though Cubism’s support for it is already in its second version, there is still plenty of room for improvement.

Regardless, I’m excited to start exploring and supporting these new input methods. In the short term, I think they can help make experiences like this more accessible and easier to share with new VR users.

Mixed reality footage captured on an iPhone with Fabio Dela Antonio’s app Reality Mixer gives an idea of what it may be like to play Cubism on an AR headset in the future.

In the long term, there seems to be a good chance that hand tracking will be the go-to input for future standalone AR devices, so hopefully this update can be a first small step towards an AR version of Cubism.


If you enjoyed this look at at the hand-tracking design in Cubism, be sure to check out Thomas’ prior Guest Article which overviews the design of the broader game.

The post Case Study: The Design Behind ‘Cubism’s’ Hand-tracking appeared first on Road to VR.

Hand Physics erscheint am 1. April für Oculus Quest

Tower Tag Lockdown Sale 3

Entwickler Dennys Kuhnert und Holonautic werden am 1. April ihre Anwendung Hand Physics für die Oculus Quest und Oculus Quest 2 im offiziellen Store von Oculus veröffentlichen. Auf Sidequest war die App bereits über die letzten Monate hinweg sehr beliebt.

Hand Physics erscheint am 1. April für Oculus Quest

Wie der Name der Anwendung vermuten lässt, handelt es sich bei Hand Physics um eine Anwendung, die euch mit mit den eigenen Hände in der Virtual Reality experimentieren lässt. So werdet ihr vor kleinere und größere Aufgaben gestellt, wie beispielsweise das Bemalen von Eiern oder die Verwendung von Telekinese. Einen Support für die Oculus Touch Controller wird es jedoch auch geben – vermutlich ist dies eine Auflage von Oculus für den Release im offiziellen Store.

In der Vorveröffentlichungsphase hat das Spiel bereits einige Erfolge gefeiert. Hand Physics war zuvor in der Beta-Phase über SideQuest verfügbar, wo es über 185.000 Mal heruntergeladen wurde und eine Benutzerbewertung von [4.6/5] erhielt.

Wir sind gespannt auf die Veröffentlichung der Vollversion und werden euch natürlich unsere Eindrücke zeitnah auf Youtube schildern.

Hand Physics wird exklusiv für die Oculus Quest bzw. Oculus Quest 2 erscheinen, welche ihr aktuell nicht in Deutschland kaufen könnt. Ihr könnt sie jedoch einfach über Amazon Frankreich bestellen und erhaltet sie in wenigen Tagen. Unseren Langzeittest zur Oculus Quest 2 findet ihr hier.

(Quelle: Road to VR)

Der Beitrag Hand Physics erscheint am 1. April für Oculus Quest zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Hand-tracking Game ‘Hand Physics’ for Quest Releasing April 1st, Trailer Here

Both Oculus Quest and Quest 2 have the ability to track your hands, but precious few games have integrated the tech. Now, indie developer Dennys Kuhnert and Holonautic are getting ready to release a virtual smorgasbord of hand-tracking-based puzzles and tasks.

Hand Physics is launching April 1st, bringing its Touch-less game to the official Oculus Store. Ok, that’s not entirely true; it also supports Touch controllers, but it was primarily created to make use of Quest’s innate optical hand-tracking.

In the game you’re tasked with running through various whacky objectives, including painting eggs with your fingers, building cube towers, using magnets and telekinesis, shaking hands with your clone, petting a virtual cat—all of it timed and scored.

In its pre-release stage, the game has already celebrated its fair share of success. The game has been previously available in beta via SideQuest where it was downloaded over 185,000 times, garnering a user review of [4.6/5].

Dennys Kuhnert, co-founder and CTO of Holonautic, has been previously involved in creating the studio’s Early Access physics-based game Holoception (2019)which lets you take either a first or third-person view inside a world full of stick figure violence reminiscent of Flash games and videos from the early ’00s.

We’re hoping to get a look at the full version before its released on Quest next week, so make sure to check back soon. You can currently wishlist the game here.

The post Hand-tracking Game ‘Hand Physics’ for Quest Releasing April 1st, Trailer Here appeared first on Road to VR.

New Cubism Update Is An Excellent Showcase For Hand Tracking On Quest

Cubism’s latest update adds hand tracking support to the full game on Quest and Quest 2 and it works incredibly well.

Cubism, which launched midway through last year, is a fantastic puzzle game available on PC VR and Quest. The concept is simple — fit the pieces into the 3D wireframe with no gaps and nothing sticking out — but it’s one that strikes a perfect balance between being easy to understand and challenging to solve.

Before its full launch, a short demo of Cubism was available to sideload with basic hand tracking support. However, when it arrived on the Oculus Store, the game only supported Touch controllers. While they worked well enough, it sometimes felt a bit clunky to position the controllers in the right orientation when placing a piece.

Now, Cubism offers a more natural solution with hand tracking for the entire game and the implementation is better than ever.

As with all hand tracking on Quest headsets, it’s not perfect. However, Cubism has designed its hand tracking support to be as user-friendly as possible. An introductory dialogue box educates you on how to get the best performance — playing in a well lit space, not crossing you hands over each other, etc. — and from there it’s very simple. Grab pieces and place them in the desired position.

What sets Cubism apart is the intelligent design safeguards in place to account for unavoidable technological shortcomings. The tips of your fingers will light up when grabbing a block to indicate which specific fingers are being tracked as part of the ‘grab’ action. This adds a lot of clarity, as it lets you know exactly which fingers to release when you want place a block accurately.

Likewise, your hand will go red when it’s occluded by the other or when it’s in a position that’s difficult to track accurately. It’s a subtle feature, but one that makes it incredibly clear when you need to adjust something to improve the experience.

It’s these little additions that make Cubism’s hand tracking feel more refined than it would otherwise. It works within the limitations of the current technology, not in spite of them.

There’s still bumps in the road — sometimes a piece will get ‘stuck’ on your hand after releasing, and sometimes it’s hard to twist your hand to the right angle without encountering tracking hiccups. However, it’s definitely one of the more natural hand tracking experiences on the Quest — it feels like a perfect fit for the game and not just a gimmick.

The hand tracking update is available now for Cubism on Quest. Hand tracking is not supported for PC VR, so the Rift version of the game will only support Touch controllers.

You can read our full Cubism review for more details.

Watch: Low-Fi Getting Ultraleap Hand-Tracking Support For Vehicles

You’ll soon be able to interact with Low-Fi’s futuristic cockpits using just your hands thanks to support for Ultraleap.

Mark Schramm, one of the developers on the anticipated upcoming VR project, just revealed that support for Ultraleap hand-tracking would be arriving in the early access build of the game soon. Take a look at Schramm’s tweet below to see the support in action; it allows you to pilot Low-Fi’s Blade Runner-style hovercars using just your hands (provided you have one of the company’s hand-tracking sensors, that is).

The clip shows the player using their hands to interact with the vehicle’s dashboard, changing the radio station, turning on lights and activating waypoints. Schramm confirmed to us that hand-tracking support will be limited to driving vehicles for now, though it will allow you to just set aside your controllers when you climb into the cockpit. Figuring out how it might work elsewhere would be a challenge, as hand-tracking means no control sticks for movement and nothing in your hand when you grab items.

It’s another cool incremental addition to the upcoming sci-fi sandbox. Low-Fi’s aim isn’t to be a ‘game’ so much as a simulator that lets players do as they choose in its futuristic world. You can get early access to the experience on Itch.io and it’s coming soon to SteamVR and, at some point, PS5 VR too.

Will you be checking out Ultraleap hand-tracking support in Low-Fi? Let us know in the comments below!

Go Hands-on With Oculus Quest App Lab Games First Steps & Tiny Castles

Tiny Castles

Once an experimental feature, great things were expected of Oculus Quest’s hand tracking yet that’s not really come to pass, with only a handful of titles utilising it. Oculus is trying to encourage more developers to give the feature a go by releasing two of its own projects, First Steps With Hand Tracking and Tiny Castles.

First Steps With Hand Tracking

First Steps should sound familiar to most Oculus Quest users as the app came supplied with the original headset back in 2019. It helped introduce owners to the features of the device, mainly how to use the Oculus Touch controllers and what they were capable of. First Steps With Hand Tracking essentially takes that experience and swaps in hand tracking.

Free to download, this is an App Lab title so don’t expect the same level of polish as other Oculus games. First Steps With Hand Tracking even notes in the description its more developer-oriented, saying: “If you’re a developer, you should check out how hand tracking can replace your Touch Controller experience.” Either way, it’s still nice to see more content encouraging the feature.

Most gamers will be more interested in Tiny Castles, another in-house project this time built specifically for hand tracking. Described as an ‘action puzzle game’, the gameplay involves you playing a god freeing and protecting your followers from an evil god and its minions. It’s really more of a test bench to showcase what types of gameplay work well with hand tracking.

Tiny Castles

So don’t expect a massive array of levels and challenges to face. One of the more interesting aspects – especially if you’re a developer – is the Playground Mode. This area offers you the chance to test each specific mechanic such as grabbing an object or punching an obelisk.

While you might have played with the hand tracking feature in the Oculus Quest menus, actual implementation by third-party studios has been lacking. Worth checking out are The Curious Tale of the Stolen Pets, Waltz of the Wizard: Extended Edition, Vacation Simulator and The Line for good examples. As the roster of hand tracking titles for Oculus Quest (hopefully) grows, VRFocus will keep you updated.

Free New Facebook Demos Showcase Oculus Quest Hand-Tracking To Devs

Facebook posted two Oculus App Lab experiences aimed at developers, including a new version of its First Steps showcase using hand-tracking tech.

Both First Steps and the other title, Tiny Castles, are free to download via the service but shouldn’t be thought of as fully-fleshed out gaming experiences. First Steps itself is an extension of the introductory app that users will experience when they boot up their Oculus Quest headset for the first time. The current iteration of the app focuses on controller interactions, but this edition puts Quest’s hand-tracking features front and center.

The actual content of the app remains the same, offering a range of experiences and items you can interact with. You can play with toy rockets, hold a ping pong paddle, shoot different types of guns and more. Those interactions have all been reconfigured to work without the need for a controller.

The App Lab description for the experience says it showcases “how hand tracking can replace your Touch Controller experience”. But it also emphasizes that this is an experimental version of the app for developers, which explains why it isn’t releasing on the Oculus Store proper at this point in time.

The other release, Tiny Castles, is an all-new experience with a number of new interactions. It casts players as a god that must save their followers from an evil entity, using a variety of mechanics built around hand-tracking. Those include grabbing objects from afar and tossing them onto a battlefield, picking up tiny characters with a pinch, or shooting fire from your fingers with a certain gesture.

Again, this is more showcase than it is game, and Facebook says it “lacks real challenge and difficulty balancing” as a result. Both apps were developed by Facebook’s internal ‘Strike Team’.

App Lab is primarily designed for third-party developers to publish titles on Quest without having to be approved for the Oculus Store and without users having to sideload content. But First Steps and Tiny Castles aren’t the only apps Facebook itself has launched on the service – at launch last month it also had a showcase demonstration of its SparkAR player.

Ultraleap’s New ‘Gemini’ Software Overhaul Drastically Improves Two-handed Interactions

Ultraleap, the company behind the Leap Motion hand-tracking controller, has released a Developer Preview of its hand-tracking engine Gemini. By many accounts, Ultraleap’s latest software overhaul dramatically increases the ability of the company’s camera modules to do more precise and stable two-handed interactions.

Gemini is now available in Developer Preview for Windows 10, and is designed to work with all existent Leap Motion controllers as well as Ultraleap’s more recent Stereo IR 170 camera module.

In comparison to Orion (V4), which was released in June 2018, its Gemini (V5) engine is said to offer better smoothness, pose fidelity, and robustness. It also improves hand initialization, and brings “significantly better performance with two-hand interactions,” Ultraleap says.

As seen in the gif below, the solidity of Gemini (V5) is pretty astounding. Not only are both hands more accurately tracked, but occlusion appears to be much less of an issue too, as fingers interlock and move in front of each other with comparative ease.

Ultraleap is set to integrate Gemini into a number of XR headsets, including Varjo VR-3 and XR-3 headsets, and the Qualcomm Snapdragon XR2 5G reference design, which makes use of Ultraleap hardware.

Antony Vitillo of XR publication Skarred Ghost went hands-on with Gemini using his first-generation Leap Motion tracker. To him, the software overhaul represents “the best hands-tracking system I’ve seen until now on all headsets for what concerns the interactions between two hands.”

“What really surprised me is the stability of two hands interactions. For the first time, I’ve been able to make the fingers of my two hands cross and interweave [together], and the tracking kept working reliably.”

Granted, Vitillo’s five year-old Leap Motion does present somewhat of a roadblock due to its comparatively small field of view, however Ultraleap says with its updated IR 170 camera module that “hands will almost certainly be tracked before they come into your sight.”

In practice, Ultraleap hopes its new software will let developers create hand-tracking-focused applications in preparation for the next wave of AR and VR headsets to make more prominent use of the technology. Facebook’s Oculus Quest standalone notably includes hand-tracking for use within its system UI and a handful of applications, however it hasn’t become a standard input method yet.

The post Ultraleap’s New ‘Gemini’ Software Overhaul Drastically Improves Two-handed Interactions appeared first on Road to VR.

Ultraleap Gemini Hand Tracking Improves Two-Handed Interactions

Ultraleap has shared a developer preview of Gemini, the fifth generation of its hand tracking software, claiming improved performance with two-handed interactions.

Ultraleap says that it rewrote its tracking engine “from the ground up” for Gemini, which will allow increased flexibility and compatibility with different types of hardware and platforms. The Gemini software will be integrated in the Varjo VR-3 and XR-3 headsets. Back in September last year, Qualcomm announced that Ultraleap’s Gemini software would also be implemented into the Qualcomm XR2 reference design.

A common problem with hand tracking technology is decreased performance when both hands are placed in closed proximity to each other. Having both hands interact and touch with each other increases the complexity of the hand tracking and will often give unreliable results or partial tracking for one or both hands. A good example of this is the hand tracking found in the Oculus Quest and Quest 2 — while often reliable, it will easily get confused when both hands interact with each other.

Ultraleap claims to offer “significantly better performance with two-hand interactions” with its Gemini software — a claim that it backed up with an accompanying video, embedded in the tweet above.

It also claims that Gemini will offer “even better smoothness, pose fidelity, and robustness” along with “improved hand initialization.”

You can read more about the Gemini Developer Preview and sign up to test the preview release here. The preview release will only be available on Windows 10, but Ultraleap says support for additional platforms will be added in later releases.

Ultraleap’s Fifth-Gen Hand Tracking Software Improves Two-handed Interactions

Ultraleap Gemini

Hand tracking is moving more and more into mainstream virtual reality (VR), whether that’s in consumer headsets like Oculus Quest 2 or Varjo’s high-end enterprise devices. The latter employs Ultraleap’s technology, with the hand tracking specialist announcing a developer preview is available for version 5 of its Gemini software.

Ultraleap Gemini
Ultraleap Gemini improvements show both hands can be used together. It enables natural interaction with virtual objects. Image credit: Ultraleap.

One of the main problems with software-based hand tracking solutions over actual gloves like HaptX or SenseGlove are two-handed interactions. Natural interactions like holding hands or one going behind the other are difficult to portray due to occlusion, where the sensors can no longer see fingers of the entire hand. To maintain natural immersion so that tracking isn’t lost or a hand suddenly disappears, Ultraleap has improved this important aspect with Gemini v5.

It may only be in a developer preview form at the moment – a full release will come later in the year – but the above GIF showcases the improvements made over the previous edition, Orion. The full hand and fingers are tracked and maintained no matter how they interact.

Gemini’s preview features include:

  • Even better smoothness, pose fidelity, and robustness (likely to be most apparent on desktop mode)
  • Improved hand initialization
  • Significantly better performance with two-hand interactions
  • New Screentop modes (to be mounted above an interactive screen) in addition to HMD and Desktop mode
Ultraleap Gemini
Combined with Stereo IR 170’s wider FoV and Gemini’s improved hand initialization,  hands will almost certainly be tracked before coming into view.

While Gemini works with both Ultraleap camera modules the Leap Motion Controller and Stereo IR 170, the latter’s wider field of view (FoV) means that hands can be tracked sooner, even before they come in a users line of sight. Leap Motion Controller has been available for several years now and can be used on a desk or mounted onto a VR headset. The Stereo IR 170 (in Camera Module and Evaluation Kit form) is primarily designed for integration and development needs.

Ultraleap tech might already be used by Varjo and Pimax but it’s the integration with Qualcomm’s Snapdragon XR2 5G reference design which could see more consumers gain access. The XR2 platform is going to lay the groundwork for plenty of devices over the next couple of years, making hand tracking even more prominent. For further Ultraleap updates, keep reading VRFocus.