Leading Hand-tracking Company Ultraleap Raises $82 Million Investment

Ultraleap, a leading company focused on hand-tracking interfaces, this week announced it has secured a £60 million (~$82 million) Series D investment, with the goal of expanding its hand-tracking and mid-air haptic tech in the XR space and beyond.

Formerly known as Ultrahaptics, Ultraleap was formed after the UK-based haptics company acquired leading hand-tracking company Leap Motion back in 2019. The new name clearly defined the merger’s unique combination of mid-air ultrasonic haptics now underpinned by some of the best hand-tracking tech in the industry.

This week Ultraleap announced it has raised a £60 million (~$82 million) Series D investment with participation from new investors Tencent, British Patient Capital, and CMB International, alongside existing investors Mayfair Equity Partners and IP Group plc.

“With this investment round, Ultraleap will continue to bring Gemini to different operating systems and increase their investment in tooling to enable developers to build more applications using the best interface—your hands. Ultraleap will also continue to invest in R&D to drive their machine-learning-based hand tracking even further ahead,” the company said in its investment announcement.

Ultraleap is betting that hand-tracking will be the primary input for XR and the metaverse. Last month the company released its latest revision.

While the company has been trying to get its tech into the XR space for many years now, it has yet to find significant traction. Though Ultraleap hand-tracking can be found on a few headsets like those from Varjo and Pimax, leading devices implementing hand-tracking—like Quest 2, HoloLens, and Magic Leap—are using their own solutions, as far as we know.

However, with a growing number of XR devices on the market and the steady march toward consumer-friendly AR glasses, the company seems poised to find the right fit eventually.

Ultraleap is also looking to find a home for its tech outside of the XR realm. The company has long been angling its tech in the automotive space as an in-car interface, as well as the out-of-home space in areas like exhibits, marketing installations, and touchless self-service kiosks.

The post Leading Hand-tracking Company Ultraleap Raises $82 Million Investment appeared first on Road to VR.

Ultraleap Hand-tracking Update Delivers Improved Two-handed Interactions

The latest version of Ultraleap’s hand-tracking tech is finally available today on Windows for use with the Leap Motion Controller accessory and promises to improve two-handed interactions, speed, and robustness. The release includes a demo experience showcasing how hand-tracking can be used as a primary input for a standalone XR device.

Ultraleap today publicly released ‘Gemini’, the company’s fifth-generation hand-tracking software which was initially made available in a developer preview earlier this year. The improved hand-tracking software has already been deployed to headsets like Varjo’s and been made available for devices based on Qualcomm’s Snapdragon XR2, and now it can be downloaded on Windows to be used with the company’s existing Leap Motion Controller accessory which can be mounted to VR headsets. Support for MacOS and Linux are expected further down the road.

While the Leap Motion Controller is by now quite old, the company has continued to refine the software that underlies it, improving on what is already recognized as some of the best hand-tracking tech available in the industry. More recently, Ultraleap has released improved hand-tracking modules with a wider field-of-view and other improvements, though these aren’t available as a standalone accessory.

Image courtesy Ultraleap

With the Gemini update, Ultraleap says it has improved two-handed interactions, initialization speed, and the robustness of its hand-tracking. Alongside the Windows release of Gemini, the company is also making available an ‘XR Launcher’ demo experience which shows how the hand-tracking tech can be used for a fully functional XR interface.

The post Ultraleap Hand-tracking Update Delivers Improved Two-handed Interactions appeared first on Road to VR.

Ultraleap’s New ‘Gemini’ Software Overhaul Drastically Improves Two-handed Interactions

Ultraleap, the company behind the Leap Motion hand-tracking controller, has released a Developer Preview of its hand-tracking engine Gemini. By many accounts, Ultraleap’s latest software overhaul dramatically increases the ability of the company’s camera modules to do more precise and stable two-handed interactions.

Gemini is now available in Developer Preview for Windows 10, and is designed to work with all existent Leap Motion controllers as well as Ultraleap’s more recent Stereo IR 170 camera module.

In comparison to Orion (V4), which was released in June 2018, its Gemini (V5) engine is said to offer better smoothness, pose fidelity, and robustness. It also improves hand initialization, and brings “significantly better performance with two-hand interactions,” Ultraleap says.

As seen in the gif below, the solidity of Gemini (V5) is pretty astounding. Not only are both hands more accurately tracked, but occlusion appears to be much less of an issue too, as fingers interlock and move in front of each other with comparative ease.

Ultraleap is set to integrate Gemini into a number of XR headsets, including Varjo VR-3 and XR-3 headsets, and the Qualcomm Snapdragon XR2 5G reference design, which makes use of Ultraleap hardware.

Antony Vitillo of XR publication Skarred Ghost went hands-on with Gemini using his first-generation Leap Motion tracker. To him, the software overhaul represents “the best hands-tracking system I’ve seen until now on all headsets for what concerns the interactions between two hands.”

“What really surprised me is the stability of two hands interactions. For the first time, I’ve been able to make the fingers of my two hands cross and interweave [together], and the tracking kept working reliably.”

Granted, Vitillo’s five year-old Leap Motion does present somewhat of a roadblock due to its comparatively small field of view, however Ultraleap says with its updated IR 170 camera module that “hands will almost certainly be tracked before they come into your sight.”

In practice, Ultraleap hopes its new software will let developers create hand-tracking-focused applications in preparation for the next wave of AR and VR headsets to make more prominent use of the technology. Facebook’s Oculus Quest standalone notably includes hand-tracking for use within its system UI and a handful of applications, however it hasn’t become a standard input method yet.

The post Ultraleap’s New ‘Gemini’ Software Overhaul Drastically Improves Two-handed Interactions appeared first on Road to VR.

Ultraleap’s Fifth-Gen Hand Tracking Software Improves Two-handed Interactions

Ultraleap Gemini

Hand tracking is moving more and more into mainstream virtual reality (VR), whether that’s in consumer headsets like Oculus Quest 2 or Varjo’s high-end enterprise devices. The latter employs Ultraleap’s technology, with the hand tracking specialist announcing a developer preview is available for version 5 of its Gemini software.

Ultraleap Gemini
Ultraleap Gemini improvements show both hands can be used together. It enables natural interaction with virtual objects. Image credit: Ultraleap.

One of the main problems with software-based hand tracking solutions over actual gloves like HaptX or SenseGlove are two-handed interactions. Natural interactions like holding hands or one going behind the other are difficult to portray due to occlusion, where the sensors can no longer see fingers of the entire hand. To maintain natural immersion so that tracking isn’t lost or a hand suddenly disappears, Ultraleap has improved this important aspect with Gemini v5.

It may only be in a developer preview form at the moment – a full release will come later in the year – but the above GIF showcases the improvements made over the previous edition, Orion. The full hand and fingers are tracked and maintained no matter how they interact.

Gemini’s preview features include:

  • Even better smoothness, pose fidelity, and robustness (likely to be most apparent on desktop mode)
  • Improved hand initialization
  • Significantly better performance with two-hand interactions
  • New Screentop modes (to be mounted above an interactive screen) in addition to HMD and Desktop mode
Ultraleap Gemini
Combined with Stereo IR 170’s wider FoV and Gemini’s improved hand initialization,  hands will almost certainly be tracked before coming into view.

While Gemini works with both Ultraleap camera modules the Leap Motion Controller and Stereo IR 170, the latter’s wider field of view (FoV) means that hands can be tracked sooner, even before they come in a users line of sight. Leap Motion Controller has been available for several years now and can be used on a desk or mounted onto a VR headset. The Stereo IR 170 (in Camera Module and Evaluation Kit form) is primarily designed for integration and development needs.

Ultraleap tech might already be used by Varjo and Pimax but it’s the integration with Qualcomm’s Snapdragon XR2 5G reference design which could see more consumers gain access. The XR2 platform is going to lay the groundwork for plenty of devices over the next couple of years, making hand tracking even more prominent. For further Ultraleap updates, keep reading VRFocus.

Watch: Half-Life: Alyx Gets Up And Running In Project North Star AR Headset

Half-Life: Alyx’s latest impressive feat? Getting up and running inside Leap Motion’s Project North Star open-source AR headset.

The project was recently shown off by Bryan Chris Brown, who serves as Community Manager for North Star. He showed Valve’s beloved VR hit running inside the headset. North Star is an open-source device, meaning anyone can iterate on its design, but its lenses already feature a wide field of view. In the video a camera moves from the real world before peering behind the lens to see the world of City 17 stretching out in front of them.

Half-Life: Alyx AR (Sort Of)

It’s hard to tell from the video but the image is translucent; look at when the lenses cover a doorway on the bottom right of the screen towards the end of the video. Brown noted that, if you were to play in the dark, you’d see essentially a full VR image, or you could cover the back of the lenses with a black matte material to get the same effect too.

Currently there’s also a gesture system implemented for hand-control – Leap Motion’s specialty – but that’s not shown off in the video and apparently needs a lot more work. This wouldn’t be the best way to experience Alyx itself, but it’s certainly a fascinating project. It’s also one of the first times we’ve seen North Star in action ever since Ultrahaptics bought Leap Motion last year.

Don’t hold your breath for an AR version of Half-Life: Alyx just yet, then. But if you’re interested in seeing what else Valve’s latest can do, you should definitely check out the influx of mods for the game.

Qualcomm Signs “Multi-year” Deal to Bring Ultraleap Hand-tracking to XR2 Headsets

Qualcomm and Ultraleap today announced a “multi-year co-operation agreement” that will bring Ultraleap’s controllerless hand-tracking tech (formerly of Leap Motion) to XR headsets based on the Snapdragon XR2 chipset. Ultraleap claims to have the “fastest, most accurate, and most robust hand tracking.”

Snapdragon XR2 is Qualcomm’s latest made-for-XR chip which the company has touted as being the ideal foundation for standalone XR headsets.

The leading standalone VR headset, Oculus Quest, has been increasingly focusing on controllerless hand-tracking as a means of input for the device. Other major headset makers, like Microsoft and its HoloLens 2, have also honed in on hand-tracking as a key input method. As industry leaders coalesce around hand-tracking, it becomes increasingly important for competing devices to offer similar functionality.

But hand-tracking isn’t a ‘solved’ problem, making it a challenge for organizations that don’t have the resources of Facebook and Microsoft to work out their own hand-tracking solution.

Over the years Qualcomm has been working to reduce the barrier to entry to making a standalone XR headset by offering ready-made technologies—like inside-out tracking—alongside its chips. Now the company is announcing that its XR2 chip will be optimized for Ultrealeap hand-tracking out of the box.

While Qualcomm and Ultraleap have previously worked together on this front, the Ultraleap hand-tracking solution offered through Qualcomm was tied to Ultraleap’s hand-tracking hardware. The new announcement means that Ultraleap’s hand-tracking software is being offered independent of its hardware. This makes it a more flexible and cost-effective solution, with the hand-tracking software ostensibly making use of a headset’s existing inside-out tracking cameras, rather than requiring an additional cameras just for hand-tracking; this also frees up two of XR2’s seven supported camera slots for other uses like eye-tracking, mouth, tracking, and more.

Qualcomm and Ultraleap say the hand-tracking tech will be “pre-integrated” and “optimized” for XR2. It isn’t clear if this simply means that Ultraleap hand-tracking will be available as a service in the XR2 software stack, or if XR2 will include special hardware to accelerate Ultraleap hand-tracking, making it more power and resource efficient.

Leap Motion 'Virtual Wearable' AR Prototype is a Potent Glimpse at the Future of Your Smartphone

Despite being a years-long leader in hand-tracking technology, Ultraleap (formerly Leap Motion) hassn’t managed to get its solution to catch on widely in the XR space. Now that hand-tracking is seeing greater emphasis from leading companies, Ultraleap’s camera-agnostic solution on XR2 could be the moment where the company’s hand-tracking tech begins to find significant traction.

The post Qualcomm Signs “Multi-year” Deal to Bring Ultraleap Hand-tracking to XR2 Headsets appeared first on Road to VR.

CR Deck Mk.1 Is An Open Source AR Headset Based On Project North Star With Ultraleap Hand-Tracking

Today AR headset manufacturer Combine Reality revealed the CR Deck Mk.1, an open source AR headset based on Project North Star that utilizes Ultraleap hand-tracking. A Kickstarter campaign is coming soon.

Details are scarce so far, but Combine Reality unveiled images and short video clips of the new AR headset on Twitter. Utilizing the open source Project North Star program and Ultraleap’s hand-tracking, it aims to deliver an accessible development kit for AR developers that’s “easily remixable with off-the-shelf components & modules” according to the announcement tweet.

On the official Combine Reality website, it states this will be:

An open-source, community-driven AR hardware platform with Unity and SteamVR integration, built around the world’s most advance optical hand-tracking technology. Featuring brilliant 1440x1600px per eye displays at up to 120Hz.

cr deck mk.1 ar headset side view cr deck mk.1 ar headset front angle view

Combine Reality also showed colorized teaser images of a CR Deck Mk.2 prototype that are purely just sketches, not even actual renders, with an embedded Intel Real Sense SLAM module. Reportedly it’ll be included in some capacity in the upcoming Kickstarter campaign as well.

That’s everything we know right now. For more specs and details on the construction of the headset, check out this development blog.

If you want to learn more you can sign up for a newsletter that will let you know once the Kickstarter campaign goes live. The campaign appears to be for an “injection molded version of the Project North Star headset” that will bypass the need for 3D printing. They’ve also got details on how you can build your own Project North Star headset using from Smart Prototyping. The About Us page mentions it’s possible

The post CR Deck Mk.1 Is An Open Source AR Headset Based On Project North Star With Ultraleap Hand-Tracking appeared first on UploadVR.

Oculus Quest Controller-Free Hand Tracking SDK Now Available

The SDK for controller-free Hand Tracking on Oculus Quest is now available, allowing developers to start integrating the feature into their apps.

Controller-free hand tracking for Quest was first announced in September at Oculus Connect 6, and shipped last week as an experimental feature. It uses the four cameras on the Oculus Quest and advanced computer vision algorithms, powered by machine learning, to track your hands and fingers.

SDK stands for Software Development Kit. Simplified, this means the code and resources needed by developers in order to add the feature. The SDK not being released yet was why only first party apps currently support the feature.

If you’re a developer wondering about the specifics: the Oculus Mobile SDK now has an API to return a skeletal model or full mesh for the user’s hands, along with a confidence value. With the Oculus Unity Integration you can integrate the feature with a simple prefab. Both the native API and Unity API have calls for detecting pinching and returning the location the user’s pinch is pointing at.

Oculus Hand Tracking API

Don’t expect to see Quest store apps updating over the next few weeks, however. Facebook told us to expect the first third-party app updates in early 2020. But integrating a new kind of input often takes weeks or months anyway, depending on the app.

There’s a chance, however, we may Quest apps with Hand Tracking available on SideQuest, the unofficial third party store for Quest which works by automating sideloading.

Interestingly, the SDK documentation warns developers about the privacy implications of having access to user hand data:

Data Usage Disclaimer: Enabling support for Hand tracking grants your app access to certain user data, such as the user’s estimated hand size and hand pose data. This data is only permitted to be used for enabling hand tracking within your app and is expressly forbidden for any other purpose.

As VR advances, it will be able to track more and more uniquely identifiable biometric data. The SDK documentation mentions that to use Hand Tracking the developer has to declare it as a permission in the manifest, which may mean that Facebook intends to give the user an option to accept the permission for each app.

Controller-free Hand Tracking promises to increase the convenience of non-gaming VR and open it up to new audiences uncomfortable or unfamiliar with gaming controllers. Now that the SDK is released, we know of at least five apps which will start working on adding the feature, but there will almost certainly be more.

What apps do you hope add support for Hand Tracking next year? Let us know in the comments below!

The post Oculus Quest Controller-Free Hand Tracking SDK Now Available appeared first on UploadVR.

Varjo Expands Enterprise Headset Lineup With VR-2 & VR-2 Pro

There are several virtual reality (VR) headsets competing in the higher end enterprise sector such as the HTC Vive Pro Eye, VRgineers’ XTAL and Varjo’s VR-1. The latter is a £6000 GBP head-mounted display (HMD) which launched back in February and today the company has announced two new additions to its product lineup, the VR-2 and VR-2 Pro.

Varjo VR-2

The two new state-of-the-art devices improve on the previous model in a number of ways, adding Varjo’s next-generation Bionic Display which has better peripheral vision and colour consistency than before. Varjo’s 20/20 Eye Tracker technology has also been upgraded offering faster and more accurate calibration performance, giving applications access to precise eye data.

The Bionic Display still features two 1920×1080 low persistence micro-OLEDs and two 1440×1600 low persistence AMOLEDs allowing for the high definition the headset is known for. Both the VR-2 and the VR-2 Pro include support for SteamVR content as well as support for the OpenVR development platform. Additionally, the Varjo VR-2 Pro also comes with integrated Ultraleap (formerly Leap Motion) hand tracking technology.

“We have seen first-hand what the power of human-eye resolution in VR can offer in terms of expanding the realm of applications for the enterprise,” said Urho Konttori, Co-Founder and Chief Product Officer of Varjo in a statement. “Today we bring the Resolution Revolution overnight to nearly all industrial applications, unlocking the next level of professional VR. With VR-2 and VR-2 Pro, professionals can benefit from the industry’s highest visual fidelity and the most precise eye-tracking joining forces with the leading hand tracking technology.”

Image Credit: Varjo – Comparison – Left HTC Vive Pro, Middle Varjo with SteamVR support (Ultra-high res at 40 PPD), Right Native Varjo SDK (Human-eye res at 60 PPD).

“One of the main goals of SteamVR is to support a diverse ecosystem of hardware and software. Varjo is taking a unique approach to building high-end industrial VR products, and we are excited that users of Varjo will be able to take advantage of the wide array of SteamVR content for professionals,” said Joe Ludwig from Valve.

As you may expect, the Varjo VR-2 and VR-2 Pro require some decent processing power as these system requirements detail:

Component Recommended Minimum
Processor Intel Core i7-8700

AMD Ryzen 7 2700

Intel Core i7-6700



NVIDIA Quadro RTX 6000

NVIDIA GeForce GTX 1080

NVIDIA Quadro P6000

Storage Space 2GB
Video output 2 x DisplayPort 1.2 / 2 x Mini DisplayPort 1.2
USB port 1 x USB-A 3.0
Operating system Windows 10 (64-bit)

Available today, pricing for the Varjo VR-2 starts from $4,995 USD while the VR-2 Pro starts from $5,995. When purchasing either device you’ll also need to add on Varjo’s software and support services starting at $795. Both headsets will be on demonstration at AWE EU 2019 later this week. Varjo now sells four enterprise-grade devices, the other being the XR-1. For further Varjo updates, keep reading VRFocus.

Ultrahaptics Relaunches as ‘Ultraleap’ After Leap Motion Acquisition

Leap Motion, the optical hand-tracking firm, was acquired by Bristol, UK-based haptics company Ultrahaptics earlier this year. Now, Ultrahaptics is relaunching under a new name created to reflect its shared heritage: Ultraleap.

Before the acquisition of Leap Motion and the subsequent rebranding, Ultrahaptics was best known for its mid-air haptic technology which uses ultrasound to project tactile sensations onto users’ hands.

Leap Motion, known for its eponymous optical hand-tracking module and underlying software, was acquired by the company for a reported $30 million back in May. Prior to its acquisition, Leap Motion created an open-source AR headset, Project North Star.

Image courtesy Leap Motion


According to a press statement provided to Road to VR, both the Ultrahaptics and Leap Motion names will continue to be maintained as trademarks for existing products, however all new software and hardware launches will fall under the Ultraleap name.

“Rebranding isn’t a decision we’ve taken lightly. We’re immensely proud of what our companies have achieved,” explains Ultraleap CEO Steve Cliffe. “We’re also very excited for what’s to come. Our new name and brand reflects our ambitions in this new world, now and for the future.”

Valve VR Dev is "Dying to share all the exciting things" in Development

The company was, and still is, focused on using its technology across a variety of industries such as automotive, advertising, AR/VR, and simulation & training.

Notably, Ultraleap has licensed its technology to industry pros such as The Void’s Star Wars: Secrets of the Empire at Disney, and has been showcased in concept cars developed by Harman, and Bosch.

The news of the rebranding was first reported by Business Leader.

Thanks to Antony Vitillo of VR/AR blog Skarred Ghost for pointing us to the news.

The post Ultrahaptics Relaunches as ‘Ultraleap’ After Leap Motion Acquisition appeared first on Road to VR.