Today, Leap Motion – the motion tracking technology specialist – has announced a $50 million USD funding round, led by clients advised by J.P. Morgan Asset Management.
Founded in 2010, Leap Motion’s tracking technology can be used to make virtual reality (VR) and augmented reality (AR) experience more immersive by allowing users to actually grab virtual objects without the need for data gloves. This new funding round will allow Leap Motion to push forward with its global expansion plans, which includes opening an office in Shanghai, China. The company also plans to expand its commercial and enterprise reach, developing applications in education, healthcare, and industrial training simulations.
“Natural input through full hand tracking is inseparable and fundamental to the future of VR/AR, and Leap Motion is a principal driver of its widespread adoption,” said Michael Buckwald, CEO and Co-founder of Leap Motion in a statement. “In much the same way as the touchscreen sparked the mobile revolution, Leap Motion is playing a transformative role in the development of human interface technology for VR/AR. As a result, the industry as a whole is on the verge of a similar moment of exponential growth.”
As part of the funding round, Lawrence Unrein, Global Head of J.P. Morgan Asset Management’s Private Equity Group, has joined Leap Motion’s board of directors.
“Virtual and augmented reality technologies are evolving at a rapid pace, drawing significant attention from investors,” adds Unrein. “Leap Motion’s suite of hardware and software designed for headsets falls squarely into this category, offering natural input through hand and finger tracking that can be integrated into any headset.”
Over the years Leap Motion has partnered with several tech companies, the most recent being Qualcomm. This saw Leap Motion to bring its interface to the Qualcomm Snapdragon 835 mobile platform, and Qualcomm’s new VR development kit (VRDK).
VRFocus will continue its coverage of Leap Motion, reporting back with the latest updates.
First demonstrated to theme park executives at the Asian Attractions Expo in Singapore last month, Holovis’ R3ex (Rideable Realtime Robot Experiences) project combines virtual reality with four/six-axis robot arm amusement rides. The technology claims to offer the rider genuine interactivity and ‘complete agency’ over the experience, including real-time control of the motion profile.
Theme parks have been eager to add virtual reality rides to their list of attractions, but the majority of experiences are repurposed existing roller coasters, essentially offering the same ride but with a virtual environment to look at. Some have begun to offer a degree of interactivity based on where you look, but there’s only so much you can achieve with a fixed track. Being able to influence the ride in a meaningful way sounds much more compelling, and that’s what Holovis hopes to deliver with R3ex.
The debut system in the video above uses a two-seat, six-axis robot arm from KUKA, one of the leading suppliers of industrial robots. This ‘new generation’ KUKA Coaster was revealed at the end of 2015, but KUKA has been active in the entertainment sector for well over a decade; their ‘robocoaster’ arms have been in service at Legoland parks since 2003. Holovis is keen to point out that this is more than repurposing an existing ride.
“Anyone could put a VR headset on the end of a robotic arm, but it is our proprietary Interact software that makes this experience successful”, says Stuart Hetherington, CEO of Holovis. “The real-time VR world needs to synchronise perfectly with the motion and gesture interactivity, so we’ve developed specialist software to achieve this. The movements that our solution can achieve coupled with the immersive VR world take experiences of this nature to a new dimension, previously only dreamed about in science fiction.”
image courtesy Holovis
The demonstration depicts a futuristic flying taxi ride through Singapore. Holovis’ press release describes some of its interactivity achieved via hand tracking – “riders could explore things in the cab, such as change the radio station, scroll through the news and pick up elements that fall from the glove box as it opens during the first inversion”. It seems that in this case, the interactivity was largely limited to ‘secondary’ cockpit controls, with the motion following a selection of scripted sequences based on decisions made by the riders, but Holovis assures me that “it is possible to have direct control of the motion profile of the robot”. I asked how this might work with multiple people, and the team explained that this depends on how the particular story works. “People can be given complete control of the robot but you might not want someone else controlling your whole motion experience for the whole (ride), so the control can be switched as part of the storyline between riders so you all got a turn”.
The final version, available from November, will seat four riders at once to cater for high-capacity throughput, so it will be interesting to see how the controls can be shared in a creative and entertaining way. Of course, the beauty of real-time VR rendering combined with a multi-axis robot arm means that its potential applications are almost endless and can be improved over time; initially designed for the attractions industry, the same software can be used to safely simulate high-risk scenarios for training purposes. Holovis is developing such a simulation with a partner in Malaysia to be revealed later this year “as part of a much larger immersive training and simulation facility”. The first entertainment installation is also expected at the end of the year in Dubai.
image courtesy Holovis
With real agency, potential skill elements, and the need to engage four riders, you’d expect a longer seat time than a typical blink-and-you-miss-it roller coaster. Holovis says that they’re aiming at a ride/game time of between 3 and 5 minutes, “depending on the design, IP and the required capacity throughput”. They’re also developing ‘R3ex Arena’ solutions with multiple robots, and thanks to the company’s expertise in mixed reality, are promising a “unique onboard and offboard solution that allows the guests in the queue to game using BYOD/smartphones and AR technology”.
Leap Motion units attached to the headsets perform hand tracking duties, which Holovis say is ‘very stable’ despite the wild ride – “we developed a specific software interface to handle the synchronisation of the Leap data with our motion positioning and control systems to ensure perfect real-time interaction with correct positional referencing of the hands”.
Combining motion technology, interactivity and high-quality visuals means greater performance requirements than the mobile VR solutions (regularly used on VR roller coasters) can handle – in the first demonstration R3ex is rendering on a PC using Oculus Rift headsets (without the IR cameras as players are tightly strapped in, meaning that positional tracking is unnecessary) – but Holovis say they are hardware agnostic and “can work with whichever platform is right for the application”.
The graphics, story integration to the motion profile and the gesture interactivity were all created in-house for their proprietary Interact system. Holovis have developed ‘all the elements’ in R3ex, with a team of over 100 people based in the UK, US and China, working closely with KUKA to ensure that both pre-programmed motion profiles as well as real-time guest controlled motion are delivered safely.
Promising “new levels of interactivity, graphical quality and sustained physical forces on the rider”, this type of ride has the potential to be a more suitable pairing (compared to a roller coaster) for VR at theme parks and other attractions.
Explorations in VR Design is a journey through the bleeding edge of VR design – from architecting a space and designing groundbreaking interactions to making users feel powerful.
Sound is essential for truly immersive VR. It conveys depth and emotion, builds and reinforces interactions, and guides users through alien landscapes. Combined with hand tracking and visual feedback, sound even has the power to create the illusion of tactile sensation.
In this Exploration, we’ll explore the fundamentals of VR sound design, plus take a deep dive into the auditory world of Blocks. Along the way, we’ll break a few laws of physics and uncover the surprising complexity of physical sound effects.
What Can Great Sound Design Achieve in VR?
Presence and Realism in 3D Space
When it comes to depth cues, stereoscopic vision is a massive improvement on traditional monitors. But it’s not perfect. For this reason, sound is more than just an immersive tool – how (and where) objects around you sound has an enormous effect on your understanding of where they are, especially when you’re not looking at them. This applies to everything from background noises to user interfaces.
Engines like Unity and Unreal are constantly getting better at representing sound effects in 3D space – with binaural audio, better reverb modeling, better occlusion and obstruction modeling, and more. The more realistic that zombie right behind you sounds, the more your hair stands on end.
Mood and Atmosphere
Music plays a crucial role in setting the mood for an experience. Blocks has a techno vibe with a deep bass, inspired by ambient artists like Ryuichi Sakamoto:
Weightless features soft piano tracks that feel elegant and contemplative:
If you imagine shuffling the soundtracks in these three examples, you can understand how it would fundamentally change the experience.
Building and Reinforcing Interactions
Sound communicates the inception, success, failure, and overall nature of interactions and game physics, especially when the user’s eyes are drawn elsewhere. Blocks, for example, is designed with a wide range of sounds – from the high and low electronic notes that signal the block creation interactions, to the echoes of blocks bashing against the floor.
For game developers, this is also a double-edged sword that relies on careful timing, as even being off by a quarter second can disrupt the experience.
Tutorial Audio
It’s sad but true – most users don’t read instructions. Fortunately, while written instructions have to compete with a huge variety of visual stimuli, you have a lot more control over what your user hears.
Using the abstract state capabilities in Unity’s Mecanim system, you can easily build a flow system so that your audio cues are responsive to what’s actually happening. Just make sure that the cues work within the narrative and don’t become repetitive.
Setting Boundaries
Virtual reality is an exciting medium, but for first time users it can take a few minutes to master its limitations. Our hand tracking technology can only track what it can see, so you may want to design interaction sounds that fade out as users approach the edge of the tracking field of view.
Evoking Touch
In the absence of touch feedback, visual and auditory feedback can fill the cognitive gap and reinforce which elements of a scene are interactive, and what happens when the user “touches” them. This is because our brains “continuously bind information obtained through many sensory channels to form solid percepts of objects and events.” Some users even describe phantom sensations in VR, which are almost always associated with compelling sound design. To achieve this level of immersion, sounds must be perfectly timed and feel like they fit with the user’s actions.
Sound Design in Blocks
We’ve already talked about its ambient-inspired soundtrack, but you might be surprised to learn the sound effects in Blocks were one of our biggest development challenges – second only to the physical object interactions, an early prototype of the Leap Motion Interaction Engine.
Magic and Progression
One of the core design ideas behind Blocks was that we never imply a specific device underneath anything. For example, there are no whirring or mechanical noises when the arm HUD appears. It’s just something that magically appears from nowhere. The block creation sounds are also minimal, suggesting a natural progression. This was central to the narrative we wanted to tell – the miraculous power to create things with your bare hands.
This philosophy was also reflected in the physical sound effects, which were designed to suggest the embodiment of the object itself, rather than a specific material. When you grab something, a minimal subtle clicking sound plays. Nothing fancy – just tactile, quick, and precisely timed.
Getting the Right Physical Sound Effects
Here’s where it got challenging. To ensure a natural and immersive experience, the physical sound of block impacts is driven by 33 distinct effects, which are modified by factors like block sizes, collision velocities, and some random variations that give each block its own unique character. This aspect of the design proved nontrivial, but also was a fundamental component of the final product.
Since the blocks don’t have a representative material (such as metal or glass), finding the right sound took time. In creating the Blocks audioscape, sound designer Jack Menhorn experimented with kitty litter, plastic jugs, cardboard boxes, and other household objects. The final sound suite was created by putting synths into cardboard boxes and slamming the boxes into each other.
Violating the Laws of Physics
In an abstract environment with simple geometry, sound design is the difference between disbelief and physical presence. Sometimes this involves breaking the laws of physics. When you have to decide between being accurate and delighting your user, always opt for the latter.
In the real world, sound follows the inverse square law – getting quieter as the source gets farther away. The Unity game engine tries to reinforce this real-world falloff. But a block that lands silently after being thrown a long distance isn’t very satisfying. With Blocks, we created a normal falloff for a number of meters, but then the falloff itself stops. Beyond that point, blocks sound to be the same volume, regardless of how far away they are.
At the same time, the reverb goes up as blocks get farther away – creating an echo effect. In the real world, this would be impossible, since there are no walls or anything in the space that suggests there should be reverb. This is all just part of setting the rules for the virtual worlds in ways that feel human, even as they violate the laws of physics. So far no one has complained, except maybe this guy:
The Future of VR Sound Design
Imagine all the different ways that you can interact with a coffee mug, and how each action is reflected in the sound it makes. Pick it up. Slide across the table. Tip it over. Place it down gently, or slam it onto the table. All of these actions create different sounds. Of course, if it breaks, that’s a whole other problem space with different pieces!
This is the root of the biggest challenge on the horizon for sound design in VR – the economy of scale. When you move away from a simple scene with a few objects, to fully realized scenes with many different objects, everything in the scene has to be interactive, and that includes sound. You need to have variations and sensitivity.
This is one of the reasons why we recommend only having a few objects in a scene, and making the interactions for those objects as powerful as possible. As VR experiences grow in size and complexity, these new realities will need richer soundscapes than ever before.
Fully functioning hand-tracking might be a ways off from becoming the standard form of VR input, but Leap Motion is making a big step toward that future today, taking its Interaction development engine to 1.0 and introducing some major new features.
The Interaction Engine has been available in early Beta since last year, but this full release focuses on what could be a major application for hand-tracking going forward — interfaces.
Leap Motion has built a new user interface module that allows developers to create their own accessible menus and systems that can be navigated a little like Tom Cruise navigates menus in Minority Report. Users reach out to virtual panels to press buttons and alter meters. The company is also adding support for systems like wearables and widgets, enabling wrist-mounted menus and more.
Also updated is the core physics engine, which should make using Leap Motion a much more reliable and immersive experience going forward.
Perhaps the most exciting addition to the engine, though, is Oculus Touch and Vive controller support. The combination of these two technologies is very interesting. Touch also has basic gesture recognition but imagine being able to hold a controller and still extend a finger to press a button.
The company has also launched a new Graphic Renderer that can curve the user interface and render it in one draw call. This is specifically aimed at mobile and standalone headsets.
Leap Motion’s hand-tracking technology has existed for years, but found a new lease of life in VR. We’ve seen the company’s tech integrated into Qualcomm’s reference design for standalone VR headsets though. Now that Google has partnered with Qualcomm for its WorldSense devices, we’re not sure if what role Leap Motion will play in them.
Aktuell findet gerade die SVVR Expo im Silicon Valley statt, bei der die neusten Entwicklungen aus dem Bereich Virtual Reality und Augmented Reality vorgestellt werden. Das Team von Go Touch VR hatte sich im Januar 2016 gebildet, um das Berühren von Objekten in VR realer zu gestalten. Die Lösung: Kleine Module die an die einzelnen Finger angebracht werden. Eine geniale Idee?
Go Touch VR
Das Unternehmen bietet auf der eigenen Webseite eine Vorbestellung der Development Kits an. Dies sollten auch die Einheiten sein, welche das Startup auf der SVVR Expo zeigt. In den einzelnen Aufsätzen für die Fingerkuppen sind kleine Platten verbaut, die einen Druck auf eure Finger ausüben können. Wie Road to VR berichtet, soll sich dadurch das Greif-Gefühl mit dem Controller sehr real anfühlen.
Das aktuelle System besteht aus drei Aufsätzen, welche an den Fingern angebracht werden können. Diese Aufsätze können zwar Druck ausüben, aber sie werden nicht im Raum getrackt. Wenn ihr also eine realistische Darstellung wollt, dann müsst ihr zusätzlich einen Leap-Motion-Controller verwenden, damit eure Hände getrackt werden können. Doch der Controller von Go Touch ist somit ein zusätzliches Produkt, welches ein großes Problem des Leap-Motion-Controllers löst – das fehlende Feedback.
Bei aller Euphorie sollte man aber nicht vergessen, dass es sich beim Produkt noch nicht um eine finale Version für den Handel oder für Arcades handelt. Bis das Team ein finales Produkt anbieten kann, könnte noch etwas Zeit vergehen. Zudem ist Hand-Tracking aktuell noch eine Nische und es ist fraglich, wann man auch als Konsument von dieser Entwicklung profitieren kann. Die Anzahl an Anwendungen für den Leap-Motion-Controller sind auch weiterhin noch sehr stark begrenzt. Vielleicht ein weiterer Grund eine VR-Arcade zu besuchen?
Wir sind auch auf der SVVR Expo und versuchen noch eine Demo zu erhalten.
The video below shows a build of Leap’s Blocks demo designed for mobile VR headsets like the Qualcomm reference design it’s already been integrated into. Here users can create different shapes by pinching their hands together and then pulling them apart. They can then pick up the blocks by making a grabbing shape with their hands. It didn’t run quite as smoothly as this when I tried it at the 2017 Mobile World Congress, but I was able to do everything shown in the video.
Unlike its original sensor, released as an external add on for PCs and their compatible headsets, its latest tech is actually built inside of VR devices. That means companies that take on Qualcomm’s reference design and build their own mobile VR headsets, the first of which will be releasing later this year, can include hand-tracking within if they so choose, but are under no obligation to do so. I speculated that the optional nature of Leap’s support might well help the device in the long-run, as developers won’t need any additional peripherals to support the tech, just what’s already embedded inside the headset.
The first mobile VR headsets based on Qualcomm’s tech are expected to launch later this year. Whether or not they support hand-tracking remains to be seen.
Qualcomm has debuted an updated version of their VR Headset Reference Design now with Leap Motion’s new 180-degree hand-tracking to bring gesture control to mobile VR headsets. The new headset and Leap Motion tracking module was shown off during last week’s GDC 2017.
Qualcomm’s VR Headset Reference Design has been upgraded to the company’s new Snapdragon 835 mobile platform. The purpose of the headset, which the company calls the VRDK (Virtual Reality Development Kit), is to act as a foundation for Qualcomm’s device partners to make their own VR headsets based on Qualcomm’s mobile computing hardware.
And now, Qualcomm’s latest VRDK brings hand-tracking into the mix thanks to Leap Motion. Leap Motion has been working on hand-tracking technology since 2010, and in recent years has pivoted their focus toward use as an input technology for VR. And while the company’s initial hand tracking device—originally built for use as a desktop peripheral—has seen some use in VR by strapping the device to the front of a headset, the limited field of view meant that users had to told their hands up in front of their face for the device to be able to track their hands in VR. Ultimately VR controllers have thus far become the defacto standard for motion input on tethered VR headsets.
But when it comes to mobile VR, where the goal is to have a single, self-contained unit that doesn’t rely on external tracking sensors or beacons, Leap Motion may have found a perfect fit; hand-tracking is more immersive than the limited rotation-only controllers that we see with Daydream and others (like the newly announced Gear VR). Having the tracking be totally on-board also means one less piece of equipment to tote around, helping to keep mobile VR portable and easy to use.
Leap Motion identified this sweet spot a while back and has been teasing a new mobile solution that would address the field-of-view limitation that came from strapping the company’s pre-VR device onto VR headsets. The company formally announced the mobile made-for-VR module in late 2016, and now we’re seeing the first glimpses of integration into Qualcomm’s newest VRDK, which I got to try out at GDC 2017 last week.
Though the new Leap Motion mobile module is technically still an attachment to Qualcomm’s VRDK, Leap Motion says that it will be directly integrated into mobile VR headsets built on the VRDK that opt for the hand-tracking tech.
The new mobile module as seen at GDC 2017 hugs closely to the Snapdragon 835 VRDK and was clearly made to fit the device specifically. With two wide-angle lenses, Leap Motion says the module provides a 180 degree field of view for hand-tracking. Indeed, I could feel a significnat difference between the new module and the old one. With the headset on and my hands out in front of me, I could grab objects and let them out of my own field of view through the headset, and when I looked down I could see that I was still holding the object.
The increased tracking field of view is bolstered by smart tweaks to the hand-tracking software; such that if I was holding an object and then turned my head (causing the object to truly leave the tracking module’s field of view) the software would remember that I was holding that object (and in which hand) once it came back into view, and often identify my hand holding the object before it came back into the headset’s own field of view, making a big improvement from the compelling-but-frustrating experience of the original desktop module.
The mobile module doesn’t just have a wider field of view, it’s also built for high power efficiency so that it makes sense to add to mobile VR headsets. Leap Motion says that the module runs at 10x the speed of the original device at significantly lower power. It’s also very tiny.
Finally jumping the hurdle of VR input will eventually require a combination of many different technologies currently in their infancy. Hand-tracking looks likely to play a big part in that future, and Leap Motion is one of a few companies leading the charge in this department.
Leap’s latest, well, leap is to bring its controller-free hand tracking tech to mobile VR headsets. We saw a primitive implementation of the system at the end of last year, but since then it’s been officially integrated into its first mobile VR headset, Qualcomm’s standalone reference design kit that other companies will be able to take and sell their own branded devices with.
That ultimately means that Leap Motion might be officially integrated into not one but several standalone devices within the coming year. Hand-tracking is entirely optional for other companies to include, but doing so, according to the Leap, is relatively inexpensive (the original external sensor cost just $80 at launch) and adds very little to the overall weight of a device.
It obviously also doesn’t require any additional controllers or other hardware, and even if it is integrated using it in software is still optional. That’s important; many third-party or experimental VR peripherals not made by headset manufacturers don’t garner much support, but being embedded inside a headset and not forcing its use upon people is brilliantly unintrusive. Developers don’t have to shoehorn support in, and don’t have to worry about any other install base than that of the headset itself, and the user is left with options.
We’ve already seen a different kind of hand-tracking in a similar reference design kit from Intel at CES, though I was intrigued to see this option simply because, of the two reference designs, Qualcomm’s is the one I’ve personally found to offer better inside-out tracking, at least in this early stage.
Leap’s demo was its usual one, in which you can put your hands together, pinch, and stretch out to create different sized blocks. We’ve seen it many times but it worked well on Qualcomm’s device, with the usual glitchy caveats: fingers moving when they weren’t supposed to, hands sticking to blocks when a grabbing motion wasn’t being made, and gestures occasionally not being recognized.
Hopefully Leap can iron out these issues even further before it starts showing up in consumer headsets, though what’s here already is undeniably impressive.
Ultimately, the main questions about Leap are the same ones facing any hand-tracking software on any VR headset right now. While it’s liberating to wave your hands in the air, actually interacting with items lacks haptic feedback. There’s nothing to make you feel the button you push or stop you putting your hand through a desk. There is work going on in this area, but it’s still very early and we doubt we’ll see any kind of integration with Leap anytime soon.
Still, the company envisions its tech being used with productivity apps and other such software that isn’t necessarily as reliant on haptic feedback. Convincing others that this is the perfect input mechanism for those apps that form such a promising part of VR’s future may be key to Leap and other hand-tracking tech staying relevant as more popular devices like Oculus Touch and the HTC Vive wands continue to be improved and refined.
Hand-tracking won’t be the dominant form of VR input for some time, but that doesn’t necessarily matter to Leap Motion. For now, its technology can quite happily exist as additive to mobile and PC-based VR headsets, and not essential to them. That’s why adding it to Qualcomm’s reference design makes so much sense for the company; it’s going to encourage manufacturers to implement the tech and grow a wider install base of hand-tracked headsets organically. Hopefully, great software will follow suit.
Leap Motion’s integration in the Qualcomm reference design is smart technology, but it’s an even smarter business move.
Qualcomm Technologies, Inc. has announced a partnership with hand tracking specialist Leap Motion to bring its interface to the Qualcomm Snapdragon 835 mobile platform.
Hand tracking alows for a much more natural, intuitive interface between virtual reality (VR) users and virtual worlds. The two companies will be showcasing this technology at the upcoming Game Developers Conference (GDC) in San Francisco, and at the Mobile World Congress (MWC) in Barcelona next week.
“As we deliver the new Snapdragon mobile platform for greater immersion with untethered virtual reality HMDs, natural user interfaces like hand movements will help consumers more intuitively interact with VR content and transform the consumer experience. We’re thrilled to work closely with a VR technology leader like Leap Motion so that we can make that possible,” said Tim Leland, vice president, product management, Qualcomm Technologies, Inc. “The Qualcomm Snapdragon 835 was designed to combine six degrees of positional tracking, high VR frame rates, immersive audio and enhanced 3D graphics with real-time rendering in a compact, stand-alone headset for the ultimate VR experience.”
“Technology works best when technology disappears,” said David Holz, Chief Technology Officer, Leap Motion. “Untethered, mobile VR headsets with intuitive, handbased interaction and position tracking bring a level of quality, immersion, and accessibility to VR unlike anything that’s been seen before. This relationship with a mobile VR processing leader like Qualcomm Technologies is an important step towards making virtual reality truly ubiquitous, and we believe it has the potential to fundamentally transform the human experience.”
To coincide with the Leap Motion announcement, Qualcomm has also revealed a new VR development kit (VRDK) for the Snapdragon 835 mobile platform, giving developers early access to a VR head mounted display (HMD) built with the Snapdragon 835.
“With this new VRDK, we’re providing virtual reality application developers with advanced tools and technologies to accelerate a new generation of VR games, 360-degree VR videos and a variety of interactive education, enterprise, healthcare and entertainment applications,” said Cristiano Amon, executive vice president, Qualcomm Technologies, Inc., and president, QCT. “We see great potential for the exciting new experiences made possible by truly mobile, untethered virtual reality that’s always connected to the internet, and we’re excited to help mobile VR developers more efficiently deliver compelling and high-quality experiences on upcoming Snapdragon 835 VR-capable products.”
The HMD included in the Snapdragon 835 VRDK consists of:
Display: Four megapixel (2560×1440) WQHD AMOLED display (two megapixels per eye)
Cameras: Six degrees of freedom (6DoF) Motion Tracking:
Two monochromatic, stereo- one mega pixel (1280×800) cameras with fish-eye lenses for each
integrated sensor IMU (gyroscope, accelerometer, magnetic compass), with fast interface to the Snapdragon 835 mobile platform sensor core
Eye Tracking: Two monochromatic VGA global shutter cameras with active depth sensing
Memory: DRAM: 4GB LPDDR4 and Flash: 64GB UFS
Connectivity: Wireless with Wi-Fi, Bluetooth and USB3.1 type C (power)
The VRDK is due to launch in the second quarter of 2017, engineered to help developers improve VR performance for upcoming devices built with the Snapdragon 835 that are expected to ship in the second half of 2017.
For further updates from Qualcomm and Leap Motion keep reading VRFocus.
Last September we reported on the fact that Qualcomm was launching their own VR development kit with the ability to deliver standalone VR. What made the VR 820 so compelling was that it had 6-DoF tracking as well as integrated compute (Snapdragon 820) which was on par with all the latest flagship phones. It even had support for eye tracking, which we now know was through a partnership with none other than SMI. However, there was one thing that was missing, hand tracking. In fact, Intel was already demoing hand tracking this year at CES with their Project Alloy prototype.
Anyone that has used mobile VR knows that controllers are nice, but unless you can ‘see’ your hands and interact with your surroundings with your hands, the immersion is lost. HTC and Valve do this with their Vive controllers that are super low latency and extremely accurate and Oculus does this with their touch controllers and their extremely natural ergonomics. When it comes to mobile, in many cases you’re either stuck with a Bluetooth gamepad on Samsung or a controller like the Daydream controller which simply put isn’t good enough. Thankfully, the team at Leap Motion have been working tirelessly to deliver hand tracking and late last year launched their much more compact hand tracking solution specifically aimed at mobile form factors.
Now that their technology has been miniaturized, it can be integrated into platforms. One such platform that’s launching at MWC and GDC (since both shows are happening simultaneously), is Qualcomm’s new Snapdragon 835 VR development kit. This new Snapdragon 835 VR development kit features a 2560×1440 AMOLED display, 6DoF tracking, eye tracking, foveated rendering and many other performance and power saving features. This system is essentially an upgrade over the Snapdragon 820 developer kit that Qualcomm launched at IFA 2016. The real improvements are increased performance, power savings and support for Leap Motion. While we don’t quite yet know the performance of the Snapdragon 835, the expectations are that it will be quite a bit faster on the GPU than the Snapdragon 820, which is a blessing for VR. The Snapdragon 835 VRDK is expected to be available in Q2 through the Qualcomm Developer Network. This device is really designed to help developers optimize their apps for the Snapdragon 835 HMDs that are due out in the second half of this year.
In addition to announcing the partnership and support of Leap Motion and a new VR development kit based on Snapdragon 835, Qualcomm is also announcing an HMD accelerator program. This program is specifically aimed at accelerating the time to market for HMD manufacturers, which has been an issue for some companies. The program is designed to help HMD manufacturers reduce their engineering costs and time to market so that they can seed the market with these HMDs faster. Part of this program utilizes the newly announced Snapdragon 835 VR HMD and will connect OEMs with ODMs like Thundercomm or Goertek, the two leading HMD ODMs in the world. The program is designed to help OEMs modify the reference Snapdragon 835 VR HMD and enable pre-optimized features like SMI’s eye-tracking and Leap Motion’s hand tracking.
These three announcements are very closely intertwined and show where mobile VR and more specifically standalone VR is going. Mobile VR itself will still benefit from the advances that result from these new developments, however standalone VR is currently the focus of this platform. The interesting thing about the mobile industry and players like Qualcomm is that they can iterate so much more quickly than their PC counterparts that we are seeing mobile HMD feature sets leapfrog PC. The fact that the Snapdragon 835 VR platform will support both eye tracking and hand tracking is huge because both of those are natural interfaces. Combining hand tracking, eye tracking and voice recognition into a single device means that a user can naturally interface with their VR HMD without ever needing to touch anything. Ultimately, hands free VR is the holy grail and I think that Qualcomm has brought us one step closer to that reality.
Disclosure: My firm, Moor Insights & Strategy, like all research and analyst firms, provides or has provided research, analysis, advising, and/or consulting to many high-tech companies in the industry, including Google, Intel, Qualcomm and Samsung cited or related to this article. I do not hold any equity positions with any companies cited in this column.