Haptic feedback is developing into an elusive achievement for the VR industry. Many different companies are attempting to harness the additional immersion haptics affords you within the VR platform, but nothing has caught on just yet. We recently reported on the EXOS glove which lets you feel your way through virtual spaces and even a rifle accessory that adds realistic recoil to VR shooters. A vest is a bit more involved of a haptics project and, over on Kickstarter, Nullspace VR wants their players to wear the Hardlight VR Suit and feel a virtual world all over.
The Hardlight VR Suit comes equipped with 16 positional haptic sensors and vibration nodes. With so many sources, the suit should be able to give accurate feedback depending on where you’re hit and even have the sensation travel across multiple nodes like if cut across the upper torso by a sword. The suit covers a lot more of the upper body than older haptic vest projects typically do, bringing feedback to shoulders, chest, arms, abdomen, and upper back. It also includes a tracking system that augments the experience a VR headset can provide by supplying limb positioning relative to the headset while measuring inertia.
Feedback vests aren’t a new development for the gaming industry. At every convention or expo for the last handful of years, you were likely to come across some company trying to recreate the sensation of being shot or the kinetic impact of a grenade exploding nearby. None of those stuck, but the market for that type of device was incredibly niche considering they’d still be used in conjunction with a fairly uninvolved gaming experience. The technology is still pretty niche now, but it is certainly less so considering it is being developed as a companion to a platform built on the immersion such a device aims to enhance.
The Hardlight VR suit has been funded with over $127,000 on a goal of $80,000 and that will likely climb during the remaining 13 days of the campaign. The projected delivery time frame for the suit is September of this year, but in general hardware-based crowdfunding projects often result in unintended delays. We’ll have more updates on this project in the coming months.
Striker VR used the opportunity of GDC 2017 to showcase its high-end virtual reality rifle and tease the next generation of mixed reality gun accessories.
Striker VR is a startup of dedicated VR hardware developers. Its product is a realistic-feeling rifle capable of syncing with VR experiences and creating a deeper layer of immersion for gun-based games. Inside the Striker VR rifle is a battery, wireless electronics and a haptic motor. This motor is what delivers the kick you feel every time you pull the trigger.
I tried the show floor model of the Striker at GDC and I can say definitively that it is the most realistic VR gun peripheral I’ve ever experienced. The recoil on the rifle is strong enough to feel almost uncomfortable, which is exactly what you would want in a firearm facsimile.
On top of the push back, the rifle’s weight is also a source of added realism. This thing is so heavy that you’ll feel the strain in your shoulders after just a few shots. This could provide some interesting VR fitness applications for the Striker while also making the fake weapon feel more like the genuine article.
During my demo I was strapped into a wireless VR backpack and given a Striker VR Rifle and an Oculus Rift headset. The Optitrack large-scale positional tracking system provided enough positional horsepower to turn 50 square feet of show floor space into a wide-open digital playground. According to Striker VR, it has forged something of a partnership with Optitrack.
Inside the headset I saw a basic white expanse full of multicolored balloons. Without any prompting necessary I opened fire on these innocent plastic spheres and, to my delight, I discovered that my weapon had not one but three different modes of fire. The first was a basic semi automatic rifle burst, the second was a grenade launcher, and the third was a Gears of War style chainsaw blade. Each of these was given its own sense of haptic identity by the motor. The grenade kicked the hardest, the chainsaw rumbled consistently, etc.
According to company reps in the booth, Striker is currently exploring a number of options including acquiring Vive Trackers, working with Vive arcade owners, and beginning pre-orders for their new, market-ready design.
This new model will have the battery in the back, to balance the weight better, and will also feature a more powerful haptic motor and sleeker overall design. Pre-orders for the updated rifle will soon be made available to “location-based” customers only, according to the company.
This means that the hardware is being sold in batches to arcades and larger venues. A commercial version is not yet on the horizon but, according to the company, it is something that may become available in the future.
How do you think virtual reality will improve over the next few years? You’re probably hoping for better ways to see, hear and touch virtual worlds. Michael Abrash, chief scientist at Oculus, seems to agree: when he outlined his predictions for the next five years of VR last October, he focused on these three senses.
But one sense Abrash didn’t mention was smell. Using your nose in VR might sound slightly unnecessary, superfluous even – an optional extra once visuals, audio and haptics have been perfected.
Yet smell is central to how we perceive and remember the world, and without it VR will arguably always be a bloodless imitation of reality. Anosmics, as those without a sense of smell are called, have been found to suffer from a reduced quality of life and even severe depression. Describing the misery of losing her sense of smell, the documentary maker Elizabeth Zierah explained how she felt “dissociated” from the world around her. “It was as though I were watching a movie of my own life,” she wrote, and found anosmia far more traumatic than the effects of a stroke that had left her with a limp.
Smell is also the only sense directly linked to the amygdala, part of the brain closely involved in our feelings, meaning that scents can be particularly evocative of powerful emotional memories. Many of us have had the sensation of catching a whiff of something that takes us back to a particular time, place, and emotional state – something impossible in current VR.
Benson Munyan III, who researches smell and VR at the University of Central Florida, recalls driving out to his grandma’s house as a child. “And as soon as we arrived we would see rose hedges that were on her driveway. So getting out the car the first thing we would smell was rose. That has stuck with me until today.”
Munyan is one of a handful of scientists finding out how we can smell our way around VR. Having served with the US military in Iraq, Kenya and Djibouti, one of his key research interests is getting former soldiers to don VR headsets so they can face up to, and overcome, their traumatic memories. Smell has been used in VR PTSD treatment previously, he explains, but until now the difference it makes to immersion has not been quantified.
Figuring Out How Smell Affects Presence
Along with colleagues, he created a VR experience where you have to search a creepy abandoned carnival at night for your keys. In the same room, they set up a Scent Palette, a $4,000, shoebox-sized silver box that fires out certain smells at the right moment during the experience – so smoke when a ride crashes and bursts into flame; garbage from an overturned bin; and the more pleasant odors of cotton candy and popcorn.
They found that piping in smells gave participants a greater sense of presence as they made their way around the spooky carnival, while removing odors caused their sense of being there to plummet.
But there is a problem: pump too many different smells into a room for too long, and you end up with a very weird mixture of pongs. After lengthy sessions, “that room can smell of smoke, or garbage, or diesel fuel or whatever the combination is,” Munyan says.
Not only might this confuse your nose, but a consumer version would mightily annoy anyone who wants to use the living room after you without it smelling of candyfloss and garbage. Odors also need to be synchronized with your VR experience, but it takes time for a smell to reach you from a box in the corner of the room. By the time you smell smoke, you may have already moved away from a fire in the virtual world.
Some companies are already working on these problems. Olorama, a Valencia-based company, produces kits (cost: $1,500) that they say quickly deliver up to ten smells toward headset-wearing users. Their scents include ‘pastry shop’, mojito, anchovies and ‘wet ground’ (gunpowder, blood and burning rubber are ‘coming soon’). They say that their aromas are based on ‘natural extracts’, suggesting they dissipate more rapidly that standard chemical-based scents.
Problems Adding Smells To VR
Another solution might be to have a smell machine incorporated into a VR a headset, meaning odors reach your nose almost immediately and don’t stink out the entire room. Such an idea has already been prototyped: the FeelReal mask, launched on Kickstarter in 2015, promised not only to release smells but also vibrate and blast your face with hot or cold air and mist.
The mask was not a success, however, and joined an already long list of failed products like Smell-o-Vision and the iSmell. The Verge described wearing a FeelReal mask as like “putting an air freshener in a new car on a hot day. Then imagine burying your face in one of the car’s plastic seats. Then imagine the car’s driver is navigating some tight curves very quickly”. It failed to raise even half of its $50,000 Kickstarter target.
But other contraptions are in the works. A Japanese lab last year came up with a prototype smell machine small enough to hook over an Oculus Rift and sit just below the nose (see video), leaving the lower half of your face uncovered. Rather than using a fan, it atomises smelly liquids by blasting them with acoustic waves so that they waft upward into your nostrils. The lab says that because this does away with tubes, the machine doesn’t continue to smell when it’s not supposed to – one of the problems that has plagued previous devices.
One crucial feature of this device is that it can vaporize several liquids at the same time, in different concentrations, and so could potentially combine different smells to make others. The holy grail of VR smell research is a basic ‘palette’ of smell components that could be mixed to make thousands of other odors, rather like a headset screen can create any color from a few basic ones. But this will be a considerable scientific challenge.
Consciousness-altering Scents
Takamichi Nakamoto, head of the lab at the Tokyo Institute of Technology which created the device, says a “huge amount of data are required to establish odor components [of different smells]. We can collect them to some degree but it is not so easy.”
“Consciousness-altering smells, for example the smell of fear present in the sweat of someone very afraid or scared, are complex mixtures and no-one knows the composition and they will not be synthetically recreated in a hurry,” says Tim Jacob, a smell expert at Cardiff University. “Smell is not like vision where from a primary color palette you can mix all colors.”
So there are a list of daunting technological challenges to solve before we can incorporate smell fully into VR. But the psychological hurdles may be even higher, because of the idiosyncratic way we all experience smell.
This is well illustrated by another experiment, published last October, where participants were told to hunt for a murderer’s knife in a VR house. Those who were exposed to the unpleasant smell of urine as they entered the virtual kitchen rated the experience as more presence-inducing – providing further evidence that smell helps us feel VR is more believable.
But participants often misidentified the urine smell as something else entirely. Some thought it was fish, others garbage, the bad breath of the killer, or the body of the victim, explains Oliver Baus, a researcher at the University of Quebec. Some even thought it was a pleasant smell because it evoked happy memories.
“We had one participant who said when they were young, they drove to school past a farm, and that’s what it smelled like,” he says.
In other words, our reaction to a particular smell is highly dependent on the context, or our previous experiences. “Although some cultural consistency in response to certain odors can be assumed to some degree, because the associations we each have acquired to odors is idiosyncratic, it cannot be assumed on the individual level and therefore cannot be used in a predictive fashion,” says Rachel Herz, adjunct professor at Brown University and author of The Scent of Desire, which explores smell.
If VR developers want to ever include smell in a game, says Baus, they are therefore going to have to give a lot of visual cues to tell players exactly what they are smelling. “The visual is dominant,” he says.
For now, smell in VR is seen as something of a bizarre joke, like the moldy timber and blood scented candle you can light while playing Resident Evil 7. But without using this overlooked sense, VR may never be able to pack the emotional, visceral punch of our real lives. For that reason, incorporating smell may become one of the biggest tasks facing the industry over the coming decades.
On October 12, 2013 Sixense closed its Kickstarter campaign after raising over $600,000 from 2,383 backers. The purpose of the campaign was to raise money for the STEM System: a third-party hardware solution capable of providing positional tracking and hand controllers to virtual reality headsets. Today, these capabilities come standard in the three biggest, high-end VR platforms: Oculus Rift with Touch, HTC Vive and PlayStation VR. Despite a first-party market that has innovated right past it and the growing threat of irrelevance, Sixense still has no definitive plans to begin shipping its STEM systems to the backers that funded it.
On the show floor at GDC 2017 there was a familiar booth — Sixense. Multiple stations had been erected to show off different use cases for the STEM System. Also inside this booth were top Sixense executives. Our goal was to get the answer to three questions: how exactly can you afford this booth but are unable to ship units to backers, when will you be shipping to backers and are you out of money as a company?
Photos taken by Anshel Sag at GDC 2017 showing equipment on display at the Sixense booth.
We spoke with Amir Rubin, CEO and co-founder of the company. I approached Rubin at his booth and asked him how it is that he can afford such a booth at such an in-demand show so many years after the close of his company’s Kickstarter?
According to Rubin, Sixense raised money following its Kickstarter to keep the company alive and expanding. Rubin would not say how much he raised or who the new investors are. When asked to name his new sources of capital, Rubin said they were “important people.”
Rubin steered me around his booth showing use cases for the STEM hardware. I saw STEM working to create gaming experiences (which Rubin says he is working to bring to casinos) and medical applications as well. The latter were said to be training experiences for doctors to help them master procedures without harming patients. I asked Rubin when his 2,300 backers could expect to see a return on their contributions. Rubin brought in an additional Sixense employee to explain.
“We needed to redesign the trigger based mount,” the employee explained. “One piece was too big and needs to be slimmed down.”
Rubin said the Sixense manufacturing pipeline is distributed and complex. Designs are reworked and sent to partners in China and Oakland, California for tooling and completion, he said. I asked Rubin what he wanted to say to his original Kickstarter backers that have been waiting for their systems. Rubin said STEM is continually being rethought and repurposed for things like medical applications. He also said the initial $600,000 influx from Kickstarter wouldn’t cover the tooling on the plastic for these devices. According to him, Sixense has a team in Israel and the company is spending $150,000-175,000 on research and development each month.
According to Rubin, “every day more and more money is being poured in” to the STEM System.
However, nobody I talked to from Sixense at GDC could provide an answer to the question “when can backers expect their hardware.”
Finally jumping the hurdle of VR input will eventually require a combination of many different technologies currently in their infancy. Hand-tracking looks likely to play a big part in that future, and Leap Motion is one of a few companies leading the charge in this department.
Leap’s latest, well, leap is to bring its controller-free hand tracking tech to mobile VR headsets. We saw a primitive implementation of the system at the end of last year, but since then it’s been officially integrated into its first mobile VR headset, Qualcomm’s standalone reference design kit that other companies will be able to take and sell their own branded devices with.
That ultimately means that Leap Motion might be officially integrated into not one but several standalone devices within the coming year. Hand-tracking is entirely optional for other companies to include, but doing so, according to the Leap, is relatively inexpensive (the original external sensor cost just $80 at launch) and adds very little to the overall weight of a device.
It obviously also doesn’t require any additional controllers or other hardware, and even if it is integrated using it in software is still optional. That’s important; many third-party or experimental VR peripherals not made by headset manufacturers don’t garner much support, but being embedded inside a headset and not forcing its use upon people is brilliantly unintrusive. Developers don’t have to shoehorn support in, and don’t have to worry about any other install base than that of the headset itself, and the user is left with options.
We’ve already seen a different kind of hand-tracking in a similar reference design kit from Intel at CES, though I was intrigued to see this option simply because, of the two reference designs, Qualcomm’s is the one I’ve personally found to offer better inside-out tracking, at least in this early stage.
Leap’s demo was its usual one, in which you can put your hands together, pinch, and stretch out to create different sized blocks. We’ve seen it many times but it worked well on Qualcomm’s device, with the usual glitchy caveats: fingers moving when they weren’t supposed to, hands sticking to blocks when a grabbing motion wasn’t being made, and gestures occasionally not being recognized.
Hopefully Leap can iron out these issues even further before it starts showing up in consumer headsets, though what’s here already is undeniably impressive.
Ultimately, the main questions about Leap are the same ones facing any hand-tracking software on any VR headset right now. While it’s liberating to wave your hands in the air, actually interacting with items lacks haptic feedback. There’s nothing to make you feel the button you push or stop you putting your hand through a desk. There is work going on in this area, but it’s still very early and we doubt we’ll see any kind of integration with Leap anytime soon.
Still, the company envisions its tech being used with productivity apps and other such software that isn’t necessarily as reliant on haptic feedback. Convincing others that this is the perfect input mechanism for those apps that form such a promising part of VR’s future may be key to Leap and other hand-tracking tech staying relevant as more popular devices like Oculus Touch and the HTC Vive wands continue to be improved and refined.
Hand-tracking won’t be the dominant form of VR input for some time, but that doesn’t necessarily matter to Leap Motion. For now, its technology can quite happily exist as additive to mobile and PC-based VR headsets, and not essential to them. That’s why adding it to Qualcomm’s reference design makes so much sense for the company; it’s going to encourage manufacturers to implement the tech and grow a wider install base of hand-tracked headsets organically. Hopefully, great software will follow suit.
Leap Motion’s integration in the Qualcomm reference design is smart technology, but it’s an even smarter business move.
AMD had a press conference at GDC to demonstrate their engagement with the gaming community and had some major VR related announcements. AMD has long supported VR through their own Liquid VR technology and have been evangelizing VR for quite some time. So these announcements at their Capsaicin and Cream event made complete sense. GDC is a developer-focused conference so its worth remembering that many of these announcements will not have a direct impact on consumers, but rather an indirect effect as a result of decisions made by developers.
The first major announcement from AMD was that they have worked with Valve to support asynchronous reprojection, which is Valve’s own feature that exists to improve the VR experience and eliminate judder. This feature is akin to Oculus’ asynchronous time warp but for Valve’s Vive platform. The hardware manufacturer will support this feature through a driver update and Valve will support it through an update to SteamVR which is the company’s component of the Steam gaming platform. Valve actually launched this feature back in November along with NVIDIA, but now AMD is bringing support for this feature to their GPUs in March, which is a welcome addition for anyone running an AMD GPU with a Vive.
AMD also added support for a forward rendering path with Unreal Engine 4, which is one of the most popular engines in the world and is commonly used by some of the top game developers in the world. This forward rendering path feature is yet another VR-related feature that improves overall image quality in VR since HMDs are not the same as computer monitors and behave differently. As a result, lots of applications support forward rendering to deliver faster and better looking VR applications. Not all developers necessarily find forward rendering to be their cup of tea, but having support for the option is important for AMD to be relevant in VR.
Last, but not least, AMD announced their biggest partnership of the year and possibly in the company’s history with game developer and publisher Bethesda. This partnership will very likely stretch outwards into areas like VR, which is why it’s such a big deal. After all, Bethesda is releasing Fallout 4 in VR and it sounds like it will very likely ship with Vulkan which is a very good low level API that can squeeze the most performance out of virtually any CPU and GPU combination. However, AMD’s partnership with Bethesda is clearly designed to help them get better support for their GPU and CPU features in games and to accelerate performance in VR and other applications.
AMD did not announce anything regarding their new GPU code named Vega other than the fact that it will commercially be called Vega. Many people have been anticipating AMD’s newest GPUs using the new Vega architecture, but in the meantime, NVIDIA has announced their own GTX 1080 Ti which appears to once again raised the bar for AMD to compete with them
Disclosure: Anshel’s firm, Moor Insights & Strategy, like all research and analyst firms, provides or has provided research, analysis, advising, and/or consulting to many high-tech companies in the industry, including AMD, NVIDIA and others. He does not hold any equity positions with any of the companies cited.
As many of you already know, NVIDIA is one of the leading purveyors of graphics cards which are heavily used for VR. In fact, NVIDIA is the go-to solution for the majority of users, so it is always a good day when the company launches a new graphics card. This is especially true when they launch a brand new high-end graphics card. The new GeForce GTX 1080 Ti is the new top dog within NVIDIA’s lineup of graphics cards and is designed to be the fastest card that the company has offered to date. Yes, that includes the GeForce GTX Titan X Pascal, the fastest and most sought after graphics card.
The specs for the GPU itself are:
12 Billion transistors
1.6 GHz Clockspeed, 2 GHz Boost (1.4, 1.5 GHz boost on Titan X Pascal)
3584 CUDA Cores (same as Titan X Pascal)
352-bit memory bus (384-bit on Titan X Pascal)
11 Gbps memory speed (10 Gbps on Titan X Pascal)
11 GB of RAM (12 GB on Titan X Pascal)
220 Watt TDP (250 on GTX Titan X Pascal)
The expectation is that the GTX 1080 Ti will be 35 percent faster on average than a GTX 1080, according to NVIDIA, which should mean that it will outperform the GTX Titan X Pascal in gaming and VR. The GTX 1080 Ti will ship in March and be available for $699. NVIDIA also killed the DVI port on the new GTX 1080 Ti, which won’t really be missed. It has three DisplayPort connectors and one HDMI connector, allowing for three monitor display configurations with an HDMI, which is what I’m running at home right now.
In addition to the new GPU, NVIDIA is also announcing support for VR Works inside of Unity including support for VR SLI, Multi-Res Shading, Lens Match Shading and SPS. Thanks to the VRWorks features on NVIDIA’s GPUs, benchmarks like Basemark’s VR Score saw as much as a 40% performance uplift. Also, in addition to announcing support for Unity and VR benchmarks, NVIDIA is introducing their own tool to measure VR quality called FCAT VR. This tool is built on their frame capturing technology which seeks to discover real world performance and actual frames displayed to the headset. They’ve also introduced an advanced data analysis tool with FCAT data analyzer to allow anyone to analyze a game or application’s behavior.
While we don’t exactly know how much faster it will be than the Titan X Pascal, the expectation is that it will be faster and significantly faster than the GTX 1080. This is a good thing because it means that VR developers can really start to look at enabling eye candy features in their applications and that we can start to think about possibly increasing the resolution of VR HMDs down the road as well. There are already higher resolution displays out there and graphics cards like the GTX 1080 Ti are going to be critical to enabling those high resolutions at acceptable frame rates.
Disclosure: My firm, Moor Insights & Strategy, like all research and analyst firms, provides or had provided research, analysis, advising, and/or consulting to many high-tech companies in the industry, including NVIDIA and others. I do not hold any equity positions with any companies cited.
Lower-priced cameras from long-time motion tracking company Optitrack could slash as much as 40 percent off the cost to track VR headsets and accessories over very large areas. The price cut could accelerate the roll-out of out-of-home VR experiences like The Void.
The Void covers very large regions with Optitrack cameras overhead to find the locations of people, controllers or other objects that are part of the overall story. In The Void’s first public installation in New York, Madame Tussauds offers a Ghostbusters experience that makes visitors feel like they are really catching ghosts throughout a building. Immersion can be dialed up on these “stages” by enhancing the experience with wind, heat or scent effects that tie to the story. Ghostbusters is a particularly smart fit for The Void because you wear a backpack powering the wireless headset that ends up feeling exactly like a proton pack.
When we got a look at the refined Rapture hardware from The Void, co-founder James Jensen noted the controller and headset are no longer covered with external tracking markers.
Typically, Optitrack covers objects or people with lightly-colored reflective balls or dots to track their movements. It turns out The Void is one of the very first systems equipped with Optitrack’s latest “active” system which uses embedded lights covering objects rather than easy-to-break balls. The Void is also now employing a significant upgrade to the visuals seen inside its Rapture VR helmet, and the startup aims to open 20 of its hyper-immersive “stages” this year.
While Valve Software is working on improved base stations for its innovative lighthouse tracking system used by the HTC Vive, we haven’t heard a definitive answer one way or the other about whether the technology might one day be extensible to cover very large regions. Today, a large-scale virtual world like those made by The Void turns to a camera-based tracking technology like Optitrack. IMAX VR, in contrast, equipped room-sized pods with Vive tracking base stations for its VR arcade initiative.
“In 2015, the number of out-of-home VR tracking experiences that we sold into, it was a couple dozen systems,” said Optitrack Chief Strategy Officer Brian Nilles. “In 2016, we probably sold 400 to 500 systems in VR tracking. Some of them are research, some of them are R&D for universities, but a lot of them are out-of-home experiences that are in Asia, Europe and growing in North America as well. So in 2017, it seems like the market is getting traction.”
The Void is just one among a field of companies looking to establish a market for a new kind of destination entertainment mixing elements of storytelling and exploration with paintball or laser tag. A price drop like Optitrack’s with cameras tuned specifically for VR usage could be precisely the boost needed to make these types of locations more common.
From an Optitrack press release, bolding added:
At the core of OptiTrack Active is a set of infra-red LEDs synchronized with OptiTrack’s low latency, high frame rate, Slim 13E cameras, delivering real time marker identification as well as positioning. This differs from OptiTrack’s passive solution, which requires that reflective markers be configured in unique patterns for each tracked object. This can add a great deal of complexity for high volume manufacturing and large-scale deployments of HMDs or weapons. With OptiTrack Active over 100 objects can be tracked simultaneously over areas greater than 100’x100’ (30mx30m)…
The newer Slim 13E cameras are priced around $1,500 while equivalent hardware that used the older “passive” dot-tracking system cost around $2,500. Covering large regions can require dozens of these cameras so the cost adds up very quickly. The image below provided by Optitrack imagines an enormous space with cameras placed overhead evenly throughout.
Last September we reported on the fact that Qualcomm was launching their own VR development kit with the ability to deliver standalone VR. What made the VR 820 so compelling was that it had 6-DoF tracking as well as integrated compute (Snapdragon 820) which was on par with all the latest flagship phones. It even had support for eye tracking, which we now know was through a partnership with none other than SMI. However, there was one thing that was missing, hand tracking. In fact, Intel was already demoing hand tracking this year at CES with their Project Alloy prototype.
Anyone that has used mobile VR knows that controllers are nice, but unless you can ‘see’ your hands and interact with your surroundings with your hands, the immersion is lost. HTC and Valve do this with their Vive controllers that are super low latency and extremely accurate and Oculus does this with their touch controllers and their extremely natural ergonomics. When it comes to mobile, in many cases you’re either stuck with a Bluetooth gamepad on Samsung or a controller like the Daydream controller which simply put isn’t good enough. Thankfully, the team at Leap Motion have been working tirelessly to deliver hand tracking and late last year launched their much more compact hand tracking solution specifically aimed at mobile form factors.
Now that their technology has been miniaturized, it can be integrated into platforms. One such platform that’s launching at MWC and GDC (since both shows are happening simultaneously), is Qualcomm’s new Snapdragon 835 VR development kit. This new Snapdragon 835 VR development kit features a 2560×1440 AMOLED display, 6DoF tracking, eye tracking, foveated rendering and many other performance and power saving features. This system is essentially an upgrade over the Snapdragon 820 developer kit that Qualcomm launched at IFA 2016. The real improvements are increased performance, power savings and support for Leap Motion. While we don’t quite yet know the performance of the Snapdragon 835, the expectations are that it will be quite a bit faster on the GPU than the Snapdragon 820, which is a blessing for VR. The Snapdragon 835 VRDK is expected to be available in Q2 through the Qualcomm Developer Network. This device is really designed to help developers optimize their apps for the Snapdragon 835 HMDs that are due out in the second half of this year.
In addition to announcing the partnership and support of Leap Motion and a new VR development kit based on Snapdragon 835, Qualcomm is also announcing an HMD accelerator program. This program is specifically aimed at accelerating the time to market for HMD manufacturers, which has been an issue for some companies. The program is designed to help HMD manufacturers reduce their engineering costs and time to market so that they can seed the market with these HMDs faster. Part of this program utilizes the newly announced Snapdragon 835 VR HMD and will connect OEMs with ODMs like Thundercomm or Goertek, the two leading HMD ODMs in the world. The program is designed to help OEMs modify the reference Snapdragon 835 VR HMD and enable pre-optimized features like SMI’s eye-tracking and Leap Motion’s hand tracking.
These three announcements are very closely intertwined and show where mobile VR and more specifically standalone VR is going. Mobile VR itself will still benefit from the advances that result from these new developments, however standalone VR is currently the focus of this platform. The interesting thing about the mobile industry and players like Qualcomm is that they can iterate so much more quickly than their PC counterparts that we are seeing mobile HMD feature sets leapfrog PC. The fact that the Snapdragon 835 VR platform will support both eye tracking and hand tracking is huge because both of those are natural interfaces. Combining hand tracking, eye tracking and voice recognition into a single device means that a user can naturally interface with their VR HMD without ever needing to touch anything. Ultimately, hands free VR is the holy grail and I think that Qualcomm has brought us one step closer to that reality.
Disclosure: My firm, Moor Insights & Strategy, like all research and analyst firms, provides or has provided research, analysis, advising, and/or consulting to many high-tech companies in the industry, including Google, Intel, Qualcomm and Samsung cited or related to this article. I do not hold any equity positions with any companies cited in this column.
Back in August of last year, Valve started to roll out of its innovative and royalty-free tracking technology. The company made a development kit available to licensees, but only if they attended a $3,000 training session that would teach the ins and outs of the tech. The introductory course was likely a bit of quality control, but the price of the session was also a daunting obstacle to some. This is no longer a concern, as Valve is removing the requirement of the course, thus making the highly regarded tracking technology more readily available.
Valve has over 500 companies signed up currently, though that number is sure to change a great deal in response to this new development. The original in-person training course will still be available, but the coursework (in English or Chinese) will be available for free.
On top of all this, the SteamVR base stations that emit lasers to track sensors throughout the room will be available directly from Valve later this year.
The tech itself opens up a plethora of opportunities for enhancing the immersion of VR. SteamVR Tracking is a system that works with low-weight sensors that can be placed on various objects so they can be brought into virtual spaces. For example, players could be handed realistic props for baseball, ping pong, or even shooters and they’d be tracked accurately in whatever experiences were built around them.
At the beginning of the year, we addressed the idea of SteamVR Tracking potentially being 2017’s most important VR technology, and it is very encouraging to see it made available in such a way. As it makes its way into the hands of more creatives and engineers, we’ll hopefully be able to find out if a more immersive hardware and accessory ecosystem will bring VR into more homes.