Hands-on: HaptX Glove Delivers Impressively Detailed Micro-pneumatic Haptics, Force Feedback

The company formerly known as AxonVR, which has raised more than $5 million in venture capital, is rebranding to HaptX, and revealing a feature prototype of a VR glove which uses micro-pneumatics for detailed haptics and force feedback to the fingers. After trying the prototype for myself, I came away impressed with the tech. The company’s next challenge is to turn the prototype into something sleeker, smaller, and far more practical.

Fitting Procedure

Photo by Road to VR

Meeting with HaptX co-founder Jake Rubin in Silicon Valley earlier this month, I got to try the latest prototype of the company’s wild-looking haptic VR glove—a monstrous piece of equipment hooked up to some massive cabling. Putting it on—with the help of two people by my sides—I felt like I was preparing for a medical procedure, as the pair showed me how to carefully guide my fingers into the right places, pull out some fabric slack, and then tighten the glove to my hand with a ratcheting mechanism to ensure a snug fit. But the point of this prototype is not about size and fit, it’s all about function. And function it did—the haptics and force feedback were the most responsive and detailed I’ve tried to date, in part thanks to the glove’s micro-pneumatics.

Micro-pneumatics

Photo by Road to VR

The HaptX gloves is based on innovative micro-pneumatic technology. The company has developed a method for essentially producing thin, bendable fabrics which are manufactured with a series of tiny air pipes along their length which eventually terminate in small inflatable circles which act as “haptic pixels,” according to Jake Rubin, one of the company’s two co-founders and its CEO. The inflatable circles, just a few millimeters across, are aligned into grids; by precisely controlling when and which haptic pixels to inflate, a convincing sensation can be created, simulating the feeling of an insect crawling along your finger or a marble rolling around in the palm of your hand.

The glove also features force feedback: the ability to restrict the movement of your fingers to simulate holding objects. This too is based on the company’s micro-pneumatic technology, which Rubin explained works by inflating stoppers along the joints of your fingers to restrict their movement. The effect is that when you reach out to grab an object, say, a baseball, your fingers stop right where they should be coming in contact with the baseball.

HaptX CEO Jake Rubin | Photo by Road to VR

Both effects were impressively responsive and quite convincing. I’ve tried a few other similar systems, but the haptics from the HaptX glove blew the others away. The glove puts the haptic material across the palm of your hand and on the tips of your fingers—totaling some 100 individual haptic pixels—allowing you to feel a finely detailed sensation of pressure in all those places. The range of tactile sensations was ultimately surprising; revealed to me when I was thrown in to the company’s farm-themed menagerie of tactile examples.

Feeling the Farm

Photo by Road to VR

Rubin walked me through the demo experience, built using an SDK of HaptX’s design, which he says is largely created by leveraging Unreal Engine’s physics system to tell the glove when and where to apply haptic effects and when and how to engage the force feedback.

With the glove on my right hand, and wearing an HTC Vive headset, I was looking down at a miniature barnyard with some little sunflowers off to the right and a tiny patch of wheat in front of a barn. Rubin encouraged me to start poking and prodding at the scene. Each of the glove’s fingers is tracked by a proprietary magnetic tracking system which Rubin claims is capable of sub-millimeter precision. Indeed it worked well.

As I reached out with my index finger to gently touch the leaves on the side of the sunflower, I could feel pressure against my finger that quickly and closely followed the visuals, making it easy to connect the feeling with the image.

Next to the sunflowers was a little grey storm cloud, and when I poked it, pea-sized raindrops began falling from within. Stretching out my palm to catch them, I felt a convincing pitter-patter of pressure right on my palm. A similarly convincing moment came when I brushed my palm across the tops of the tiny wheat plants.

Photo by Road to VR

Eventually a baseball-sized tractor came rolling out of the barn. When I went to pick it up like a little toy, my fingers stuck in place—seemingly right against the virtual tractors surface—and wouldn’t budge. It was a convincing effect, especially combined with the haptics putting pressure on the tips of my fingers as though I was holding something. Although there’s no way for the gloves to simulate the weight of the object, making it feel like it was at least real, as far as the volume that it takes up, is a big step up. ‘Mock grabbing’ items with controller-less VR hand input feels really unnatural, but the quality of this force feedback remedied that with ease.

There was more to do and see in the demo, including some tiny critters that came out of the barn to dance around on my palm so that I could feel their little steps (including the eight legs of a spider, which Rubin tells me is the most contested part of the demo). In the end though, the whole ordeal provided me with a new benchmark for small-scale haptics in VR.

Continued on Page 2: Enterprise First »

The post Hands-on: HaptX Glove Delivers Impressively Detailed Micro-pneumatic Haptics, Force Feedback appeared first on Road to VR.

‘Haptic Shape Illusion’ Allows VR Controllers to Simulate Feel of Physically Larger Objects

In a study lead by Eisuke Fujinawa at the University of Tokyo, a team of students created a procedure for designing compact VR controllers that feel physically larger. Exploring the concept of ‘haptic shape illusion’, the controllers have data-driven, precise mass properties, aiming to simulate the same feeling in the hand as the larger objects on which they are based.

Simulating the feel of real objects is a fundamental haptics challenge in VR. Today’s general-purpose motion controllers for VR work best when the virtual object is reasonably similar in size and weight; very large or heavy virtual objects immediately seem unrealistic when picked up.

One solution is to use specific controllers for a given application—for instance attaching a tracker to a real baseball bat; in a hands-on with one such solution, Road to VR’s Ben Lang described the significance of gripping a real bat and how that influenced his swing compared to a lightweight controller. But swinging a controller the size and weight of a baseball bat around your living room probably isn’t the best idea.

As shown in the video below, researchers from the University of Tokyo attempted to create much smaller objects that retain the same perceived size. The team designed an automated system which takes the original weight and size of an object and then creates a more compact but similar feeling output through precise mass arrangement.

The paper refers to several ecological psychology studies into how humans perceive the size of an object through touch alone, supporting the idea that perceived length and width is strongly related to the moment of inertia about the hand position.

The team concentrated its efforts on this haptic shape perception, collecting data from participants wielding different sample controllers in VR to determine their perceived sizes, having never seen the controllers in reality. This data allowed the creation of a ‘shape perception model’, which optimises the design of a large object within smaller size constraints, outputting CAD data for fabrication.

The object is deformed to fit the size constraints, holes are cut out, and weights are placed at specific points to maintain the original moment of inertia.

Image courtesy Fujinawa et al.

The team had VR developers in mind, as this approach could offer a potential benefit in demonstrating a product with a more realistic controller. The CAD data output means that smaller, safer prototype controllers that give the impression of wielding larger objects can be created quickly with a laser cutter or 3D printer.

SEE ALSO
Exploring Methods for Conveying Object Weight in Virtual Reality

Further information and the full paper is available on Fujinawa’s website. The research is being presented at this week’s VRST 2017, the 23rd ACM Symposium on Virtual Reality Software and Technology held in Gothenburg, Sweden.

The post ‘Haptic Shape Illusion’ Allows VR Controllers to Simulate Feel of Physically Larger Objects appeared first on Road to VR.

Tactai Wants To Bring Better Touch Feedback To Virtual Reality

Tactai Wants To Bring Better Touch Feedback To Virtual Reality

Virtual reality and augmented reality have delivered realistic visuals and sound to virtual experiences, but they’re completely lacking a sense of touch. Tactaiwants to change that with a new kind of multi-modal touch feedback technology.

If it works, it could lead to a new generation of VR and AR headsets that can add a new sense to immersive experiences and games. Tactai’s technology goes further than past touch feedback — or haptic — devices since it gives you a much more fine-grained sense of what you’re touching, based on a demo I saw. Tactai hopes to be part of the standard technologies in the next set of VR devices.

“We want to bring a rich sense of touch into the digital world to empower our interactions in a unique way,” said Steven Domenikos, cofounder and CEO of Tactai, in an interview with GamesBeat.

Above: Steven Domenikos is CEO of Tactai.

Image Credit: Dean Takahashi

Waltham, Mass.-based Tactai has created software that shows off its “ultra high-fidelity haptics.” The technology is in its prototype stage, and the company is seeking partners to bring products to the market that can enhance our digital interactions. Domenikos believes the tech can be used in everything from smartphones to pen-based tablets to VR and AR hand controllers.

In the demo, which you can see in the video, Domenikos used a standard pen input device to allow me to feel the texture of objects that I touched. When he activated the software associated with a stone tile, I moved the pen over the screen showing the stone tile. It felt rough and bumpy, just like real stone. The pen vibrated and enabled me to feel the sensations.

Above: Tactai uses a pen to demo the feeling of a stone tile.

Image Credit: Tactai

He also showed me the feeling of ABS plastic as well. It can give you the sensation of pressure and counter pressure. You can feel temperature, softness, hardness, smoothness, curvature, and texture as you do in the real world. Tactai has created algorithms to replay fine patterns, softness, textures, pressure, heat, cold, friction, and other sensations in a finger-sized device. When I tried the demo, I could feel how the sandpaper felt different from the glass, and the stone felt different from the plastic.

He also showed me a fingertip wearable device that Tactai built, so you can touch and feel and manipulate an object in VR. When you combine these feelings with visuals and sound (and maybe one day smell and taste), you’ll get a much richer sense of a digital world. Without touch, the VR worlds that you experience can feel hollow, Domenikos said. Touch creates a much better sense of “presence” — or the feeling you have been transported to another reality.

“Without touch, VR is interesting, but touch can make it compelling,” Domenikos said. “What will make the experiences variable is the type of hardware.”

Above: With this device, you can put your finger into VR and feel things.

Image Credit: Tactai

The company is delivering the software that it will license to partners, and it has built reference hardware that can serve as the foundation for the products that the partners are creating. Tactai creates a new material by scanning it into digital form, and then, it creates a waveform that mimics mother nature.

“That is all done with software,” he said. “We take into consideration the real-time movements of the user. If you press hard, it feels different. If you press soft, it feels another way.”

Domenikos believes that compelling, immersive AR and VR experiences can be created for gaming, e-commerce, training, film, and education. Tactai already worked with Ericsson to co-develop a tactile VR experience for mobile devices at the CES tech trade show in Las Vegas in January. Tactai is working with an online retailer, a digital media distributor, and a credit card processor. Of course, the ultimate customers could be the porn companies.

“When I give a demo, people always talk about that,” Domenikos said. “We’re all imaginative people.”

Domenikos hopes a shared VR experience will be possible — like playing a Jenga tabletop blocks game with a friend.

“What’s going to drive this is the ability to have a more shared, immersive experience,” he said.

Domenikos started the company with his cofounder Katherine Kuchenbecker, a professor at the University of Pennsylvania, in 2014, and they formally incorporated in January 2016. Their work was based on work on haptics technology that went back 15 years.

Rivals include Immersion, Tactical Haptics, and Ultrahaptics. But Tactai wants to create fine-grained sensations, which can be localized to particular areas. It can give you a sense of touching a keyboard when you tap a part of a touchscreen on a smartphone or a tablet. Right now, phones have haptic actuators that are meant only to vibrate a smartphone. That’s very low fidelity and coarse-grained touch feedback, Domenikos said.

“We can now deliver a very rich sensation that can be used in a variety of sensations,” Domenikos said.

The company has about 15 employees and contractors in the Boston area and Montreal. Revenue will start coming this year from licenses.

This post by Dean Takahashi originally appeared on VentureBeat.

Tagged with:

SIGGRAPH 2017: This Controller Changes Shape To Match Virtual Objects

SIGGRAPH 2017: This Controller Changes Shape To Match Virtual Objects

A new controller shown at SIGGRAPH could change shape from rigid to squishy in a demonstration that could lead to more immersive touch sensations in virtual worlds.

The research from Cornell University’s Organic Robotics Lab builds on an earlier demonstration that pumped air into a Vive controller for a variety of sensations, including firing a gun and a liquid passing through the grip of the controller.

This new demonstration used a Vive Tracker, leaving the entirety of the base of the handheld device as a playground for haptic innovation. The overall project is only six months old and Vive Trackers only started shipping to developers in recent months, so there is a lot that still might be possible with this approach which works by pumping air into the device to change its shape.

The project is a collaboration between NVIDIA and Cornell, with PhD canidates Ben Mac Murray and Bryan Peele working on it. NVIDIA’s VR Funhouse software was used to demonstrate a pair of virtual swords. One sword was flimsy and unable to pop balloons while the other was firm and sharp and able to pop them with ease. The controller changed its rigidity based on the amount of air pumped into it, simulating a rough approximation of each sword. It is a relatively simple demonstration, but Peele said he thinks the research can be taken much further very quickly, using sensors and actuators to react to different people and how hard they are gripping the controller.

“You can now match whatever object you have in a game to whatever that visual representation is — be it a sword or a gun — can actually become that object in your hand,” said Peele. “Rather than having the same diameter and the same rigidity for every object in the game, our controller can actually morph and change shape, change stiffness. So a sponge feels different than a rock and a rope feels different than a sword.”

One major problem to overcome with this approach is that each controller prototype was tethered. Peele said they are considering a variety of options to make it more portable. It could be that this type of technology might have potential in a VR arcade setting where people wear backpack PCs with large batteries to provide the large amounts of power needed to make it work.

“Energy requirements are a technical challenge, but we are currently developing a portable system that wouldn’t need to be tethered to the computer,” Peele wrote in an email.

Tagged with: ,

SIGGRAPH 2017: Feel A Hot Desert Or Freezing Mountain In This Doctor Strange-Inspired VR Project

SIGGRAPH 2017: Feel A Hot Desert Or Freezing Mountain In This Doctor Strange-Inspired VR Project

Researchers from National Taiwan University and Tamkang University are presenting a project at SIGGRAPH this week that shows potential for VR arcades as it recreates realistic weather effects that could enhance immersion.

Their system is suspended from the ceiling with modules inside that send down cold air, hot air, wind, and mist. It includes a Vive Tracker on top and its modules can turn in some directions so the environmental effects can be directed straight at a person in VR.

I took a quick trip through a sample experience that perfectly showcased the system’s potential. First I started in a neutral environment that seemed to resemble a monastery from Doctor Strange. And just like the movie, using a circular motion with the Vive controller in hand I opened a portal through which I could see another part of the world. The moment I stepped through the portal I encountered what felt like the environment of the new place. This included a wet cave, a cold mountaintop and a windy desert. In each place I felt the wet mist, cold air or hot wind on my neck and arms, dramatically enhancing my sense of place.

The rig includes some limitations in its current version. Some of the machinery can be so loud you can hear it even with headphones on, and you can only move backward and forward in a small space to receive the full weather effect. The unit could use some increased directional advances — perhaps putting the whole thing on a swivel mechanism — if it is to extend its effects into a larger region. You might need a waterproof HMD too if you spend too long in a misty environment.

The project is called “AOES” and carries the title “Enhancing Teleportation Experience In Immersive Environments With Mid-Air Haptics.” If you are at SIGGRAPH, it is in the Emerging Technologies area.

While SIGGRAPH is a showcase for many projects that might never actually become real products, the ideas shared here often inspire creators to envision new ways of making technologies more compelling or affordable for mass consumption. I for one would love to see this technology used at an IMAX VR center.

Tagged with: ,

Nvidia Takes on Challenge of Improving AR

VRFocus have already reported on the challenges that are facing virtual reality (VR) and its continued development with regards to human vision and perception, but little so far has been said about the similar problems facing augmented reality (AR) as that area also continues to grow and develop. Graphics card manufacturers Nvidia are taking on that very issue.

Nvidia Inventions, the Research and Development area of Nvidia, are working on two areas that are relevant to VR and AR. The first involves what researchers have dubbed ‘varifocal displays’. As discussed by Michael Abrash in his Oculus Blog post, fixed-focus VR and AR displays can present a problem to human vision, using new research, Nvidia re working on a new type of optical layout that uses a holographic back-projection to display virtual images. This new technology could also lead to VR and AR displays that are thinner and lighter than currently available headsets.

Another project that Nvidia are working on in collaboration with the University of North Carolina, Saarland University and Max-Planck Institute involves a deformable membrane mirror for each eye which means the mirror can be adjusted depending on where a separate -eye-tracking system sees the user is looking.

nvidia AR research

Nvidia are also working on Haptic feedback systems to enhance the immersion of VR and AR. One prototype system is a VR controller that allows users to experience different textures as they play, its soft skin able to produce force-feedback as well as replicate the feel of different materials and textures.

The second project involves a squishy foam sword such as children might play with, which can transform in a moment to feel like the solid cord-wrapped handle of a katana, or the sold metal of a broadsword hilt. Nvidia have already integrated those two types of haptic controllers into its in-house VR Funhouse experience, so users can feel the solid hit of a mallet in whack-a-mole, or feel the recoil of a gun in a shooting gallery.

VRFocus will continue to bring you news of research into new VR/AR display and feedback methods.

E3 2017: Hands-On With VR Hand Controller CaptoGlove

E3 2017: Hands-On With VR Hand Controller CaptoGlove

Immersive experiences are the foundation of VR’s growth and devs or creators continue to find ways to keep the illusion from being broken as long as they can. Controller input is one of the biggest offenders when it comes to this, but the team behind CaptoGlove is utilizing our natural gestures to keep players engaged.

The glove is equipped with many sensors and can track the movement of your wrist and the individual fingers. It also has a customization tool to make a large number of gestures work in many different ways for any experience. We got a chance to try out a demo at E3.

The CaptoGlove was fully funded on its Kickstarter campaign and is already making its way into the homes of those that supported it during the campaign. It also made it onto our list of the best hardware we saw at E3 and I go into detail on my experience with the device. In that piece we noted:

Once I got my bearings with the glove, which I was wearing while mimicking the action of holding a helicopter’s control stick, I flew closer to the ground and between buildings. The accuracy and response time allowed me to maneuver deftly, an impressive feat in DCS World for sure. The creators of the tool have plans to add haptic sensors to the base glove in the future, so we could be witnessing an affordable and functional new step for immersive input in VR.

To clarify further what “got my bearings with the glove” means, I initially was thinking of the CaptoGlove’s input in the wrong way. I moved in wide, sweeping moves as if using a large joystick. I realized that the movements weren’t really met with great degrees of input and realized that the glove itself wasn’t being tracked by anything outside of the glove. So I adjusted my strategy, keeping my arm in the same place but turning and tilting with my wrist in more controlled motions and that got a much better and accurate response in the game.

The glove gets as warm as any glove normally would and was quite comfortable throughout the demo. It wasn’t so comfortable that you forget you’re wearing one, but you won’t find yourself getting tired of it with extended play. Further, the glove is designed practically with individual sensors that can be removed with ease if they have to be replaced. The same functionality that allows for that type of repair also opens up the door for updates to the glove without having to buy a new version altogether, an element explained to me by a rep at the demo. There are plans to add a haptic feedback accessory and, though that is still far off.

Tagged with:

This Flexible Thermoelectric Skin Has Made Me a Believer in Thermal Haptics

Korea-based TEGway is developing ThermoReal, a thermoelectric array which can generate heat and cold with impressively low latency. The flexible nature of ThermoReal could make it suitable for integration into VR controllers, gloves, and more.

I’ve tried a few different thermal haptic devices throughout the course of my VR reporting, but nothing that really impressed me. Usually the effects are hard to notice because they don’t feel particularly hot or cold, and they take so long to activate that it’s hard to sell the illusion that the effect is being caused by something happening in the virtual world.

I got to try the ThermoReal thermoelectric skin at the Vive X Batch 2 demo day in San Francisco this week and it’s led me to become a believer in the value of thermal haptics for the first time. That’s thanks to three things:

Latency

ThermoReal—which is a thermoelectric generator based on something called the Seebeck Effect—is impressively quick to react. I held a prototype wand which had the ThermoReal skin embedded in it as I watched a video of a man jumping into a river. The moment he plunged into the water I could feel the wand get cold to the touch. Another video showed a car blowing up and the heat effect kicked in almost immediately with very little ‘spin up’ time. Keep an eye on the ‘thermal imaging’ section of the clip above to see how quickly the device changes temperatures.

TEGway’s Thermoreal prototype device | Photo by Road to VR

In addition to hot and cold, the device can do both at the same time in close proximity, which is perceived as an amplified ‘pain’ effect compared to just using heat alone.

Our sense of temperature is not nearly as latency-sensitive as our senses of sight or hearing, but thermal haptics must still be fast enough to help our brains connect what we’re seeing with what we’re feeling. For many potential thermal haptic scenarios, it feels like ThermoReal has passed an important latency threshold that helps sell that illusion.

Amplitude

It isn’t just the speed of the hot or cold effect, but the extent of it too. I was impressed with how the device could achieve its maximum level of cold so quickly.

Even more than the cold effect, the heat effect was so great that I had to loosen my grip on the ThermoReal prototype at times; I was honestly concerned the device could burn me. I asked one of the creators if there was any risk of injury and was told that the device would only get up to 4°C hotter than body temperature. Based on how hot it felt, I’m still skeptical of that claim, though it’s possible that the rate of heat increase (rather than the measured temperature itself) could signal to my brain a more severe sensation of heat; I’ll be interested to learn more about the minimum and maximum possible temperatures of the device.

Form-factor

Thermoelectric generators like ThermoReal are not new. What is new, says TEGway, is the form-factor of their device. It takes the form of a flexible skin-like array of conductors which can be curved and wrapped around various surfaces, which could make it perfect for integration into VR controllers, gloves, or even suits.

SEE ALSO
This Pulsating 'Haptic Skin' is Somewhat Creepy, Mostly Awesome

They say it’s the “world-first ‘Stand-Alone’ high performance flexible [thermoelectric device].

– – — – –

For any good haptic device, figuring out how to use it is always the hard part. For ThermoReal, there’s a number of promising applications beyond simply making the player feel hot in a hot environment and cold in a cold environment.

As a few examples to get your imagination churning, the speed and amplitude of the temperature effects should be suitable for conveying the temperature of objects held in the user’s hand. That could mean, for instance, allowing the player to feel it when their energy-weapon has overheated, or feel the cold of a snowball when held in their hand.

The company also says the ThermoReal skin can create the temperatures in discrete areas, potentially allowing for the feeling of virtual objects moving across the player’s hand. You can imagine a sticky snail crawling across your hand, or possibly even larger creatures—like a snake coiling around your leg—if the tech was integrated into a suit-like device covering a larger portion of the player’s body.

Continued on Page 2: Lingering Questions »

The post This Flexible Thermoelectric Skin Has Made Me a Believer in Thermal Haptics appeared first on Road to VR.