HaptX Launches Enterprise-focused Haptic Gloves Dev Kit

HaptX (formerly AxonVR) today unveiled the HaptX Gloves Development Kit, an “industrial-grade” VR haptic glove that uses micro-pneumatics for detailed haptics and delivers force-feedback to the fingers.

The HaptX Gloves dev kit includes two gloves, each with 130 tactile actuators that the company says in a press statement provides “realistic touch across the hand and fingertips.” An exoskeleton provides force-feedback by physically stopping your fingers when you grasp a virtual object like a dial or handle.

We went hands-on with an early prototype of HaptX’s gloves last year, and Road to VR’s Ben Lang came away pretty impressed with the tech, saying however that the company’s ultimate challenge was to turn the prototype into what would ideally be something sleeker, smaller, and far more practical.

Dean Takahashi of VentureBeat got a chance to get his hands on the current dev kit itself, saying the feeling of the actuators “was more fine-grained in terms of sensations.” Images seem to support the claim that the dev kit itself hasn’t slimmed down substantially however, as the kit itself is still heavily reliant on a central control box and of-the-shelf Vive Trackers to give the user’s hands positional tracking.

Image courtesy Venture Beat

While the setup is admittedly not ideal for consumers at the moment, the company has remained steadfast in their market segment, appealing to industrial and government organizations which can deploy training solutions with the aim of transferring skills learned in VR to the workplace.

SEE ALSO
Hands-on: HaptX Glove Delivers Impressively Detailed Micro-pneumatic Haptics, Force Feedback

“With HaptX Gloves, leading automotive and aerospace companies can touch and interact with their vehicles before they are built, radically reducing time and cost for design iterations,” said Jake Rubin, founder and CEO of HaptX. “Industrial and government organizations can deploy virtual training solutions that build real muscle memory, providing a safe, cost-effective and flexible alternative to live training.”

HaptX has reportedly taken on $5.8 million in seed funding in its opening financing round in late 2016.

The HaptX Gloves Development Kit made its worldwide debut today at the Future of Storytelling Summit in New York City and the GeekWire Summit in Seattle.

Check out the mixed reality launch trailer below to get an idea of how virtual training works with HaptX’s gloves.

The post HaptX Launches Enterprise-focused Haptic Gloves Dev Kit appeared first on Road to VR.

XRI: Cross-Reality Interaction

Widespread consumer adoption of XR devices will redefine how humans interact with both technology and each other. In coming decades, the standard mouse and QWERTY keyboard may fade as the dominant computing UX, giving way to holographic UI, precise hand/eye/body-tracking and, eventually, powerful brain-to-computer interfaces. One key UX pattern that must be answered by designers and developers is: How to input?

That is, by what means does a user communicate and interact with your software and to what end? Aging 2D input paradigms are of limited use, while new ones are little understood or undiscovered altogether. Further, XRI best practices will vary widely per application, use case and individual mechanic.

The mind reels. Though these interaction patterns will become commonplace in time, right now we’re very much living through the “Cinema of Attractions” era of XR tech. As such, we’re privileged to witness the advent of a broad range of wildly creative immersive design solutions, some as fantastic as they are impractical. How have industry best practices evolved?

Controllers

These may seem pedestrian, but it’s easy to forget that the first controllers offering room-scale, six degrees-of-freedom (6-DoF) tracking only hit the market in 2016 (first Vive’s Wands then Oculus’ more ergonomic Touch, followed by Windows’ muddled bastardization of the two in 2017). With 6-DoF XR likely coming to mobile and standalone systems in 2018, where are controller interfaces headed?

Well, Vive’s been developing its “Grip controllers” (aka the “knuckles controllers”) — which are worn as much held, allowing users freer gestural tracking and expression — for over a year, but they were conspicuously excluded from the CES launch announcement of the Vive Pro.

One controller trend we did see at CES: haptics. Until now, handheld inputs have largely utilised general vibration to indicate haptic feedback. The strength of the rumble can be throttled up or down, but limited to just one vibratory output, developers’ power to express information with physical feedback has been limited. It’s a challenging problem: how to simulate physical resistance where there is none?

VR Controllers
Left: the HaptX Glove, Right: the Tactical Haptics Reactive Grip Motion Controller

HaptX Inc. is one firm leading advances in this field with their HaptX Gloves, two Nintendo Power Glove-style offerings featuring tiny air pockets that dynamically expand and contract to provide simulated touch and pressure in VR in real-time. All reports indicate some truly impressive tech demos, though perhaps at the cost of form-factor — the hardware involved looks heavy-duty and removing the glove would appear to be several degrees more difficult than setting down a Vive Wand, for contrast.

Theirs strikes me as a specialty solution, perhaps more suited to location-based VR or commercial/industrial applications. (Hypothetical: would a Wand/Touch-like controller w/ this type of “actuators” built into the grips provide any UX benefit at the consumer level?). Meanwhile, Tactical Haptics is exploring this tech through a different lens, using a series of sliding plates and ballasts in their Reactive Grip Motion Controller, which tries to simulate some of the physical forces and resistance one feels wielding objectives with mass in meatspace. This is perhaps a more practical haptics approach for consumer adoption — they’re still simple controllers, but the added illusion of physics force could be a truly compelling XRI mechanic (for more, check out their white paper on the tech).

Hand-Tracking

Who needs a controller? For some XR applications, the optimal UX will take advantage of the same built-in implements with which humans have explored the material world for thousands of years: their hands.

Tracking a user’s hands in real-time 27 degrees of freedom (four per finger, five in the thumb, six in the wrist) absent any handheld implement allows them to interact with physical objects in their environment as one normally would (useful in MR contexts)— or to interact with virtual assets and UI in a more natural, frictionless and immersive way than, say, the pulling of a trigger on a controller.

And of course, I defy you to test such software without immediately making rude gestures with it.

Pricier AR/MR rigs like Microsoft’s Hololens will have hand-tracking technology baked in — though reliability, field of view and latency vary. However, most popular VR headsets on the market don’t offer this integration natively thus far. Thankfully, the Leap Motion hand-tracking sensor, available as a desktop peripheral for years, is being retrofitted by XR developers with compelling results. For additional reading, and to see some UX possibilities in action I’d recommend checking out this great series by Leap Motion designer Martin Schubert.

These hand-eye interaction patterns have been entrenched in our brains over thousands of years of evolution and (for most of us) decades of first-hand experience. This makes them feel real and natural in XR. As drawbacks go, the device adds yet another USB peripheral and extension cable to my life (surely I will drown in a sea of them), and there are still field of view and reliability issues. But as the technology improves, this set of interactions works so well that it can’t help but become an integral piece of XRI. To allow for the broadest range of use cases, I’d argue that all advanced/future XR HMDs need to feature hand-tracking natively (though optionally, per application, of course).

Interestingly enough, the upcoming Vive Pro features dual forward-facing cameras in addition to its beefed-up pixel density. We now know, having been confirmed by Vive, hand-tracking can be done using these. Developers and designers would do well to start grokking XR hand-tracking principles now.

Eye-Tracking

Though the state of the art has advanced, too much of XRI has been relegated to holographic panels attached at the wrist. While this is no doubt an extremely useful practice, endless new possibilities for UI and gameplay mechanics emerge once you add high-quality, low-latency eye tracking to any HMD-relative heads-up display UI and/or any XR environment beyond it.

Imagine browsing menus more effortlessly than ever using only your eyes to exact selection, or to target distant enemies better in shooters. Consider also the effects of eye-tracking in multiplayer VR and the possibilities that unlocks. Once combined with 3D photogrammetry scans of users faces or hyper-expressive 3D avatars, we’ll be looking at real-time, photorealistic telepresence in XR spaces (if you’re into that sort of thing).

Wrist-Mounted UI
Wrist-mounted UI has proliferated in XR — but only goes so far. Eye-tracking will usher in many HMD-relative UI possibilities.

Imagine browsing menus more effortlessly than ever using only your eyes to exact selection, or to target distant enemies better in shooters. Consider also the effects of eye-tracking in multiplayer VR and the possibilities that unlocks. Once combined with 3D photogrammetry scans of users faces or hyper-expressive 3D avatars, we’ll be looking at real-time, photorealistic telepresence in XR spaces (if you’re into that sort of thing).

Eye-tracking isn’t just promising as an input mechanism. This tech will also allow hardware and software developers to utilise a technique called foveated rendering. Basically, the human eye only sees sharply near the very center of your gaze — things get more blurred further out into your visual periphery. Foveated rendering takes advantage of this wetware limitation by precisely tracking the position of your eyes from frame to frame and rendering whatever you’re looking at super precisely on (theoretically) higher-resolution screens. Simultaneously, the quality of everything you’re not looking directly at is downgraded – which you won’t notice because your pathetic human eyes literally can’t. This will allow for more XR on lower-powered systems and will allow high-end systems to stretch possibilities even further with higher-resolution screens.

Tobii & HTC Vive
Tobii’s eye-tracking technology embedded in a custom Vive

While Oculus and Google have acquired eye-tracking companies in recent years, the current industry leader appears to be Tobii. Their CES demos were reportedly extremely impressive ;  but considering they retrofit a new Vive for each devkit, their solution is not mass-market at this point – and likely pricey, since you have to seek approval to even receive a quote. Still, the potential benefits of eye-tracking for XRI are so great, surely we’ll see native adoption of this tech by major HMD manufacturers in coming hardware generations (hopefully through a licensing deal with Tobii).

Voice & Natural Language Processing

As the trend of exploding Alexa use has taught us, many users love interacting with technology using their voices. Frankly, the tech to implement keyword and phrase recognition at relatively low cost is already there for developers to utilise — it’s officially low-hanging fruit in 2018.

On the local processing side, Windows 10 voice recognition tech runs on any PC with that OS — though it currently fairs better with shorter keywords and a low confidence threshold. (Check out this great tutorial for Unity implementation on Lightbuzz.com). Alternatively, you can offshore more complex phrases and vocal data to powerful, highly-optimized Google or Amazon processing centers. At their most basic, these services transform vocal data into stringvalues you can store and program logic against — but certainly many other kinds of analyses of and programmatic responses to the human voice are possible through the lens of machine learning: stress signals, emotional cues, sentiment evaluation, behavior anticipation, etc.

At the OS/always-on level, some Alexa-like voice-controlled task rabbit has to be in the pipeline (Rift OS Core 2.0 already gives me access to my Windows desktop, and therefore Cortana) —that’s assuming Amazon’s automated assistant doesn’t grace the XR app stores herself. At the individual app level, this powerful input may be the most widely available yet underutilised in XR (though for the record, I do see it as primarily an optional mechanic, not one that should be required for many experiences). When I’m dashing starboard to take on space pirates in From Other Suns, I want to be able to yell “Computer, fire!” so badly — this would be so pure. In Fallout 4 VR, I want to yell, “Go!” and point to exactly where Dogmeat should run (I pulled this off with my buddy BB-8 in a recent project). Developers and designers should look for more chances to use voice recognition more often as the implementation costs continue to fall.

Brain-Computer Input

Will we eventually arrive at a point where the most human of inputs —our physical and vocal communications—are no longer necessary to order each and every task? Can we interact with a computer using our minds alone? Proponents of a new generation of brain-computer-interfaces (BCI) say yes.

At a high-level, the current generation of such technology exists as helmet- or headband-like devices that use generally use safe and portable electroencephalography (EEG) sensors to monitor various brain waves. These sensors will generally output floating point values per type of wave tracked, and developers can program different responses to such data as they please.

Neurable HTC Vive
Neurable’s Vive integration

Though studied for decades, this technology has not yet reached maturity. The major caveat right now is that a given person’s ability project and/or manipulate the specific brainwaves tracked by accurately (as tracked by each device’s array of EEG sensors) will vary and can sometimes require lots of calibration and practice.

Still, recent advances appear promising. Neurable is perhaps the leader in integrating an array of EEG and other BCI sensors with a Vive VR headset. On the content side, the Midwest US-based StoryUp XR is using another BCI, the Muse, to drive a mobile VR app with the users’ “positivity,” which they say corresponds to a particular brainwave picked up by the headset that users can learn to manipulate. StoryUp, who are part of the inaugural Women In XR Fund cohort, hope to bring these kinds of therapeutic and meditative XR experiences to deployed military, combat veterans and the general public using BCI interfaces as both a critical input and a monitor of user progress.

It will likely be decades before you’re able to dictate an email via inner monologue or directly drive a cursor with your thoughts — and who knows whether such sensitive operations will even be possible without invasive surgery to hack directly into the wetware. (Yes, that was a fun and terrifying sentence to write). I would wager, however, that an eye-tracking-based cursor combined with “click” or “select” actions driven by an external BCI will become possible within a few hardware generations, and may well end up being the fastest, most natural input in the world.

Machine Learning

Imagine an AI-powered XR OS a decade from now: one that can utilise and analyse all the above inputs, divining user intent and taking action on their behalf. One that, if unsure of itself, can seek clarification in natural language or in a hundred other ways. It can acquire your likes and dislikes through experience and observation as easily as you might for a new friend, constructing a model your overall XR interaction preferences — with the AI itself, with other humans, and with the virtual realities your visit and the physical ones you augment. This system will, at the very least, be able to model and emulate human social graces and friendship.

Any such system will also have unparalleled access to your most sensitive personal and biometric data. The security, privacy and ethical concerns involved will enormous and should be given all due consideration. In his talk on XR UX at Unity HQ last fall, Unity Labs designer and developer Dylan Urquidi said he sees blockchain technology as a possible medium for context-aware, OS-level storage of these kinds of permissions or preferences. This allows ultimate ownership and decision-making power re: this data to remain with the user, who can allow or deny access to individual applications and subsystems as desired.

I’m currently working on a VR mechanic using a neural net trained from Google QuickDraw data to recognize basic shapes drawn with Leap Motion hand-tracking — check out my next piece for more.

Machine learning is likely the most important yet least understood technology coming to XR and computing at large. It’s on designers and developers to educate themselves and the public on how they’re leveraging these technologies and their users’ data safely and responsibly. For myself, machine learning is the first problem domain I’ve encountered in programming where I don’t grok all the mathematics involved.

As such, I’m currently digging through applied linear algebra coursework and Andrew Ng’s great machine learning class on Coursera.org in an effort to better understand this most arcane frontier (look out for my next piece, where I’ll apply some of these concepts and train neural net to identify shapes drawn in VR spaces). While I’m not ready to write the obituary for the QWERTY keyboard just yet, these advances make it clear that in terms of XRI, the times are a-changin’.

Grabbing Virtual Objects with the HaptX Glove (Formerly AxonVR)

Jake-Rubin
Jake Rubin

The HaptX Glove that was shown at Sundance was one of the most convincing haptics experiences that I’ve had in VR. While it was still primitive, I was able to grab a virtual object in VR, and for the first time have enough haptic feedback to convince my brain that I was actually grabbing something. Their glove uses a combination of exoskeletal force feedback with their patented microfluidic technology, and they’ve significantly reduced the size of their external box driving the experience from the demo that I saw at GDC (back when they were named AxonVR) thanks to a number of technological upgrades and ditching the temperature feedback.

LISTEN TO THE VOICES OF VR PODCAST

joe-michaels
Joe Michaels

I had a chance to talk with CEO & co-founder Jake Rubin and Chief Revenue Officer Joe Michaels at Sundance where we talked about why enterprise & military training customers are really excited about this technology, some of the potential haptics-inspired interactive storytelling possibilities, how they’re refining the haptics resolution fidelity distribution that will provide the optimal experience, and their collaboration with SynTouch’s texture-data models in striving towards creating a haptic display technology that can simulate a wide ranges of textures.

SEE ALSO
Hands-on: HaptX Glove Delivers Impressively Detailed Micro-pneumatic Haptics, Force Feedback

HaptX was using a Vive tracker puck for arm orientation, but they had to develop customized magnetic tracking to get the level of precision required to simulate individual finger movements, and one side effect is that their technology could start to be used as an input device. Some of HaptX’s microfludic technologies combined with a new air valve that is 1000x more precise could also start to create unique haptics technologies that could have some really interesting applications for sensory replacement or sensory substitution or start to be used in assisting data visualizations in a similar way that sound enhances spatialization through a process called sonification.

Photo by Road to VR

Overall, HaptX is making rapid progress and huge leaps with their haptics technologies and they’ve crossed a threshold for becoming useful enough for a number of different enterprise and military training applications. Rubin isn’t convinced that VR haptics will ever be able to fully trick the brain in a way that’s totally indistinguishable from reality, but they’re getting to the point where it’s good enough to start to be used creatively in training and narrative experiences. Perhaps soon we’ll be seeing some of HaptX’s technology in location-based entertainment applications created by storytellers who got to experience their technology at Sundance this year, and I’m really looking forward to seeing how their textures haptic display evolves over the next year.


Support Voices of VR

Music: Fatality & Summer Trip

The post Grabbing Virtual Objects with the HaptX Glove (Formerly AxonVR) appeared first on Road to VR.

Haptx Is Working On VR’s Long-Awaited Touch Glove

Haptx Is Working On VR’s Long-Awaited Touch Glove

Virtual reality can never be too immersive, at least until we get to something like the Star Trek Holodeck. While VR’s visuals and sound make you feel like you’re wandering in a 3D space, the feeling disappears the second you touch something. That’s why Haptx is the latest company to try to tackle this problem — with haptic gloves that enable you to feel things that you touch in VR.

VR has been slow in taking off with consumers, and it just might take another generation or two of the technology before it lives up to its potential. Haptic technology, which could really bring the force feedback and sensation of touch, might be necessary in the next generation of VR systems.

Jake Rubin, CEO of Seattle-based Haptx, showed me his company’s haptic gloves at the recent VRX event in San Francisco. I could feel the texture, size, weight, and impact of objects in a virtual environment. It enables tactile feedback, force feedback, and motion tracking.

Above: Haptx lets you feel a fox walking across your hand.

Image Credit: Haptx

Rubin said the company’s Haptx Skin is a microfluidic smart textile that delivers high fidelity tactile feedback. The gloves have more than 100 points where air bubbles can be inflated to displace your skin and make you feel something as you move your hand through the virtual world. Haptx combines this with motion-tracking technology to figure out where your hand is in a 3D space and the kind of feedback it should send to your hand.

“We have proprietary algorithms that simulate interaction with any virtual object,” Rubin said. “There are no electronic motors in the glove. It is microfluidics.”

The demo showed me a farm in VR. I could run my fingers through strands of grain in a field and feel them on my skin. I could touch the clouds in the sky, and the stars as well, and each felt different, due to different resistive forces. A fox jumped into my hand as I held it open. That was a lot more fun than feeling the eight legs of a spider walking across my hand.

I asked if the spider would bite me.

“We found that was a little much for people,” Rubin said.

Raindrops fell on my hand, and as the drops bounced on my hand, I could feel them. I was giggling while it was happening. For the finale, a bunch of UFOs invaded the farm and I had to knock them out of the way. Each time I hit one, I could feel its force on my hand.

“It’s the haptic equivalent of a visual display,” Rubin said. “Instead of a pixel that changes color, you have a pixel that changes pressure. We have a high-density, high-displacement, and high-bandwidth solution in a light and thin package.”

I used the demo with an HTC Vive VR headset that was connected by wire to a big box. That box controls the flow of air into the glove, which also had a custom sensor attached to it. The glove has panels in the palm and fingers, allowing you to feel the shape texture of objects. There’s a force feedback exoskeleton in the fingers with microfluidic actuators that apply resistance to your fingers as you grab things.

Above: You can feel the rocks and the wheat in this demo with a Haptx Skin glove.

Image Credit: Haptx

The glove is still a prototype, and it’s pretty bulky. But it sure beats a rumble game controller, which is used in video game systems today but isn’t very useful in enterprise applications. And it’s a proof of concept demonstrating that the tech can fit in a glove.

“Haptics today are limited to vibrations, which buzz when you touch something,” Rubin said. “It doesn’t tell you anything about the size, shape, and texture. We believe microfluidics can displace your skin in the same way that happens in real life. It is augmented with force feedback to provide resistance.”

Right now, the system can only handle one glove, but soon it will handle two. The company hopes to ship Haptx gloves to its first 20 customers in the middle of 2018. After that, it will scale up full production.

“We’re already talking to top studios, intellectual property creators, theme parks, and arcades,” Rubin said. “Those will be some of our first customers. Initially, we’re focused on location-based entertainment.”

The initial applications will be arcades and theme parks. Beyond gaming, the company is looking at markets in entertainment, design, manufacturing, and training.

Rubin started the company in 2012 with cofounder Bob Crockett, who heads the biomedical department at Cal Polytechnic State University in San Luis Obispo, California. Rivals include Dexta Robotics, Tactai, and Ultrahaptics. Years ago, Jaron Lanier’s data gloves were tracking systems only, and they tracked only one kind of motion.

Haptx raised $9 million in funding from NetEase, Dawn Patrol Ventures, the Virtual Reality Company, Keeler Investment Group, former Twitter CEO Dick Costolo, Disney Imagineering executive Jon Snoddy, Digital Kitchen founder Paul Matthaeus, and others.

The company has more than 30 people. It previously operated under the name AxonVR.

This post by Dean Takahashi originally appeared on VentureBeat.

Tagged with:

AxonVR Rebrands as HaptX and announces Haptic Gloves

AxonVR have previously been involved in virtual reality (VR) technology with its patents for wearable haptic technology for interfacing with VR. The copmany has now rebranded and announced a new line of haptic gloves.

AxonVR have officially rebranded at HaptX, and has announced its first product, the HaptX Gloves, haptic wearable devices that allow VR users to experience realistic tactile feedback. The gloves feature over 100 points of tactile feedback, and are capable of up to five pounds of resistance per finger, with sub-millimetre motion tracking.

“HaptX Gloves are the result of years of research and development in haptic technology,” said Jake Rubin, Founder and CEO, HaptX Inc. “What really sets HaptX Gloves apart is the unprecedented realism they deliver. Our patented microfluidic technology physically displaces the skin the same way a real object would when touched, closely replicating its texture, shape, and movement.”

“We’ve reviewed the wearable haptic solutions out there, and the HaptX prototype provides the most realistic feedback by far,” said Dr. Jeremy Fishel, Chief Technology Officer of SynTouch, the leading tactile evaluation company. “HaptX marks a fundamental breakthrough in our industry’s ability to simulate touch.”

“Enterprise and entertainment users require a higher level of immersion than today’s VR controllers can deliver,” said Joe Michaels, Chief Revenue Officer at HaptX. “HaptX Gloves will allow our customers to get more out of their VR applications—whether they’re administering virtual training, designing three-dimensional objects, or developing a VR game.”

“This name change reflects our company’s dedication to delivering realistic touch through advanced haptic technology,” added Rubin. “HaptX Gloves are a huge step toward our long-term goal to deliver a full-body haptic platform, ushering in a world where virtual reality is indistinguishable from real life.”

HaptX Gloves are planned to begin shipping to certain selected customers in 2018.

VRFocus will bring you further information on HaptX as it becomes available.

HaptX Glove: Micro-Pneumatic für Finger-Force-Feedback

Objekte in VR nicht nur greifen, sondern auch spüren können: Gleich mehrere Projekte wollen das ermöglichen und damit die Erfahrung in der virtuellen Realität noch naturgetreuer gestalten. Unsere Kollegen von Road To VR haben jetzt den HaptX Glove ausprobieren können und zeigen sich beeindruckt.

HaptX Glove VR

HaptX Glove: Pixel-Druck für Feingefühl und Gegendruck

Mit 5 Millionen US-Dollar als Venture Capital im Rücken hat das frisch in HaptX umbenannte Unternehmen den Prototypen eines Handschuhs für VR entwickelt. Neu ist die Idee nicht, beispielsweise haben unsere Schweizer Freunde von Sensoryx mit VRfree eine Lösung verwirklicht, die sogar mobil handhabbar ist, allerdings kein Force Feedback bietet. Das versuchen Forscher an der UC San Diego über künstliche Muskelkammern zu realisieren. Aber es kann noch Jahre, dauern, bis daraus irgendwann ein Produkt wird – wenn überhaupt.

Bei HaptX setzt man auf Micro-Pneumatics, um ein Force-Feedback-System für die Finger zu realisieren. Das Unternehmen verspricht, den ersten Handschuh zu produzieren, der ein superfeines taktiles Feedback bietet. Dabei soll die Krafteinwirkung pro Finger bei fünf Pfund liegen. Die Kräfte werden durch kleine aufblasbare Luft-Röhren erzeugt, die kreisförmig angeordnet sind und die der Hersteller als „haptische Pixel“ bezeichnet – rund 100 „Pixel“ hat der Handschuh, wodurch sich das Feedback sehr fein abstimmen lässt: vom Kribbeln bisz zu einer Murmel, die in der Hand herumrollt.

HaptX Glove VR

Force Feedback zum Greifen

Zusätzlich lässt sich ein Force-Feedback erzeugen, für das HaptX ebenfalls Micro-pneumatische Elemente einsetzt. Hier beschränken aufblasbare „Stopper“ die Fingerbewegungen und simulieren damit das Halten von Objekten. Laut Ben Lang von Road To VR arbeiten beide Systeme beeindruckend gut – von allen bisher von ihm getesteten Systemen wäre HaptX den anderen weit überlegen. Auch das Tracking, das auf einem eigenen magnetischen System beruht, arbeitete im Hands-on zuverlässig. Lang probierte verschiedene Demos aus – beispielsweise spürte er das Kribben von Spinnenbeinen oder er nahm einen Spielzeugtraktor in die Hand. Einschränkungen gibt es trotzdem, so kann man das Gewicht eines Objektes nicht spüren. Und das System ist für subtile Effekte und kleinere Objekte gedacht. Um ein Schwert zu schwingen oder eine Waffe abzufeuern sei das System nicht geeignet.

HaptX Glove VR

HaptX geht auch deshalb erstmal den Weg, Kunden in der Industrie anzusprechen. Nächstes Jahr wird man mindestens 20 Unternehmen beliefern. Die größte Herausforderung für die Firma wird aber nach Ben Lang sein, den viel zu großen und klobigen Prototypen auf eine ergonomische Größe zu bringen. Auch das Anziehen dauere noch zu lang. HaptX glaubt aber, diese Probleme lösen zu können. So plane man, die Handschuhe in verschiedenen Größen anzubieten, der Prototyp besitzt nur eine und muss somit jede Handgröße bedienen. Aber das ultimative Ziel des Unternehmens ist nicht nur ein Handschuh, sondern ein Ganzkörperanzug für die totale Immersion.

(Quelle: Road To VR)

Der Beitrag HaptX Glove: Micro-Pneumatic für Finger-Force-Feedback zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

AxonVR Secures Patent for “Realistic Touch” VR

AxonVR have announced that they’ve secured a new foundational patent for a brand new way to interact with the world in virtual reality (VR). The company hope the new patent will set them on the path to being the first introduce new technology that allows true immersion in the virtual world.

AxonVR’s patent specifically mentions “haptic technology for wearable devices,” suggesting unique feedback in wearables. AxonVR’s HaptX platform includes HaptX Skin, a haptic smart textile, HaptX skeleton, a literal force-feedback exoskeleton, and HaptX SDK, the software toolkit allowing developers to create unique experiences for all of these devices.

“This patent is a product of years of cutting-edge research and development in haptic technology” said Jake Rubin, Founder and CEO, AxonVR. “It provides comprehensive protection for the novel microfluidic technology at the heart of AxonVR’s HaptX Platform.”

The idea of “Realistic Touch” in VR is fascinating, and would make immersion in the virtual space that much more inseparable from the real world. We’re imagining VR gloves that allow true individual finger replication and texture sensation… Fascinating.

One of the patent’s inventors, Dr. Robert Crockett wants people to know that this technology is truly groundbreaking; “The novel microfluidic architecture disclosed in this patent provides an unprecedented combination of high displacement, high bandwidth, high spatial resolution, and small size for wearable haptic devices.”

Dr Crockett continues; “No other haptic technology can offer this combination of features, all of which are essential for delivering realistic touch feedback.”

If AxonVR’s plans for this patent come to fruition, it could make for one of the most immersive and incredible VR experiences available – potentially even essential to the future of VR.

For everything on AxonVR’s developments and the latest in VR, stay with VRFocus.

AxonVR is Building a Generalized Haptic Display

Jake-RubinAxonVR was awarded US Patent No. 9,652,037 of a “whole-body human-computer interface” on May 16th, which includes an external exoskeleton as well as a generalized haptic display made out of microfluidic technology. I had a chance to demo AxonVR’s HaptX™ haptic display that uses a “fluidic distribution laminate” with channels and actuators to form a fluidic integrated circuit of sorts that could simulate variable stiffness and friction of materials.

At GDC, I stuck my hand into a 3-foot cube device with my palm facing upward. I could drop virtual objects into my hands, and there was an array of tactile pixels that was simulating the size, shape, weight, texture, and temperature of these virtual objects. The virtual spider in my hand was the most convincing demo as the visual feedback helped to convince my brain that I was holding the virtual object. Most of the sensations were focused on the palm on the hand, and the fidelity was not high enough to provide convincing feedback to my fingertips. The temperature demos were also impressive, but also were a large contributor to the bulkiness and size of the demo. They’re in the process of miniaturizing their system and integrating it with an exoskeletal system to have more force feedback, and the temperature features are unlikely going to be able to be integrated in the mobile implementations of their technology.

LISTEN TO THE VOICES OF VR PODCAST

I had a chance to talk with AxonVR CEO Jake Rubin about the process of creating a generalized haptic device, their plans for an exoskeleton for force feedback, and how they’re creating tactile pixels to simulate a cutaneous sensation of different shapes and texture properties. Rubin said that that the Experiential Age only has one end point, and that’s full immersion. In order to create something like the Holodeck, then Rubin thinks that a generalized haptic device will unlock an infinite array of applications and experiences that will be analogous to what general computing devices have enabled. AxonVR is not a system that’s going to be ready for consumer home applications any time soon, but their microfluidic approach for haptics is a foundational technology that is going to be proven out in simulation training, engineering design, and digital out of home entertainment applications.


Support Voices of VR

Music: Fatality & Summer Trip

The post AxonVR is Building a Generalized Haptic Display appeared first on Road to VR.

AxonVR Raises $5.8 Million Seed Investment for Advanced VR Haptics

AxonVR has announced the closure of a $5.8 million seed investment, which is claimed to be the largest secured by a VR haptics company to date. The company intends to use the money to build out their HaptX platform which will be licensed directly to businesses such as theme parks and VR arcades.

With a few exceptions, haptics in VR usually amounts to using vibration as a proxy for a touch event. AxonVR, however, is trying to change this by actually applying localized pressure to the points of virtual contact. What this means is that when you place a virtual object on your hand, you would feel pressure at many points of contact instead of just a general vibration from a controller. To accomplish this, AxonVR has created a haptic skin that has hundreds of dots that are linked to pneumatic actuators. These actuators inflate the corresponding dots in the skin whenever the virtual object comes into contact with the player.

SEE ALSO
AxonVR is Making a Haptic Exoskeleton Suit to Bring Your Body and Mind into VR

Road to VR went hands on with AxonVR’s haptic skin technology in a demo at this year’s IMMERSE Summit. The demo was similar to the one featured in AxonVR’s video with the exception of the thermal feedback system (which was not present at the time).

To try out their technology, I placed my hand, palm upwards, in a slot on the side of large metal box that contained pneumatic drive system needed to make their haptic skin function. Using a Vive controller in my other hand, I was able to place a variety of objects on my virtual hand and feel the corresponding pressure response from the HaptX skin.

axonvr-haptx-demo-screen-captureThe most impressive part of the demo was when I got to place a virtual deer on my hand. I could feel the individual points of contact as the deer moved its legs around my hand. While the experience raised the bar for what is capable in haptic feedback, AxonVR has an even grander vision for the future.

The company’s HaptX Skeleton is a full-body exoskeleton that uses force feedback to enable both locomotion and macro-haptic feedback to entire limbs. While they did not provide a date for when the HaptX Skeleton would be available, AxonVR president, Mark Kroese, says that they will have a product shipping sometime in 2017.

SEE ALSO
Hands-on: 4 Experimental Haptic Feedback Systems at SIGGRAPH 2016

AxonVR was founded in 2012 and has offices in both San Luis Obispo, California and Seattle, Washington.

The post AxonVR Raises $5.8 Million Seed Investment for Advanced VR Haptics appeared first on Road to VR.