Former Microsoft Senior Researcher, Now At Facebook, Recounts Haptics Innovations

haptic revolver vr controller

Former Microsoft Senior Researcher Dr. Hrvoje Benko gave a talk entitled ‘The Future of AR Interactions’ at the International Symposium on Mixed and Augmented Reality (ISMAR) conference in October. This week the talk was uploaded on the ISMAR YouTube channel.

Dr Benko had worked at Microsoft since 2005, but moved to Facebook Reality Labs (formerly Oculus Research) in late 2017. He now leads the human computer interfaces (HCI) division there.

A Great Display Is Not Enough

A core point that Benko stressed multiple times during the talk is that a great AR display in itself is not good enough — a new input paradigm that takes advantage of spatial computing is needed.

Benko used the example of smartphones with large displays that existed before the iPhone but lacked a multitouch input interface. He pointed out how Hololens and other current AR devices unsuccessfully try to use existing input techniques.

Finger Tracking Is Not Enough

Benko explained that while finger tracking technology is rapidly progressing, humans don’t often interact with empty air — we interact with objects. The only time we tend to use our hands in empty air is when gesticulating during speech.

The lack of haptic feedback with only finger tracking, he claims, is jarring, and is unlikely to be the basis of future interfaces.

Surfaces May Be The Key

Benko pointed out that mixed reality interfaces could leverage the already existing surfaces in the environment to provide real haptic feedback.

Menus could appear on the nearest table or wall, and your fingers could manipulate the virtual UI elements on these surfaces.

This obviously requires a very advanced sensor system with a precise understanding of all the major objects in the room, as well as almost perfect finger tracking.

‘Haptic Retargeting’

Microsoft Shows New Research in Haptics With ‘CLAW’ VR Controller Prototype

As VR display resolutions become more packed with pixels and new controller types such as Valve’s Knuckles dangle tantalizingly in front of us, one thing that often goes overlooked is haptic feedback. There are plenty of companies out there working to help define the first real haptic standard outside of tiny vibration motors seen already in VR motion controllers, and now it appears Microsoft can be counted among them with their newly revealed CLAW controller prototype.

CLAW, Microsoft describes in a recent blog post, is a handheld VR controller designed to “augment the typical controller functionality with force feedback and actuated movement to the index finger.”

With it, Microsoft researchers say it mimics the feeling of grasping virtual objects, touching virtual surfaces, and depressing triggers. Microsoft researchers say it also changes its corresponding haptic configuration by sensing the differences in the user’s grasp.

The prototype contains a servo motor coupled with a force sensor, which imparts force on the index finger during grasping and touching. Using HTC’s Vive Tracker for positional tracking, the prototype also incorporates a vibrating actuator at the index fingertip to mimic virtual textures. While somewhat less exciting in its implications, CLAW can also reconfigure to a trigger mode that delivers haptic force feedback to simulate pulling a trigger on a gun.


Microsoft researchers carried out two user studies, detailed in the full research paper. The first study, researchers say, “obtained qualitative user feedback on the naturalness, effectiveness, and comfort when using the device,” while the study details the ease of the transition between grasping and touching when in use.

Microsoft’s CLAW may not be the foundation of a new haptic controller standard, but to its credit, it does reduce the complexity of 5-finger setups considerably by focusing solely on the index finger. While it’s clear that force feedback and a buzzing actuator on a single finger isn’t what we’d call anywhere near ‘complete’, it’s certainly a step in a different direction.

The post Microsoft Shows New Research in Haptics With ‘CLAW’ VR Controller Prototype appeared first on Road to VR.

Liveblog: VR World Congress 2017 – “Virtual Futures Track: Perception = Reality”

VRFocus is back for our second day at this year’s VR World Congress (VRWC), set to be another packed conference with over 2000 attendees from across the globe representing all fields of virtual reality (VR) and its related industries to Bristol in the UK. Crammed full of talks, experience, software and hardware. With representatives from Microsoft, Leap Motion, IBM, AMD, the Royal Opera House, Samsung, Ultrahaptics, Oculus Story Studio, the BBC and many more in attendance.

VRFocus will be bringing you content throughout today. Next up is Hrvoje Benko of Microsoft Research: “Hrvoje will showcase how recent advances in sensing and display technologies make it possible to manipulate user’s perception in surprising ways.”

Your liveblogger for the event is Peter Graham.


Join us throughout the day on VRFocus for more for more liveblogs and stories from VRWC and, of course, the world at large,

Experimental Controllers From Microsoft Research Offer A New Way To Feel Objects In VR

Experimental Controllers From Microsoft Research Show A New Way To Feel Objects In VR

We’re eager to go hands-on with Valve’s new prototype Vive controllers, but these new experiments from Microsoft’s Research division might be even more exciting.

These ‘High-fidelity 3D Haptic Shape Rendering on Handheld Virtual Reality Controllers’, named NormalTouch and TextureTouch, were designed by Hrvoje Benko, Christian Holz, Mike Sinclair, and Eyal Ofek. They’re also position-tracked, though the title is long enough already. The first controller features a platform not dissimilar to analogue sticks seen on a gamepad. Rather than you pushing the stick around to move in a game world, though, the stick rises, lowers and tilts based on surfaces and objects you’d interact with in the virtual world.

As you can see in the video, if you were to run a finger along a table, the stick would remain flat, but when your hand travels over an object it adapts to replicate that change. As a finger passes over a ball the stick tilts and moves with your hand, simulating the curved shape of the object. What’s more, force feedback allows users to test the stiffness of surfaces and prevent your hand passing through an item, meaning you could apply more force to a balloon with less resistance than you would a block of concrete, for example. It can even be used to push objects.

TextureTouch, meanwhile, uses a 4×4 matrix of actuated pins to better replicate the surface of objects, applying feedback as you drag your hand across. The prototype’s creators have tested the controllers, and published their findings online. Using these controllers produced better ratings in tasks like tracing a virtual object with a finger than controllers that used VibroTactile feedback or simply relying on visual feedback.

There’s no denying these are intriguing solutions to VR’s current feedback problem, though they’re far away from any sort of consumer implementation. For now, the best we have its the vibration feedback provided by the Oculus Touch and HTC Vive wands.

File these controllers away with other interesting experiments, like Oculus Research’s haptic feedback prototypes.