Former Microsoft Senior Researcher Dr. Hrvoje Benko gave a talk entitled ‘The Future of AR Interactions’ at the International Symposium on Mixed and Augmented Reality (ISMAR) conference in October. This week the talk was uploaded on the ISMAR YouTube channel.
Dr Benko had worked at Microsoft since 2005, but moved to Facebook Reality Labs (formerly Oculus Research) in late 2017. He now leads the human computer interfaces (HCI) division there.
A Great Display Is Not Enough
A core point that Benko stressed multiple times during the talk is that a great AR display in itself is not good enough — a new input paradigm that takes advantage of spatial computing is needed.
Benko used the example of smartphones with large displays that existed before the iPhone but lacked a multitouch input interface. He pointed out how Hololens and other current AR devices unsuccessfully try to use existing input techniques.
Finger Tracking Is Not Enough
Benko explained that while finger tracking technology is rapidly progressing, humans don’t often interact with empty air — we interact with objects. The only time we tend to use our hands in empty air is when gesticulating during speech.
The lack of haptic feedback with only finger tracking, he claims, is jarring, and is unlikely to be the basis of future interfaces.
Surfaces May Be The Key
Benko pointed out that mixed reality interfaces could leverage the already existing surfaces in the environment to provide real haptic feedback.
Menus could appear on the nearest table or wall, and your fingers could manipulate the virtual UI elements on these surfaces.
This obviously requires a very advanced sensor system with a precise understanding of all the major objects in the room, as well as almost perfect finger tracking.
‘Haptic Retargeting’
Microsoft Shows New Research in Haptics With ‘CLAW’ VR Controller Prototype

As VR display resolutions become more packed with pixels and new controller types such as Valve’s Knuckles dangle tantalizingly in front of us, one thing that often goes overlooked is haptic feedback. There are plenty of companies out there working to help define the first real haptic standard outside of tiny vibration motors seen already in VR motion controllers, and now it appears Microsoft can be counted among them with their newly revealed CLAW controller prototype.
CLAW, Microsoft describes in a recent blog post, is a handheld VR controller designed to “augment the typical controller functionality with force feedback and actuated movement to the index finger.”
With it, Microsoft researchers say it mimics the feeling of grasping virtual objects, touching virtual surfaces, and depressing triggers. Microsoft researchers say it also changes its corresponding haptic configuration by sensing the differences in the user’s grasp.
The prototype contains a servo motor coupled with a force sensor, which imparts force on the index finger during grasping and touching. Using HTC’s Vive Tracker for positional tracking, the prototype also incorporates a vibrating actuator at the index fingertip to mimic virtual textures. While somewhat less exciting in its implications, CLAW can also reconfigure to a trigger mode that delivers haptic force feedback to simulate pulling a trigger on a gun.
Microsoft researchers carried out two user studies, detailed in the full research paper. The first study, researchers say, “obtained qualitative user feedback on the naturalness, effectiveness, and comfort when using the device,” while the study details the ease of the transition between grasping and touching when in use.
Microsoft’s CLAW may not be the foundation of a new haptic controller standard, but to its credit, it does reduce the complexity of 5-finger setups considerably by focusing solely on the index finger. While it’s clear that force feedback and a buzzing actuator on a single finger isn’t what we’d call anywhere near ‘complete’, it’s certainly a step in a different direction.
The post Microsoft Shows New Research in Haptics With ‘CLAW’ VR Controller Prototype appeared first on Road to VR.
Liveblog: VR World Congress 2017 – “Virtual Futures Track: Perception = Reality”
VRFocus is back for our second day at this year’s VR World Congress (VRWC), set to be another packed conference with over 2000 attendees from across the globe representing all fields of virtual reality (VR) and its related industries to Bristol in the UK. Crammed full of talks, experience, software and hardware. With representatives from Microsoft, Leap Motion, IBM, AMD, the Royal Opera House, Samsung, Ultrahaptics, Oculus Story Studio, the BBC and many more in attendance.
VRFocus will be bringing you content throughout today. Next up is Hrvoje Benko of Microsoft Research: “Hrvoje will showcase how recent advances in sensing and display technologies make it possible to manipulate user’s perception in surprising ways.”
Your liveblogger for the event is Peter Graham.
Join us throughout the day on VRFocus for more for more liveblogs and stories from VRWC and, of course, the world at large,