Researchers Exploit Natural Quirk of Human Vision for Hidden Redirected Walking in VR

Researches from Stony Brook University, NVIDIA, and Adobe have devised a system which hides so-called ‘redirected walking’ techniques using saccades, natural eye movements which act like a momentary blindspot. Redirected walking changes the direction that a user is walking to create the illusion of moving through a larger virtual space than the physical space would allow.

Update (4/27/18): The researchers behind this work have reached out with the finished video presentation for the work, which has been included below.

Original Article (3/28/18): At NVIDIA’s GTC 2018 conference this week, researchers Anjul Patney and Qi Sun presented their saccade-driven redirected walking system for dynamic room-scale VR. Redirected walking uses novel techniques to steer users in VR away from real-world obstacles like walls, with the goal of creating the illusion of traversing a larger space than is actually available to the user.

There’s a number of ways to implement redirected walking, but the strengths of this saccade-driven method is that it’s hidden from the user, widely applicable to VR content, and dynamic, allowing the system to direct users away from objects newly introduced into the environment, and even moving objects, the researchers say.

The basic principle behind their work is an exploitation of a natural quirk of human vision—saccadic suppression—to hide small rotations to the virtual scene. Saccades are quick eye movements which happen when we move our gaze from one part of a scene to another. Instead of moving in a slow continuous motion from one gaze point to the next, our eyes quickly dart about, when not tracking a moving object or focused on a singular point, a process which takes tens of milliseconds.

An eye undertaking regular saccades

Saccadic suppression occurs during these movements, essentially rendering us blind for a brief moment until the eye reaches its new point of fixation. With precise eye-tracking technology from SMI and an HTC Vive headset, the researchers are able to detect and exploit that temporary blindness to hide a slight rotation of the scene from the user. As the user walks forward and looks around the scene, it is slowly rotated, just a few degrees per saccade, such that the user reflexively alters their walking direction in response to the new visual cues.

This method allows the system to steer users away from real-world walls, even when it seems like they’re walking in a straight line in the virtual world, creating the illusion that the the virtual space is significantly larger than the corresponding virtual space.

A VR backpack allows a user at GTC 2018 to move through the saccadic redirected walking demo without a tether. | Photo by Road to VR

The researchers have devised a GPU accelerated real-time path planning system, which dynamically adjusts the hidden scene rotation to redirect the user’s walking. Because the path planning routine operates in real-time, Patney and Sun say that it can account for objects newly introduced into the real world environment (like a chair), and can even be used to steer users clear of moving obstacles, like pets or potentially even other VR users inhabiting the same space.

The research is being shown off in a working demo this week at GTC 2018. An academic paper based on the work is expect to be published later this year.

The post Researchers Exploit Natural Quirk of Human Vision for Hidden Redirected Walking in VR appeared first on Road to VR.

The VR Job Hub: Ultrahaptics, Adobe Research, Force Field VR and more

Whether you’re an experienced designer, programmer, engineer, or maybe you’ve just been inspired after reading VRFocus  articles, the jobs listed here are located worldwide, from major game players to humble indie developers – the one thing they all have in common is that they are all jobs in VR.

View the new listings below for more information:

Location Company Role Link
Manchester, UK Red Frog Digital Ltd Mobile App R&D Software Engineer Click here to apply
Bristol, UK Ultrahaptics Visual Designer (Contract) Click here to apply
London, UK Facebook Software Engineer, Social VR Click here to apply
San Francisco, CA Adobe Research VR Video Engineer Click here to apply
Amsterdam Force Field VR QA Tester Click here to apply
Amsterdam Force Field VR Lead Programmer Click here to apply
Amsterdam Force Field VR Senior Character Artist Click here to apply

As usual, you can check last week’s edition for further job listings. If you are an employer looking for someone to fill a role in a VR, AR or other related areas in the industry and want that position to be featured on next week’s VR Job Hub, please send details to either or

Check back with VRFocus next Sunday at 3PM BST and every Sunday for the latest roles in the VR industry.