Cybershoes for Oculus Quest Impressions: Surprisingly Effective VR Movement

Cybershoes for Oculus Quest give you the means to move convincingly in VR using your actual legs without ever needing to physical stand up from your chair. And, believe it or not, it actually does a pretty good job. Here are our first impressions of the Cybershoes for Oculus Quest. The Kickstarter campaign for Cybershoes on Quest is fully-funded at over twice their goal with an end date of December 31.

Cybershoes on Oculus Quest

Over two years ago I wrote about my experience using the original iteration of the Cybershoes, which were designed to be used with a PC VR headset tethered to a PC. This newest model supports both Quest and PC VR.

The most intrusive part of the previous setup is that you had to dangle the HMD wire above your head with a little fishing rod-style contraption that was a pain to setup and took up lots of space. That’s no longer the case with the wireless, standalone Oculus Quest headset.

While it might seem redundant to use a device that lets you move around in VR with your legs since you could just, you know, stand up and move around in VR with your legs using a Quest already, but there are some unique advantages with Cybershoes. Not only does it mean you don’t need to worry about your Guardian boundaries and room size constraints, but it also should help tremendously for those with motion sickness concerns.

cybershoes the shoes

The act of swinging your feet and twisting around in a chair adds that physical element most VR is missing that can, for many people, alleviate the VR sickness woes. Personally, I don’t get motion sick or VR sick so I cannot confirm nor deny the effectiveness, but many users have reported results after using these and the previous PC VR version.

The concept here is very simple. You strap on these open-style shoes that slide on the ground and simulate actual movement. The bottoms of the shoes have sensors that tell your VR headset which direction you’re moving.

If the game has analog stick / gamepad movement support at all, then it should work with no problems. For Oculus Quest many of the top games, like Arizona Sunshine, Myst, and The Walking Dead: Saints & Sinners all work great right out of the box.

Thankfully setup is dead simple. All you have to do is strap the Cybershoes onto your feet, attach the little box to the front of your headset, which is incredibly light, and plug that in the side. There are no wires to worry about at all.

At first it takes some getting used to. Rubbing your feet across the floor to move isn’t exactly a natural movement nor is it super intuitive, but it starts to click after a while. The concept is the same as you see in other movement solutions, such as the treadmill-style options from Omni, but you’re seated instead.

cybershoes opening it up

Admittedly I don’t think I see myself using Cybershoes for Quest very often even though they absolutely do work as advertised. To me, the physicality of standing up and moving around a room is far more immersive and important than rubbing my feet on the ground. However, I can see some use cases for this.

If you get motion sick easily and traditional artificial locomotion in games like The Walking Dead: Saints & Sinners always makes you uncomfortable, then you could totally try using Cybershoes as a way to circumvent that side effect. Alternatively, if you have a disability that prevents you from standing for long periods of time but you can still move your legs, then this is an excellent middle ground.

So to be perfectly clear: yes, Cybershoes works as intended and removing the wire from PC VR makes it far more user-friendly and compelling, but, just like the 3DRudder, I fail to see a compelling reason to use this instead of just moving around a room. Even if you don’t have enough space for roomscale and would just be standing in one spot and leaning around I’d still rather do that than be restricted to sitting in a chair while in VR — especially when using a standalone, wireless, roomscale headset like the Oculus Quest or Quest 2.

cybershoes quest

Maybe that will change once more developers add support, but as it stands it’s hard to imagine a world where sitting down with sensor shoes is the ideal way of enjoying otherwise roomscale VR.

Cybershoes für Oculus Quest erreicht Kickstarter-Ziel nach einem Tag

Tower Tag auf Steam

Ihr wollt eure Füße verwenden, um euch in der Virtual Reality fortzubewegen? Für diesen Zweck hat Cybershoes ein System entwickelt, welches euch im Sitzen durch die Virtual Reality laufen lässt. Dieses System soll nun auch für die Oculus Quest und Oculus Quest 2 verwirklicht werden.

Cybershoes für Oculus Quest erreicht Kickstarter-Ziel nach einem Tag

Bereits am ersten Tag der Kampagne wurde das Finanzierungsziel für die Cybershoes für die Oculus Quest und Oculus Quest 2 erreicht. Dazu sei aber gesagt, dass das Unternehmen nach “nur” 30.000 US-Dollar gefragt hatte, was für ein Hardware-Produkt nicht besonders viel Geld ist.

Wenn ihr noch kein Cybershoes-System für den PC besitzt, dann könnt ihr als Early Bird mit 279 US-Dollar aktuell auf Kickstarter einsteigen und sollt das Produkt im April 2021 erhalten.

Wenn ihr Cybershoes bereits für den PC besitzt, dann müsst ihr nur 49 US-Dollar investieren und erhaltet im April 2021 den Reveiver für die Oculus Quest und Oculus Quest 2.

Hier gelangt ihr zur Kickstarter-Kampange von Cybershoes.

Die Oculus Quest 2 in Deutschland weiterhin nicht verfügbar. Ihr könnt die Brille aber problemlos über Amazon FR bestellen.  Hier findet ihr die Oculus Quest 2 auf Amazon FR.

(Quelle: Road to VR)

Der Beitrag Cybershoes für Oculus Quest erreicht Kickstarter-Ziel nach einem Tag zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Half-Life: Alyx Locomotion Development Explained In Deep Dive Valve Video

Valve posted a nearly 11-minute video providing a deep dive look at the development of the locomotion systems in Half-Life: Alyx.

The video explains how they moved from the teleportation system they used in 2016’s The Lab — their experimental collection of VR software — to the three systems included in Half-Life: Alyx. The game includes ‘Continuous’ locomotion, ‘Shift’ movement, and ‘Blink’ teleport, with the three methods for traversing City 17 making the game work for a wide range of play styles and comfort preferences.

“Our initial tests and playtester’s desires led us to the system we have today, the main goals of which are to ensure the end player position is a valid place for the player’s body and that the path to get there is viable,” Valve’s Greg Coomer says in the video. “To get to this point we had to solve a whole variety of problems.”

The video breaks down how the height of players affected the development of their systems and how they developed audio systems to ground players in the world of Half-Life: Alyx, like footstep volume and timing changing based on how far a player is moving. Playtesters also expected a louder sound if they teleported from a high to a low area, according to the video, and Valve developers implemented the feature as a result.

The video provides a really insightful overview of the movement systems in Half-Life: Alyx and some of the subtler aspects to its design that might’ve been overlooked during their playthroughs. Check it out above.

The post Half-Life: Alyx Locomotion Development Explained In Deep Dive Valve Video appeared first on UploadVR.

Infinite Walking In VR Thanks To A Clever Brain Hack

Locomotion in virtual reality (VR) has been something of a challenge for developers. For the most immersive experience, VR should be able to track the movement of the user, moving when they do, but this comes with several issues, though these might be solved thanks to the work of a group of researchers.

Computer scientists from Stony Brook University, Nvidia and Adobe have been working together to create a new framework that allows VR users to experience infinite walking in the virtual world, even though they are limited to a small physical space in the real world.

The framework utilises a natural function of the human eye in order to ‘hack’ the brain. The work revolves around something called the saccade. This is something that they human eye does when looking at different points in ur field of vision, such as when scanning a room of viewing a painting. These saccades occur without conscious direction and can happen several times in a second.

When a saccade is happening, the brain ignores the input coming in from the eye to avoid confusion, something called ‘saccadic suppression’. The process used by the research team takes advantage of this by using head and eye-tracking to detect when saccadic suppression is occurring and redirects the users’ walking path, making them walk in a circle without being consciously aware of it.

“In VR, we can display vast universes; however, the physical spaces in our homes and offices are much smaller,” says lead author of the work, Qi Sun, a PhD student at Stony Brook University and former research intern at Adobe Research and NVIDIA. “It’s the nature of the human eye to scan a scene by moving rapidly between points of fixation. We realized that if we rotate the virtual camera just slightly during saccades, we can redirect a user’s walking direction to simulate a larger walking space.”

The research paper produced is titled ‘Towards Virtual Reality Infinite Walking: Dynamic Saccade Redirection’ and the team will be presenting their work at SIGGRAPH 2018, which is due to take place from 12th-16th August in Vancouver, Canada.

For news on the latest developments in VR technology, keep checking back with VRFocus.

3dRudder Releases New Chrome Extension Update

Virtual reality (VR) locomotion controller 3dRudder has released a new update that brings support for control of web pages by means of a users foot control.

3dRudder WebSocket 01

This new control scheme is possible thanks to the new 3dRudder WebSocket server. By opening a constant communication flow between a users web browser and the 3dRudder, a real-time foot-based interaction control scheme can be achieved.

The new 3dRudder WebSocket server can be called from any web page to let users interact with apps and content in the page all with their feet. This allows new ways for users to engage with content all thanks to the integrated solution that 3dRudder providers. The released Chrome extension acts as an example by allowing users to control video playback with their feet on websites such as YouTube, Daily Motion, Vimeo, Amazon Prime Video, Twitch and many others.

3dRudder WebSocket 02

This new application for the 3dRudder opens up the doors to many new exciting possibilities. Some of the examples that 3dRudder suggest include the creation and manipulation of 3D objects in cloud-based applications for CAD design. Educational providers could take advantage of this new integration as well to make web-based learning more engaging. Of course, playing online videogames and web-based VR applications would also be a suitable and ideal use.

3dRudder currently allows users to take advantage of its locomotion system to allow for a more natural sense of travel within virtual space. As only the feet need to be used it means that hands become free to preform additional actions and helps to reduce control clutter that might lead to confusion. 3dRudder also removes the limits of physical space meaning there is no need to worry about walking into your surroundings as you remain seated, but have the feeling of free movement.

3dRudder was announced back in 2015 with a release a few months later in mid-2016. Since then a a wireless version was revealed at CES 2017 and just last month a price cut was announced as well, bringing the price down to $99 (USD). With the newly released 3dRudder WebSocket server the future of 3dRudder and web-based controls looks promising.

For more on 3dRudder in the future, keep reading VRFocus.

The Mage’s Tale erhält Fortbewegungs-Update und mehr

Der VR-Dungeon Crawler The Mage’s Tale von InXile Entertainment wurde vor Kurzem für die Oculus Rift mit Touch Support veröffentlicht. Der VR-Titel konnte zwar allgemein begeistern, doch die überholten Fortbewegungsmethoden waren vielen Spielern ein Dorn im Auge. Die Entwickler haben sich der Problematik angenommen und kündigen ein Update an, das neben weiteren Verbesserungen eine freie Fortbewegung ermöglicht.

Neues Update von The Mage’s Tale verbessert Gameplay und Performanz

Zauberer Oculus Touch Spiel

Das Dungeon Crawler RPG The Mage’s Tale bietet alles, was die Herzen von Genre-Fans höher schlagen lässt: ein eigenes Level- und Crafting-System, dunkle Dungeons mit jeder Menge Loot, den es zu entdecken gilt, und eine rollenspielartige Atmosphäre, die an alte Pen and Paper Spiele erinnert. Das Ganze wird noch gewürzt mit einer Prise Humor und Eastereggs. Natürlich alles in VR. Insgesamt sorgen 10 – 15 Stunden Gameplay für langen Spielspaß.

Doch die Fortbewegung per Teleportation stört in der ersten Version das großartige Gesamtpaket des Spiels. Schließlich zeigen Titel wie Arizona Sunshine oder Onward, wie geschmeidige Locomotion funktioniert. Der CEO Brian Fargo vom Entwicklerstudio InXile Entertainment kündigte letzte Woche per Twitter ein kommendes Update an, das diesem Problem nachgeht. Das Update soll bereits diese Woche erscheinen und den Spielern die Möglichkeit geben, auf freie Locomotion umzustellen.

Dazu erklärte der CEO in einem Interview: “Wir hören unserer Community genau zu. Die VR macht die Spieler wesentlich feinfühliger für Spielelemente, wie z. B. UI und Features, die die Immersion zerstören können. Einige Spieler störten sich beispielsweise am Fadenkreuz, deshalb bieten wir nun eine Option, um dieses auszustellen. Ein weiterer Kritikpunkt ist die Fortbewegung, die wir nun mit dem kommenden Update verändern. Zudem werden die Ladezeiten verkürzt.”

Das Update umfasst also einige Optionen, die die Immersion verbessern. Auch die Verkürzung der Ladezeiten dürfte den Spielern willkommen sein. Insgesamt verbessert das Update also sowohl Gameplay als auch Performance. Wir sind zuversichtlich, dass sich das Update positiv auf The Mage’s Tale auswirkt.

(Quellen: UploadVR | Brian Fargo Twitter)

Der Beitrag The Mage’s Tale erhält Fortbewegungs-Update und mehr zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Sprint Vector Gets Bigger With New Features

Survios has announced more of what we can expect from their virtual reality (VR) parkour-skiing hybrid, Sprint Vector. The videogame was praised by those who played it at GDC 2017, and the developers are keen to point out that there’s still more to come.

Sprint Vector is a game that takes on the task of locomotion in VR in a unique way. Instead of simply moving in the virtual world by pushing an analogue stick forward, players swing their arms in wide arcs, alternating motion and using the triggers to maximise the distances they can run and leap.

VRFocus played the game earlier this year, and said; “Whether its the motion of your arms or the design layout, Sprint Vector doesn’t induce those feelings of simulator sickness that many may expect when looking at [the game]. The subject is always a difficult one to approach due to everyone being effected differently, but it would seem that the direct approach to controlling your own speed – and subsequently making your body move more – can help mitigate those issues developers try to avoid using controls like teleportation.”

New features players can expect in Sprint Vector include built-in weapons, items and power-ups. New running techniques – drifting and wall running – just add to the variety of moves and traversal the game offers, also potentially opening the door to new kinds of levels.

Each player can now blast objects in the environment to change the path forward, and grab Nitro and Slow Mines – the former boosting your speed, and the latter can be left in wait to slow down your foes.

Sprint Vector is no longer just a tech demo for a new system of movement, as Survios Game Design Director Mike McTyre makes clear; “When we first unveiled Sprint Vector earlier this year, we were blown away by the reaction to the Fluid Locomotion system.”

McTyre continues; “It’s one thing to be able to talk about seamless motion controls in VR, but it’s quite another to be able to implement them at such high speeds without causing the player any discomfort. Now, we’re taking that core experience and adding in fun weapons and power-ups to make the game more competitive and exciting for both players and spectators.”

Sprint Vector will be at E3, along with Survios’ critically acclaimed title Raw Data.

For more on VR, stay on VRFocus.

New Oculus Dev Blog Teaches Best Practices for User Interfaces in VR

When using virtual reality (VR) for the first time, nothing is worse than a bad experience for the user – it can put them off the medium for life. As you can imagine, no one fears this more than head-mounted display (HMD) manufacturers such as Oculus, and Oculus have released another dev blog to help prospective VR developers build comfortable, fun experiences.

Oculus’ Chris Pruett works on the Oculus Content team with a large number of VR developers, so he knows the common pitfalls of a poor user experience in VR.

Pruett outlines what he calls the VR Motion Design Fundamentals;

  • Don’t accelerate, rotate, or decelerate the VR camera. The vestibular system can feel acceleration and rotation but not fixed-speed linear motion.
  • No locking or animation of the VR camera. The camera position belongs to the person inside the headset now. It’s controlled by the movements of their head and neck.
  • Limit the amount of “pixel flow” in people’s peripheral vision.
  • No judder! Your app must maintain a high and consistent frame rate at all times.

HMD manufacturers and VR storefronts alike want to improve the overall quality of VR experiences. Google was also outlining tips to help build comfortable VR experiences this week.

Pruett goes into much more detail in the full Oculus dev blog, and touches on many techniques that make for better VR experiences, such as Linear Speed, Tunnel Vision, Head Steering, Depth Cue Issues, Tracked Space Size, Polygon Clipping and more. If you’re looking to develop for VR, this is a good place to start.

We’re seeing more and more tools that helps developers move into the VR space – it seems the future is very bright for the industry.

For everything on VR development and industry news, stay on VRFocus.

New Oculus Dev Blog Teaches Best Practices for User Interfaces in VR

When using virtual reality (VR) for the first time, nothing is worse than a bad experience for the user – it can put them off the medium for life. As you can imagine, no one fears this more than head-mounted display (HMD) manufacturers such as Oculus, and Oculus have released another dev blog to help prospective VR developers build comfortable, fun experiences.

Oculus’ Chris Pruett works on the Oculus Content team with a large number of VR developers, so he knows the common pitfalls of a poor user experience in VR.

Pruett outlines what he calls the VR Motion Design Fundamentals;

  • Don’t accelerate, rotate, or decelerate the VR camera. The vestibular system can feel acceleration and rotation but not fixed-speed linear motion.
  • No locking or animation of the VR camera. The camera position belongs to the person inside the headset now. It’s controlled by the movements of their head and neck.
  • Limit the amount of “pixel flow” in people’s peripheral vision.
  • No judder! Your app must maintain a high and consistent frame rate at all times.

HMD manufacturers and VR storefronts alike want to improve the overall quality of VR experiences. Google was also outlining tips to help build comfortable VR experiences this week.

Pruett goes into much more detail in the full Oculus dev blog, and touches on many techniques that make for better VR experiences, such as Linear Speed, Tunnel Vision, Head Steering, Depth Cue Issues, Tracked Space Size, Polygon Clipping and more. If you’re looking to develop for VR, this is a good place to start.

We’re seeing more and more tools that helps developers move into the VR space – it seems the future is very bright for the industry.

For everything on VR development and industry news, stay on VRFocus.