Eye-tracking is a Game Changer for XR That Goes Far Beyond Foveated Rendering

Eye-tracking—the ability to quickly and precisely measure the direction a user is looking while inside of a VR headset—is often talked about within the context of foveated rendering, and how it could reduce the performance requirements of XR headsets. And while foveated rendering is an exciting use-case for eye-tracking in AR and VR headsets, eye-tracking stands to bring much more to the table.

Updated – May 2nd, 2023

Eye-tracking has been talked about with regards to XR as a distant technology for many years, but the hardware is finally becoming increasingly available to developers and customers. PSVR 2 and Quest Pro are the most visible examples of headsets with built-in eye-tracking, along with the likes of Varjo Aero, Vive Pro Eye and more.

With this momentum, in just a few years we could see eye-tracking become a standard part of consumer XR headsets. When that happens, there’s a wide range of features the tech can enable to drastically improve the experience.

Foveated Rendering

Let’s first start with the one that many people are already familiar with. Foveated rendering aims to reduce the computational power required for displaying demanding AR and VR scenes. The name comes from the ‘fovea’—a small pit at the center of the human retina which is densely packed with photoreceptors. It’s the fovea which gives us high resolution vision at the center of our field of view; meanwhile our peripheral vision is actually very poor at picking up detail and color, and is better tuned for spotting motion and contrast than seeing detail. You can think of it like a camera which has a large sensor with just a few megapixels, and another smaller sensor in the middle with lots of megapixels.

The region of your vision in which you can see in high detail is actually much smaller than most think—just a few degrees across the center of your view. The difference in resolving power between the fovea and the rest of the retina is so drastic, that without your fovea, you couldn’t make out the text on this page. You can see this easily for yourself: if you keep your eyes focused on this word and try to read just two sentences below, you’ll find it’s almost impossible to make out what the words say, even though you can see something resembling words. The reason that people overestimate the foveal region of their vision seems to be because the brain does a lot of unconscious interpretation and prediction to build a model of how we believe the world to be.

SEE ALSO
Abrash Spent Most of His F8 Keynote Convincing the Audience That 'Reality' is Constructed in the Brain

Foveated rendering aims to exploit this quirk of our vision by rendering the virtual scene in high resolution only in the region that the fovea sees, and then drastically cut down the complexity of the scene in our peripheral vision where the detail can’t be resolved anyway. Doing so allows us to focus most of the processing power where it contributes most to detail, while saving processing resources elsewhere. That may not sound like a huge deal, but as the display resolution of XR headsets and field-of-view increases, the power needed to render complex scenes grows quickly.

Eye-tracking of course comes into play because we need to know where the center of the user’s gaze is at all times quickly and with high precision in order to pull off foveated rendering. While it’s difficult to pull this off without the user noticing, it’s possible and has been demonstrated quite effectively on recent headset like Quest Pro and PSVR 2.

Automatic User Detection & Adjustment

 

In addition to detecting movement, eye-tracking can also be used as a biometric identifier. That makes eye-tracking a great candidate for multiple user profiles across a single headset—when I put on the headset, the system can instantly identify me as a unique user and call up my customized environment, content library, game progress, and settings. When a friend puts on the headset, the system can load their preferences and saved data.

Eye-tracking can also be used to precisely measure IPD (the distance between one’s eyes). Knowing your IPD is important in XR because it’s required to move the lenses and displays into the optimal position for both comfort and visual quality. Unfortunately many people understandably don’t know what their IPD off the top of their head.

With eye-tracking, it would be easy to instantly measure each user’s IPD and then have the headset’s software assist the user in adjusting headset’s IPD to match, or warn users that their IPD is outside the range supported by the headset.

In more advanced headsets, this process can be invisible and automatic—IPD can be measured invisibly, and the headset can have a motorized IPD adjustment that automatically moves the lenses into the correct position without the user needing to be aware of any of it, like on the Varjo Aero, for example.

Varifocal Displays

A prototype varifocal headset | Image courtesy NVIDIA

The optical systems used in today’s VR headsets work pretty well but they’re actually rather simple and don’t support an important function of human vision: dynamic focus. This is because the display in XR headsets is always the same distance from our eyes, even when the stereoscopic depth suggests otherwise. This leads to an issue called vergence-accommodation conflict. If you want to learn a bit more in depth, check out our primer below:

Accommodation

Accommodation is the bending of the eye’s lens to focus light from objects at different distances. | Photo courtesy Pearson Scott Foresman

In the real world, to focus on a near object the lens of your eye bends to make the light from the object hit the right spot on your retina, giving you a sharp view of the object. For an object that’s further away, the light is traveling at different angles into your eye and the lens again must bend to ensure the light is focused onto your retina. This is why, if you close one eye and focus on your finger a few inches from your face, the world behind your finger is blurry. Conversely, if you focus on the world behind your finger, your finger becomes blurry. This is called accommodation.

Vergence

Vergence is the inward rotation of each eye to overlap each eye’s view into one aligned image. | Photo courtesy Fred Hsu (CC BY-SA 3.0)

Then there’s vergence, which is when each of your eyes rotates inward to ‘converge’ the separate views from each eye into one overlapping image. For very distant objects, your eyes are nearly parallel, because the distance between them is so small in comparison to the distance of the object (meaning each eye sees a nearly identical portion of the object). For very near objects, your eyes must rotate inward to bring each eye’s perspective into alignment. You can see this too with our little finger trick as above: this time, using both eyes, hold your finger a few inches from your face and look at it. Notice that you see double-images of objects far behind your finger. When you then focus on those objects behind your finger, now you see a double finger image.

The Conflict

With precise enough instruments, you could use either vergence or accommodation to know how far away an object is that a person is looking at. But the thing is, both accommodation and vergence happen in your eye together, automatically. And they don’t just happen at the same time—there’s a direct correlation between vergence and accommodation, such that for any given measurement of vergence, there’s a directly corresponding level of accommodation (and vice versa). Since you were a little baby, your brain and eyes have formed muscle memory to make these two things happen together, without thinking, anytime you look at anything.

But when it comes to most of today’s AR and VR headsets, vergence and accommodation are out of sync due to inherent limitations of the optical design.

In a basic AR or VR headset, there’s a display (which is, let’s say, 3″ away from your eye) which shows the virtual scene, and a lens which focuses the light from the display onto your eye (just like the lens in your eye would normally focus the light from the world onto your retina). But since the display is a static distance from your eye, and the lens’ shape is static, the light coming from all objects shown on that display is coming from the same distance. So even if there’s a virtual mountain five miles away and a coffee cup on a table five inches away, the light from both objects enters the eye at the same angle (which means your accommodation—the bending of the lens in your eye—never changes).

That comes in conflict with vergence in such headsets which—because we can show a different image to each eye—is variable. Being able to adjust the imagine independently for each eye, such that our eyes need to converge on objects at different depths, is essentially what gives today’s AR and VR headsets stereoscopy.

But the most realistic (and arguably, most comfortable) display we could create would eliminate the vergence-accommodation issue and let the two work in sync, just like we’re used to in the real world.

Varifocal displays—those which can dynamically alter their focal depth—are proposed as a solution to this problem. There’s a number of approaches to varifocal displays, perhaps the most simple of which is an optical system where the display is physically moved back and forth from the lens in order to change focal depth on the fly.

Achieving such an actuated varifocal display requires eye-tracking because the system needs to know precisely where in the scene the user is looking. By tracing a path into the virtual scene from each of the user’s eyes, the system can find the point that those paths intersect, establishing the proper focal plane that the user is looking at. This information is then sent to the display to adjust accordingly, setting the focal depth to match the virtual distance from the user’s eye to the object.

SEE ALSO
Oculus on Half Dome Prototype: 'don't expect to see everything in a product anytime soon'

A well implemented varifocal display could not only eliminate the vergence-accommodation conflict, but also allow users to focus on virtual objects much nearer to them than in existing headsets.

And well before we’re putting varifocal displays into XR headsets, eye-tracking could be used for simulated depth-of-field, which could approximate the blurring of objects outside of the focal plane of the user’s eyes.

As of now, there’s no major headset on the market with varifocal capabilities, but there’s a growing body of research and development trying to figure out how to make the capability compact, reliable, and affordable.

Foveated Displays

While foveated rendering aims to better distribute rendering power between the part of our vision where we can see sharply and our low-detail peripheral vision, something similar can be achieved for the actual pixel count.

Rather than just changing the detail of the rendering on certain parts of the display vs. others, foveated displays are those which are physically moved (or in some cases “steered”) to stay in front of the user’s gaze no matter where they look.

Foveated displays open the door to achieving much higher resolution in AR and VR headsets without brute-forcing the problem by trying to cram pixels at higher resolution across our entire field-of-view. Doing so is not only be costly, but also runs into challenging power and size constraints as the number of pixels approach retinal-resolution. Instead, foveated displays would move a smaller, pixel-dense display to wherever the user is looking based on eye-tracking data. This approach could even lead to higher fields-of-view than could otherwise be achieved with a single flat display.

A rough approximation of how a pixel-dense foveated display looks against a larger, much less pixel-dense display in Varjo’s prototype headset. | Photo by Road to VR, based on images courtesy Varjo

Varjo is one company working on a foveated display system. They use a typical display that covers a wide field of view (but isn’t very pixel dense), and then superimpose a microdisplay that’s much more pixel dense on top of it. The combination of the two means the user gets both a wide field of view for their peripheral vision, and a region of very high resolution for their foveal vision.

Granted, this foveated display is still static (the high resolution area stays in the middle of the display) rather than dynamic, but the company has considered a number of methods for moving the display to ensure the high resolution area is always at the center of your gaze.

Continued on Page 2: Better Social Avatars »

VRChat Will Add Native Eye Tracking Support, Including On Quest Pro

VRChat plans to add native support for eye tracking.

This includes both PC (via the OSC protocol) and the Quest Store app on Quest Pro, which Meta just announced is being cut to $1000.

The developers released a preview video running standalone on Quest Pro:

VRChat also added support for controller-free hand tracking to the Quest Store app late last year. There are no announced plans for native support for Quest Pro’s face tracking yet, however. Both eye tracking and face tracking are already supported in Meta’s Horizon suite but aren’t yet in other 3rd party social platforms like Rec Room or Bigscreen.

Existing avatars with moving eyes should “just work” without changes. Initially both eyelids will be controlled together, so you’ll be able to blink but not wink, but winking is planned for a future SDK update.

PSVR 2 Horror Shooter ‘Switchback’ Shows Off Unique Eye-tracking Uses in New Video

Don’t blink, because PSVR 2’s eye-tracking may get you more than you bargained for in the headset’s upcoming on-rails horror shooter The Dark Pictures: Switchback VR, which aims to toss some extra scares your way when you least suspect it.

PSVR 2 is releasing on February 22nd, and in its 100+ game content lineup is a unique horror game from the makers of Until Dawn: Rush of Blood which tosses you back into another rollercoaster thrill ride that arms you with plenty of guns to fend off what bumps in the night.

Besides bringing high-quality VR to PS5, Sony’s next-gen headset also packs in eye-tracking, which is many games are using for easier UI selection and foveated rendering—useful, but not terribly exciting stuff.

Some developers though, including Supermassive Games, are integrating the feature into their core gameplay loop, which in Switchback’s case allows enemies to move around specifically when your eyes are closed.

In a new gameplay video, Supermassive shows off the feature as it plays out beyond the big ‘DON’T BLINK’ doors, revealing a room full of grotesque mannequins which only move when you blink—and they’re entirely focused on attacking you if they can.

Alejandro Arque Gallardo, Game Director at Supermassive, says there’s also set to be another mannequin type that works with eye-tracking, but cryptically will work in “a completely different way.”

We’ve linked to the timestamp (above) where Arque Gallardo discusses Switchback’s eye-tracking mechanic. The full video also delves into haptics, adaptive triggers, spatial audio, and the multiple areas you can encounter in the game.

The Dark Pictures: Switchback VR is launching on March 16th, priced at $40. You can pre-order the game here. In the meantime, make sure to check out our growing list of all confirmed games coming to PSVR 2.

PlayStation VR2 Eye Tracking Will Let You Wink At People In VR

PSVR 2’s eye tracking can detect each eye blinking, letting you wink in VR.

Details of the eye tracking capabilities were unveiled during Unity’s GDC 2022 talk on PlayStation VR2.

As well as tracking what you’re looking at – your gaze – the talk revealed that PSVR 2 can also track your pupil diameter and per eye blink states.

Human pupil diameter expands in darkness and contracts in brightness, but it also changes based on your emotional state. Theoretically, developers could use this to gauge whether you’re having a strong emotional response to what you’re experiencing. Applying pupil diameter to an avatar could change the nature of social VR and the per eye blink states could enable winking in social spaces. Additionally, developers of horror games might be able to choose the moment to trigger a scare in a far more targeted way. We asked Sony to comment about its approach to data collection of pupil diameter and blink state and will update this post if we hear back.

The flagship capability of eye tracking though is, of course, foveated rendering. This means the system only renders where you’re currently looking at full resolution, thus freeing up performance since the rest of your view is lower resolution. That extra performance can be used to improve the graphical fidelity of the environment or port large scale flatscreen games to VR without needing to make lower quality assets. You can read about the specific performance advantages here.

PSVR 2 Foveated Rendering Provides 3.6x Faster Performance – Unity

As reported by Michael Hicks of Android Central, a Unity panel at GDC last week revealed new details about PSVR 2, including ways developers will be able to harness eye tracking in their experiences and the performance benefits gained from doing so.

There was no full PSVR 2 reveal at GDC from Sony this year, but as we reported last week, they were showing the headset to developers at the conference alongside the Unity talk. We weren’t at the talk ourselves so we can’t verify the data, but hopefully it’ll be posted online later.

The big focus seemed to be on detailing the headset’s eye-tracking. As previously reported, PSVR 2’s eye-tracking will be able to provide foveated rendering. This is when an experience uses eye-tracking data to only fully render areas of the screen that the user is directly looking at, whilst areas in your peripheral vision aren’t fully realized. This can greatly improve performance if done right.

According to Android Central, the Unity talk revealed that GPU frame times are 3.6x faster when using foveated rendering with eye-tracking on PSVR 2, or just 2.5x faster when using foveated rendering alone (which presumably only blurs the edges of the headset’s field of view, which is a technique commonly used on Quest headsets.

Running the VR Alchemy Lab demo with dynamic lighting and shadows on PSVR 2, frame time reportedly dropped from 33.2ms to 14.3ms. In another demo — a 4K spaceship demo — CPU thread performance was 32% faster and GPU frame time went down from 14.3ms to 12.5ms.

Moving beyond performance, developers also outlined the various ways eye tracking can be implemented into experiences on PSVR. The headset will be able to track “gaze position and rotation, pupil diameter, and blink states.”

This means you will be able to magnify what a player is looking at — particularly useful for UI design — or use eye data to make sure the game grabs the right item when the player is looking at something they want to pick up.

Developers will also be able to tell if the player is staring at an NPC or even if they wink at them, allowing them to program custom responses from NPC to those actions. Likewise, eye tracking can be used for aim assist when throwing an item, for example, so that the item is course-corrected and thrown in a direction closer to the player’s intention, based on their gaze.

Eye-tracking will also mean more realistic avatars in social experiences, and the ability to create ‘heat maps’ of players’ gazes while playtesting games. This would let developers iterate on puzzles and environments to improve the experience based on eye-tracking data.

Last month, it was reported that Tobii was “in negotiation” to supply PSVR 2’s eye-tracking technology.

In non-eye tracking news, it was also confirmed at the Unity talk that developers will be able to create asymmetric multiplayer experiences for PSVR 2, where one player is playing in VR while others play using the TV. This was something we saw used in the original PSVR, too.

For more info on PSVR 2, check out our article with everything we know so far.

Tobii Negotiating With Sony to Supply PlayStation VR2’s Eye Tracking

In a rather surprising statement today, eye-tracking specialist Tobii has revealed that it is currently talking to Sony Interactive Entertainment (SIE) in regards to supplying its tech for PlayStation VR2.

It was certainly a very short but sweet statement, saying: “Tobii AB, the global leader in eye tracking and pioneer of attention computing, announces it is currently in negotiation with Sony Interactive Entertainment (“SIE”) be the eye tracking technology provider in SIE’s new VR headset, PlayStation VR2 (PS VR2).”

Tobii added that the negotiations are “are ongoing” with no details to share regarding how a deal would financially impact the company. The only reason the details have been shared is due to EU Market Abuse Regulations.

PlayStation VR2 was initially teased in early 2021, with eye-tracking confirmed by Sony during its CES 2022 presentation. What makes the reveal so surprising is the timing. Whilst SIE hasn’t yet confirmed a launch date, there have been suggestions it could be this year. If the eye tracking component is still being negotiated then that could mean waiting even longer for PlayStation VR2. Key features like this are usually settled upon well in advance so videogame companies know what they’re working with.

PlayStation 5 VR Controller

Even so, having Tobii on board would mean the eye tracking is in very good hands. Tobii already supplies the likes of the HP Reverb G2 Omnicept Edition and the Pico Neo 2 Eye, both of which are enterprise-level VR headsets.

Eye-tracking is commonly used to enhance VR experiences, allowing avatars to be more expressive or thanks to foveated rendering reduce the processing workload for improved visuals. PlayStation VR2 will utilise all of these features, SIE recently launched a new landing page for the headset where you can delve into its capabilities and sign-up for more info. Unfortunately, there’s no picture of the headset just yet.

As further details of this announcement come to light, gmw3 will keep you updated.

Update: PSVR 2 to Include Tech from the Biggest Name in Eye Tracking

Tobii, a global leader in eye-tracking, announced earlier this year that it was in talks with Sony to include its tech in the upcoming PlayStation VR2. Now the company has confirmed its eye-tracking is integrated in PSVR 2.

Update (July 1st, 2022): Tobii has officially announced it is a key manufacturer of PSVR 2’s eye-tracking tech. The company says in a press statement that it will receive upfront revenue as a part of this deal starting in 2022 and revenue from this deal is expected to represent more than 10% of Tobii’s revenue in 2022.

“PlayStation VR2 establishes a new baseline for immersive virtual reality (VR) entertainment and will enable millions of users across the world to experience the power of eye tracking,” said Anand Srivatsa, Tobii CEO. “Our partnership with Sony Interactive Entertainment (SIE) is continued validation of Tobii’s world-leading technology capabilities to deliver cutting-edge solutions at mass-market scale.”

The original article follows below:

Original Article (February 7th, 2022): Tobii released a short press statement today confirming that negotiations are ongoing, additionally noting that it’s “not commenting on the financial impact of the deal at this time.”

It was first revealed that Sony would include eye-tracking in PSVR 2 back in May 2021, with the mention that it will provide foveated rendering for the next-gen VR headset. Foveated rendering allows the headset to render scenes in high detail exactly where you’re looking and not in your peripheral. That essentially lets PSVR 2 save precious compute power for more and better things.

Founded in 2001, Tobii has become well known in the industry for its eye-tracking hardware and software stacks. The Sweden-based firm has partnered with VR headset makers over the years and can be found in a number of devices, such as HTC Vive Pro Eye, HP Reverb G2 Omnicept Edition, Pico Neo 2 Eye, Pico Neo 3 Pro Eye, and a number of Qualcomm VRDK reference designs.

It’s still unclear when PSVR 2 is slated to arrive, although it may be positioned to become the first true commercial VR headset to feature eye-tracking—that’s if PSVR 2 isn’t beaten out by Project Cambria, the rumored ‘Quest Pro’ headset from Meta which is also said to include face and eye-tracking.

Vive Pro 2, Focus 3 Getting Eye-Tracking Add-On, Focus Getting Hand-Tracking

Neither the HTC Vive Pro 2 nor the Vive Focus 3 feature integrated eye-tracking, but support will apparently come via an add-on device in the future.

As announced by Vive China President Alvin Wang Graylin, both headsets will be compatible with the new Droolon F2 Eye-Tracking module that’s set to ship later this year. You may have heard that name before – in 2019 we reported that the first generation of the device, created by Chinese startup 7invensun, would support the Vive Cosmos.

Vive Pro 2 And Focus 3 Eye-Tracking Confirmed (Kind Of)

Vive Pro 2 Eye Tracking Focus 3 Droolon

Don’t hold your breath for a big western launch, though. Both the Droolon F1 and even the add-on the company made for the original Vive in 2017 were promised as worldwide releases but we never really saw them reach North America or Europe in a meaningful way. HTC itself didn’t mention the device in any of our pre-briefings, so we’re not likely to see the kit make a big splash in the west.

That said the device does have US pricing – Droolen F2 is shipping in Q3 for $299. That’s double the price of the F1.

It’s curious that Vive Pro 2 itself doesn’t have integrated eye-tracking given that the previous version of the device, the Vive Pro Eye, did. Eye-tracking has several uses for VR headsets, including foveated rendering that can improve performance by only fully rendering the part of the display you’re directly looking at. That said, it’s also true that there’s little in the way of consumer-level software that integrates the feature right now.

On Monday, we reported that the new PlayStation 5 VR headset will have its own form of gaze tracking with foveated rendering. There’s also speculation a possible Quest Pro device could integrate the feature too.

Pico Announces Neo 3 Pro, Neo 3 Pro Eye For Enterprise Market

Pico Interactive announced two new headsets this week — the Pico Neo 3 Pro and the Pico Neo 3 Pro Eye, both aimed at an enterprise market and available soon in North America and Europe.

The two new headsets follow from the launch of the standard consumer-focused Pico Neo 3 headset last month, which is available exclusively in China. Pico promised that headsets aimed at an enterprise market would follow, and the Neo 3 Pro and Neo 3 Pro Eye are just that.

Pico Neo 3 Pro
The Pico Neo 3 Pro

Like the Neo 3, they are both powered by the Snapdragon XR2 platform and now feature optical 6DoF controller tracking, as opposed to the electromagnetic controller seen in the Neo 2 Pro and Neo 2 Pro Eye. Both headsets feature a single 5.5″ display, with a resolution of 3664 x 1920 and a 90Hz refresh rate. The field of view (FOV) on both headsets is ever so slightly lower than the last generation, down to 98 from 101 in the Neo 2 headsets. However, while the Neo 2 generation had a fixed IPD, the Neo 3 Pro and Neo 3 Pro Eye offer adjustable IPD settings at 58mm, 63.5mm and 69mm.

Just like the standard Neo 3 model, both the Pro and the Pro Eye feature upgraded WiFi 6 capabilities, along with improved guardian support thanks to the increase to 4 cameras from only 2 last generation. The Neo 2 headsets only supported a 3x3m guardian, but the Neo 3 Pro and Pro Eye now support a 10x10m space.

Pico Neo 3 Pro Eye
The Pico Neo 3 Pro Eye

Continuing the partnership from the Neo 2 generation, the Neo 3 Pro Eye will feature built-in eye-tracking capabilities from Tobii.

Both headsets offer DisplayPort and NVIDIA’s Direct Mode support, which promises streaming of PC VR content at 4K 90Hz via Pico VR Streaming when connected to a VR-ready computer via DisplayPort.

In terms of pricing, the Neo 3 Pro sits at $699 USD and the Neo 3 Pro Eye at $899 USD. Both headsets will be available to pre-order soon for enterprise customers on the Pico Interactive website.

HP Reverb G2 Omnicept Will Cost $1,249 in May

HP Reverb G2 Omnicept Edition

HP is really the one company carrying the torch for Microsoft’s Windows Mixed Reality (WMR) system having launched the HP Reverb G2 at the end of 2020. Its not stopping there though. After announcing an enterprise-focused model called the Reverb G2 Omnicept in September, today HP has announced it’ll be available in May priced at $1,249 USD.

HP Reverb G2 Omnicept Edition

That price might sound a little steep but it’s not considering other pro-level headsets and the amount HP has packed into the device. Because there are a lot of sensors, all designed to help provide developers and companies with data-driven insight depending on their requirements. Built into the HP Reverb G2 Omnicept Edition are sensors for eye tracking, heart rate monitoring, facial movements and even pupil dilation.

All of this can be used for any number of use cases, from training scenarios looking at how users cope in particular situations; mental health and monitoring someone’s well being to creating a more realistic, immersive experience where co-workers can collaborate with expressive avatars.

For creators, alongside the hardware launch, HP will also release the Omnicept software development kit (SDK) in four options depending on the organisation and planned use. HP Omnicept SDK ‘Core’ will be free but won’t offer the Inference Engine SDK, whilst the Academic version is free for educational use or 2% revenue share for profit. The Developer Edition of the SDK is a flat 2% revenue share with the Enterprise Edition pricing tailored to the company. Only the Core version has a couple of extra exclusions, Pulse Rate Variability API and HP VR Spatial Audio omitted. The latter uses dynamic head-related-transfer functions (HRTFs) to create a personalised sound for a more immersive experience.

HP Reverb G2 Omnicept Edition

As previously reported, the HP Reverb G2 Omnicept Edition’s other specs still mirror its consumer cousin, with a 2160×2160 per eye resolution, Valve’s off-ear headphones, four cameras for inside-out tracking, a 90Hz refresh rate, a 114-degree FOV and a 6 meter cable for plenty of freedom to move around.

The HP Reverb G2 Omnicept Edition will be available to order through HP’s own website in May. If you’re just looking for a decent PC VR headset for gaming then the standard HP Reverb G2 goes for $599. For further updates on HP’s VR plans, keep reading VRFocus.