Eye-tracking is a Game Changer for XR That Goes Far Beyond Foveated Rendering

Eye-tracking—the ability to quickly and precisely measure the direction a user is looking while inside of a VR headset—is often talked about within the context of foveated rendering, and how it could reduce the performance requirements of XR headsets. And while foveated rendering is an exciting use-case for eye-tracking in AR and VR headsets, eye-tracking stands to bring much more to the table.

Updated – May 2nd, 2023

Eye-tracking has been talked about with regards to XR as a distant technology for many years, but the hardware is finally becoming increasingly available to developers and customers. PSVR 2 and Quest Pro are the most visible examples of headsets with built-in eye-tracking, along with the likes of Varjo Aero, Vive Pro Eye and more.

With this momentum, in just a few years we could see eye-tracking become a standard part of consumer XR headsets. When that happens, there’s a wide range of features the tech can enable to drastically improve the experience.

Foveated Rendering

Let’s first start with the one that many people are already familiar with. Foveated rendering aims to reduce the computational power required for displaying demanding AR and VR scenes. The name comes from the ‘fovea’—a small pit at the center of the human retina which is densely packed with photoreceptors. It’s the fovea which gives us high resolution vision at the center of our field of view; meanwhile our peripheral vision is actually very poor at picking up detail and color, and is better tuned for spotting motion and contrast than seeing detail. You can think of it like a camera which has a large sensor with just a few megapixels, and another smaller sensor in the middle with lots of megapixels.

The region of your vision in which you can see in high detail is actually much smaller than most think—just a few degrees across the center of your view. The difference in resolving power between the fovea and the rest of the retina is so drastic, that without your fovea, you couldn’t make out the text on this page. You can see this easily for yourself: if you keep your eyes focused on this word and try to read just two sentences below, you’ll find it’s almost impossible to make out what the words say, even though you can see something resembling words. The reason that people overestimate the foveal region of their vision seems to be because the brain does a lot of unconscious interpretation and prediction to build a model of how we believe the world to be.

SEE ALSO
Abrash Spent Most of His F8 Keynote Convincing the Audience That 'Reality' is Constructed in the Brain

Foveated rendering aims to exploit this quirk of our vision by rendering the virtual scene in high resolution only in the region that the fovea sees, and then drastically cut down the complexity of the scene in our peripheral vision where the detail can’t be resolved anyway. Doing so allows us to focus most of the processing power where it contributes most to detail, while saving processing resources elsewhere. That may not sound like a huge deal, but as the display resolution of XR headsets and field-of-view increases, the power needed to render complex scenes grows quickly.

Eye-tracking of course comes into play because we need to know where the center of the user’s gaze is at all times quickly and with high precision in order to pull off foveated rendering. While it’s difficult to pull this off without the user noticing, it’s possible and has been demonstrated quite effectively on recent headset like Quest Pro and PSVR 2.

Automatic User Detection & Adjustment

 

In addition to detecting movement, eye-tracking can also be used as a biometric identifier. That makes eye-tracking a great candidate for multiple user profiles across a single headset—when I put on the headset, the system can instantly identify me as a unique user and call up my customized environment, content library, game progress, and settings. When a friend puts on the headset, the system can load their preferences and saved data.

Eye-tracking can also be used to precisely measure IPD (the distance between one’s eyes). Knowing your IPD is important in XR because it’s required to move the lenses and displays into the optimal position for both comfort and visual quality. Unfortunately many people understandably don’t know what their IPD off the top of their head.

With eye-tracking, it would be easy to instantly measure each user’s IPD and then have the headset’s software assist the user in adjusting headset’s IPD to match, or warn users that their IPD is outside the range supported by the headset.

In more advanced headsets, this process can be invisible and automatic—IPD can be measured invisibly, and the headset can have a motorized IPD adjustment that automatically moves the lenses into the correct position without the user needing to be aware of any of it, like on the Varjo Aero, for example.

Varifocal Displays

A prototype varifocal headset | Image courtesy NVIDIA

The optical systems used in today’s VR headsets work pretty well but they’re actually rather simple and don’t support an important function of human vision: dynamic focus. This is because the display in XR headsets is always the same distance from our eyes, even when the stereoscopic depth suggests otherwise. This leads to an issue called vergence-accommodation conflict. If you want to learn a bit more in depth, check out our primer below:

Accommodation

Accommodation is the bending of the eye’s lens to focus light from objects at different distances. | Photo courtesy Pearson Scott Foresman

In the real world, to focus on a near object the lens of your eye bends to make the light from the object hit the right spot on your retina, giving you a sharp view of the object. For an object that’s further away, the light is traveling at different angles into your eye and the lens again must bend to ensure the light is focused onto your retina. This is why, if you close one eye and focus on your finger a few inches from your face, the world behind your finger is blurry. Conversely, if you focus on the world behind your finger, your finger becomes blurry. This is called accommodation.

Vergence

Vergence is the inward rotation of each eye to overlap each eye’s view into one aligned image. | Photo courtesy Fred Hsu (CC BY-SA 3.0)

Then there’s vergence, which is when each of your eyes rotates inward to ‘converge’ the separate views from each eye into one overlapping image. For very distant objects, your eyes are nearly parallel, because the distance between them is so small in comparison to the distance of the object (meaning each eye sees a nearly identical portion of the object). For very near objects, your eyes must rotate inward to bring each eye’s perspective into alignment. You can see this too with our little finger trick as above: this time, using both eyes, hold your finger a few inches from your face and look at it. Notice that you see double-images of objects far behind your finger. When you then focus on those objects behind your finger, now you see a double finger image.

The Conflict

With precise enough instruments, you could use either vergence or accommodation to know how far away an object is that a person is looking at. But the thing is, both accommodation and vergence happen in your eye together, automatically. And they don’t just happen at the same time—there’s a direct correlation between vergence and accommodation, such that for any given measurement of vergence, there’s a directly corresponding level of accommodation (and vice versa). Since you were a little baby, your brain and eyes have formed muscle memory to make these two things happen together, without thinking, anytime you look at anything.

But when it comes to most of today’s AR and VR headsets, vergence and accommodation are out of sync due to inherent limitations of the optical design.

In a basic AR or VR headset, there’s a display (which is, let’s say, 3″ away from your eye) which shows the virtual scene, and a lens which focuses the light from the display onto your eye (just like the lens in your eye would normally focus the light from the world onto your retina). But since the display is a static distance from your eye, and the lens’ shape is static, the light coming from all objects shown on that display is coming from the same distance. So even if there’s a virtual mountain five miles away and a coffee cup on a table five inches away, the light from both objects enters the eye at the same angle (which means your accommodation—the bending of the lens in your eye—never changes).

That comes in conflict with vergence in such headsets which—because we can show a different image to each eye—is variable. Being able to adjust the imagine independently for each eye, such that our eyes need to converge on objects at different depths, is essentially what gives today’s AR and VR headsets stereoscopy.

But the most realistic (and arguably, most comfortable) display we could create would eliminate the vergence-accommodation issue and let the two work in sync, just like we’re used to in the real world.

Varifocal displays—those which can dynamically alter their focal depth—are proposed as a solution to this problem. There’s a number of approaches to varifocal displays, perhaps the most simple of which is an optical system where the display is physically moved back and forth from the lens in order to change focal depth on the fly.

Achieving such an actuated varifocal display requires eye-tracking because the system needs to know precisely where in the scene the user is looking. By tracing a path into the virtual scene from each of the user’s eyes, the system can find the point that those paths intersect, establishing the proper focal plane that the user is looking at. This information is then sent to the display to adjust accordingly, setting the focal depth to match the virtual distance from the user’s eye to the object.

SEE ALSO
Oculus on Half Dome Prototype: 'don't expect to see everything in a product anytime soon'

A well implemented varifocal display could not only eliminate the vergence-accommodation conflict, but also allow users to focus on virtual objects much nearer to them than in existing headsets.

And well before we’re putting varifocal displays into XR headsets, eye-tracking could be used for simulated depth-of-field, which could approximate the blurring of objects outside of the focal plane of the user’s eyes.

As of now, there’s no major headset on the market with varifocal capabilities, but there’s a growing body of research and development trying to figure out how to make the capability compact, reliable, and affordable.

Foveated Displays

While foveated rendering aims to better distribute rendering power between the part of our vision where we can see sharply and our low-detail peripheral vision, something similar can be achieved for the actual pixel count.

Rather than just changing the detail of the rendering on certain parts of the display vs. others, foveated displays are those which are physically moved (or in some cases “steered”) to stay in front of the user’s gaze no matter where they look.

Foveated displays open the door to achieving much higher resolution in AR and VR headsets without brute-forcing the problem by trying to cram pixels at higher resolution across our entire field-of-view. Doing so is not only be costly, but also runs into challenging power and size constraints as the number of pixels approach retinal-resolution. Instead, foveated displays would move a smaller, pixel-dense display to wherever the user is looking based on eye-tracking data. This approach could even lead to higher fields-of-view than could otherwise be achieved with a single flat display.

A rough approximation of how a pixel-dense foveated display looks against a larger, much less pixel-dense display in Varjo’s prototype headset. | Photo by Road to VR, based on images courtesy Varjo

Varjo is one company working on a foveated display system. They use a typical display that covers a wide field of view (but isn’t very pixel dense), and then superimpose a microdisplay that’s much more pixel dense on top of it. The combination of the two means the user gets both a wide field of view for their peripheral vision, and a region of very high resolution for their foveal vision.

Granted, this foveated display is still static (the high resolution area stays in the middle of the display) rather than dynamic, but the company has considered a number of methods for moving the display to ensure the high resolution area is always at the center of your gaze.

Continued on Page 2: Better Social Avatars »

PSVR 2 Will Use Tobii Eye Tracking, Company Confirms

The upcoming PlayStation VR2 headset will use eye tracking from specialist group, Tobii.

Tobii itself confirmed the news in a press release today. This follows on from a February 2022 announcement that the company was “in negotiation” with Sony to be integrated into the new headset.

PSVR 2 — which doesn’t yet have a release date — is expected to use eye tracking for a variety of applications, including foveated rendering. This refers to a technique that tracks the user’s gaze and then fully renders only the exact center of where they’re looking. Areas in the peripheral vision aren’t fully rendered — a process that’s intended to be imperceptible to the user — reducing the overall demand on the system powering the VR experience.

In other words, this could help PS5 VR games run and look better, though we’re yet to see the feature in action.

For years now Tobii has developed eye tracking hardware that’s already been integrated into other VR headsets. In 2018, for example, the company worked with Qualcomm to develop a standalone VR reference design with eye-tracking. In 2019, Tobii technology was integrated into the HTC Vive Pro Eye. More recently it was integrated into the HP Reverb G2 Omnicept Edition and the Pico Neo 3 Pro Eye.

Tobii says it expects the deal to represent “more than 10%” of its revenue in 2022.

While we still don’t know exactly when PSVR 2 is launching, reports are pointing towards an early 2023 release for the device and today’s announcement does suggest that Sony might only just be assembling all the necessary components to mass produce the device. This would be in line with a report from prominent supply chain analyst, Ming-Chi Kuo, who noted PSVR 2 would begin mass production in H2 2022, speculating this would mean a Q1 2023 launch. As of today it’s H2 2022.

You can keep up with everything we currently know about PSVR 2 right here.

Tobii könnte Eye Tracking für PSVR 2 liefern

Wie Tobii mitteilt, befindet sich das Unternehmen gerade in Verhandlungen mit Sony, bei denen es um ein Eye Tracking-Modul für die PSVR 2 geht.

Tobii könnte Eye Tracking für PSVR 2 liefern

PSVR 2

Das Unternehmen Tobii hat langjährige Expertise im Bereitstellen von Eye Tracking-Lösungen für VR-Brillen. HP, Pico und HTC setzten bereits seit Jahren auf Tobii und nun könnte auch Sony bald zum Kundenkreis gehören. Da von HP, Pico und HTC das Eye-Tracking nur für Unternehmen angeboten wird, könnte der Deal mit Sony der bisher größte VR-Deal für Tobii werden.

Die heutige Pressemitteilung enthüllt, dass zwischen Tobii und Sony Verhandlungen laufen. In diesen Verhandlungen soll geklärt werden, ob Tobii Lieferant für das Eye Tracking der PlayStation VR 2 wird. In der Mitteilung heißt es : “Diese Informationen sind Informationen, die Tobii AB (publ) gemäß der EU-Marktmissbrauchsverordnung veröffentlichen muss”.

Spannend ist die Frage, warum überhaupt noch verhandelt wird. Immerhin gibt es bereits eine Produktseite zur PSVR 2 und der Release scheint nicht in ferner Zukunft zu liegen. Eventuell hat sich Sony kurzfristig entschieden, nicht auf eine eigene Eye Tracking-Lösung zu setzen.

Wir hoffen, dass die neue VR-Brille noch rechtzeitig erscheint, um in diesem Jahr unter dem Weihnachtsbaum zu liegen. Sofern es neue Informationen von Sony gibt, erfahrt ihr sie auf unserem Blog.

(Quelle: Upload VR)

Der Beitrag Tobii könnte Eye Tracking für PSVR 2 liefern zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Tobii Negotiating With Sony to Supply PlayStation VR2’s Eye Tracking

In a rather surprising statement today, eye-tracking specialist Tobii has revealed that it is currently talking to Sony Interactive Entertainment (SIE) in regards to supplying its tech for PlayStation VR2.

It was certainly a very short but sweet statement, saying: “Tobii AB, the global leader in eye tracking and pioneer of attention computing, announces it is currently in negotiation with Sony Interactive Entertainment (“SIE”) be the eye tracking technology provider in SIE’s new VR headset, PlayStation VR2 (PS VR2).”

Tobii added that the negotiations are “are ongoing” with no details to share regarding how a deal would financially impact the company. The only reason the details have been shared is due to EU Market Abuse Regulations.

PlayStation VR2 was initially teased in early 2021, with eye-tracking confirmed by Sony during its CES 2022 presentation. What makes the reveal so surprising is the timing. Whilst SIE hasn’t yet confirmed a launch date, there have been suggestions it could be this year. If the eye tracking component is still being negotiated then that could mean waiting even longer for PlayStation VR2. Key features like this are usually settled upon well in advance so videogame companies know what they’re working with.

PlayStation 5 VR Controller

Even so, having Tobii on board would mean the eye tracking is in very good hands. Tobii already supplies the likes of the HP Reverb G2 Omnicept Edition and the Pico Neo 2 Eye, both of which are enterprise-level VR headsets.

Eye-tracking is commonly used to enhance VR experiences, allowing avatars to be more expressive or thanks to foveated rendering reduce the processing workload for improved visuals. PlayStation VR2 will utilise all of these features, SIE recently launched a new landing page for the headset where you can delve into its capabilities and sign-up for more info. Unfortunately, there’s no picture of the headset just yet.

As further details of this announcement come to light, gmw3 will keep you updated.

Tobii ‘In Negotiation’ To Supply PlayStation VR2’s Eye Tracking Tech

Tobii says it’s “currently in negotiation” with Sony to be the eye tracking tech provider for the PSVR2 headset.

While a few VR headset companies like Varjo and XTAL use in-house eye tracking technology, all others so far use Tobii’s. It’s present in HTC Vive Pro Eye, HP Reverb G2 Omnicept Edition, Pico Neo 2 Eye, and Pico Neo 3 Pro Eye.

Sony first confirmed a new Playstation VR headset was in development a year ago, and in May reliable sources told us the headset would feature eye tracking capable of foveated rendering. Sony officially revealed specifications for PlayStation VR2 last month, including eye tracking and foveated rendering.

Today’s press release reveals that “negotiations are ongoing” between Tobii and Sony to become the eye tracking supplier for PSVR2. But the release states “This information is information Tobii AB (publ) is obliged to make public pursuant to the EU Market Abuse Regulation”, suggesting this potential partnership is being announced not for marketing or public relations purposes, but due to this regulatory requirement.

Sony hasn’t said anything about PSVR2’s release date, but it has announced the full specifications (including the eye tracking), revealed the controllers, and even opened signups to be notified when it’s available to pre-order. Given that every PlayStation console and the original PSVR headset launched late in the year (with the exception of PS2 which released early in Japan), it’s reasonable to assume PSVR2 could launch this holiday season. So the obvious question here is: why is this partnership still being negotiated so late in the product’s development?

Sony & Tobii could simply still be working out the exact terms of the deal – this would very likely be Tobii’s largest deal ever, but there is another possibility. Everyone’s eyes are slightly different, and the task of tracking eyes with extremely high precision and low latency gets progressively harder as companies tackle the outliers. No company yet claims eye tracking good enough to cover 100% of the population with the precision needed for foveated rendering. Sony has been working on eye tracking since at least 2014, and the tech has even shown up in its patent filings. Was Sony intending to use in-house tech but discovered late in wide scale user testing that it isn’t capable of tracking the wide variety of people it wants to target with PSVR2? Tobii told us in an interview that even after its two decades of operation and experience shipping its tech in real headsets, it targets 95%-98% population coverage.

Regardless of who supplies the technology, eye tracking should let PlayStation VR2 offer enhanced graphics with foveated rendering, more precise throwing and targeting mechanics, new NPC interactions, and enable true eye contact in multiplayer games – all of which will push VR forwards.

Update: PSVR 2 to Include Tech from the Biggest Name in Eye Tracking

Tobii, a global leader in eye-tracking, announced earlier this year that it was in talks with Sony to include its tech in the upcoming PlayStation VR2. Now the company has confirmed its eye-tracking is integrated in PSVR 2.

Update (July 1st, 2022): Tobii has officially announced it is a key manufacturer of PSVR 2’s eye-tracking tech. The company says in a press statement that it will receive upfront revenue as a part of this deal starting in 2022 and revenue from this deal is expected to represent more than 10% of Tobii’s revenue in 2022.

“PlayStation VR2 establishes a new baseline for immersive virtual reality (VR) entertainment and will enable millions of users across the world to experience the power of eye tracking,” said Anand Srivatsa, Tobii CEO. “Our partnership with Sony Interactive Entertainment (SIE) is continued validation of Tobii’s world-leading technology capabilities to deliver cutting-edge solutions at mass-market scale.”

The original article follows below:

Original Article (February 7th, 2022): Tobii released a short press statement today confirming that negotiations are ongoing, additionally noting that it’s “not commenting on the financial impact of the deal at this time.”

It was first revealed that Sony would include eye-tracking in PSVR 2 back in May 2021, with the mention that it will provide foveated rendering for the next-gen VR headset. Foveated rendering allows the headset to render scenes in high detail exactly where you’re looking and not in your peripheral. That essentially lets PSVR 2 save precious compute power for more and better things.

Founded in 2001, Tobii has become well known in the industry for its eye-tracking hardware and software stacks. The Sweden-based firm has partnered with VR headset makers over the years and can be found in a number of devices, such as HTC Vive Pro Eye, HP Reverb G2 Omnicept Edition, Pico Neo 2 Eye, Pico Neo 3 Pro Eye, and a number of Qualcomm VRDK reference designs.

It’s still unclear when PSVR 2 is slated to arrive, although it may be positioned to become the first true commercial VR headset to feature eye-tracking—that’s if PSVR 2 isn’t beaten out by Project Cambria, the rumored ‘Quest Pro’ headset from Meta which is also said to include face and eye-tracking.

Pico Announces Neo 3 Pro, Neo 3 Pro Eye For Enterprise Market

Pico Interactive announced two new headsets this week — the Pico Neo 3 Pro and the Pico Neo 3 Pro Eye, both aimed at an enterprise market and available soon in North America and Europe.

The two new headsets follow from the launch of the standard consumer-focused Pico Neo 3 headset last month, which is available exclusively in China. Pico promised that headsets aimed at an enterprise market would follow, and the Neo 3 Pro and Neo 3 Pro Eye are just that.

Pico Neo 3 Pro
The Pico Neo 3 Pro

Like the Neo 3, they are both powered by the Snapdragon XR2 platform and now feature optical 6DoF controller tracking, as opposed to the electromagnetic controller seen in the Neo 2 Pro and Neo 2 Pro Eye. Both headsets feature a single 5.5″ display, with a resolution of 3664 x 1920 and a 90Hz refresh rate. The field of view (FOV) on both headsets is ever so slightly lower than the last generation, down to 98 from 101 in the Neo 2 headsets. However, while the Neo 2 generation had a fixed IPD, the Neo 3 Pro and Neo 3 Pro Eye offer adjustable IPD settings at 58mm, 63.5mm and 69mm.

Just like the standard Neo 3 model, both the Pro and the Pro Eye feature upgraded WiFi 6 capabilities, along with improved guardian support thanks to the increase to 4 cameras from only 2 last generation. The Neo 2 headsets only supported a 3x3m guardian, but the Neo 3 Pro and Pro Eye now support a 10x10m space.

Pico Neo 3 Pro Eye
The Pico Neo 3 Pro Eye

Continuing the partnership from the Neo 2 generation, the Neo 3 Pro Eye will feature built-in eye-tracking capabilities from Tobii.

Both headsets offer DisplayPort and NVIDIA’s Direct Mode support, which promises streaming of PC VR content at 4K 90Hz via Pico VR Streaming when connected to a VR-ready computer via DisplayPort.

In terms of pricing, the Neo 3 Pro sits at $699 USD and the Neo 3 Pro Eye at $899 USD. Both headsets will be available to pre-order soon for enterprise customers on the Pico Interactive website.

HP Reverb G2 Omnicept Will Cost $1,249 in May

HP Reverb G2 Omnicept Edition

HP is really the one company carrying the torch for Microsoft’s Windows Mixed Reality (WMR) system having launched the HP Reverb G2 at the end of 2020. Its not stopping there though. After announcing an enterprise-focused model called the Reverb G2 Omnicept in September, today HP has announced it’ll be available in May priced at $1,249 USD.

HP Reverb G2 Omnicept Edition

That price might sound a little steep but it’s not considering other pro-level headsets and the amount HP has packed into the device. Because there are a lot of sensors, all designed to help provide developers and companies with data-driven insight depending on their requirements. Built into the HP Reverb G2 Omnicept Edition are sensors for eye tracking, heart rate monitoring, facial movements and even pupil dilation.

All of this can be used for any number of use cases, from training scenarios looking at how users cope in particular situations; mental health and monitoring someone’s well being to creating a more realistic, immersive experience where co-workers can collaborate with expressive avatars.

For creators, alongside the hardware launch, HP will also release the Omnicept software development kit (SDK) in four options depending on the organisation and planned use. HP Omnicept SDK ‘Core’ will be free but won’t offer the Inference Engine SDK, whilst the Academic version is free for educational use or 2% revenue share for profit. The Developer Edition of the SDK is a flat 2% revenue share with the Enterprise Edition pricing tailored to the company. Only the Core version has a couple of extra exclusions, Pulse Rate Variability API and HP VR Spatial Audio omitted. The latter uses dynamic head-related-transfer functions (HRTFs) to create a personalised sound for a more immersive experience.

HP Reverb G2 Omnicept Edition

As previously reported, the HP Reverb G2 Omnicept Edition’s other specs still mirror its consumer cousin, with a 2160×2160 per eye resolution, Valve’s off-ear headphones, four cameras for inside-out tracking, a 90Hz refresh rate, a 114-degree FOV and a 6 meter cable for plenty of freedom to move around.

The HP Reverb G2 Omnicept Edition will be available to order through HP’s own website in May. If you’re just looking for a decent PC VR headset for gaming then the standard HP Reverb G2 goes for $599. For further updates on HP’s VR plans, keep reading VRFocus.

Tobii, Valve & OpenBCI Collaborate on ‘Galea’ VR Brain-Computer Interface

OpenBCI - Galea

When it comes to interfacing with virtual reality (VR) worlds currently you really only have the option of physical controllers or very basic hand tracking unless you have the cash to buy expensive gloves. The future could very well be in brain-computer interface’s (BCI) like the one Valve, Tobii, and OpenBCI are currently collaborating on, Galea, with a beta programme slated to launch early next year.

Valve Index

Brooklyn-based neurotechnology company OpenBCI its Galea hardware and software platform last year, a combination of mixed reality (MR) headset with state-of-the-art biosensing and brain-computer interfacing (BCI) tech. The device features a wealth of sensors including electroencephalogram (EEG), electrooculography (EOG) electromyography (EMG), electrodermal activity (EDA), and photoplethysmography (PPG) to detect human emotions.

The project jumped into the spotlight last month thanks to New Zealand’s 1 NEWS interview with Valve’s Gabe Newell confirming the company had been working with OpenBCI. “If you’re a software developer in 2022 who doesn’t have one of these in your test lab, you’re making a silly mistake,” Newell said.

The latest news on the project sees Tobii as another partner, so alongside all the kit above built into a headset with elements from the Valve Index, it’ll also include eye-tracking.  

Valve Brain Interface
Image credit: Mike Ambinder/Valve Corporation

“We are excited to work with Valve and OpenBCI to explore the future of immersive gaming by combining the power of Tobii eye tracking and OpenBCI’s advanced brain computer interface technology,” said Anand Srivatsa, Division CEO of Tobii Tech in a statement.   

BCI could well be built into all future headsets to see when people are happy, sad, surprised or bored because as Newell himself puts: “simply because there’s too much useful data.”

If you’re a developer interested in how neurotechnology can transform VR experiences then head over to OpenBCI’s Galea beta project website to register. Limited quantities of developer units will be shipped to beta program participants in early 2022. For further updates, keep reading VRFocus.

Pico Neo 2 And Its Eye Tracking Variant Now Available Worldwide

Pico Interactive is making its Neo 2 line of standalone headsets available for purchase worldwide.

The base model is $700 while an eye-tracking variant powered by Tobii is $900.

Pico Neo 2 Controllers ElectromagneticWe tried both models at CES in January and while the eye-tracking wasn’t perfect in the early demo, it also worked without calibration and the electromagnetic controller tracking technology was very interesting. The controllers were able to track even when they were behind my back, unlike the kind of tracking used with Facebook’s Oculus Quest.

The Neo 2 headsets run on Qualcomm’s Snapdragon 845 chips, feature an SD expansion slot and are supposed to be able to stream content from a VR Ready PC “over wireless 2X2 MIMO 802.11ac 5G link with a common MIMO 5G router.”

The headsets are primarily pitched toward businesses but may offer an intriguing alternative for some folks looking to step outside the Facebook ecosystem for VR hardware. HTC also offers the Vive Focus Plus priced starting around $800 while Facebook’s Quest starts at $400 but is priced around $1,000 when bundled with features and support tailored toward businesses.

Pico’s Neo 2 Eye version is meant to allow “businesses to gain a deeper understanding of customer behavior, enhance training efficiency, improve productivity and increase overall safety at work,” according to the company. The eye tracking variant is also said to include dynamic foveated rendering to reduce “shading load in some applications” while increasing frame rate. The headsets are 4K resolution with 101-degree field of view and weigh 340 grams without the headband. Those specifications are as stated by Pico and comparing things like resolution and field of view in VR can be especially tricky because there’s no industry standard method for comparing these measurements. Likewise, streaming VR content from a PC to a standalone headset can lead to comfort issues in certain situations depending on a range of conditions including the amount of traffic on your local area network. We’ve requested a review unit from Pico so we can test it out and report back with extended hands-on experience.

The post Pico Neo 2 And Its Eye Tracking Variant Now Available Worldwide appeared first on UploadVR.