Eye-tracking is a Game Changer for XR That Goes Far Beyond Foveated Rendering

Eye-tracking—the ability to quickly and precisely measure the direction a user is looking while inside of a VR headset—is often talked about within the context of foveated rendering, and how it could reduce the performance requirements of XR headsets. And while foveated rendering is an exciting use-case for eye-tracking in AR and VR headsets, eye-tracking stands to bring much more to the table.

Updated – May 2nd, 2023

Eye-tracking has been talked about with regards to XR as a distant technology for many years, but the hardware is finally becoming increasingly available to developers and customers. PSVR 2 and Quest Pro are the most visible examples of headsets with built-in eye-tracking, along with the likes of Varjo Aero, Vive Pro Eye and more.

With this momentum, in just a few years we could see eye-tracking become a standard part of consumer XR headsets. When that happens, there’s a wide range of features the tech can enable to drastically improve the experience.

Foveated Rendering

Let’s first start with the one that many people are already familiar with. Foveated rendering aims to reduce the computational power required for displaying demanding AR and VR scenes. The name comes from the ‘fovea’—a small pit at the center of the human retina which is densely packed with photoreceptors. It’s the fovea which gives us high resolution vision at the center of our field of view; meanwhile our peripheral vision is actually very poor at picking up detail and color, and is better tuned for spotting motion and contrast than seeing detail. You can think of it like a camera which has a large sensor with just a few megapixels, and another smaller sensor in the middle with lots of megapixels.

The region of your vision in which you can see in high detail is actually much smaller than most think—just a few degrees across the center of your view. The difference in resolving power between the fovea and the rest of the retina is so drastic, that without your fovea, you couldn’t make out the text on this page. You can see this easily for yourself: if you keep your eyes focused on this word and try to read just two sentences below, you’ll find it’s almost impossible to make out what the words say, even though you can see something resembling words. The reason that people overestimate the foveal region of their vision seems to be because the brain does a lot of unconscious interpretation and prediction to build a model of how we believe the world to be.

SEE ALSO
Abrash Spent Most of His F8 Keynote Convincing the Audience That 'Reality' is Constructed in the Brain

Foveated rendering aims to exploit this quirk of our vision by rendering the virtual scene in high resolution only in the region that the fovea sees, and then drastically cut down the complexity of the scene in our peripheral vision where the detail can’t be resolved anyway. Doing so allows us to focus most of the processing power where it contributes most to detail, while saving processing resources elsewhere. That may not sound like a huge deal, but as the display resolution of XR headsets and field-of-view increases, the power needed to render complex scenes grows quickly.

Eye-tracking of course comes into play because we need to know where the center of the user’s gaze is at all times quickly and with high precision in order to pull off foveated rendering. While it’s difficult to pull this off without the user noticing, it’s possible and has been demonstrated quite effectively on recent headset like Quest Pro and PSVR 2.

Automatic User Detection & Adjustment

 

In addition to detecting movement, eye-tracking can also be used as a biometric identifier. That makes eye-tracking a great candidate for multiple user profiles across a single headset—when I put on the headset, the system can instantly identify me as a unique user and call up my customized environment, content library, game progress, and settings. When a friend puts on the headset, the system can load their preferences and saved data.

Eye-tracking can also be used to precisely measure IPD (the distance between one’s eyes). Knowing your IPD is important in XR because it’s required to move the lenses and displays into the optimal position for both comfort and visual quality. Unfortunately many people understandably don’t know what their IPD off the top of their head.

With eye-tracking, it would be easy to instantly measure each user’s IPD and then have the headset’s software assist the user in adjusting headset’s IPD to match, or warn users that their IPD is outside the range supported by the headset.

In more advanced headsets, this process can be invisible and automatic—IPD can be measured invisibly, and the headset can have a motorized IPD adjustment that automatically moves the lenses into the correct position without the user needing to be aware of any of it, like on the Varjo Aero, for example.

Varifocal Displays

A prototype varifocal headset | Image courtesy NVIDIA

The optical systems used in today’s VR headsets work pretty well but they’re actually rather simple and don’t support an important function of human vision: dynamic focus. This is because the display in XR headsets is always the same distance from our eyes, even when the stereoscopic depth suggests otherwise. This leads to an issue called vergence-accommodation conflict. If you want to learn a bit more in depth, check out our primer below:

Accommodation

Accommodation is the bending of the eye’s lens to focus light from objects at different distances. | Photo courtesy Pearson Scott Foresman

In the real world, to focus on a near object the lens of your eye bends to make the light from the object hit the right spot on your retina, giving you a sharp view of the object. For an object that’s further away, the light is traveling at different angles into your eye and the lens again must bend to ensure the light is focused onto your retina. This is why, if you close one eye and focus on your finger a few inches from your face, the world behind your finger is blurry. Conversely, if you focus on the world behind your finger, your finger becomes blurry. This is called accommodation.

Vergence

Vergence is the inward rotation of each eye to overlap each eye’s view into one aligned image. | Photo courtesy Fred Hsu (CC BY-SA 3.0)

Then there’s vergence, which is when each of your eyes rotates inward to ‘converge’ the separate views from each eye into one overlapping image. For very distant objects, your eyes are nearly parallel, because the distance between them is so small in comparison to the distance of the object (meaning each eye sees a nearly identical portion of the object). For very near objects, your eyes must rotate inward to bring each eye’s perspective into alignment. You can see this too with our little finger trick as above: this time, using both eyes, hold your finger a few inches from your face and look at it. Notice that you see double-images of objects far behind your finger. When you then focus on those objects behind your finger, now you see a double finger image.

The Conflict

With precise enough instruments, you could use either vergence or accommodation to know how far away an object is that a person is looking at. But the thing is, both accommodation and vergence happen in your eye together, automatically. And they don’t just happen at the same time—there’s a direct correlation between vergence and accommodation, such that for any given measurement of vergence, there’s a directly corresponding level of accommodation (and vice versa). Since you were a little baby, your brain and eyes have formed muscle memory to make these two things happen together, without thinking, anytime you look at anything.

But when it comes to most of today’s AR and VR headsets, vergence and accommodation are out of sync due to inherent limitations of the optical design.

In a basic AR or VR headset, there’s a display (which is, let’s say, 3″ away from your eye) which shows the virtual scene, and a lens which focuses the light from the display onto your eye (just like the lens in your eye would normally focus the light from the world onto your retina). But since the display is a static distance from your eye, and the lens’ shape is static, the light coming from all objects shown on that display is coming from the same distance. So even if there’s a virtual mountain five miles away and a coffee cup on a table five inches away, the light from both objects enters the eye at the same angle (which means your accommodation—the bending of the lens in your eye—never changes).

That comes in conflict with vergence in such headsets which—because we can show a different image to each eye—is variable. Being able to adjust the imagine independently for each eye, such that our eyes need to converge on objects at different depths, is essentially what gives today’s AR and VR headsets stereoscopy.

But the most realistic (and arguably, most comfortable) display we could create would eliminate the vergence-accommodation issue and let the two work in sync, just like we’re used to in the real world.

Varifocal displays—those which can dynamically alter their focal depth—are proposed as a solution to this problem. There’s a number of approaches to varifocal displays, perhaps the most simple of which is an optical system where the display is physically moved back and forth from the lens in order to change focal depth on the fly.

Achieving such an actuated varifocal display requires eye-tracking because the system needs to know precisely where in the scene the user is looking. By tracing a path into the virtual scene from each of the user’s eyes, the system can find the point that those paths intersect, establishing the proper focal plane that the user is looking at. This information is then sent to the display to adjust accordingly, setting the focal depth to match the virtual distance from the user’s eye to the object.

SEE ALSO
Oculus on Half Dome Prototype: 'don't expect to see everything in a product anytime soon'

A well implemented varifocal display could not only eliminate the vergence-accommodation conflict, but also allow users to focus on virtual objects much nearer to them than in existing headsets.

And well before we’re putting varifocal displays into XR headsets, eye-tracking could be used for simulated depth-of-field, which could approximate the blurring of objects outside of the focal plane of the user’s eyes.

As of now, there’s no major headset on the market with varifocal capabilities, but there’s a growing body of research and development trying to figure out how to make the capability compact, reliable, and affordable.

Foveated Displays

While foveated rendering aims to better distribute rendering power between the part of our vision where we can see sharply and our low-detail peripheral vision, something similar can be achieved for the actual pixel count.

Rather than just changing the detail of the rendering on certain parts of the display vs. others, foveated displays are those which are physically moved (or in some cases “steered”) to stay in front of the user’s gaze no matter where they look.

Foveated displays open the door to achieving much higher resolution in AR and VR headsets without brute-forcing the problem by trying to cram pixels at higher resolution across our entire field-of-view. Doing so is not only be costly, but also runs into challenging power and size constraints as the number of pixels approach retinal-resolution. Instead, foveated displays would move a smaller, pixel-dense display to wherever the user is looking based on eye-tracking data. This approach could even lead to higher fields-of-view than could otherwise be achieved with a single flat display.

A rough approximation of how a pixel-dense foveated display looks against a larger, much less pixel-dense display in Varjo’s prototype headset. | Photo by Road to VR, based on images courtesy Varjo

Varjo is one company working on a foveated display system. They use a typical display that covers a wide field of view (but isn’t very pixel dense), and then superimpose a microdisplay that’s much more pixel dense on top of it. The combination of the two means the user gets both a wide field of view for their peripheral vision, and a region of very high resolution for their foveal vision.

Granted, this foveated display is still static (the high resolution area stays in the middle of the display) rather than dynamic, but the company has considered a number of methods for moving the display to ensure the high resolution area is always at the center of your gaze.

Continued on Page 2: Better Social Avatars »

Eye Tracking is Coming to Vive Focus 3 & Vive Pro 2 Q3 2021

Vive Focus/Pro 2 - Eye Tracking

It’s been a big week for virtual reality (VR) hardware announcements thanks to HTC Vive’s very first ViveCon 2021 event. The all-in-one Vive Focus 3 was finally unveiled after much teasing whilst the big surprise was the Vive Pro 2, HTC new flagship PC headset. While both devices have plenty of tech built-in one thing they don’t have is eye-tracking which will be made available later in the year as an addon.

HTC Vive Focus 3

Eye-tracking will come by way of Chinese company 7invensun whose technology VRFocus has previously come across in Pimax and other headsets. 7invensun’s involvement wasn’t part of the main ViveCon Keynote yesterday, instead revealed by HTC China President Alvin Wang Graylin who confirmed that the Droolon F2 eye-tracking module would be available for customers in Q3 2021, priced at $299 USD.

That’s double the price of the company’s previous eye-tracking module which was available for HTC Vive headsets retailing at $149. What’s more unusual is the fact that neither headset has eye-tracking natively built-in considering the HTC Vive Pro Eye has been available for several years now. Especially where the Vive Focus 3 is concerned. The standalone headset is purely aimed at the enterprise market which is where eye-tracking would be most applicable, offering analytics and training use cases. The Vive Pro 2 bridges both the consumer/enterprise markets so keeping the cost down is understandable, plus very little content actually uses eye-tracking.

Graylin’s Tweet also mentions that the Droolon F2 eye-tracking module will be available in ‘most markets’. Considering 7invensun’s tech has mostly been for the Chinese market the launch could be the first time customers in Europe and the US can easily acquire its eye-tracking. Vive Focus 3 and Pro 2 owners will have to wait a few months to find out.

HTC Vive Pro 2

The eye-tracking module adds to the ever-growing list of accessories for the HTC Vive platform. The most recent additions include the Vive Tracker 3.0 and the Vive Facial Tracker. If you want a VR platform with lots of accessory options then HTC Vive has definitely got you covered.

7invensun has a long history with HTC Vive being one of the startups involved in the Vive X accelerator programme. As further updates on other hardware partners are released, VRFocus will let you know.

HTC’s Vive Cosmos And Vive Focus Are Getting An Eye Tracking Addon

HTC’s Vive Cosmos, Vive Focus, Vive Focus Plus, and Vive are getting an eye tracking addon from Chinese startup 7invensun.

Called Droolon F1, the addon is priced at $149. The startup claims it takes just minutes to install, and adds only 60 grams to the headset (a roughly 10% weight increase). It connects via USB and has two sampling rate options, 120Hz and 240Hz.

This isn’t 7invensun’s first Vive eye tracking addon. Back in April 2017, the company announced a $220 eye tracking addon for the original Vive. At the time HTC told us that it would launch in the west in Q3 2017, but we’re not aware of this actually happening. Notably, 7invensun is a member of HTC’s Vive X accelerator initiative.

Droolon F1 uses the same SRanipal SDK from HTC, so content developed for HTC’s Vive Pro Eye enterprise headset should work without any updates needed. This is an improvement over the original addon which used its own separate SDK.

Eye tracking has several uses in VR. It can detect the user’s interpupillary distance to enable the optimal optical calibration. It can be used in social VR to communicate real eye movements, and eye contact. It can also be used by advertisers to collect data on what the user is looking at.

But most importantly, it can enable foveated rendering. The human eye is only high resolution in the very center, as you can notice by looking around your room. VR headsets can take advantage of this by only rendering where you’re directly looking in high resolution. Everything else can be rendered at a significantly lower resolution. However, there doesn’t seem to be any confirmation on whether Droolon F1’s tracking quality is sufficient for foveated rendering at this time.

vive pro eye foveated rendering
Foveated rendering on Vive Pro Eye

Preorders will open in November, and HTC claims it will start worldwide shipping in December. The company hasn’t stated which countries it will ship to, but we’ve reached out to clarify this and will update this article with their response.

The post HTC’s Vive Cosmos And Vive Focus Are Getting An Eye Tracking Addon appeared first on UploadVR.

7invensun Announces $150 Eye-tracking Module Supporting All HTC Headsets

Today at HTC’s Vive Developer Meeting in Beijing and its simultaneous Shanghai-based Vive Ecosystem Conference, Chinese eye-tracking startup 7invensun debuted a new eye-tracking module that’s not only affordable at $150 (¥1100 RMB), but is designed to support all of HTC’s VR headsets, past and present.

Called Droolon F1, 7invensun’s new eye-tracking module was primarily announced to target the company’s upcoming Vive Cosmos, the inside-out tracked PC VR headset arriving on October 3rd for $700.

However, as an official product partner with HTC, 7invensun says that Droolon F1 will also be compatible with the original Vive, Vive Pro, Vive Focus, and Vive Focus Plus.

Image courtesy 7invensun, via CNW

There isn’t any official information out there in English yet, although according to Chinese publication CNW (Chinese), Droolon F1 connects to the VR headset via USB and uses its USB port to provide dual-eye sync data. The standard version, CNW reports, has a sampling rate of 120 Hz, but can be customized to run at 240 Hz.

At the time of this writing, neither 7invensun nor HTC have mentioned what countries it intends on supporting at Droolon F1’s launch, however the $150 USD price was unveiled onstage at the enterprise-focused Shanghai event which points to a probable offering outside of China. Pre-orders are said to begin sometime in November with shipping taking place in December.

Image courtesy HTC

Unlike 7invensun’s previous aGlass modules, which used its own proprietary eye-tracking API, Droolon F1 is said to use HTC’s official eye-tracking API, SRanipal SDK. This essentially allows developers to target both Droolon F1 and Vive Pro Eye, HTC’s enterprise-focused headset with integrated eye-tracking.

SEE ALSO
Oculus Job Listing Points to Eye-tracking in 'next gen AR/VR products'

The size of the module is also said to be smaller in comparison to the company’s previous eye-trackers, weighing in at only 60g (~2.1 oz). It’s said to feature a more convenient assembly, taking a purported three-minutes of setup time.

Image courtesy 7invensun, via CNW

7invensun initially launched its first aGlass module (DK1) for HTC Vive at the end of 2016, and later released the second iteration in 2017. Shortly afterwards, the company was accepted into HTC’s second batch of Vive X accelerator, which provided the company financial investment and mentorship. One short year later, the 7invensun confirmed aGlass DK2’s compatibility with Vive Pro.

As with its previous products, Droolon F1 is being targeted at developers. Current apps and games can’t make ready use of eye-tracking without being created specifically for the task, although the low price point may prove tempting to not only a wide array of developers, but (ultra) early adopters who want to experience the first experiences made specifically with eye-tracking in mind.

If you want to learn about eye-tracking and its many uses, check out our deep dive article on why eye-tracking is a game changer for VR.

The post 7invensun Announces $150 Eye-tracking Module Supporting All HTC Headsets appeared first on Road to VR.

HTC Vive Is Getting A $220 Plug-And-Play Eye Tracking Peripheral Next Month

HTC Vive Is Getting A $220 Plug-And-Play Eye Tracking Peripheral Next Month

Eye tracking is one feature that could could benefit the performance and affordability of high-end virtual reality headsets. The HTC Vive should become the first mainstream headset to put that theory to the test.

A Chinese startup known as 7invensun (pronounced seven-in-ven-sun) is announcing it will be releasing  a new eye tracking module for the Vive next month. The module is called the aGlass and it will be available for “limited pre-order sales” next month, according to HTC. The company is referring to this first roll-out as a developer kit, but pre-orders are open to anyone. The system will cost about $220 USD.

Unlike other eye tracking solutions that require hardware to be installed at the manufacturer level, the 7invensun devices are modular in nature. The thin plastic overlays can be placed manually inside the Vive headset by the average VR user, according to the company. The eye trackers are designed to be wired directly to the headset over USB. Two separate USB chords are connected to each of the aGlass devices. The two chords are then joined by a USB combiner and fed into the Vive’s single port.

The aGlass consists of two separate trackers built specifically to fit alongside the lenses of the Vive. Each tracker has a halo of IR lights combined with sensors that can track the movements of each of your eyes and eyelids. It is said to support customized lenses depending on the specific vision concerns of the individual customer.

This type of tech can have a variety of use cases but the most immediate is foveated rendering.

Foveated rendering is a process that combines eye tracking and software to adjust the way a VR experience is rendered in real time. With foveated rendering, the PC running your Vive only has to render the greatest detail in the small area on which your eyes are directly focused. This dramatically lowers the cost of the hardware required to successfully show a convincing VR experience. According to 7invensun spokespeople, this tech could allow Vive to run on older generation graphics hardware.

Currently, VR demands graphics cards and CPUs that are among the most powerful that the various manufacturers can provide. With foveated rendering, however, users can lower the workload demanded by their Vives and run VR on older, cheaper hardware from NVIDIA, AMD, Intel, etc.

The aGlass comes with custom software allowing you to manually apply foveated rendering to any HTC Vive experience and the amount of the effect being applied. In a demonstration, we saw the device running with NVIDIA’s VR Funhouse experience with a performance jump from 45 frames-per-second to 90 with the foveated rendering applied. This functionality will only be available with NVIDIA graphics cards at first, according to the company.

According to a spokesperson for Vive, the release of aGlass ties into the team’s stated goal for 2017 which is to “expand the ecosystem” for the headset by providing cutting edge peripherals like this, the TPCast wireless VR system and the Vive Tracker. To that end, Vive is officially referring to the aGlass system as an “upgrade kit” for the Vive.

7invensun is a member of the Vive X accelerator’s second class. This is Vive’s in-house startup incubator that previously gave rise to TPCast and other VR-specific startups.

The aGlass will only work with the Vive upon release. HTC emphasizes that they are not making that a requirement for 7invensun, which has full freedom to develop this hardware for other headsets in the future.

Update: after publishing, HTC confirmed that the price for this system will be around $220 USD. 

Tagged with: ,

Vive X Teams Reveal Innovation Breakthroughs Toward Next-Gen VR Tech

Last year HTC formed the Virtual Reality Venture Capital Alliance (VRVCA) alongside some of the biggest investors in the VR industry. This alliance offered a total of $10bn USD to VR content creators in conjunction with the Vive X accelerator program. The VRVCA has just held its fourth closed-door members meeting in San Francisco, with 10 teams from 5 different regions showcasing their achievements.

These startups included 7invensun, ObEN and TPCast, each making progress with interactive eye tracking, social connection, and wireless connectivity.

aGlass DK II

7invensun, a Vive X batch II team based in Beijing, has created real-time eye tracking for HTC Vive systems in the form of an upgrade kit known as aGlass. The aGlass DK II will start limited pre-order sales next month priced at RMB 1,500 (approx. $219 USD. Each set of aGlass DK II’s comes supplied with a pair of 200°, 400° and 600° myopic lenses and supports customised lenses.

“It’s a great honour for us to attend VRVCA this year and we’re super excited about the benefits aGlass can bring to lower hardware requirements and thus enable broader access to high quality VR. Thanks to Vive X, we are able to work and share resources with a variety of Vive X teams, and jointly create more VR/AR eye-tracking solutions in game interaction, financial payment, travel guide, and so on. Our vision is to empower a new and cool way of human-computer interaction through eye tracking.” said Thomas Huang, Founder and CEO of 7invensun in a statement.

While ObEN has built artificial intelligence (AI) technology that integrates with China’s WeChat, a popular social platform allowing all HTC Vive users and WeChat’s 889 million mobile users to interact with each other.

“China is a world leader in both mobile internet and VR adoption. We are thrilled to join forces with Tencent and HTC Vive to tackle the challenge of improving Social VR experiences using ObEN’s AI technology,” said Nikhil Jain, cofounder and CEO, ObEN. “Today’s showcase not only enables shared experiences between users of China’s top social and VR platform but is also an early demonstration of how ObEN’s Personal AI can drive and create AI generated content.”

TPCast on the other hand needs little introduction. The company developed a wireless solution last year which has seen widespread media coverage. At the showcase TPCast announced its Business Edition tether-less upgrade kit, a multi-user solution that allows up to six users to interact in one physical space. The product is due to launch later this year.

TPCast_picture[1]

“We believe this industry milestone will really change the game completely for multiuser VR and ultimately unleash the tether-less VR world!” said Michael Liu, CEO of TPCAST. “We sincerely appreciate VRVCA for giving us this opportunity to show our latest tether-less VR product to the world. Also we would like to express our gratitude to HTC Vive for being always supportive to us.”

VRFocus will continue its coverage of Vive X and VRVCA, reporting back with the latest technological updates.