Vive Eye-tracking & Corrective Lens Add-on Confirmed Compatible with Vive Pro

7invensun, one of the latest companies accepted into HTC’s Vive X accelerator program, is launching an eye-tracking add-on made specially for the Vive headset. The device, which went on sale last year, uses a series of IR LEDs and a near-eye camera to track the user’s eyes in real-time, enabling a range of potential benefits for VR.

Update (5/30/18): 7invensun has confirmed that the company’s latest iteration of its aGlass eye-tracking add-on, the aGlass DK2, is compatible with the Vive Pro in addition to the original Vive. All that’s required is an adapter to adapt the device’s original plug for the Vive Pro’s USB-C port.

The company says they’re also now offering ‘Advanced’, ‘Analytical’ and ‘Ultimate’ Editions of the eye-tracking add-on which offer greater capabilities than the consumer version, like pupil diameter, pupil location, eyelid distance, iris recognition and customizable calibration.

Photo by Road to VR

Original Article (4/27/17): Meeting with 7invensun at HTC’s Vive X accelerator space in San Francisco today, I got to see the aGlass eye-tracking development kit in action. The device, which comes as two compact assemblies which easily insert around the Vive’s lenses, uses a ring of IR LED’s to illuminate each eye while a small camera records the movements and sends the data back to the computer for processing. The devices are smartly powered from the Vive’s extra USB port that’s hidden under the cable retainer at the top of the headset.

Foveated Rendering and More

Image courtesy 7invensun

Eye-tracking can be used for a number of useful purposes in VR, perhaps one of the biggest being foveated rendering, which reduces the resolution around your peripheral vision where it doesn’t need to be as sharp, while leaving the center part of your vision at full quality. This can significantly reduce the processing power required to render a high-quality, high-framerate VR scene. Among the VR industry, foveated rendering via eye-tracking is largely considered a necessary technology to one day achieve retinal-resolutions in VR headsets by reducing the processing power needed to render on ultra-high pixel density displays.

Foveated rendering is already up and running with the aGlass dev kit, and is presently said to work on any application with no modifications, so long as the computer is equipped with an NVIDIA GPU. The company will aim to extend the capability to AMD cards as well. My current understanding is that other functions that employ aGlass’s eye-tracking data are GPU-independent, but foveated rendering is a special case because of the current rendering technique.

Beyond foveated rendering, eye-tracking can be used to more realistically animate the eyes of your VR avatar, simulate depth-of-field effects, be used for conscious and unconscious input inside of VR apps, and more.

Corrective Lenses

Beyond eye-tracking, the aGlass add-on also opens the door to corrective lenses for those who normally wear glasses. Today’s high-end VR headsets lack variable focus control, so those with glasses are either stuck trying to fit them inside the headset or not playing with them at all (and suffering a blurry virtual image). The aGlass add-on is designed for easy insertion of custom-made lenses that rest over the Vive’s original lenses, and the development kit comes with three focus powers. And if you don’t need corrective lenses, the device works entirely fine without them.

Hands-on With the aGlass Dev Kit

I got to try the aGlass dev kit myself at the Vive X accelerator office in San Francisco today.

The calibration process took several tries and quite a bit of fidgeting with the headset to get right. In my experience this is not entirely unique to aGlass, but speaks to the underlying challenges of designing a robust eye-tracking system that can handle a wide range of face and eye structures, not to mention those with eye problems (just an FYI, I’m fortunate to have 20/20 vision).

SEE ALSO
Eye-tracking is a Game Changer for VR That Goes Far Beyond Foveated Rendering

Once we did get the system calibrated—which involves looking at a series of dots in the screen and takes only a few seconds, if it works in the first try—foveated rendering was demonstrated using NVIDIA’s VR Funhouse. Foveated rendering only works well if the eye-tracking is fast and accurate enough that it’s hard for the user to notice any foveated rendering happening.

Photo by Road to VR

I was told the aGlass has a 5ms latency, and this was fast enough that—under the right rendering settings—I could barely tell that the foveated rendering was turned on, which is great. The particular method they were using to achieve foveated rendering was described as “proprietary,” and it did work better than a number of other attempts I’ve seen, which often introduce a blurry feeling in the periphery which gives the effect away easily and doesn’t feel quite right.

The particular laptop computer powering this demo lacked enough power to be formally called ‘VR ready’, and when foveated rendering was disabled I could easily see lots of jitter due to the computer not being able to keep up with the rendering workload. Once foveated rendering was enabled, it allowed the computer to achieve the necessary 90FPS consistently.

Improvements to Come

The installation seems dead simple, though the current dev kit adds a little bulk around the Vive’s lenses which slightly restricts the IPD setting on the lower end. The average IPD is often quoted around ~64mm, but with the aGlass installed I was only able to bring the Vive IPD down to 65mm, though the company says they aim to fix this for the consumer version of the device.

Another issue for me was the need to dial the Vive’s lens-relief out a few notches to get the calibration to work correctly. Doing so means reducing the headset’s field of view which is of course no ideal. The problem may be due to the angle of the eye-tracking camera which is mounted on the aGlass bezel below the eye. The company says they hope to improve the robustness of calibration by increasing the ‘sweet spot’ of the system so that it can better handle varying eye and facial structures.

On the aGlass dev kit there’s also two soft rubber flaps that I could feel gently pushing against the inside corner of my eyes. It seems these may be needed to keep the IR light from interfering with the Vive’s head-sensor (which is positioned on the headset right between your eyes). It didn’t hurt but was slightly discomforting since you don’t usually have something touching that part of your eyes. The company says that the consumer version won’t require the flaps.

aGlass Dev Kit Release Date

7invensun says the aGlass eye-tracking dev kit add-on for Vive will go on sale in China this month, priced around $220. In the US, the company expects to offer the dev kit for sale in Q3. There’s no word yet on when a consumer version of the device will become available.

The post Vive Eye-tracking & Corrective Lens Add-on Confirmed Compatible with Vive Pro appeared first on Road to VR.

Samsung’s Facesense Recognizes Facial Expressions For VR Navigation

Samsung’s Facesense Recognizes Facial Expressions For VR Navigation

With body language, humans are able to express a range of feelings with only the most the subtle shifts in demeanor. Our facial expressions alone can show off a huge range of responses. Interaction with VR is something various companies continue to try to nail down, attempting to enable users to manipulate these worlds while breaking immersion as little as possible. Samsung’s experimental Facesense is a new attempt at changing the game, harnessing the power of our facial expression for hands-free VR interaction.

April 14th – 15th at the VRLA Expo, Samsung showed a new creation from their  C-Lab division. C-Lab cultivates ideas that are more experimental and this particular one provides a new way to navigate within VR. Detailed in an announcement, Facesense tracks electric signals that are created any time we speak, change our expression, or shift our gaze. Those signals are then used for navigation input along with a few spoken commands.

We tried the technology at VRLA briefly and it was a very early concept. It likely won’t be something that dominates as a primary means of interaction within VR, but it could be a complement to VR controllers. We’ll have to see if Facesense can find a degree of consistency with its input across all users. It could also serve as an option for those that cannot use VR controllers, opening up accessibility to virtual technology in a big way.

Tagged with: ,

Eye-tracking Glasses Give Glimpse into Professional Pianist’s Perception

For immersive technologies like AR and VR to be effective, they need to understand not only about our world, but also about us. But we don’t even understand us all that well yet; as we capture more biometric information on our quest to lose ourselves in augmented and virtual realities, we’re learning more about ourselves too.

Last month we wrote about a Disney Research project which used virtual reality to study human perception. The task—catching a ball—was simple, but the reality behind how the brain and body coordinate to get a simple task done can often confound our intuition.

Now Function has teamed up with eye-tracking specialists Tobii to equip a professional pianist with eye-tracking glasses to see what happens (see the video heading this article). Pianist Daniel Beliavsky sets about playing some complex arrangements, and the results are quite interesting, and also revealing about how much we have yet to learn about our own perception.

SEE ALSO
Tobii Recommends Explicit Consent for Recording Eye Tracking Data

In the video from Beliavsky’s perspective, a yellow dot shows what his eyes are fixed on. Rather than looking constantly at his hands to ensure each note is landing correctly, his eyes are darting about, often leading the hands to the next location on the keyboard, but also frequently returning to the point between his hands where he’s able to gather information about both at the same time with his peripheral vision.

This video highlights not just the potential usefulness of eye-tracking for input in immersive technologies, but also the challenges of using the raw data to understand intent. While we can use eye-tracking input for passive things like foveated rendering, using it for interaction is a much more complex problem.

Using the input from Beliavsky’s eyes alone, it may be possible to predict the next likely moves, but because of the complexity of how the brain, hands, and eyes are interacting in this case, doing so may come with extreme difficulty.

piano-eye-tracking

If the computer had to guess what the most important thing was to Beliavsky while he was fixated between his hands, it might guess it’s the keys between his hands. But actually Beliavsky was using a combination of muscle memory and peripheral visual queues to make his performance work, while at times the point at which his eyes were directly fixated was not important at all. The raw data in this case betrays the intent of the user. Intent represents a major challenge for the usefulness of biometrics beyond simple passive input and data collection.

The more we learn about human perception, the deeper our virtual and augmented worlds will be able to immerse us. One day, it’s feasible that we might be able to fully emulate input and output of human perception, but today we’re at just the beginning of that journey.

The post Eye-tracking Glasses Give Glimpse into Professional Pianist’s Perception appeared first on Road to VR.

ResearchVR Episode 38 – FOVE and Eye-Tracking in VR

ResearchVR Episode 38 – FOVE and Eye-Tracking in VR

This week on ResearchVR you have a chance to catch up with our latest episode, where Az analyzed FOVE in detail with Jim Preston, Director of Strategy and Business Development.

The last few weeks were tough – between our day jobs and preparation of VR@Cebit, we did not have much time for the work needed to deliver you the best quality audio and information for our episodes. But now we are on track, and we have an extensive list of stimulating discussions and deep dives with VR and AR experts.

Jim Preston, the Director of Strategy and Business Development at FOVE, worked for years at EA as a producer, climbing his ranks to the top of the ladder. Now he leads the effort to partner with teams and companies from North America and Europe to create the best quality content for the FOVE.

Episode Preview

The ecosystem of modern virtual reality is developing faster than any other technology before. We are quickly reaching the point when just looking or pointing won’t be enough. We need to continuously provide improvements to the input system so that more can be possible. After all, software cannot do more than the hardware enables.

That is why the combination between eye-tracking and HMDs is one of the most interesting technologies currently available. With FOVE, instead of targeting the best possible hardware level, FOVE took up on the challenge to provide eye-tracking to the wide range of consumers. This is tricky when all the other elements of the HMD stack up the costs.

Learn (listen to or watch) all about the trade-offs and solutions that FOVE implemented in Episode 38 – FOVE and eye-tracking in VR with Jim Preston.

Tagged with: , , , , ,

Tobii Recommends Explicit Consent for Recording Eye Tracking Data

Johan-HellqvistThe eye tracking company Tobii had some VR demos that they were showing on the GDC Expo Hall floor as well as within Valve’s booth. They were primarily focusing on the new user interaction paradigms that are made available by using eye gazing to select specific objects, direct action, but also locomotion determined by eye gaze. I had a chance to catch up with Johan Hellqvist, VP products and integrations at Tobii, where we discussed some of the eye tracking applications being demoed. We also had a deeper discussion about what type of eye tracking data should be recorded and the consent that application developers should secure before capturing and storing it.

LISTEN TO THE VOICES OF VR PODCAST

One potential application that Hellqvist suggested was amplifying someone’s eye dilation in a social VR context as a way of broadcasting engagement and interest. He said that there isn’t explicit science to connect dilation with someone’s feelings, but this example brought up an interesting point about what type of data from an eye tracker should or should not be shared or recorded.

Hellqvist says that from Tobii’s perspective that application developers should get explicit consent about any type of eye tracking data that they want to capture and store. He says, “From Tobii’s side, we should be really, really cautious about using eye tracking data to spread around. We separate using eye tracking data for interaction… it’s important for the user to know that’s just being consumed in the device and it’s not being sent [and stored]. But if they want to send it, then there should be user acceptance.”

Hellqvist says our eye gaze is semi-conscious data that we have limited control over, and that this is something that will ultimately be up to each application developer as to what to do with that data. Tobii has a separate part of their business that does market research with eye tracking data, but he cautions that using eye tracking within consumer applications is a completely different context than market research that should require explicit consent.

SEE ALSO
Watch: Tobii Reveals Vive VR Eye Tracking Examples, Including 'Rec Room'

Hellqvist says, “It’s important to realize that when you do consumer equipment and consumer programs that the consumer knows that his or her gaze information is kept under control. So we really want from Tobii’s side, if you use the gaze for interaction then you don’t need the user’s approval, but then it needs to be kept on the device so it’s not getting sent away. But it should be possible that if the user wants to use their data for more things, then that’s something that Tobii is working on in parallel.”

Tobii will be actively working with the OpenXR standardization initiative to see if it makes sense to put some of these user consent flags within the OpenXR API. In talking with other representatives from OpenXR about privacy I got the sense that the OpenXR APIs will be a lot lower level than these types of application-specific requirements. So we’ll have to wait for OpenXR’s next update in the next 6-12 months as to whether or not Tobii was able to formalize any type of privacy protocols and controls within the OpenXR standard.

SEE ALSO
Valve Talks 'OpenXR', the Newly Revealed Branding for Khronos Group's Industry-backed VR Standard

Overall, Tobii’s and SMI VR demos that I saw at GDC proved to me that there are a lot of really compelling social presence, user interface, and rendering applications of eye tracking. However, there are still a lot of open questions around the intimate data that will be available to application developers and the privacy and consent protocols that will inform users and provide them with some level of transparency and control. It’s an important topic, and I’m glad that Tobii is leading an effort to bring some more awareness to this issue within the OpenXR standardization process.


Support Voices of VR

Music: Fatality & Summer Trip

The post Tobii Recommends Explicit Consent for Recording Eye Tracking Data appeared first on Road to VR.

VR-Training für sicherheitsgefährdende Berufe

Wissenschaftler der Universität Exeter arbeiten gemeinsam mit Cineon Productions und Experten der Nuklearindustrie um ein gemeinsames Training für Industrien mit kritischen Sicherheitsbedingungen, zu entwickeln.

Erhöhte Sicherheit durch Virtual Reality

Die Ausbildungseinheit nennt sich Cineon Training und soll zukünftig im Bereich des Militärs, Wehrdienstes, Luftfahrt und der Arbeit im Nuklearbereich zum Einsatz kommen. Durch das neue VR-Training sollen Unfälle in diesen gesundheitsgefährdenden Bereichen vermieden werden. Dr. Sam Vine von der Universität Exeter leitet das Projekt in Zusammenarbeit mit Cineon Productions und den Nuklearexperten.

Das Cineon Training befindet sich noch in der Entwicklung und soll umfassend auf viele verschiedene Bereiche einsetzbar sein. Das Training basiert auf der 360-Grad-Technologie mit VR-Headsets, um die Effektivität von Angestellten zu erhöhen und Gefahren durch Unfälle zu vermeiden. Des Weiteren kommen Eye-Tracking und eine physiologische Überwachung der Auszubildenden zum Einsatz, um ein Verständnis der Lerneffekte zu erhalten. Außerdem wollen die Projektleiter dadurch herausfinden, wie Fehler während eines Einsatzes, besonders in Stresssituationen, entstehen.

Fundiertes Training basierend auf Forschungsergebnissen

Das Ziel des Projekts ist eine umfassende Ausbildung durch eine Kombination aus Technologie, wissenschaftlicher Theorie und Messmethoden wie Eye-Tracking zu erschaffen. Dadurch sollen die Mitarbeiter effektiver in kritischen Situationen agieren lernen, ohne sie dabei körperlich zu gefährden. Bekannt ist, dass die Simulation einer gefährlichen Situation, die Reaktion bei echter Gefahr durchaus verbessert. Der Mitarbeiter ist in der Lage auf gelernte Verhaltensweisen zurückzugreifen. Das Projekt basiert auf der zehnjährigen Forschung der Universität, die ebenfalls die angewandte Software entwickelt. Durch die Arbeit mit der der Nuklearindustrie trug das Projekt bereits Früchte. Jedoch möchten die Zuständigen in Zukunft enger mit Experten der Bereiche Luftfahrt, Notfallmedizin, Bergbau und Bauwesen zusammenarbeiten.

Das Team veranstaltet am 27.4. einen eintägigen Workshop für Sicherheitsexperten innerhalb der Nuklearindustrie und Trainer in anderen Bereichen. Laut Dr. Sam Vine simulieren die verwendeten Methoden und VR-Technologien stresshafte und risikobehaftete Umgebungen durch die VR-Headsets.

Das Training klingt vielversprechend und wird bald hoffentlich international angewendet. Die Langzeiteffekte müssen noch erforscht werden, jedoch wird deutlich wie viel Einfluss die VR-Technologien mittlerweile in der realen Welt haben.

(Quellen: phys.org)

Der Beitrag VR-Training für sicherheitsgefährdende Berufe zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Deepening Social Presence with SMI Eye Tracking

Christian-VillwockAt GDC this year, SensoMotoric Instruments (SMI) showed a couple of new eye tracking demos at Valve’s booth. They added eye tracking to avatars in the social VR experiences of Pluto VR and Rec Room, which provided an amazing boost to the social presence within these experience.

There are so many subtle body language cues that are communicated non-verbally through the someone else’s eye contact, gaze position, or even blinking. Since it’s difficult to see your own eye movement due to saccades, it’s best to experience eye tracking in a social VR context. Without having a recording of your eyes in social VR, you have to rely upon looking at a virtual mirror as you look to the extremes of your periphery, observing your vestibulo–ocular reflex as your eyes lock gaze while you turn your head, or winking at yourself.

eye-tracking

I had a chance to catch up with SMI’s Head of the OEM business Christian Villwock at GDC to talk about the social presence multiplier of eye tracking, the anatomy of the eye, and some of the 2x performance boosts they’re seeing with foveated rendering on NVIDIA GPUs.
Continue reading "Deepening Social Presence with SMI Eye Tracking"

Valve, SMI and Tobii Preview VR’s Eye Tracking Future In HTC Vive

Valve, SMI and Tobii Preview VR’s Eye Tracking Future In HTC Vive

Over the last week we learned that by spending essentially $300 to purchase three Vive Trackers, you will be able to bring your legs, feet and torso into VR — so you can kick a dinosaur in the face without even looking at it. Dinosaur kicking for $300 is certainly funny, but it’s also a great example of a broad effort by developers and hardware manufacturers to make virtual worlds more responsive to human behavior. Another is more robust hand and finger tracking, so the incredible variety of quick and precise movements in your hands are accurately represented in a virtual world. Still another example is eye tracking, and we’ve seen demonstrations from both Tobii and SMI in the HTC Vive offering a glimpse of how much better future VR systems will be at understanding our behavior.

A look inside a headset with eye tracking from Tobii.

New Tools For Game Designers

After a few minutes using the tech from SMI and Tobii, I noticed I was starting to unlearn a behavior I’d grown accustomed to in first-generation VR. Namely, I’ve gotten in the habit of pointing my head directly at objects to interact with them. That’s because current VR systems only understand where your head is pointed. Some games, particularly those on mobile VR, use this “gaze detection” as the primary method of interacting with the world. Tobii, in contrast, offered a very interesting test where I tried to throw a rock at a bottle in the distance. My aim was so-so on the first few throws, but that was without eye-tracking. When eye-tracking was turned on, they asked me to pick up a glowing orb and throw that instead. This time, almost every throw collided with a bottle.

Initially, I couldn’t understand why I’d want the computer to help me so much. As long as I kept my eye on the bottle and made a decently strong throw I’d hit my target every time. The glowing orb could be recalled by pressing a button on the controller too, so I could throw the ball and the instant it collided with a bottle I could recall it back to my hand like Thor’s hammer. It was just a simple tech demo but once my brain started getting accustomed to this new capability, I made a game out of seeing how quickly I could eliminate all the bottles by throwing the orb, recalling it the moment it collided, locking eyes on the next target and then immediately throwing it again.

This is what it took for me to realize just how empowering eye tracking will be for VR software designers. The additional information it provides will allow creators to make games that are fundamentally different from the current generation. With the example of throwing that orb, it was like I had been suddenly handed a superpower and I naturally started using it as such — because it was fun. It is up to designers to figure out how much skill will be involved in achieving a particular task when the game knows exactly what you’re interested in at any given moment.

This is a screen grab from Tobii’s demo showing my eye movements over ten seconds. The purple lines represent what caught my eye in that virtual world over that length of time. This type of data is already used to optimize video game design.

Higher Resolution Headsets May Need Eye Tracking

Eye tracking will be useful for other purposes too, including foveated rendering and social VR. Foveated rendering focuses the most detail in the center of your vision where your eyes are actually pointed. Your eyes see less detail in the periphery, so if the computer knows exactly where your eyes are pointed it dials up the amount of detail in the right spot while saving resources in places you’ll never notice. As manufacturers look at putting higher resolution displays in VR headsets, eye tracking that enables foveated rendering may become fundamental to that effort because it could help keep computers at affordable prices despite pushing more pixels.

Make Eye Contact

Eye tracking also dramatically increases the expressiveness in communication. In Valve’s booth at GDC, both SMI and Tobii demonstrated a 3-person social VR experience in which I hung out with other folks in VR and had a conversation. Tobii showed its technology integrated with the popular multiplayer world Rec Room while SMI allowed me to chat with someone in Seattle as if he was standing right next to me. Social interaction in VR with current consumer technology is fairly awkward. You can get some sense of a person’s interest via their hand and head movements, but to really connect with someone you need eye contact and both Tobii and SMI enabled that natural connection regardless of physical distance.

I wouldn’t say any of these technologies are consumer ready just yet, but they do show a sophistication, ease of use and affordability that we haven’t seen before. In fact, all the technologies mentioned in this post are being distributed to select developers as kits so they can start to build software around these upcoming advancements. FOVE is distributing a eye-tracking headset too. Meanwhile, both Google and Facebook have acquired eye tracking technologies within the last year — underscoring the expectation that the technology will power future headsets. It indicates that we are getting much closer to the realization of next-generation systems that will enable far more compelling and responsive virtual worlds compared with the ones we have today.

“I like to think of this as an extension of the development of the human-computer interface,” said Valve Developer Yasser Malaika, in an interview with UploadVR. “You started with command lines where you needed a lot of memorization, then moved to GUIs…now with VR we’re bringing more of the human body into it…your whole body the computer can now respond to. And adding eyes is another layer where it’s more responsive to you. It is the computer adapting to you rather than it asking you to adapt to it.”

Tagged with: , , ,

Watch: Tobii Reveals Vive VR Eye Tracking Examples, Including ‘Rec Room’

After hyping everyone up recently with a short teaser video, eye-tracking specialists Tobii have revealed more details and video examples their eye tracking in action at GDC, including integration with social VR application Rec Room.

We wrote recently about Tobii‘s cunning and quietly impressive teaser video which showed a small glimpse at what the company might be prepping to demonstrate at GDC this week. Well the company has revealed more examples of VR eye-tracking integration in a series of short videos covering various use cases.

As we wrote then, Swedish company Tobii, who has specialised in eye tracking since its inception in 2001, have produced hardware for gaze detection hardware in various guises for some time now, and has recently announced plans to bring its tech to virtual reality, via a $50M funding round.

The first video is a repeat of the teaser which snuck onto subreddit /r/vive last week. We called it “unnervignly effective” at the time and indeed it still is. The demo is of a virtual avatar, standing in front of two virtual mirrors, demonstrating how much more relatable the character shown in the right hand mirror (showing eye tracking enabled) than the left (sans eye tracking).

Next up is a snippet of gameplay from virtual reality social and multiplayer gaming app Rec Room, with the developer Against Gravity apparently having integrated Tobii’s technology with the application. In this case, two players enjoy a game of poker, a brilliantly simple example of how subtle ticks and tells betrayed by your eyes movements can impact social interactions while gaming. As Tobii put it:

Eye tracking can help you actually merge with your character, making interactions with the virtual world become more seamless – Just glance at another character in the VR world and watch the character react accordingly.

Finally, a demo which shows how input taken from gaze detection can be used directly to control aspects of a VR gameworld.

Using eye tracking the game can better understand where you are looking making aiming feel more realistic and consistent. Instead of needing to perfectly position your body before the throw you can focus on using the right amount of force and angle to hit your target bullseye.

However, we still have no details on how the eye tracking works, how it’s integrated or what the plans are to implement said tracking going forward. We’re trying to dig up more details at GDC this week.

The post Watch: Tobii Reveals Vive VR Eye Tracking Examples, Including ‘Rec Room’ appeared first on Road to VR.

SMI und Valve arbeiten gemeinsam an einem Eye-Tracking-System für OpenVR

Von vielen wird Eye-Tracking als eine der bedeutendsten Techniken angesehen, wenn es um die Zukunft von Virtual Reality geht. SensoMotoric Instruments (SMI) und Valve arbeiten nun gemeinsam daran, den Traum bald Realität werden zu lassen.

Kooperation im Dienste des Fortschritts

Die Zusammenarbeit entstand, um die Eye-Tracking-Technologie in das Software Development Kit (SDK) der OpenVR zu integrieren und für die Programmierschnittstellen zu fertigen. Dadurch wäre es auch für andere Unternehmen möglich, diese Unterstützung für ihre VR-Software zu nutzen. Gemeinsam konnten sie bereits in ausgewählten HTC Vive-Einheiten die Technik von SMI integrieren. Diese Einheiten wurden an Forschungspartner ausgeliefert, die jetzt wiederum prüfen, wie das Eye-Tracking in VR-Welten am besten funktionieren kann.

Tatsächlich präsentierte Google in der letzten Woche schon ausgereiftere Technik mit ihrer Gesichts-Scan-Methode, die das gesamte Gesicht eines Users in Videos platziert. Ob Valve oder HTC derzeit an Add-ons für Verbraucher arbeiten, um Eye-Tracking für Vive zu ermöglichen, ist noch unklar. Dass zukünftige Versionen des Gerätes die Möglichkeit bieten werden, ist da zumindest wahrscheinlicher. Auch andere Hersteller, wie beispielsweise FOVE, setzen schon jetzt auf die Eye Tracking Technologie.

Eye-Tracking hat an Bedeutung gewonnen

Auch wenn es zunächst so wirkt, als würde Eye-Tracking nur als ein weiteres Mittel zur Eingabe bei VR-Erlebnissen eingesetzt werden, taugt es doch zu wesentlich mehr: Die Technik ist für das sogenannte Foveated Rendering von hoher Bedeutung. Im menschlichen Auge bietet die Fovea das beste Auflösungsvermögen, also die höchste Schärfe des menschlichen Sehapparates. Ist durch Eye Tracking klar, worauf die Fovea gerade gerichtet ist, wird der Bereich mit der höchsten Qualität gerendert, während sie ringförmig abnimmt. Dieses sehr effiziente Verfahren bedeutet, dass Apps nicht immer einen kompletten Bildschirm rendern müssen. Somit könnten einige Barrieren bei der Herstellung von VR-Hardware gesenkt werden.

Daneben wird Eye-Tracking eines Tages notwendig sein, um wirklichkeitsnahe soziale Interaktionen zwischen virtuellen Avatare in VR-Welten zu kreieren. Je realer die virtuellen Körper wirken, desto wichtiger wird das genaue Eye-Tracking sein.

(Beitragsbild: Fove)

Der Beitrag SMI und Valve arbeiten gemeinsam an einem Eye-Tracking-System für OpenVR zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!