Qualcomm And Tobii Partnership Looks For Better AR/VR Eye-Tracking

Eye-tracking technology has a number of benefits for virtual reality (VR) and augmented reality (AR), allowing for a range of improvements to immersive experiences. As demand for the technology rises, Qualcomm are poised to meet it thanks to its collaboration with Tobii.

Tobii and Qualcomm have partnered in order to create a full reference design and development kit for the Qualcomm Snapdragon Mobile VR platform, which integrates Tobii’s EyeCore eye-tracking hardware and algorithms.

Eye-tracking can be used to benefit VR and AR in a variety of ways. This includes techniques such as foveated rendering, which allows for processing power to be focussed on the area the user is currently looking at, with graphics out of their field of view rendered in lower quality, thus saving on power and increasing efficiency.

The technology also allows for better hand-eye coordination within the VR world, creating more natural interactions, and can be used to allow for users to make eye-contact with other virtual avatars, facilitating an important part of human interaction.

“Increased interest in the untethered, mobile VR segment, in conjunction with Qualcomm’s innovation and technology leadership in this space, further strengthens our excitement about the potential of this market opportunity for Tobii eye tracking,” said Oscar Werner, president of Tobii Tech. “At its core, eye tracking fundamentally enables hardware manufacturers to build smarter and more capable devices with greater mobility, that in turn deliver truly immersive and natural experiences to delight users.”

“Qualcomm is focused on transforming the way that people use mobile technologies for entertainment and productivity,” said Hiren Bhinde, director of Product Management, XR Technologies, Qualcomm Technologies, Inc. “We added support for Tobii’s eye tracking solution to our new Snapdragon 845 VR development kits to help developers create new experiences using a higher quality of gaze interaction that we think will ultimately provide consumers with more intuitive, visually immersive experiences.”

Further developments on eye-tracking in VR and AR will be reported on here at VRFocus.

Eye-Tracking Is Both Incredibly Important And Very Risky

Eye-Tracking Is Both Incredibly Important And Very Risky

At CES last week one of my first demos was with Tobii and its eye-tracking technology. I left the demo convinced that once you’ve tried a VR headset with eye-tracking included you’ll never want to wear one without it again.

Google, Facebook and Apple all purchased eye-tracking companies over the last few years because these tech giants know what benefits are possible with the technology. If you know where a person is pointing their eyes at any given moment you can do things with software interaction and optimization that were never possible before. For example, eye-tracking could allow next generation headsets to dramatically upgrade resolution without adding a ton to rendering cost. That’s by way of foveated rendering — where the greatest detail is only drawn directly in front of your eyes. Manufacturers could even use eye-tracking to measure the distance between your pupils, which could help people maximize the 3D effect seen inside a VR headset just by getting it set up right.

But more important to developers, eye-tracking completely changes the way people interact with a virtual world. Tobii sells a modified HTC Vive with its eye-tracking tech installed and, in a series of demos, I was given the freedom to flip eye-tracking on or off at any time. Not once did I prefer eye-tracking off and it so enhanced the experience of interacting with a virtual world I started disliking the HTC Vive without the feature turned on. Below is a look at the demos I tried, each of which was either more immersive or easier to accomplish than if I had been using a headset without eye-tracking.

Bringing Your Eyes To Life

ESports: Hightech-Analyse und VR-Livestream für CS: GO Premier 2017

ESports begeistert unzählige Zuschauer und sorgt gerne für Emotionsausbrüche bei den Fans, welche ihre Mannschaften bei Turnieren teilweise in riesigen Hallen anfeuern. Einer der bekanntesten eSports-Titel ist Counter-Strike: Global Offensive. Für den Titel steht bald das CS:GO Premier 2017 an und dafür haben sich die Veranstalter von Eleague etwas Besonderes ausgedacht: Dank einer Partnerschaft mit Tobii, Sliver.tv und Dojo Madness kann das komplette Turnier innerhalb der VR angesehen werden und technische Neuerungen sorgen für detaillierte Analysen.

CS: GO Premier 2017 per VR-Livestream verfolgen

Die CS: Go Premier 2017 umfasst insgesamt 16 Teams, die im Erfolgstitel Counter-Strike: Global Offensive von Valve gegeneinander antreten. Die Gewinner erwartet ein Geldpreis in Höhe von einer Millionen US-Dollar. Um dieses Spektakel angemessen zu analysieren und zu moderieren, gingen die Veranstalter von Elague nun eine Kooperation mit Tobii, Sliver.tv und Dojo Madness ein.

Die Partnerschaft ermöglicht einige technische Neuerungen. So lässt sich beispielsweise dank der Eye-Tracking-Technologie von Tobii die Blickrichtungen der Spieler analysieren. Das ermöglichen spezielle Sensoren, die an den Computermonitoren befestigt sind. Außerdem werden die Augenbewegungen ausgewertet und in Form eines grafischen Overlays für die Zuschauer sichtbar gemacht. Dadurch ist es für die Zuschauer besser nachvollziehbar, wie die Spieler reagieren. Die Spieler sollen dadurch übrigens nicht abgelenkt werden.

Das Unternehmen Dojo Madness sorgt ebenfalls für eine spannende Verbesserung der Zuschauerqualität, denn ihre Shadow.gg-Plattform sorgt für eine neue Form der Visualisierung mit integrierter Heatmap.

Die größte Neuerung entsteht jedoch durch die Partnerschaft mit Sliver.tv, denn dadurch ist es zukünftig möglich, das Turnier innerhalb der VR mitzuerleben. Die neue Technologie des Unternehmens ist in der Lage, 2D-Computerspiele in 360-Grad-VR-Livestreams umzuwandeln und an eine VR-Brille zu streamen. Außer der PlayStation VR (PSVR) sind alle vorhanden VR-Brillen mit der Technologie kompatibel.

Die CS: GO Premier 2017 beginnen am 8. September 2017 und laufen bis zum 30. September 2017. Daraufhin beginnt ab dem 10. Oktober die Play-off-Phase. Wir freuen uns auf die neuen innovativen Features und wünschen allen teilnehmenden Teams viel Glück bei ihrem Wettkampf um den Sieg.

(Quellen: Venturebeat | Eleague)

Der Beitrag ESports: Hightech-Analyse und VR-Livestream für CS: GO Premier 2017 zuerst gesehen auf VR∙Nerds. VR·Nerds am Werk!

Tobii Integrates Eye-Tracking Technology Into HTC Vive

Tobii Pro, a company focussed on eye-tracking research, has announced a new solution that can be integrated into a commercial HTC Vive headset. The tool, along with the software development kit, is being made available for researchers.

The Tobii Pro integration uses a HTC Vive business edition along with the Tobii eye tracking technology to track all types of eyes, able to collect eye-tracking data at 120 Hz, which allows researchers to study new possibilities in areas such as psychology, human behaviour and human performance. The addition of the eye-tracking technology via retro-fitting into a readily available commercial virtual reality (VR) headset allows researchers to run scenarios that were previously considered too difficult or costly.

“Combining eye tracking with VR is growing as a research methodology and our customers have started to demand this technology to be part of their toolkit for behavioral studies. The Tobii Pro VR Integration is our first step in making eye tracking in immersive VR a reliable and effective research tool for a range of fields. It marks our first major expansion of VR-based research tools,” said Tom Englund, president, Tobii Pro.

“Eye tracking in immersive VR will open up opportunities for new ways of evaluating research questions that leverage the ability to control the environment and the net gain for researchers will be stronger insights that will be more predictive of real-world behaviour,” said Dr. Tim Holmes, director of research and development at Acuity Intelligence and Honorary Research Associate at Royal Holloway, University of London.

The SDK that comes along with the technology solution allows for eye-tracking data collection for live interactions and analysis and is compatible with Matlab, Python and .Net with Unity.

A video demonstration of the eye-tracking solution can be viewed below.

VRFocus will bring you further information on Tobii eye-tracking technology as it becomes available.

Tobii Unveils its Eye Tracking Development Kit for HTC Vive

Tobii, the eye tracking specialist, has announced availablility of its development kit for HTC Vive, aimed squarely at developers looking to unlock the potential for the technology in virtual reality (VR). 

The VR4 for Vive Kit is a retrofit service to the HTC Vive business edition. It contains Tobii VR4 (a reference implementation of Tobii eye tracking designed to support HMDs), an HTC Vive business edition HMD kit, Tobii software development kit (SDK) software and Tobii example applications showcasing eye tracking interactions.

Tobii HTC Vive headset

“Our existing demonstrations have resulted in a strong demand for a VR development kit with Tobii eye tracking,” said Oscar Werner, president of Tobii Tech in a statement. “Eye tracking is the next natural step for VR, and with this offering we are extending our commitment to the VR developer community to help provide the best experience possible for the end user while unleashing developer creativity.”

As the kit is designed solely for VR developers, with Tobii retrofitting HTC Vive business edition, it won’t be made publicly available. Instead those who’re interested will need to register their details through a dedicated website, after which Tobii will contact them with a quote for purchasing – the company hasn’t released a rough figure on what that might be – with shipments due to begin next month.

Eye tracking is likely to be one technology that’ll advance the immersion levels of future VR headsets as manufacturers and developers understand its benefits. Already built into the FOVE 0 headset, eye tracking in its most basic form can allow users to highlight in game objects or interact with an NPC, while more advanced use comes from its combination with foveated rendering. This allows a players viewpoint to be fully rendered at maximum resolution, while the surrounding periphery resolution is lowered, reducing the PC performance needed to display VR software and thus the cost of buying a suitably powerful rig.

VRFocus will continue its coverage of Tobii, reporting back with the latest announcements.

Eye-tracking Glasses Give Glimpse into Professional Pianist’s Perception

For immersive technologies like AR and VR to be effective, they need to understand not only about our world, but also about us. But we don’t even understand us all that well yet; as we capture more biometric information on our quest to lose ourselves in augmented and virtual realities, we’re learning more about ourselves too.

Last month we wrote about a Disney Research project which used virtual reality to study human perception. The task—catching a ball—was simple, but the reality behind how the brain and body coordinate to get a simple task done can often confound our intuition.

Now Function has teamed up with eye-tracking specialists Tobii to equip a professional pianist with eye-tracking glasses to see what happens (see the video heading this article). Pianist Daniel Beliavsky sets about playing some complex arrangements, and the results are quite interesting, and also revealing about how much we have yet to learn about our own perception.

SEE ALSO
Tobii Recommends Explicit Consent for Recording Eye Tracking Data

In the video from Beliavsky’s perspective, a yellow dot shows what his eyes are fixed on. Rather than looking constantly at his hands to ensure each note is landing correctly, his eyes are darting about, often leading the hands to the next location on the keyboard, but also frequently returning to the point between his hands where he’s able to gather information about both at the same time with his peripheral vision.

This video highlights not just the potential usefulness of eye-tracking for input in immersive technologies, but also the challenges of using the raw data to understand intent. While we can use eye-tracking input for passive things like foveated rendering, using it for interaction is a much more complex problem.

Using the input from Beliavsky’s eyes alone, it may be possible to predict the next likely moves, but because of the complexity of how the brain, hands, and eyes are interacting in this case, doing so may come with extreme difficulty.

piano-eye-tracking

If the computer had to guess what the most important thing was to Beliavsky while he was fixated between his hands, it might guess it’s the keys between his hands. But actually Beliavsky was using a combination of muscle memory and peripheral visual queues to make his performance work, while at times the point at which his eyes were directly fixated was not important at all. The raw data in this case betrays the intent of the user. Intent represents a major challenge for the usefulness of biometrics beyond simple passive input and data collection.

The more we learn about human perception, the deeper our virtual and augmented worlds will be able to immerse us. One day, it’s feasible that we might be able to fully emulate input and output of human perception, but today we’re at just the beginning of that journey.

The post Eye-tracking Glasses Give Glimpse into Professional Pianist’s Perception appeared first on Road to VR.

Tobii Partners with Teotl Studios to Integrate Eye Tracking into The Solus Project VR

Today, eye tracking specialist Tobii has announced a new partnership with developer Teotl Studios to integrate its technology into the developers sci-fi adventure, The Solus Project VR. 

Both companies are working together to adapt the existing eye-tracking enhanced PC version of the videogame to the virtual reality (VR) format. The collaboration continues Tobii’s expansion into enabling eye tracked VR content with game publishers around the globe.

The Solus Project image 2

Eye tracking is seen as one of the next steps in VR, allowing more intuitive and natural interactions with virtual worlds and the characters that inhabit them.

Tobii has already implemented its tech in over 55 PC titles, including Elite: Dangerous and Ghost Recon Wildlands. To date it’s built into several gaming hardware products, including the notebooks such as Alienware 17, MSi GT72 Tobii, Acer Aspire V Nitro, Predator 21 X and the monitors Predator Z271T and Predator XB271HUT.

“Seeing how much more immersive The Solus Project became with eye tracking on PC, it was obvious we had to add it to the VR version,” said Sjoerd De Jong, developer at Teotl Studios. “Even though we’ve already received many accolades for the VR version, I highly recommend players to see how much richer the experience has become with eye tracking.”

“With the rapidly growing interest in VR and, more specifically, eye-tracking VR, we’re tapping into our extensive relationships with PC gaming studios to explore how to best adapt their games as well as develop the next generation of games for future VR HMD’s with eye tracking,” said Oscar Werner, president of Tobii Tech.”

Tobii will be showcasing The Solus Project VR eye tracking demo at the Silicon Valley Virtual Reality (SVVR) expo in San Jose, California. Demos will be run using HTC Vive headsets following the R&D collaboration between Tobii and Valve.

The Solus Project launched in early 2016 for PC, it then added support for Oculus Rift, HTC Vive and OSVR a short while later.

VRFocus will continue its coverage of Tobii and The Solus Project, reporting back with further updates.

Tobii Recommends Explicit Consent for Recording Eye Tracking Data

Johan-HellqvistThe eye tracking company Tobii had some VR demos that they were showing on the GDC Expo Hall floor as well as within Valve’s booth. They were primarily focusing on the new user interaction paradigms that are made available by using eye gazing to select specific objects, direct action, but also locomotion determined by eye gaze. I had a chance to catch up with Johan Hellqvist, VP products and integrations at Tobii, where we discussed some of the eye tracking applications being demoed. We also had a deeper discussion about what type of eye tracking data should be recorded and the consent that application developers should secure before capturing and storing it.

LISTEN TO THE VOICES OF VR PODCAST

One potential application that Hellqvist suggested was amplifying someone’s eye dilation in a social VR context as a way of broadcasting engagement and interest. He said that there isn’t explicit science to connect dilation with someone’s feelings, but this example brought up an interesting point about what type of data from an eye tracker should or should not be shared or recorded.

Hellqvist says that from Tobii’s perspective that application developers should get explicit consent about any type of eye tracking data that they want to capture and store. He says, “From Tobii’s side, we should be really, really cautious about using eye tracking data to spread around. We separate using eye tracking data for interaction… it’s important for the user to know that’s just being consumed in the device and it’s not being sent [and stored]. But if they want to send it, then there should be user acceptance.”

Hellqvist says our eye gaze is semi-conscious data that we have limited control over, and that this is something that will ultimately be up to each application developer as to what to do with that data. Tobii has a separate part of their business that does market research with eye tracking data, but he cautions that using eye tracking within consumer applications is a completely different context than market research that should require explicit consent.

SEE ALSO
Watch: Tobii Reveals Vive VR Eye Tracking Examples, Including 'Rec Room'

Hellqvist says, “It’s important to realize that when you do consumer equipment and consumer programs that the consumer knows that his or her gaze information is kept under control. So we really want from Tobii’s side, if you use the gaze for interaction then you don’t need the user’s approval, but then it needs to be kept on the device so it’s not getting sent away. But it should be possible that if the user wants to use their data for more things, then that’s something that Tobii is working on in parallel.”

Tobii will be actively working with the OpenXR standardization initiative to see if it makes sense to put some of these user consent flags within the OpenXR API. In talking with other representatives from OpenXR about privacy I got the sense that the OpenXR APIs will be a lot lower level than these types of application-specific requirements. So we’ll have to wait for OpenXR’s next update in the next 6-12 months as to whether or not Tobii was able to formalize any type of privacy protocols and controls within the OpenXR standard.

SEE ALSO
Valve Talks 'OpenXR', the Newly Revealed Branding for Khronos Group's Industry-backed VR Standard

Overall, Tobii’s and SMI VR demos that I saw at GDC proved to me that there are a lot of really compelling social presence, user interface, and rendering applications of eye tracking. However, there are still a lot of open questions around the intimate data that will be available to application developers and the privacy and consent protocols that will inform users and provide them with some level of transparency and control. It’s an important topic, and I’m glad that Tobii is leading an effort to bring some more awareness to this issue within the OpenXR standardization process.


Support Voices of VR

Music: Fatality & Summer Trip

The post Tobii Recommends Explicit Consent for Recording Eye Tracking Data appeared first on Road to VR.