Epic Aims to Take on Steam with Newly Announced Epic Games Store

Steam is widely considered the de facto platform for PC games, VR or otherwise, but Unreal Engine creators Epic Games want change that with a new storefront that they say will leave more revenue to developers than Steam and other major digital distribution platforms.

Update (12:20 PM ET): Tim Sweeney has confirmed in a Game Informer interview that VR games will have a home on the platform, although the store “doesn’t have any sort of VR user interface.”

Both Steam and the Oculus Store take an industry standard cut of 30% of a game’s revenue, although Epic CEO Tim Sweeney today announced in a blogpost that the company would soon be creating their own dedicated storefront that will only take a 12% slice of the pie; that includes games created with Epic’s Unreal Engine 4, Unity, and other game engines.

Fittingly dubbed ‘Epic Games Store’, the distribution platform will likely take the place of the Epic Games launcher on PC and Mac, which most famously features battle royale sensation Fortnite and the Unreal Engine itself.

Image courtesy Epic Games

There wasn’t a specific announce surrounding support for VR games (see update), although as Khronos Group’s OpenXR standard marches ever onward in its mission to make market fragmentation a thing of the past, it’s possible those “open platforms” will also include VR. Epic Games is a prominent member of the Khronos Group working alongside a host of industry pros including Oculus, HTC, Valve, Microsoft, Google, AMD, NVIDIA, and Unity.

Epic Games Store is said to launch soon and begin what Sweeney calls “a long journey to advance the cause of all developers.” The store will first launch with a set of games curated by Epic for PC and Mac, and then allow other games and “open platforms” throughout 2019.

SEE ALSO
Report: New Valve VR Headset Appears in Leaked Images

Content creators like YouTubers can also take part in what the company calls ‘Support-A-Creator’, an opt-in program that provides revenue-sharing kickbacks to creators who refer players to buy a game on the Epic Games Store. Developers can set a specific percentage shared to content creators, although Epic intends to cover the first 5% of creator revenue-sharing for the first 24 months.

“We’ve built this store and its economic model so that Epic’s interests are aligned with your interests,” explains Sweeney, talking directly to developers. “Because of the high volume of Fortnite transactions, we can process store payments, serve bandwidth, and support customers very efficiently. From Epic’s 12% store fee, we’ll have a profitable business we’ll grow and reinvest in for years to come!”

Sweeney says more details (and launch games) for Epic Games Store will be revealed at The Game Awards this Thursday, December 6th.

The post Epic Aims to Take on Steam with Newly Announced Epic Games Store appeared first on Road to VR.

First Ever OpenXR Demo Heading To Siggraph 2018 This Week

First Ever OpenXR Demo Heading To Siggraph 2018 This Week

Khronos Group’s anticipated VR/AR standard API, OpenXR, is finally getting its first ever public demo at Siggraph 2018 this week.

The platform, which was announced at GDC last year, will be shown running an Unreal Engine VR demo inside both the StarVR and Microsoft’s Windows ‘Mixed Reality’ VR headsets at the event in Vancouver, Canada, from August 12th – 16th.

OpenXR is designed to streamline development of VR and AR applications for studios working across the broad range of headsets and control inputs in the market today. Its interface is able to automatically adjust apps and control schemes to accommodate the varying specifications of these devices without asking developers to spend large amounts of time doing it all manually. The group includes over 140 members, with major players like Google, Sony, Valve, HTC, Samsung, Magic Leap, Oculus, AMD and Nvidia involved.

Crucially, StarVR and Windows headsets offer very different specifications. The latter, for example, is one of the few home-based VR devices to currently offer an inside-out tracking system, and can run apps separate from the SteamVR and Oculus ecosystems on PC. StarVR meanwhile, is intended for location-based solutions and offers a higher field of view (FOV) and display resolution than many other VR devices as well as support for a range of peripherals. The two headsets should serve as a good example of how VR development can be simplified across two very different devices, then.

Next up on the OpenXR roadmap is a provisional release for developers to test via an adopters program before a full launch later down the line. Dates for either haven’t yet been announced.

Tagged with: ,

The post First Ever OpenXR Demo Heading To Siggraph 2018 This Week appeared first on UploadVR.

How VR’s Next Generation Shows Key Signs of a Maturing Market

VR feels like it’s simultaneously moving fast and slow. It’s been two years and four months since the leading PC VR headsets, the Oculus Rift and HTC Vive, hit shelves. The leading high-end headset overall, PSVR, has only been around for one year and nine months. That’s a significant amount of time for individual products, but just a blip in the context of the formation of a brand new ecosystem and medium. Looking forward to the next generation of VR, advancements in specs and experience are not the only improvement; the industry as a whole is becoming more cohesive, which will ultimately benefit everyone involved. Here’s how.

OpenXR API Standard

OpenXR is an in-development standard that’s open and royalty free. It’s being developed by a consortium consisting of effectively every major player in the VR market, including chip makers, headset makers, game engines, publishers, and VR app stores. The standard is being developed under the Khronos Group, the organization behind a number of major graphics standards like OpenGL, WebGL, and Vulkan.

Public supporters of OpenXR. | Image courtesy Khronos Group

OpenXR aims to foster greater interoperability between major pieces of the VR ecosystem: apps, game engines, and headsets. The goal is to make it easier to ‘write once and run anywhere’, meaning reducing the redundancy and complexity for an app, game engine, or headset to support the multitude of options available on the market.

The OpenXR project is building an ‘Application Interface’, which sits between VR apps and content platforms, and a ‘Device Layer’, which sits between the content platforms and individual VR headsets and devices. The idea is that the Application Interface and the Device Layer should be standardized so that everyone can design against a common target rather than needing to maintain individual support for many different platforms and devices.

OpenXR is a broadly supported initiative to create an industry standard method of interfacing between VR headsets and software. | Images courtesy Khronos Group

Hypothetically this means that a game supporting OpenXR that launches tomorrow could work with an OpenXR-supported headset that launches in five years, since both were built targeting the same Application Interface and Device Layer. Similarly, a OpenXR headset from one company could work with OpenXR controllers from another company, letting users use various devices together more easily.

Similarly, if a new game engine wanted to offer support for VR, it needs only to target the OpenXR Application Interface to potentially work with every headset and VR platform that supports OpenXR.

SEE ALSO
Oculus Wants to "Go big" On Opening Their Platform to Third-party Headsets, When the Time is Right

This doesn’t necessarily mean that every VR platform that supports OpenXR will support every headset that supports OpenXR (for instance, the HTC Vive on the Oculus Store), but it does mean that the technical capability is in place if platform stakeholders want to move to an open device approach.

In total, this makes it easier for new players to jump into the VR market, whether that be with a game, a game engine, a content platform, or even a new headset or accessory. This benefits everyone in the market by creating more options for developers and customers, and fosters competition which leads to better products.

Image courtesy Khronos Group

In March this year, the OpenXR group announced its latest progress and did a deep dive into the technical scope of the OpenXR spec. While the group hasn’t announced an official release of the standard, the latest timeline shows that they are nearing an initial release,  which we expect could come by the end of the year.

VirtualLink Connection Standard

Public supporters of the VirtualLink project

Most major tethered headsets today require two or three plugs on the end of the cable which need to be connected to the host PC, and they differ from one headset to the next. The Rift, for instance, has one HDMI plug and one USB plug. The Vive has one DisplayPort plug, one USB plug, and a separate power plug that needs to connect to a wall outlet.VirtualLink, backed by most of the major players in the VR industry, is a newly announced connection standard that aims to simplify headset plugs into a single, well specified connector.

Based on USB-C, the VirtualLink connector offers four high-speed HBR3 DisplayPort lanes (which are “scalable for future needs”), a USB3.1 data channel for on-board cameras, and up to 27 watts of power. The standard is said to be “purpose-built for VR,” being optimized for latency and the needs of next-generation headsets.

The new connector will not only simplify the connection on the end of the headset’s cable, it will also make it easier for customers to understand if a given computer will support a certain headset, which eases the process of buying a VR Ready system.

Whereas today you might think your system meets all the specifications for a VR headset, only to find that your USB ports don’t actually support the speeds required for the headset, or you don’t have the right HDMI spec, the VirtualLink standard means that both customers and headset makers can count on the connector to support a well specified set of capabilities.

VirtualLink was just announced last month and is expected to debut with the next generation of GPUs and PC VR headsets; the former we could see by the end of the year.

Continued on Page 2

The post How VR’s Next Generation Shows Key Signs of a Maturing Market appeared first on Road to VR.

Accessible XR Development After VRTK

As a developer moving from the web and app world into 3D and XR, I’ve had to constantly re-evaluate my platform and tool choices as the industry evolves at tweetstorm velocity. Today’s XR development pipeline is clogged by a glut of proprietary hardware and software APIs and SDKs by competing firms like Oculus, HTC Vive, Microsoft, Google, Apple, Sony and SteamVR — to say nothing of emerging third-party peripherals like Logitech’s VR-tracked keyboard, the new AR-enabling Zed Mini dual-eye camera for the Rift or Vive, or any other industry-disrupting Kickstarters that might’ve sprung up since I started typing this paragraph.

Left to right: a bunch of cool stuff I want.

Each platform’s fine — even technologically stunning, one might argue — with respective strengths, weaknesses and use cases. But the distinctions force XR developers to ask hard questions: Where is the market going? How do I invest my skill-building time? What devices should my app support? What platform can I get a job working on? Developers must be business analysts as much as creative technologists to stay relevant. It’s easy to suffer choice paralysis with such a wide array of options, and easier still to bet on the wrong technology and lose.

Personally, I also face certain technical, logistic and financial realities as an independent XR developer in the Midwest (US), where the industry hasn’t proliferated as it has in major coastal cities. Thankfully, game engines like Unity and Unreal are rapidly democratizing this space. Both engines seek to bridge the gaps between the various XR SDKs, employing thousands of engineers to ensure their software plays nicely with just about any significant third-party API. For example, as I wrote about in August, the Oculus SDK integrates beautifully with Unity and comes equipped with many of the scripts and prefabs needed to quickly prototype, develop and deploy a custom Rift app.

I miss bossing around my hand-modeled #MadeWithBlocks BB-8. Check out my deep dive on this project, The Future of VR Creation Tools.

That’s fantastic, but it’s still non-standard. To port the same Unity app to the HTC Vive or a Windows HMD is non-trivial — not impossible or even terribly difficult, but non-trivial. Maintaining your app for multiple SDKs over the long haul is similarly non-trivial. Non-trivial costs money and time and we’re all short on both.

Instead imagine if XR practitioners had to worry less about betting on the right platform or device and could instead focus on creating unique and compelling experiences, content and UX. The first step down that path was VRTK — but sadly, one of the best tools to combat the VR SDK surplus will soon be hobbled by the loss of its founder.

VRTK: The Open Source Approach

This free, open source Unity toolkit aims to knit together a single workflow for a variety of VR APIs. It comes with the same stock prefabs and scripted mechanics you might find included in any single proprietary SDK, but makes each piece of functionality identical whether deployed to Oculus, SteamVR (read: Vive and, with v3.3.0, Windows HMDs) or Daydream — covering all major VR HMD manufacturers today.

It’s a boon to anyone wanting to dip their toes in the waters of VR development. Think of it: Want to implement teleportation locomotion over a Unity NavMesh? Just drop the component onto your player prefab. Want to test out grab mechanics, or a quick bezier pointer? VRTK’s demo scenes have you covered, and they’ll work easily on a variety of devices. Since it’s open source, you’re also free to dive in and customize the code. Struggling to get a feature working in your own project? Check out this implementation on a varieties of SDKs — not a bad way to grok new XR coding concepts.

Sadly, VRTK’s creator is sunsetting the woefully underfunded project. The UK-based developer TheStoneFox — who until recently was actively seeking contributors, partnerships and support — announced recently that he would will be stepping back from the project post-version 3.3.0. Though VRTK boasts an active Slack community, a growing list of “made with” titles and a recent Kickstarter, TheStoneFox was unable to attract the support necessary to sustain it for the long term.

Now, as the opportunity to contribute to and utilize a premier open-source VR development pipeline expediter will fade going forward, what if anything will replace it?

OpenXR: One API to Rule Them All

The VRTK approach —using Unity scripting to knit together similar mechanics across a spectrum of VR SDKs — is necessary in the current fragmented development landscape, but there are downsides. Some community still has to monitor the various proprietary SDK updates and your end-user VRTK app still has to be mindful of VRTK’s changes over time. In this way, VRTK treated the symptoms of the VR SDK overload, but was not equipped to address the root cause. Enter OpenXR, The Khronos Group’s upcoming industry standard:

The standard, announced December 2016, is being written now and is quickly gaining traction among industry players (with the notable exception of Magic Leap). Instead of forcing developers to grapple with variable propriety SDKs and all the accompanying business consequences, companies will instead tailor their hardware and software to comply with OpenXR’s spec. Khronos, the non-profit responsible for shepherding the Vulkan, OpenGL, OpenGL ES and WebGL standards, is leading the charge. Cue the infographics!

On the left, the problem — on the right, the solution:

Images courtesy of https://www.khronos.org/openxr.

“Each VR device can only run the apps that have been ported to its SDK. The result is high development costs and confused customers — limiting market growth,” reads some fairly accurate marketing copy on their website. “The cross-platform VR standard eliminates industry fragmentation by enabling applications to be written once to run on any VR system, and to access VR devices integrated into those VR systems to be used by applications.”

A working group of industry heavyweights have agreed the standards be extensible to allow for future innovation and should support a range of experiences — anything from a 3-DoF controller all the way to a high-end, room-scale devices.

The only thing missing is a realistic timetable before this standard has an impact on the development community and its day-to-day workflow. Until the market-movers get their act together, we’ll be left scrambling (and patching up VRTK projects, in many cases).

OpenXR supporters: everyone except Magic Leap.

The Cinema of Attractions: Slow Your Reel

But should we so quickly welcome industry standardization while the technology is still so new and full of possibilities? That’s the question asked in a recent Voices of VR podcast by Kent Bye and Rebecca Rouse. The two discussed the early days of cinema — when exploration and experimentation were the status quo — and Rouse drew striking parallels between that era and the current period in XR production and development.

Pure spectacle then and now. Left: a Cinema of Attractions-era still. Right: Chocolate VR.

“[Scholars of early film] came up with this term ‘cinema of attractions’ because they saw an incredible wealth of diversity and kind of range of exuberant experimentation in those early pieces, so they were very hard to sort of clump them together — there was such diversity — but this ‘attraction’ idea was a large enough umbrella, because all of those early pieces are in some way showing off the technology’s capabilities and generate this experience of wonder or amazement for the viewer. And the context in which they were shown is that of attractions, so they were shown at world’s fairs and as a part of vaudeville shows with other kinds of performances and displays.”

 — Rebecca Rouse, assistant professor of communication & media at Rensselaer Polytechnic Institute

Sounds eerily familiar, huh? The whole podcast is well worth a listen, but tldr: while there are obvious consumer and market advantages to XR standards, Rouse argues that perhaps we shouldn’t jump the gun here— not during this era of frenetic, often avant garde XRexperimentation across art, science, cinema and gaming. Looking around the industry, it’s hard to disagree.

EditorXR

One man-eating-the-camera-brilliant new application of XR technology is Unity Labs’ EditorXR. Created by Unity’s far-future R&D team (whose roles often find them working on projects and products five-to-ten years away from consumer adoption), EditorXR offers you an interface to create custom XR Unity scenes entirely within virtual reality.

Oh! And there’s flying, among other superpowers — soar through your scene like Superman or scale the whole thing down to a pinhole. They’ve literally ported the Unity inspector, hierarchy and project windows (again among others) to an increasingly user-friendly VR UI pane on your wrist. With the latest update, you’re able to:

  • hook into Google’s Poly asset database web API in real-time inside VR
  • create multiplayer EditorXR sessions for editing Unity scenes with friends and collaborators
  • run EditorXR with Unity’s primary version 2017.x editor

It’s still new and I’ve encountered bugs, but it’s a foregone conclusion that this tech will become a standard feature of Unity’s scene creation process as XR technology matures and proliferates. Even their alpha and beta efforts evoke the same sense of wonder and possibility that early Cinema of Attractions-era moviegoers must have felt.

For more insight on the design side, check out this deep dive on the future of XR UX design by Unity Lab’s Dylan Urquidi or the Twitter feed of Authoring Tools Group Lead, Timoni West.

ML-Agents

Another experimental Unity project, ML-Agents, explores one of the most promising avenues for the future of XR development, design and UX: machine learning. Using so-called “reinforcement learning” techniques which expressly don’t feed the AI model any sample data or rules for analysis, ML-Agents instead applies simple rewards and punishments (in the form of tiny float values) based on the outcomes to their [usually very narrowly defined set of] behaviors.

Stretched out over hundreds of thousands if not millions of trial-and-error training sessions, the computer experiments with its abilities and forms a model for how to best achieve the desired goal. In this way, your Agents become their own teacher s— you just write the rubric.

The original GitHub commit contained some basic demo scenes and the development community quickly took up the torch from there. Unity’s Alessia Nigretti followed up the original blog with one describing how to integrate ML-Agents into a 2D game. On Twitter, @PunchesBears has been demonstrating similar concepts — and showing that often enough, Agents respond to developers’ carefully calculated reward system in ways they don’t anticipate. Similar to actual gamers, no?

In one of my favorite applications of ML-Agents, the developer Blake Schreurs actually brings a 6-DoF robo-arm Agent trained to seek a moving point in space into virtual reality — with slightly terrifying results once he assigns that moving target to his face.

Imagine someone applying this training model to actual robotics and fat-fingering the wrong key. Or don’t, whatever. 

He’s down for the count! I was immediately reminded of the audiences pouring out of theaters in 1895, afraid they’d be run down by the Lumière brothers’ Arrival of Train at La Ciotat. We’re still in the salad days of both machine learning and XR development compared to where we hope to be 10 or even 50 years from now. In that time, some combination of traditional or procedural AI with these new machine learning approaches will doubtless lead to great developments in gaming and XR at large — or even in the very design process and daily workflow of computing itself.

Rift OS Core 2.0

With Rift’s new Core 2.0 OS, your entire Windows PC is accessible from your right-hand menu button. Being able to view and use your desktop apps, as well as pin windows inside other VR apps, introduces new possibilities for XR workflows (and even for traditional computing workflows) in VR.

While working on my next project, entirely within VR, I can watch Danny Bittman’s great Unity rendering and lighting tutorial on YouTube in a pinned browser while messing with those same settings on my wrist in UnityXR. I can watch @_naam craft original assets in Google Blocks at the same time I do, or I could gather assets from the Poly database and deploy them to my Unity scene in real-time VR, pulling up Visual Studio to code some game logic as I please.

That sounds pretty goddamn metaversal to me — and before long, we likely won’t even need code.

The XR Developer of the Future Is Not a Developer

If XR technology is to go mainstream, the development process must be as efficient and accessible as possible — and likely even open to non-developers through content creation and machine learning applications. Spanning sciences and disciplines, there’s so much more to talk about and speculate over that this piece hasn’t even touched on (next time I’ll examine WebVR and A-Frame as viable XR development pathways). More and more pieces of this accessible, standardized XR development pipeline will fall into place as the immersive computing revolution rolls on, though I’m thankful the XR industry isn’t ready to ditch its Cinema of Attractions ethos quite yet.

Microsoft Joins OpenXR, Becoming a Decisive Backer in the Open, Royalty-free VR/AR Standard

Khronos Group, the consortium behind the OpenXR project which aims to standardize the way applications communicate with AR and VR headsets, just added Microsoft to its ranks. Among its count of members, the OpenXR working group consisted of nearly every major player in the industry except Microsoft until now.

By the virtue of its Windows operating system, the basis of which nearly every PC VR headset uses to function, Microsoft joining the OpenXR initiative represents a win for the others involved, which include industry players like Google, Oculus, HTC, AMD, NVIDIA, Epic Games, Unity, Intel, Qualcomm, Sony, Samsung and Valve.

image courtesy Khronos Group

Although guessing at a company’s motives is a bit like reading tea leaves sometimes, Microsoft taking part in building OpenXR makes a strong case for its ultimate interest in growing the open, royalty-free standard, and not trying to create its own internally developed “DirectXR” that would essentially dictate how headsets will talk to their OS. Up until now, it wasn’t clear which way Microsoft was headed.

SEE ALSO
Revive Creator Joins OpenXR Initiative 'to help create a truly open VR standard'

Khronos says work on the actual OpenXR project has already begun and that it stands to eliminate market fragmentation by forcing VR applications and engines to be ported and customized to run on multiple VR runtimes, and requiring VR sensors and displays to be integrated with multiple driver interfaces.

There are however two reluctant holdouts left; Apple and Magic Leap, but it remains to be seen what either are bringing to the table.

The post Microsoft Joins OpenXR, Becoming a Decisive Backer in the Open, Royalty-free VR/AR Standard appeared first on Road to VR.

Revive Creator Joins OpenXR Initiative ‘to help create a truly open VR standard’

CrossVR’s Revive, the software that allows HTC Vive users to play games from the Oculus platform, today announced it’s joining one of the leading initiatives in creating an open standard for VR and AR apps and devices, otherwise known as OpenXR.

Lead by the Khronos Group, OpenXR aims to eliminate industry fragmentation by creating a standard, royalty-free API that enables applications to target a wide variety of AR and VR headsets. Those already involved in the initiative include the likes of Oculus, HTC, Samsung, Valve, Epic Games, Unity, AMD and NVIDIA to name a few. Khronos has already helped create several open standards including WebGL, Vulcan, and OpenGL.

image courtesy Khronos Group

Jules Blok, the creator and driving force behind Revive, announced on his Patreon early this morning that CrossVR would be officially joining as an Associate Member, something he says will “represent your interests to help ensure that the next generation of VR headsets will have a truly open standard.”

Blok initially stated that, upon reaching the $2,000 per month donation mark, he would invest the $3,500 it took to join as an Associate Member, a non-voting position in the group that allows for full participation in OpenXR’s development.

Having recently reached his goal, in large part due to the help of a $2,000 monthly recurring donation by Oculus founder Palmer Luckey, Blok contacted the the Khronos Group to confirm he had the $3,500 membership fee and was ready to join. To his surprise, Khronos waved the fee, giving him free entrance into the working group.

Blok says the money originally earmarked for the membership fee will be spent on the Revive project instead. Learn more about Revive (and how to install it) on CrossVR’s GitHub.

The post Revive Creator Joins OpenXR Initiative ‘to help create a truly open VR standard’ appeared first on Road to VR.

‘Lone Echo’ and ‘Echo Arena’ Now Work on HTC Vive with ReVive Hack

Platform exclusivity is a divisive issue in VR; on one hand, big financial backing helps create awesome games like Lone Echo [9/10]on the other hand, if you chose the ‘wrong’ headset, you’re boxed out of what might otherwise become one of your favorite games. Thanks to ReVive, a free hack which allows Vive users to play games from the Oculus platform, you can now play both Lone Echo and it’s free multiplayer companion game Echo Arena with an HTC Vive.

With the advent of Revive, a project built by Jules Blok (aka CrossVR), the hack became central to discussion of Oculus’ approach to building a VR platform when Oculus modified their DRM in a way that prevented Revive from functioning, thus blocking Vive users from playing Oculus games. Community outcry over the decision eventually led Oculus to reverse that particular stance on DRM, saying that in the future they wouldn’t use headset verification as part of the platform’s security protections.

Now, two of the most well-received Oculus-funded games—both the campaign mode Lone Echo selling for $40 and the free multiplayer mode Echo Arena—have gained unofficial support for the HTC Vive. And with a native 360-degree setup already supported by Oculus, it’s practically plug-and-play. Of course, there’s also no telling if Oculus’ decision will hold into the future, so the mantra “buyer beware” is still in effect for potential Revive users looking to purchase on the Oculus Store.

SEE ALSO
Vive Users Can Join Rift Friends in ‘Facebook Spaces’ With Revive Hack

OpenXR (formerly Khronos VR) is also looking to unite what it considers a fragmented market by advocating a universal cross-platform standard that, according to the developers, enables applications to be written once to run on any VR system, and to access VR devices integrated into those VR systems to be used by applications. Names like Epic Games, AMD, ARM, Valve, Google and even Oculus are helping with the initiative.

Legendary programmer and Oculus CTO John Carmack had this to say about OpenXR:

“Khronos’ open APIs have been immensely valuable to the industry, balancing the forces of differentiation and innovation against gratuitous vendor incompatibility. As virtual reality matures and the essential capabilities become clear in practice, a cooperatively developed open standard API is a natural and important milestone. Oculus is happy to contribute to this effort.”

Oculus founder Palmer Luckey, who left the company back in March, has also backed ReVive financially to the tune of $2,000 per month to support its continued development.

The post ‘Lone Echo’ and ‘Echo Arena’ Now Work on HTC Vive with ReVive Hack appeared first on Road to VR.

Tobii Recommends Explicit Consent for Recording Eye Tracking Data

Johan-HellqvistThe eye tracking company Tobii had some VR demos that they were showing on the GDC Expo Hall floor as well as within Valve’s booth. They were primarily focusing on the new user interaction paradigms that are made available by using eye gazing to select specific objects, direct action, but also locomotion determined by eye gaze. I had a chance to catch up with Johan Hellqvist, VP products and integrations at Tobii, where we discussed some of the eye tracking applications being demoed. We also had a deeper discussion about what type of eye tracking data should be recorded and the consent that application developers should secure before capturing and storing it.

LISTEN TO THE VOICES OF VR PODCAST

One potential application that Hellqvist suggested was amplifying someone’s eye dilation in a social VR context as a way of broadcasting engagement and interest. He said that there isn’t explicit science to connect dilation with someone’s feelings, but this example brought up an interesting point about what type of data from an eye tracker should or should not be shared or recorded.

Hellqvist says that from Tobii’s perspective that application developers should get explicit consent about any type of eye tracking data that they want to capture and store. He says, “From Tobii’s side, we should be really, really cautious about using eye tracking data to spread around. We separate using eye tracking data for interaction… it’s important for the user to know that’s just being consumed in the device and it’s not being sent [and stored]. But if they want to send it, then there should be user acceptance.”

Hellqvist says our eye gaze is semi-conscious data that we have limited control over, and that this is something that will ultimately be up to each application developer as to what to do with that data. Tobii has a separate part of their business that does market research with eye tracking data, but he cautions that using eye tracking within consumer applications is a completely different context than market research that should require explicit consent.

SEE ALSO
Watch: Tobii Reveals Vive VR Eye Tracking Examples, Including 'Rec Room'

Hellqvist says, “It’s important to realize that when you do consumer equipment and consumer programs that the consumer knows that his or her gaze information is kept under control. So we really want from Tobii’s side, if you use the gaze for interaction then you don’t need the user’s approval, but then it needs to be kept on the device so it’s not getting sent away. But it should be possible that if the user wants to use their data for more things, then that’s something that Tobii is working on in parallel.”

Tobii will be actively working with the OpenXR standardization initiative to see if it makes sense to put some of these user consent flags within the OpenXR API. In talking with other representatives from OpenXR about privacy I got the sense that the OpenXR APIs will be a lot lower level than these types of application-specific requirements. So we’ll have to wait for OpenXR’s next update in the next 6-12 months as to whether or not Tobii was able to formalize any type of privacy protocols and controls within the OpenXR standard.

SEE ALSO
Valve Talks 'OpenXR', the Newly Revealed Branding for Khronos Group's Industry-backed VR Standard

Overall, Tobii’s and SMI VR demos that I saw at GDC proved to me that there are a lot of really compelling social presence, user interface, and rendering applications of eye tracking. However, there are still a lot of open questions around the intimate data that will be available to application developers and the privacy and consent protocols that will inform users and provide them with some level of transparency and control. It’s an important topic, and I’m glad that Tobii is leading an effort to bring some more awareness to this issue within the OpenXR standardization process.


Support Voices of VR

Music: Fatality & Summer Trip

The post Tobii Recommends Explicit Consent for Recording Eye Tracking Data appeared first on Road to VR.

GDC 2017: Google Daydream Support Coming To Unity On March 31st

GDC 2017: Google Daydream Support Coming To Unity On March 31st

At GDC 2017, Unity held a keynote discussing their impact on the gaming industry, future updates to the engine, and much more. The Unity engine and titular company have been very accommodating when it comes to VR development, having added a feature that even allows developers to work within VR and provided the framework that makes it easier to develop for VR in general. Thus, it’s no surprise that the new 5.6 update to the engine included a couple announcements that benefit VR platforms. The update itself also now has a release date.

The focuses on stage were the fixes for the NavMesh and the progressive lightmapper that adds improved lighting workflow with path tracing solutions & flexible ways to merge shadows. In between details on features, they brought up surprise guests to demo the new Power Ranger Legacy Wars mobile game. The original Green Ranger himself, Jason David Frank, and the Black Ranger in the new film, Ludi Lin. The two played a match against each other, which was won by the Green Ranger of course, and then they shared another announcement for Unity 5.6. For Android and iOS, 5.6 adds support for Google Daydream and Google Cardboard and the full launch of the update will happen on March 31st.

Unity 5.6 also will ship with support for the Vulkan API, a GPU standard from Khronos Group who announced updates to the API and the new VR/AR standard OpenXR at GDC this year. The demo shown on stage was for a mobile game which showed 10%-15% reduction in power consumption when using the Vulkan API over OpenGL|ES. Hopefully, with the added VR extension for the API, immersive experiences will see a rise in efficient computing for the highly demanding virtual experiences.

If you’d like to get early access to Unity engine features, you can join the beta via their website with 5.6 as part of the beta as of today. Stay tuned to UploadVR for more news from GDC 2017.

Tagged with: , , , , , ,

GDC 2017: Khronos Group Unveils VR/AR Standard OpenXR

GDC 2017: Khronos Group Unveils VR/AR Standard OpenXR

In December of last year, Khronos group made headlines for adding Epic Games to the list of companies in support of their pursuit of an API standard for virtual and augmented reality. Representatives for Google VR, Intel, and others already voiced their support of Khronos’ work by that point, but Epic’s Unreal Engine allowed Khronos to focus on a potential wider adoption of the standard. A couple months later, that standard now has a name as Khronos unveils the OpenXR working group at GDC 2017.

Visual representation of the fragmented industry without a standard.

As noted when we reported on the Epic Games support for Khronos’ standard, there’s a degree of fragmentation in the VR and AR industry as different groups attempt to innovate with their different interfaces. OpenXR aims to address the fragmentation created as a result of developers having to port to the APIs of different vendors. The OpenXR website points out that this practice leads to higher development costs and confused customers, which limits industry growth. The solution of a set standards means that application devs will only have to write code once and it will run everywhere.

Also announced at GDC, Khronos’ GPU standard Vulkan is gaining momentum. A handful of games have been released utilizing the standard since it was made available back in February of 2016. They’re releasing new extensions for VR and multi-GPU functionality.

Vulkan is a testament to the benefits of a standard and they’re looking for more companies to jump on board for OpenXR as they’ve moved beyond the exploratory phase and are now developing the actual standard. Samsung, Oculus, Valve and a great many others are on board and it is likely many more will jump on board now that there’s a finish line in sight that mutually benefits the industry collectively.

Tagged with: , , , , ,