Oculus App Lab is a way for developers to publish their apps on Quest without being subject to curation, but it comes with a few caveats like not being visible in the main Quest store. Since its launch, App Lab games also haven’t been able to offer DLC or in-app purchases, but the company says it’s working to add that capability later this year.
While App Lab games can be free or paid, they can’t currently offer ‘add-ons’ like downloadable content (DLC) or in-app purchases (IAP). That basically makes the free-to-play and live-service pricing models (where a game is offered for free up-front and supported by add-on purchases) invalid for App Lab developers.
Fortunately that should be changing later this year. Road to VR reached out to Oculus about the topic which said, “we’re currently working on adding support for in-app purchases on App Lab and are aiming for an update later this year.”
The addition of DLC and in-app purchase capabilities could have major implications for App Lab developers.
For instance, the unexpectedly popular early access game Gorilla Tag is free, but as a multiplayer title, the costs associated with running the game scale with the number of players. On Steam, the developer is able to offer a paid ‘Supporter Pack’ which gives players an optional way to support the game’s ongoing development in exchange for in-game cosmetics. But on Quest, where most of Gorilla Tag’s players play, this isn’t currently possible.
The same thing applies to games that would like to add additional content modules to their game over time. Beat Saber, for instance, sells new song packs on an ongoing basis to give players more to do in the game. If Beat Saber was on App Lab, this wouldn’t currently be possible because of the DLC and IAP limitation.
The latest update to the Oculus Integration for Unity includes a beta release of a spectator camera capability which will allow developers to show a third-person view to spectators watching on a smartphone or web browser.
Today if a friend wants to watch the action going on inside your Quest game, you can cast your first-person view to your smartphone (through the Oculus app), to a TV (with Chromecast), or to a web browser (through the Casting page). Though this offers a window into the virtual world for spectators, the first-person view from a VR headset isn’t always the best way to see what’s going on because of the cropped field-of-view and motion of the player’s head.
To improve spectating, some VR games on PC and console show the action from a third-person view, which shows clearer context about what’s happening in the scene and eliminates the shaking head motion. Soon, developers on Quest will be able to add the same kind of feature to their games.
The feature can also be used in tandem with the typical first-person view, allowing the developer or user to switch from one to another as they please from within the application.
The feature may also open the door to developers being able to add a stabilized first-person view, as spectator views in some PC and console VR games have done. Oculus also previously showed a version of the feature where the spectator could even control the orientation of the third-person camera though the smartphone, though it isn’t clear if that functionality is included in this beta release.
Oculus notes that rendering the extra third-person view will have an impact on app performance, so we’d expect the feature to be limited to applications which are highly optimized and already run well on Quest, or developers could choose to enable the feature only on Quest 2 thanks to its additional power.
Oculus Developer Hub 1.8.0 adds the ability to remotely launch URLs in Oculus Browser, including for WebXR experiences.
If you don’t know what Oculus Developer Hub (ODH) is: it’s a Windows & macOS app designed to make it easier to manage & develop for Oculus Quests.
ODH lets you see the headset’s view, access screenshots & recordings, track performance metrics, view device logs, sideload apps, download Oculus SDKs, disable the proximity sensor, and more. While ODH is aimed at developers, we consider it a must-have for power users too.
The flagship new feature is the ability to launch URLs on your headset from your PC. This can be regular websites, a 360 YouTube video, or even full fledged WebXR experiences – SideQuest has a section for these. That’s a lot more convenient than using the in-VR keyboard to manually find the URL you want. For WebXR developers, it could even be a game changer.
The update also adds the ability to upload builds directly to App Lab & Oculus Store. That was already possible from the Oculus developer dashboard in a browser, but now you won’t need to leave the Developer Hub app.
It also adds Metrics Recording, meaning OVRMetrics performance data is saved to files which you can open to see detailed data & graphs.
The Unity Cube is an experiment from developer Tony “SkarredGhost” Vitillo to test the limits of what Oculus will allow into the App Lab program. Vitillo submitted a fully functional application which simply presents the user with a cube in a blank environment—and Oculus accepted it.
Oculus App Lab is an alternative path for developers to publish applications on Quest. For a long time the only official way to distribute an app on the headset was to submit it to the main Quest store, but Oculus would only accept applications which meet certain quality criteria, like how much content the app offered to users and whether it was appropriately polished. This made it difficult for experimental and smaller projects to get their app in front of the Quest audience.
Earlier this year, Oculus finally began offering an alternative distribution approach for Quest, called App Lab, which allows developers to submit applications for distribution—without any judgement on quality or scope—with the caveat that App Lab apps aren’t shown in the main Quest store, leaving it up to developers to point their audience to the app’s page.
To test whether Oculus is truly taking a hands-off approach to the content of App Lab apps, developer Tony “SkarredGhost” Vitillo created The Unity Cube. As the name implies, the app is simply a blank Unity environment with a grey cube—that you can’t even interact with. Even at the great price of free, this app would never stand a chance of making it onto the main Quest store. But could it make it onto App Lab?
Behold, The Unity Cube! | Image courtesy Tony Vitillo
Indeed, it turns out that Oculus won’t judge the scope of App Lab apps, as long as they meet technical requirements and content guidelines (ie: limitations on adult or illegal content, and certain app categories).
Thanks to the experiment, Vitillo also learned some things about the App Lab submission process which he shared in his breakdown of the project. We boiled down his lessons below:
The App Lab submission process is fairly easy and shouldn’t take more than an hour
App Lab store page requirement must be met before an app is accepted, including at least five screenshots that all look different
Apps which request permissions which are not actually utilized (ie: use of microphone) will be rejected from App Lab until the permissions are removed
The submission took 5–6 weeks to be reviewed. It was initially rejected for not meeting some technical requirements. After resubmission it took 4–5 days to be approved.
From Vitillo’s perspective, this experiment shows that Oculus is as hands-off as they promised to be regarding the content of App Lab apps.
“Most importantly, Facebook lets you publish whatever you want. I mean, they have published an app with just a cube… this means there is absolutely no content curation. And I’m very happy about it, it means that on App Lab there is a lot of space for freedom and creativity,” he wrote. “I would still like it to be a bit more open, allowing cloud streaming applications and [adult] experiences (both are forbidden at the moment), and I would like App Lab applications to be more visible… but as a first step, I think it’s good. You can publish on App Lab whatever you want. Be brave, I even submitted a cube and I’ve been approved!”
In the decade that I’ve spent studying and reporting on VR I’ve had the fortune of experiencing a wide range of VR game design—a craft which is still far from ‘figured out’. And while I have no intention of becoming a game designer by profession, I thought it might be a good learning exercise to try my hand at designing something in VR to better understand the challenges and opportunities therein. And here I’d like to share a bit about what I learned through that process.
For context, this was a personal project conceived by myself and my developer friend Henry Ventura. We set out to see how effectively we could translate a weapon from a non-VR game into a working VR version that maintained the weapon’s ‘character’.
We looked to the arsenal of Respawn Entertainment’s Apex Legends (and its cousin Titanfall) for inspiration. The game’s iconic weapons are all quite beautifully modeled and animated, giving us a great blueprint of both function and ‘character’ to work from. We chose the Wingman revolver, and here’s what we came up with in less than a day of development time.
This was built with Unreal Engine 4 running on PC; unfortunately we won’t be distributing a demo build.
What I Learned
Although I helped a bit with textures and sounds, my role in this project was primarily interaction direction—that is, directing the way the weapon, and its various interactions, should feel.
Choose Your Weapon
The first part of the process was, of course, choosing which weapon we were going to build. If you were building an actual VR game with guns, the type of guns you choose to use can massively impact the rest of the game because of how their interactions will dictate both the player’s attention and the game’s overall pacing.
The choice of the Wingman revolver was highly influenced by its one-handed operation. While two-handed weapons can definitely be fun in VR games, they also demand more of the player’s attention and result in the player having one less hand to interact with the environment around them.
With the Wingman in particular, the reload gave us a great bang for our buck because the player gets to use both hands for the reload without needing to dedicate the off-hand entirely to the weapon. It’s clear that Valve’s Half-Life: Alyx was also built around this idea, as the game features only single-handed weapons—even going so far as turning a shotgun (pretty much always a two handed weapon) into a shotgun-pistol.
Character Acting
A major goal of this project was to bring the Wingman’s essential ‘character’—the way you imagine that it would feel—into VR. If we were building from scratch, we’d need to spend time figuring out what that character should be. In this case, Respawn already did much of that heavy lifting by giving us a very clear vision of the gun’s look and feel through animations and sounds that we drew upon.
I was actually quite surprised how effectively we were able to bring the character of the gun into VR. The Wingman is a hefty gun that feels like something an expert gunslinger would confidently use. Our finished VR version, especially with physics-driven gun-spinning as an optional flourish, really does make one feel like a badass gunslinger.
Our success in effectively translating the gun’s character into VR is interesting in itself because it suggests that designing VR weapons in the ‘traditional way’ (ie: by animating how you’d like them to look and feel) could be an equally valid starting point for figuring how the weapon should behave in VR. In the traditional method, it’s an animator’s vision which defines the feel; that artistry, it turns out, can act as excellent ‘concept art’ for how a gun should be designed to feel in its VR incarnation.
A carefully designed weapon in VR can make the player themselves feel a certain way. It isn’t just about pointing the gun at enemies to make them disappear. A VR gun can be almost like a ‘costume’ that gives players license to ‘play act’ in a way encouraged by the game designer.
A little secret… the fan-firing seen in the video (when the off-hand fires the gun by pulling the hammer back) isn’t a real mechanic, it was the result of ‘acting’. We came across the idea while literally playing around with the gun and acting out a fantasy.
When we realized how fun it ‘felt’ (even though we were pretending the mechanic existed), we knew we had to include it. Since this was prototype, there was no need to build out the actual mechanic when it could simply be ‘performed’ to get the point across. Fake it ’til you make it.
Communication During Development
While we were developing and testing the Wingman interactions, I found it was often difficult to articulate exactly what kind of changes I’d like to see. I take it this is normal for all game design, but it seems especially true for VR given the immense freedom that the player has to interact with objects.
In many cases one of us would be using the Wingman in VR while the other watched via screenshare. This work reasonably well to try things out and talk about them on the fly, but at times I was wishing that we could simply both be in VR and looking at the same thing together. Simply being able to point with my finger in 3D space and have my own version of the gun to (again) ‘act out’ how certain things should work, would have sped up development and made for clearer communication.
Capturing the Video was a Surprisingly Large Task That Could Have Been Optimized
I have a newfound appreciation for the Valve developer who live-captured all of the shots for the Half-Life: Alyx trailer, which really does a good job of selling what the experience feels like when you’re actually inside of it.
We really wanted to have one continuous sequence that would both demonstrate the full operation of the gun and truly show what it felt like. Between choreographing the actions and practicing the gun spins and aiming, it took several hours to get a shot we were finally happy with—and this was only a 30 second video!
Simply using the gun in a way that you naturally would in VR, and capturing the output screen, looked pretty bad because there was lots of head movement and the cropped field of view meant it was hard to tell what the framing would look like to the spectator.
Our process was to have one of us using the gun in VR while screensharing to the other. The person outside of VR would help direct each take to ensure that things were framed well and that there wasn’t too much camera movement from the spectator’s perspective. In hindsight, there’s a few things we could have done differently to speed up the process and end up with a better result.
While the live direction was certainly helpful, our biggest stumbling block was the skill needed for execution! Aiming had no assists and gun-spinning was fully-physics based, so a major portion of the time it took was simply the practice necessary to do it all with no mistakes (all while ensuring the head movements didn’t impact framing). We had tons of takes that were almost right, but we often missed a barrel or had some other minor stumble.
It certainly would have saved us time if we had scripted the barrels to register a hit with each trigger pull (no aiming required). While this risks looking bad if the aim is too far off, it would have given us a much larger margin of error to work with, and allowed us to spend more time perfecting other aspects of the shot.
Another useful change would have been to project a frustum onto the environment that matched the field of view of the spectator camera. This would have allowed the person in VR to see exactly what portion of the action would be captured in the final output, and also provided clear feedback to show how much the spectator camera was being moved when the head was moved. A smoothed spectator camera (as seen in some VR games) would have helped too.
The end-all-be-all solution would be to record the actions through the game engine so that we could make minor adjustments and then re-render the scene after the fact. This would also have given the option to render the video in higher fidelity (which would be especially useful for anyone making a production VR trailer).
– – — – –
I’m really happy with how the project turned out. Not only was the gun fun to use, but I gained a new appreciation for the challenges and opportunities of a small slice of VR game design.
While VR is capable of much more than just shooters, there continues to be a huge demand for VR games based on gun combat. If the industry is going to be beholden to that demand, I would personally love to see more attention paid to the ‘character’ and interactions of VR guns.
To me, the interactions that happen within arms reach are among the most compelling parts of a VR game. VR guns should be thought of less as laser pointers that cause damage to distant enemies and more as interactive tools.
Thinking first and foremost about how the player will interact with the gun (rather than how the gun will interact with enemies)seems like an essential part of designing a VR gun with character.
While we stuck with a design rooted in reality, this project has further convinced me that there’s a huge untapped opportunity for creative ‘sci-fi’ guns that work in totally novel ways that are uniquely fun in VR. If guns are going to be the essential ‘tool’ the player is given in many VR games, I’d love to more thinking outside of the realistic assault rifle, SMG, sniper, shotgun paradigm.
Oculus today introduced a program called App Lab, a new effort which will allow developers to distribute their VR apps on Quest without being subject to a quality review ‘curation’ process. Though apps still need to meet Oculus’ technical guidelines, and won’t appear in the Quest store like formally approved apps, they can be independently distributed and even sold without a cumbersome sideloading process.
With Quest, Oculus introduced a ‘curated’ approach to the headset’s app store. Since launch, any app published on the Quest store has needed to meet technical, content, and quality guidelines. That means Oculus has manually reviewed each application and made a judgement call on whether or not it meets the quality bar it’s hoping to maintain. This has meant that many developers who would like to offer their apps on Quest have been barred from doing so.
Oculus has now delivered a long promised workaround that makes it easier for developers to distribute their apps on the headset, even if the company isn’t yet ready to include them in the store.
Called App Lap, the program allows developers to upload their app to the Oculus store infrastructure and create essentially the same app product page that approved apps get to use. The only major caveat is that App Lab apps won’t be shown to users who are browsing the usual Quest store. This leaves it up to developers to point their audience to the app’s product page. Luckily, as long as users know the exact name of the app, they can even find it with a search in the regular Quest store search.
In order to be part of App Lab, apps still need to follow Oculus’ technical and content guidelines (meaning no adult apps, or other content which is against the company’s policies), but apps will not be reviewed for quality.
What’s more, App Lab apps can be free or paid, and will appear in the user’s Quest library just like any other app the user owns. App Lab apps can even be updated through the same automatic update process as apps in the Quest store, and can access “the majority of standard platform features, including automatic update distribution, platform integration and SDKs, app analytics, release channels, and more,” Oculus says.
App Lab, which allows developers to skirt Quest curation, is a more streamlined approach to unofficial distribution channels which have relied on ‘sideloading’. Sideloading is intended to allow developers to load and run applications directly on Quest without downloading through a hosted platform. However, this also functioned as a back door for users to run applications downloaded outside the store.
A popular platform and application, called SideQuest, sprung up to formalize that process, effectively creating an unofficial Quest store for users to easily browse, download, and sideload apps onto their headset.
Fortunately, Oculus has embraced SideQuest, and worked directly with the creators to allow SideQuest product pages to point users directly to App Lab hosted apps. “Because App Lab apps do not require sideloading, developer mode, or a PC to install, we expect that this support will dramatically increase the reach of SideQuest apps that use App Lab for distribution,” Oculus says. “SideQuest supports App Lab apps starting today, and community-focused platforms of that nature may play a bigger role in Quest’s future.”
Beyond providing an avenue for experimental and less polished games to be distributed on Quest, App Lab will also make it easier for non-game applications to make it onto the headset.
Playing Oculus Quest games with friends over the past few months revealed a major system-wide flaw: most developers don’t properly support Oculus Party voice chat.
That’s not to put blame on developers – Facebook’s system architecture is non-standard and somewhat confusing. The result for players is a Party voice chat system that feels like it barely ever works.
The cause isn’t a bug – the system is technically working as intended.
At this point you may be wondering; what does it even mean to “support” Party voice chat, and why does doing so matter?
The Problem
Just like Xbox Live or PlayStation Network, the Oculus platform lets players invite friends to a ‘Party’ – a background group voice call. Since Quest is architected like a console, you can’t use 3rd party alternatives like Discord (at least not backgrounded).
Some games even let you launch directly into a session with your Party – but that’s rarely used & unrelated to the issue I’m discussing today.
This method is supposed to be used temporarily, followed by setting it to false when the microphone stops being needed.
But here’s the problem: most multiplayer Quest games set it to true throughout the entire app lifecycle to ensure the microphone is always available. This has serious consequences for players trying to team up & coordinate for your game.
The Consequences
A group of Quest-owning friends decides to play some VR games together. The first hops on and invites the others to a Party. They join and together decide on a game.
Since the game calls SetSystemVoipSuppressed(true) on launch and never un-suppresses it, each friend will stop transmitting to & hearing from the others in the Party immediately upon the game loading.
In the best case scenario – games with intuitive friend-based invite systems (eg. Population ONE) – the group of friends will be able to hear each other again relatively soon by creating an in-game party.
In the worst case scenarios – games using invite codes or passwords (eg. Onward) – there is now no way for the friends to group up & hear each other without taking off their headsets & exchanging the code via their phones (or using the clunky Oculus text chat messaging system).
In both cases, the games mute the Party audio long before they’ve finished loading, meaning the player is initially left in a silent black void.
Players experiencing these issues often assume either the game or Oculus Party system is broken. A quick “Oculus Party chat not working” Google search confirms this, with leagues of frustrated users.
The Solution
The solution is to only enable SetSystemVoipSuppressed for the times your app needs the microphone, not throughout the full app lifecycle.
When the player exits an active multiplayer lobby or session, re-enable their Party voice chat by calling:
They’ll be able to coordinate with their friends again to get back into a session. You should leave Party chat un-suppressed throughout menus, offline tutorials, and single player content – anywhere except an active multiplayer lobby.
Given the issues around coordinating into lobbies, it could even be argued the Party Chat shouldn’t be disabled until there’s more than 1 occupant of the lobby.
Making these changes will make it easier & less frustrating for groups of friends to play your game on Oculus Quest.
Long-time Valve VR programmer Joe Ludwig, working in his own capacity, is building an open-source platform to bring AR-like utilities into virtual reality. The project, called Aardvark, is something of an evolution of an extension of VR dashboards as we know them today, bringing new functionality into interactive and spatially-aware ‘gadgets’ that can run inside any VR application.
Joe Ludwig is a Valve programmer who has been closely involved with the company’s VR efforts since the early days. Lately he’s been working on his own open-source project called Aardvark which essentially wants to bring augmented reality into VR—meaning a ‘layer’ for lightweight spatial apps to run inside of virtual reality spaces.
Like other VR environments, SteamVR already has a dashboard which the user can call up while inside any application to access useful information, like browsing their game library or changing settings.
While VR dashboards provide useful core functionality, they are essentially big floating screens that appear on top of your current virtual reality app. Aardvark, on the other hand, aims to allow small AR-like utilities called ‘gadgets’ to run inside existing VR applications to provide additional functionality.
For instance, you might want to build a screenshot tool that takes the form of a virtual camera which the player uses to take photos of the virtual world. Rather than building such functionality into a single game, that kind of tool could potentially be built as an Aardvark gadget which could operate inside of any VR application. Similarly, simple utilities like timers, web video players, Twitch chat boxes, drawing boards, friends lists, etc, could be built as Aardvark gadgets which the player can make use of no matter what game they’re inside.
Aardvark is still in quite early development, with mostly basic example gadgets so far, but Ludwig gives a breakdown of what kinds of things they could do and what they look like running inside of a VR environment:
Interestingly, Aardvark gadgets are, in a sense, written as ‘web apps’, where the gadget’s functionality is a defined similarly to a webpage, and Aardvark is the ‘browser’ that renders it into the virtual space. But it’s not like WebXR, which actually renders its own full scene directly. Ludwig say this approach is primarily for performance and scalability.
[…] Aardvark in some ways is my whitepaper […] what I think is the right approach is that JavaScript works very well in a declarative environment already. When you open a web page, what you’re looking at is some HTML and some CSS and some images that were generated by JavaScript. And that JavaScript runs—not every time you need to generate a pixel because your monitor’s refresh rate is 60Hz—what the JavaScript does is it either declares in the first place or manipulates the declared HTML elements and then those run through a layout engine that’s written in C++ that chews on them, does it very quickly—figures out how big all the boxes are, figures out how big the fonts are—renders all that stuff […] that all feeds into that rectangle that’s on your monitor, and the JavaScript only runs when you click a thing or when you drag a thing or when you mouse over a thing.
So the JavaScript runs at the events that happen at a human time scale—or an interaction timescale—where they’re a few times a second instead of 90 times a second or 144 times a second [the rendering rate of VR headsets]. And the native code—the C++ code—does the smooth animation of the video or the smooth animation of the controls sliding in over the course of several frames when you mouse over a thing—that’s all in C++. You express your intent through these declarative approaches of HTML and CSS, and then the native code—the system of the web browser—actually does the work to render that to the user.
So Aardvark does a similar thing. In Aardvark, at no point do you take that WebXR approach—which is to ask the system ‘where the hand is, load a model, draw the model where the hand is’. You don’t do that [in Aardvark]. What you do is you say ‘draw this model of the hand’ and you hand that down to Aardvark and Aardvark says ‘oh I’m drawing this model relative to the hand’ […] but the statement you’re making is ‘draw it on the hand’. What that means is that in 11ms later, when you’re hand moves a few millimeters to the left, Aardvark knows its on the hand and it draws it on the new hand position.
So Aardvark needs to run at framerate, but none of the gadgets need to run at framerate. And if you have a gadget that’s slow, it doesn’t matter, because it doesn’t have to run at framerate. So between the performance implications of doing things in a declarative way, and the visual fidelity implications of using a scene-graph to composite instead of using these depth-buffers and pixel maps to composite, I think that Aardvark is taking an approach that’s more scalable in a lot of ways and will end up with higher quality and higher fidelity results in a lot of ways. But part of the reason that I’m building and working on it is to kind of prove out that thesis. I don’t think it’s settled yet. […] eventually we’ll find out what the answer is.
The ‘browser’ approach also brings other benefits. For one, gadgets can be built with the sort of functions you’d expect from any website—the ability to render text, load images, and pull information from other parts of the web. Being based on the web also means distribution and maintenance of gadgets is easy, says Ludwig, because gadgets are basically web pages that anyone can access via a URL. As long as you know how to write the gadget, distributing it is as easy as hosting a website and sending people the URL.
In Ludwig’s discussion on Voices of VR, he notes that development of the platform is still ongoing and much of the functionality is minimally defined—that way Aardvark can evolve naturally to fit the use-cases that gadget builders envision.
Right now, Ludwig says, the project is mainly looking for contributors to experiment with building their own gadgets. If you’re interested in building gadgets or contributing to the underlying platform, check out the Aardvark GitHub page.
Seit 2017 veröffentlicht die Online-Vertriebsplattform Steam Bestenlisten für die erfolgreichsten VR-Spiele des Jahres. Die neueste verfügbare Bestenliste für das Jahr 2019 erlaubt mit etwas Recherche einen Einblick in das Geschehen hinter den Kulissen. Gastautor Adrian Maleska ist den folgenden Fragen nachgegangen: Welche 3D-Engines wurden in den erfolgreichsten VR-Spielen verwendet? Welche Sparten/Genres wurden am meisten frequentiert? Und schließlich: Welchen Einfluss hatte die Preisgebung auf den Verkaufserfolg?
Die aktuelle VR-Bestenliste der Topseller gilt für das vergangene Jahr 2019. Der spärlichen Beschriftung nach werden darin “Die Top-VR-Spiele von 2019 gemessen am Gesamtumsatz dieses Jahres” prämiert. Schaut man sich die korrespondierende Bestenliste des Jahres 2017 an, heißt es darin “Top VR-Erlebnisse nach Bruttoumsatz…” daher ist anzunehmen, dass es sich bei allen Topseller-Listen um Bruttoumsätze handeln. Somit ist auch klar, dass die Platzierung nicht nach dem meisten verkauften Stückzahlen erfolgte, sondern nach den Umsätzen. Dies ist sofern relevant, dass ein hochpreisiger Titel weniger Stückzahlen braucht, um darin einen höheren Rang zu erreichen als ein günstigerer Titel.
Die Kategorien Platin, Gold, Silber und Bronze fassen die erfolgreichsten Spiele und Anwendungen zusammen, die die Kasse am meisten zum Klingen gebracht haben. Steam gibt keine Auskunft darüber, wie viel Umsatz die Spiele jeweils erzielt haben, sicher ist jedoch – je höher die Position in der Liste, desto umsatzstärker ist dieser Titel.
Die 3D-Engines hinter den Spielen
Während die Softwarehersteller die verwendete Engine nicht zum Geheimnis machen, ist diese Information nicht ohne weiteres zu finden. Es gibt Entwickler, die voller Stolz das Logo der Engine prominent im Trailer oder Banner präsentieren. Diese gehören jedoch zur Minderheit. Um an die Informationen zu den Engines heran zu kommen, ist es nötig weiträumiger zu suchen. Als Quellen für die Recherchen dienten neben Wikipedia.de die Spieledatenbanken MobyGames.com und IndieDB.com für die grobe Orientierung. In wenigen Fällen wartete dort bereits die Information zur 3D-Engine, was die Suche immens vereinfachte. In den meisten Fällen waren jedoch weitreichendere Recherchen nötig. Über die Webseite der Entwicklerteams, durch die Stellenausschreibungen oder Anzeigen Jobbörsen ließ sich schließlich eine eindeutige Zuordnung vornehmen. Denn wer nach Unity- oder Unreal-Software-Entwickler sucht, wird folglich diese Engines für die Produktion verwenden.
Schwieriger war es die Informationen zu den Titeln der Einzelentwickler zu erfahren. Oftmals haben diese keine eigene Webseite und kein Team, für das sie Verstärkung suchen würden. In den Foren der 3D-Engines gab es teilweise Aufschluss darüber. Manche auskunftsfreudigen Entwickler beantworteten direkt auf Spielerfragen zur 3D-Engine auf Steam und Reddit.
Am einfachsten fiel es die großen Entwicklerstudios der Spielebranche einzuordnen, die jeweils auf ihre hauseigenen Engines setzen. So verwendet Valve die Source Engine für “Half-Life: Alyx”. Bethesda Software die Creation-Engine für Fallout 4 VR und Skyrim VR. Rockstar Games die Rockstar Advanced Engine in “L.A. Noire: The VR Case Files”. Croteam, bekannt für Serious Sam, verwendet bei “The Talos Principle VR” erwartungsgemäß die Serious Engine. Bei id Software’s “DOOM VFR” – wie könnte es auch anders sein – schlägt das 3D-Herz der id Tech Engine.
Das sind die erfolgreichsten 3D-Engines 2019
Die 3D-Engines der großen Spielehersteller machen mit 8 Titeln gerade mal 11,63% der Topseller aus. Dies war jedoch zu erwarten, denn es handelt sich um proprietäre Lösungen, die für den internen Gebrauch bestimmt sind und somit nicht zur Lizenzierung an Dritte freistehen. Aus diesem Grund wird das weite Feld von den zur Lizenzierung freistehenden 3D-Engines Unity und Unreal bestimmt, die zusammen mit 76 Titeln und 88,37% Anteil die Topseller-Liste klar anführen.
Den größten Anteil nimmt die Unity-Engine mit 54 Titeln und 62,79% Anteil ein, gefolgt mit gehörigem Abstand von der Unreal-Engine mit 22 Titeln und 25,58% Anteil. Die Meinung, dass Entwickler das meiste Geld mit Unreal verdienen (Die meisten Entwickler nutzen Unity – Geld macht man aber mit Unreal https://www.vrnerds.de/die-meisten-entwickler-nutzen-unity-geld-macht-man-aber-mit-unreal/) scheint zumindest auf VR-Spiele nicht zuzutreffen. Diese Position bekräftigt der Anteil von Unity zu Unreal in der umsatzstärksten Platin-Kategorie mit 63,64% zu 18,18%.
Hinweis: Die ausführliche Liste der Topseller und deren 3D-Engines finden sie am Ende des Artikels
Durchschnittspreis der Titel pro 3D-Engine
Auffällig ist der große Preisaufschlag derjenigen Titel, die proprietäre 3D-Engines nutzen. In der prestigeträchtigen Kategorie Platin beträgt dieser satte 35,79 Euro gegenüber der anderen Enginearten. Eine Kategorie tiefer, unter Gold, +26,72 Euro Aufschlag. Bei Bronze wird es etwas mäßiger mit +9,64 Euro. Über alle Kategorien hinweggesehen, beträgt der Aufschlag +19,17 Euro. So liegt der Preis der proprietären Titel doppelt so hoch, wie die Titel, die mit Unity oder Unreal realisiert wurden. Diese Entwicklung ist nicht weiter verwunderlich, befinden sich unter den hochpreisigen Titeln etablierte Marken, wie Fallout 4 VR, Skyrim VR, Half-Life: Alyx und Skyrim VR.
Keine so großen Preisdifferenzen der Durchschnittspreise sind zwischen Unity- und Unreal-Titeln fest zu stellen. Deren Preisgeben bleibt auf vergleichbarem Niveau. Überraschend ist jedoch, dass Unity-Titel etwas mehr kosten als Unreal-Titel. In der Kategorie Platin +2,43 Euro, bei Gold +2,02 Euro und unter Silber geringfügig mehr mit +0,56 Euro. Abweichend bleibt die Kategorie Bronze, in der Unreal-Titel im Durchschnitt +4,63 Euro mehr als Unity-Titel kosten.
Betrachtet man die Durchschnittspreise pro Kategorie, sieht man einen fast kontinuierlichen preisaufstieg je nach Höhenlage der Kategorie. Einzig die Gold-Titel sind gemäßigt günstiger als Silber-Titel. Die Kategorie Bronze beinhaltet Titel mit der weitesten Preisspanne mit 9,99 Euro für den günstigsten Titel und 49,99 Euro für den Teuersten.
In der Analyse der minimalen und maximalen Preise liegen weitere Erkenntnisse. Der Preisobergrenze für ein VR-Spiel liegt bei 59,99 € und ist für Spielemarken mit viel Inhalt reserviert. Nur in dieser Konstellation scheinen Spieler gewillt so hohe Geldbeträge auszugeben. Auch existiert ein minimalermöglicher Preis, um ein bestimmtes Auszeichnungs-Segment erreichen zu können. Abgesehen von der Gold-Kategorie, die eine kleine Anomalie darstellt, sind die Kategorie-Einstiegspreise klar abgegrenzt. So benötigt ein Titel den Preis von mindestens 9,90 €, um überhaupt die Chance auf eine Bronze-Auszeichnung haben zu wollen. Bei Silber sind das 14,99 €. Bei Gold 12,99 € und bei Platin 19,99 €.
Wie im Einzelhandel auch, werden durchgehend keine runden Zahlen bepreist, sondern stets ein Schwellenwert gewählt – am häufigsten endend mit 99 Cent.
Umsatzstärksten VR-Sparten
Insgesamt führt die Sparte Action mit 43 Titeln die Beliebtheitsskala an, gefolgt von der Sparte Abenteuer mit 12 Titeln. Simulationen bringen es auf 8 Titel. Indie-Titel sind mit 5 Einträgen vertreten. RPGs mit 4 Titeln. Gelegenheitsspiele, Sport und Werkzeuge teilen sich egalitär jeweils 3 Titel. Rennspiele belegen hingegen nur 2 Titel. Die weiteren Sparten (Design & Illustration, MMOs und Strategie) sind mit jeweils einem Titel vertreten.
Die nur 2 Titel in der Sparte Rennspiele überraschen nicht, denn Virtual Reality eignet sich für diese Spieleart zurzeit nur bedingt. Google Tiltbrush belegt als Paradiesvogel als einziger Titel die Sparte Design & Illustration. Etwas verwunderlich ist die geringe Beliebtheit der Strategie-Spiele mit Final Assault als einzigem Vertreter. Zwar belegen MMOs ebenfalls nur einen Titelplatz, doch der offensichtliche Mehraufwand eines MMOs macht dies erklärbar.
Es ist zu beobachten, dass Titel in der Sparte Action besonders häufig am Steam’s Early Access Programm partizipieren – mit 14 Titeln.
Wichtig zu erwähnen ist die Tatsache, dass Genres nicht immer mit der Sparte übereinstimmen, in der sie bei Steam veröffentlicht wurden. Das Spiel VR Paradise könnte man nur mit einem Augenzwinkern als Simulation bezeichnen, handelt es sich doch um ein Adult-Game. Das Box-Spiel “The Thrill of the Fight – VR Boxing” ist eher der Sparte Sport angehörig, anstatt der vom Publisher gewählten Action-Sparte.
Sparte
Insgesamt
Platin
Gold
Bronze
Early Access
Abenteuer
12
–
2
7
–
Action
43
7
6
20
14
Design & Illustration
1
–
–
–
–
Gelegenheitsspiele
3
–
–
3
–
Indie
5
2
–
2
1
MMO
1
–
–
–
–
Rennspiele
2
–
–
2
–
RPG
4
2
–
2
1
Simulation
8
–
2
6
1
Sport
3
–
1
2
–
Strategie
1
–
–
1
–
Werkzeuge
3
–
1
2
–
Auffälligkeiten und Überraschungen
Im Actionspiel “Space Junkies” präsentiert Ubisoft Montpellier erstmals ihre neue 3D-Engine mit dem Namen Brigitte. Dabei scheint es sich um eine völlige Neuentwicklung zu handeln, die bisher in keinem anderen Spiel verwendet wurde. Daher könnte man spekulieren, dass diese Engine exklusive für VR-Spiele zugeschnitten wurde. Es bleibt abzuwarten, ob weitere Ubisoft VR-Titel auf Brigitte setzen werden.
Eine weitere Überraschung ist das komplette Fehlen von Crytek’s CryEngine unter den Topsellern. Das erklärt sich damit, dass das einzige bekannte VR-Spiel, dass auf die CryEngine setzt “The Climb” nur im Oculus-Store verfügbar ist, nicht jedoch auf Steam. Fans fragen sich sicher, ob die beiden Blockbuster Far Cry oder Crysis jemals in VR realisiert werden. Bei Far Cry besteht Hoffnung, denn diese Marke gehört Ubisoft, die Virtual Reality in Computerspielen forciert. Ob Crysis VR jemals das Licht der Welt erblickt, bleibt weiterhin fraglich. Diese Marke gehört EA Games, einem Publisher, der der VR eher skeptisch gegenübersteht.
Auch wäre die Präsenz der freien 3D-Engine Godot zu erwarten gewesen, ist diese Engine doch eine ernstzunehmende Freeware-Alternative zu Unity und Unreal. Doch Godot ist mit keinem einzigem VR-Spiel unter den Topsellern vertreten. Diese könnte man als den Stand der Dinge der Godot-Engine werten und deuten, dass diese Engine noch nicht reif genug für kommerzielle VR-Entwicklung ist – oder aber noch keine Entwickler hat, die gute Spiele damit produzieren könnten.
Die große Anomalie unter den Spieletiteln ist Half-Life: Alyx, der bereits 2019 zwar vorbestellt werden konnte, jedoch erst am 23. März 2020 veröffentlicht wurde. An dieser Stelle könnte man sich fragen, ob Besteller der Valve Index, die Half-Life: Alyx ja kostenfrei geliefert bekommen, in der Topsellerliste mit erfasst wurden.
Steam’s bewegliche Daten
Man könnte annehmen, dass die Titel von 2019 keine Bewegungen in der Liste mehr erfahren würden, handelt es sich doch um ein abgeschlossenes Jahr, doch das ist hier eindeutig nicht der Fall. Seit dem ersten Besuch der 2019er Liste haben sich die Positionen der Titel innerhalb der Unterkategorien nicht nur kontinuierlich verschoben, sondern sind auch einzelne Titel aus den ursprünglichen Kategorien gerutscht. So wurde “Angry Birds VR: Isle of Pigs” zunächst unter Silber geführt, später fiel die Position in das Mittelfeld von Bronze. Bei einer abgeschlossenen Statistik hätte dies nicht passieren dürfen, denn dazu müsste der Umsatz des Titels nachträglich fallen. Zwar ist es vorstellbar, dass viele Spieler einen Titel stornieren oder der Hersteller aus Herzensgüte einen nachträglichen Rabatt einräumt. Wahrscheinlich ist dies jedoch nicht. Somit bleibt die Frage, was genau uns Steam in dieser Liste präsentieren will. Doch das Jahr 2020? Oder gilt das Datum der internen Abrechnung/Geschäftsjahr bei Steam, dass nicht mit dem Kalenderjahr korrespondieren muss. Womöglich zählt auch das erste Quartal 2020 mit dazu. Dies sind Mutmaßungen, denn klare Antworten darauf sind bei Steam nicht zu finden. Somit bleibt ein Quentelchen der Unsicherheit, denn ein bewegliches Ziel lässt sich bekanntlich schwerer treffen.
Für diese Analyse wurden ausschließlich Daten des ersten Snapshots vom 11.02.2020 herangezogen. Die Auswertung in diesem Zeitfenster sollte demnach stimmig sein und Aussagewert besitzen und für eine Orientierung genügen.
Übersicht der 3D-Engines
Im Nachfolgenden werden die 3D-Engines der Steam-VR-Topliste kurz vorgestellt.
Unity Engine
Entwickler: Unity Technologies Aktuelle Version: 2019.3 Lizenzierbar: Ja Scriptsprachen: C# Visuelles Scripting: Über kommerzielle Plugins möglich (Playmaker, Bolt). Nativ ab der Version 2020.1 Erscheinungsjahr: 2004 Betriebssysteme: Windows, MacOS, Linux (Beta)
Unity ist eine Entwicklungsumgebung für interaktive Anwendungen und Computerspiele der Firma Unity Technologies, die im Jahre 2004 von dem internationalen Trio, dem Isländer David Helgason, dem Dänen Nicholas Francis und dem Deutschen Joachim Ante unter dem Namen Over the Edge gegründet wurde. Die Umbenennung in Unity Technologies erfolgte im Jahr 2006.
Eine Auswahl der Spiele, die mit der Unity-Engine realisiert wurden:
Deus Ex: The Fall, Assassin’s Creed: Identity, Dead Effect 1-2, Angry Birds 2 / Epic, Rust, Battlestar Galactica Online, Final Fantasy IX, Job Simulator, The Lab, Pokémon Go, Superhot, Beat Saber, Endless Space 1-2, NASCAR Heat, Wasteland 2, Hearthstone: Heroes of Warcraft, The Elder Scrolls: Legends, Ori and the Blind Forest, Escape from Tarkov, Prey for the Gods
Unreal Engine
Entwickler: Epic Games Aktuelle Version: 4.2 Lizenzierbar: Ja Scriptsprachen: C++ Visuelles Scripting: Nativ über Blueprints Erscheinungsjahr: 2004 Betriebssysteme: Windows, MacOS, Linux
Unreal Engine ist die Entwicklungsumgebung für interaktive Anwendungen und Computerspiele der Firma Epic Games und wurde erstmals im Jahr 1998 zusammen mit dem gleichnamigen Ego-Shooter Unreal ausgeliefert. Epic Games wurde im Jahr 1991, zunächst unter dem Namen Epic MegaGames, von Amerikaner Tim Sweeney gegründet. Im Jahr 1999 erfolgte die Umfirmierung zu Epic Games.
Eine Auswahl der Spiele, die mit der Unreal-Engine realisiert wurden:
Unreal, Unreal Tournament, Deus Ex, Rune, Duke Nukem Forever, BioShock, Vanguard: Saga of Heroes, Warpath, Alien Breed 1-3, Borderlands 1-2, Gears of War 1-4, Infinity Blade, Mass Effect 1-3, Medal of Honor, Eve: Valkyrie, Hellblade: Senua’s Sacrifice, Warhammer 40,000: Eternal Crusade
Creation Engine
Entwickler: Bethesda Game Studios Aktuelle Version: 1.9.35.0 Lizenzierbar: Nein Scriptsprachen: C++ Erscheinungsjahr: 2011 Betriebssysteme: Windows
Die Creation Engine ist die Computerspiele-Engine der Firma Bethesda Game Studios und ist eine Weiterentwicklung Gamebryo Engine. Creation Engine ist nicht zur Lizensierung durch Drittentwickler vorgesehen, sondern wird ausschließlich für interne Produktionen eingesetzt. Es exisitert jedoch das Werkzeug Creation Kit, das zum Modding der mit der Creation Engine erstellten Computerspiele vorgesehen ist. Mithilfe der Creation Engine wurden u.A. die Rollenspiele The Elder Scrolls V: Skyrim, Fallout 4 and Fallout 76 erstellt.
Serious Engine ist die hauseigene Spieleengine der Firma Croteam und wurde u.A. für den Egoshooter Serious Sam und das Puzzlespiel The Talos Principle eigesetzt. Croteam wurde 1993 in Kroatien von den sechs Freunden Admir Elezović, Davor Hunski, Alen Ladavac, Roman Ribarić, Dean Sekulić und Davor Tomičić gegründet.
Rockstar Advanced
Entwickler: Rockstar San Diego Aktuelle Version: unbekannt Lizenzierbar: Nein Scriptsprachen: unbekannt Erscheinungsjahr: 2006 Betriebssysteme: unbekannt
Rockstar Advanced Game Engine (RAGE) ist eine von Rockstar San Diego in zusammenarbeit mit anderen Rockstar-Studios entwickelte Computerspiele-Engine für den hausinternen Einsatz. Rockstar Advanced wurde u.A. in den folgenden Spieleproduktionen eingesetzt: Grand Theft Auto IV+V, Red Dead Redemption 1+2 und Max Payne 3.
Id Software ist eine amerikanische Computerspiele-Firma, die im Jahre 1991 von den vier Mitgliedern der Firma Softdisk John Carmack, John Romero, Tom Hall und Adrian Carmack (nicht mit John Carmack verwandt) gegründet wurde. Die Id Engine wurde in unterschiedlichen Iterationen u.A. in den folgenen Spielen eingesetzt: Doom 1-3, Quake 1-3, Wolfenstein 3D und Rage.
Source Engine wurde von der Firma Valve entwickelt und wird sowohl für die interne Entwicklungen als auch zur Lizensierung eingesetzt. Valve beabsichtigt auch die Source2-Nachfolge-Engine zukünftig für Drittentwickler zu öffnen – die Voraussetzung dafür wird jedoch sein, dass die damit erstellten Produkte auf Valves Steam Vertriebsplattform veröffentlicht werden.
Die Source Engine nutzen unter anderen die Spiele Half-Life 1-2, Team Fortress 2, Portal 1-2, Left 4 Dead 1-2, außerdem von Drittentwicklern für die Titel Vampire: The Masquerade – Bloodlines, E.Y.E.: Divine Cybermancy , Postal III. Die neue Source 2 Engine wurde erstmals in Dota 2 eingesetzt, sowie in dem kommenden Titel Half-Life: Alyx.
Brigitte Engine wurde von Ubisoft Montpellier für den Einsatz in ubisofteigenen Produktionen. Erstmals wurde diese Engine in dem VR-Computerspiel Space Junkies eingesetzt. Weil es sich um die jüngste der hier vorgestellten Engines handelt, sind dazu nur wenige Informationen verfügbar.
Liste aller VR-Topseller 2019 mit 3D-Engine-Zuordnung
Adrian Maleska ist begeisterter VR-Entwickler der ersten Stunde. Als Programmierer und Designer wirkte er seit Ende der 80er Jahre bei zahlreichen Computerspielen mit. Zurzeit arbeitet an eigenen Indie-VR-Spielen.: www.maleska.de
The latest version of the Oculus Integration for Unity, v23, adds experimental OpenXR support for Quest and Quest 2 application development. A new technique for reducing positional latency called ‘Phase Sync’ has been added to both the Unity and Unreal Engine 4 integrations; Oculus recommends that all Quest developers consider using it.
OpenXR Support for Oculus Unity Integration
OpenXR, the industry-backed standard that aims to streamline the development of XR applications, has made several major steps this year toward becoming production ready. Today Oculus released new development tools which add experimental OpenXR support for Quest and Quest 2 applications built with Unity.
OpenXR aims to allow developers to build a single application which is compatible with any OpenXR headset, rather than needing to build a different version of the application for each headset runtime.
While Unity is working on its own OpenXR support, the newly released v23 Oculus Integration for Unity adds support for an “OpenXR experimental plugin for Oculus Quest and Oculus Quest 2.” This should allow for the development of OpenXR applications based on the features provided by the Oculus Integration for Unity.
Phase Sync Latency Reduction in Unity and Unreal Engine
The v23 Oculus Integration for Unity and for Unreal Engine 4 also bring new latency reduction tech called Phase Sync which can reduce positional tracking latency with ‘no performance overhead’, according to Oculus. The company recommends “every in-development app to enable [Phase Sync], especially if your app is latency sensitive (if it uses hand tracking, for example).”
While Quest has long used the Asynchronous Timewarp to reduce head-rotation latency by warping the rendered frame to the most recent rotational data just before it goes to the display, positional tracking doesn’t benefit from this technique.
One way to reduce positional tracking latency is to minimize the amount of time between when a frame starts rendering and when it actually reaches the display. Ideally the frame will finish rendering just before being sent to the display; if it finishes early, all of the time between when the frame is finished and when it is sent to the display becomes added positional latency.
Phase Sync introduces dynamic frame timing which adjusts on the fly to make sure frames are being completed in an optimal way for latency reduction.
Unlike the Oculus PC SDK, the Oculus Mobile SDK has been using fixed-latency mode to manage frame timing since its inception. The philosophy behind fixed-latency mode is to finish everything as early as possible to avoid stale frames. It achieves this goal well, but with our release of Quest 2, which has significantly more CPU and GPU compute than our original Quest, a lot of apps can finish rendering their frames earlier than planned. As a result, we tend to see more “early frames” […]
Compared with fixed-latency mode, Phase Sync handles frame timing adaptively according to the app’s workload. The aim is to have the frame finish rendering right before our compositor needs the completed frame, so it can save as much latency as possible, and also not missing any frames. The difference between Phase Sync and fixed-latency mode can be illustrated in the following graph on a typical multi-threaded VR app.
Image courtesy Oculus
Luckily, turning on Phase Sync is as easy as checking a box with the v23 Unity and Unreal Engine integrations from Oculus (details here).