Meta’s new Avatar SDK is now available for all Unity developers, with Unreal Engine support coming next year.
Announced at last year’s Connect, Meta Avatars replaces the legacy Oculus Avatar SDK first launched with the Oculus Touch controllers in late 2016. Back then avatars had a basic monochromic style with the eyes always hidden by a virtual headset or sunglasses. A major update released alongside Oculus Go in 2018 added skin tones, and another just before Oculus Quest in 2019 added lipsync, microexpressions, and simulated eye movement.
The editor lets you create a full body avatar, but in apps you’ll only see from the torso up. A VR system with only head and hand tracking can make a reasonable guess as to your elbow position, but it’s not really possible to do the same for legs.
Meta is already using these new avatars across its Horizon suite of social VR apps. That will include Horizon Home, the coming update to Quest’s Home software adding social features. Meta hopes as many third party apps as possible leverage the SDK so users have a consistent virtual identity across a wide range of apps. The developers of Eleven Table Tennis and Cook-Out have expressed their intention to support it.
The SDK supports Quest and Windows-based VR platforms. However, users on non-Meta platforms can’t customize their own avatar. Instead they can pick between 32 default avatars – though the developer will need to build their own interface for this.
Controller-free hand tracking now works properly in OpenXR Unity apps on Quest.
This means apps can now use both hand tracking and passthrough at the same time, since the Passthrough API requires OpenXR.
OpenXR is the open standard API for VR and AR development. It was developed by Khronos, the same non-profit industry consortium managing OpenGL. OpenXR includes all the major companies in the space such as Meta, Sony, Valve, Microsoft, HTC, NVIDIA, and AMD – but notably not Apple.
But until now, all releases of the Oculus Integration for Unity have included a caveat:
Support for hand-tracking is currently restricted to the baseline OpenXR spec. Therefore, additional hand-tracking features such as collision capsules, hand input metadata, and runtime hand meshes are not yet supported. In addition, there is a known compatibility issue with the thumb trapezium bone (Thumb0) in the OpenXR-based OVRPlugin.
Attempting to use hand tracking with OpenXR resulted in the thumb being in the wrong position, as well as the features missing mentioned above.
With release v35, this limitation notice is now gone and the thumb appears mostly correct. I say mostly because the thumb still doesn’t fully make contact with the index finger when pinching, but the gap is very small.
You can expect Oculus Store and App Lab apps to start shipping updates with hand tracking and passthrough API in the coming weeks.
Voice SDK is powered by Wit.ai, the voice interface company Meta acquired in 2015. Wit is server-side, so won’t work offline and has some latency.
Meta says this new API enables searching for in-app content and “voice driven gameplay” such as verbal magic spells or a conversation with a virtual character. While last month’s Oculus SDK release included Voice SDK, it was marked ‘experimental’ so couldn’t be included in Oculus Store or App Lab apps.
Voice SDK can actually do more than just return text from speech. It has natural language processing (NLP) features to detect commands (eg. cancel), parse entities (eg. distance, time, quantity) and analyze sentiment (positive, neutral, or negative). You can read the Voice SDK documentation here.
Spatial Anchors is a new experimental feature for mixed reality, meaning it can’t be used in Store or App Lab apps yet.
Developers can already use passthrough – the view from the Quest’s greyscale tracking cameras – as a layer (eg. the background) or on a custom mesh (eg. a desk in front of you). Spatial Anchors are world-locked reference frames allowing apps to place content in a specific position users define in their room. The headset will remember these anchor positions between sessions. You can read the documentation for Spatial Anchors here.
Next year Meta plans to add Scene Understanding as an experimental feature. Users will be prompted to (manually) mark out their walls and furniture, and to enter their ceiling height – a process called Scene Capture. This will need to be done for each room but the headset should remember the Scene Model between sessions.
Masterpiece Studio (formerly MasterpieceVR) today announced it’s releasing a free edition of its latest professional VR creator suite, Masterpiece Studio Pro. The free software license is targeting individuals looking to use the suite for non-commercial use.
The free version is said to contain the entire set of features of Masterpiece Studio Pro, which is a subscription-based service aimed at freelancers, teams, and educators using its creation tools for work.
Like its original 2019-era Masterpiece Studio, Masterpiece Studio Pro lets users create 3D assets within VR, letting you use motion controllers to draw, sculpt, texture, optimize, rig, skin, and animate things like characters or objects. The Pro version was launched back in April 2021.
Image courtesy Masterpiece Studio
One of the biggest caveats with the original was the inability to export models, which was a feature only paying users could access. That’s still a thing with the free version of Pro, although the studio has now created a public library where creations can be published and viewed.
“We believe this Free version will help showcase your work, bring value to other creatives, and help build the creative community of the future,” the studio says on its Steam page.
The Ontario, Canada-based startup is pitching the free license as a way to support VR indie creators by not only letting them learn the ropes of their software for free, but also by establishing a way to share and remix those publicly shared creations. You can find it on PC VR headsets for free over at Steam and Viveport.
At Connect 2021 last week, Meta revealed a new Quest rendering technology called Application Spacewarp which it says can increase the performance of Quest apps by a whopping 70%. While similar to the Asynchronous Spacewarp tech available to Oculus PC apps, Meta says Application Spacewarp will produce even better results.
The original article, which overviews the Application Spacewarp rendering tech, continues below.
Original Article (November 5th, 2021): Given that Quest is powered by a mobile processor, developers building VR apps need to think carefully about performance optimization in order to hit the minimum bar of 72 FPS to match the headset’s 72Hz display. It’s even harder if they want to use the 90Hz or 120Hz display modes (which make apps look smoother and reduce latency).
Considering the high bar for performance on Quest’s low-powered hardware, anything that can help boost app performance is a boon for developers.
The technique achieves this by allowing applications to run at half-framerate (for instance, 36 FPS instead of 72 FPS), and then the system generates a synthetic frame, based on the motion in the previous frame, which is filled in every-other frame. Visually, the app appears to be running at the same rate as a full-framerate app, but only half of the normal rendering work needs to be done.
Image courtesy Meta
An application targeting 36 FPS has twice as much time for each frame to render compared to running at 72 FPS; that extra time can be spent by developers however they’d like (for instance, to render at a higher resolution, use better anti-aliasing, increase geometric complexity, put more objects on screen etc).
Of course, Application Spacewarp itself needs some of the freed up compute time to do its work. Meta, having tested the system with a number of existing Quest applications, says that the technique increases the render time available to developers by up to 70%, even after Application Spacewarp finishes its work.
Developer Control
Developers using Application Spacewarp can target 36 FPS for 72Hz display, 45 FPS for 90Hz, or 60 FPS for 120Hz.
Meta Tech Lead Neel Bedekar posits that 45 FPS for 90Hz display is the “sweet spot” for developers using Application Spacewarp because it requires less compute than the current minimum bar (45 FPS instead of 72 FPS) and results in a higher refresh rate (90Hz instead of 72Hz). That makes it a fairly easy ‘drop-in’ solution which makes the app run better without requiring any additional optimization.
Of course 60 FPS for 120Hz display would be even better from a refresh rate standpoint, but in this case a 60 FPS app using Application Spacewarp would require additional optimization compared to a native 72 FPS app (because of the overhead compute used by Application Spacewarp).
Meta emphasizes that Application Spacewarp is fully controllable by the developer on a frame-by-frame basis. That gives developers the flexibility to use the feature when they need it or disable it when it isn’t wanted, even on the fly.
Developers also have full control over the key data the goes into Application Spacewarp: depth-buffers and motion vectors. Meta says that this control can help developers deal with edge cases and even find creative solutions to best take advantage of the system.
Lower Latency Than Full Framerate
Combined with other techniques, Meta says that Quest applications using Application Spacewarp can have even lower latency than their full framerate counterparts (that aren’t using the extra tech).
That’s thanks to additional techniques available to Quest developers—Phase Sync, Late Latching, and Positional Timewarp—all of which work together to minimize the time between sampling the user’s motion input and displaying a frame.
Differences Between Application Spacewarp (Quest) and Asynchronous Spacewarp (PC)
While a similar technique has been employed previously on Oculus PC called Asynchronous Spacewarp, Meta Tech Lead Neel Bedekar says that the Quest version (Application Spacewarp) can produce “significantly” better results because applications generate their own highly-accurate motion vectors which inform the creation of synthetic frames. In the Oculus PC version, motion vectors were estimated based on finished frames which makes for less accurate results.
Application Spacewarp Availability
Application Spacewarp will be available to Quest developers beginning in the next two weeks or so. Meta is promising the technique will support Unity, Unreal Engine, and native Quest development right out of the gate, including a “comprehensive developer guide.”
Per the update at the top of the article, Application Spacewarp is now available to Quest developers.
Unity developers will be able to integrate the new Oculus Avatars 2.0 from December, with Unreal Engine support coming next year.
Announced at last year’s Connect in September 2020, Avatars 2.0 replaces the legacy Oculus Avatar SDK first launched with the Oculus Touch controllers in late 2016. Back then avatars had an basic monochromic style with the eyes always hidden by a virtual headset or sunglasses. A major update released alongside Oculus Go in 2018 added skin tones, and another just before Oculus Quest in 2019 added lipsync, microexpressions, and simulated eye movement.
While the editor lets you create a full body avatar, but in apps you’ll only see from the torso up. While a VR system with head and hand tracking can make a reasonable guess as to your elbow position, it’s not really possible to do the same for legs.
Facebook is using the new avatars across its entire Horizon suite of social VR apps. That will include Horizon Home, the coming update to Quest’s Home software adding social features. As part of its “metaverse” push, Facebook hopes as many third party apps as possible leverage the SDK so users have a consistent virtual identity across a wide range of apps. However, there’s no word yet on whether the SDK supports other VR platforms such as SteamVR. Multi-platform developers will be unlikely to be enticed if they have to integrate a different avatar SDK for each system.
Oculus Quest developers will get speech recognition with the next SDK release. Tracked keyboard support and a Unity hand interaction library are planned for next year.
Called Voice SDK Experimental, speech recognition will be powered by Wit.ai, a voice interface company Facebook acquired in 2015. The company says this will enable searching for in-app content and “voice driven gameplay” such as verbal magic spells or a conversation with a virtual character. Facebook says Voice SDK is “free to sign up and get started”, which may suggest using it in a full app will incur a fee.
Tracked Keyboard support will arrive next year. Oculus Quest already supports a tracked keyboard – specifically the Logitech K830 Bluetooth model – but it currently only works in Oculus Home, for 2D apps like Oculus Browser. Allowing developers to bring a tracked keyboard into their apps could open up new productivity use cases. In the mean time, Immersed built their own manual system for showing your keyboard in VR.
Adding hand interactions such as grabbing and pushing is still a challenge in VR development. While there are a number of interaction frameworks available, each has its shortcomings and some require importing a heavy library developers may not need. Facebook says its Interaction SDK Experimental Unity library, slated for next year, “can be used together, independently, or even integrated into other interaction frameworks”. It will support both controllers and hand tracking, and include “a set of ready to use, robust interaction components like grab, poke, target and select”.
Developers will also be able to ship mixed reality apps on App Lab and the Oculus Store, with Spatial Anchors coming soon and Scene Capture coming next year. You can read more about the mixed reality developer features here.
Oculus plans to further open up the mixed reality capabilities of Quest with new tools that will allow developers to build apps which more intelligently integrate with the user’s real room. In the near future developers will also be permitted to distribute mixed reality apps to customers via the Quest store or Oculus App Lab for the first time.
Oculus first began unlocking Quest’s mixed reality capabilities with the Passthrough API which allowed developers to tap into the headset’s pass-through video view for the first time earlier this year. Now the company is announcing a more advanced set of tools, which it calls the Presence Platform, which will allow developers to build more advanced mixed reality applications.
The Presence Platform includes the Insight SDK, Interaction SDK, and Voice SDK.
Insight SDK
The main building block of the insight SDK is the Passthrough feature, which developers previously had access to in an experimental form. That feature is moving out of its experimental form and into general availability starting with the next developer update.
Additionally, the Insight SDK includes Spatial Anchors which gives developers the ability to place virtual objects in the scene and allow them to persist between sessions. For instance, a piano learning app could allow you mark the location of your piano, and the app could then remember where the piano is any time you open it.
The Insight SDK further includes Scene Understanding, which Oculus says allows developers to build “scene-aware experiences that have rich interactions with the user’s environment.” This includes geometric and semantic representation of the user’s space, meaning developers can see the shape of the room and get a useful idea of what’s in it. For instance, the Scene Understanding feature will allow developers to know what parts of the scene are walls, ceilings, floors, furniture, etc all of which can be used as a surface on which virtual content can be naturally placed.
Oculus says the developer will see a “single, comprehensive, up-to-date representation of the physical world that is indexable and queryable.” You can think of this like the headset building a map of the space around you that developers can use as a guide upon which to build a virtual experience that understand your physical space.
However, users will need to do some work on their end in order to generate this map for apps that need it, including marking their walls and tracing over their furniture.
Crucially Oculus says that the Insight SDK will enable developers to build feature-rich mixed reality apps “without needing access to the raw images or videos from your Quest sensors.” We’ve reached out to the company to further clarify if Oculus itself will send the raw sensor footage off of the headset for any processing, or if it will all happen on-device.
The Scene Understanding portion of the Insight SDK will launch in an experimental form early next year, according to the company.
Interaction SDK
Another part of the Presence Platform is the Interaction SDK which will give Unity developers a ready-made set of simple interactions for hands & controllers, like poking buttons, grabbing objects, targeting, and selecting. This saves developers time in building their own versions of these commonly used interactions in their apps.
Oculus says the goal of the Interaction SDK is to “offer standardized interaction patterns, and prevent regressions [in tracking performance of specific interactions] as the technology evolves,” and further says that the system will make it easier for developers to build their own interactions and gestures.
The company says that the Interaction SDK (and the previously announced Tracked Keyboard SDK) will become available early next year.
Voice SDK
The Voice SDK portion of the Presence Platform will open up voice-control to Quest developers, which Oculus says can drive both simple navigation functions (like quickly launching your favorite Beat Saber song with your voice) and gameplay (like casting a voice-activated spell).
The system is based on Facebook’s Wit.ai natural language platform which is free to use. Oculus says the Voice SDK will arrive in an experimental form in the next developer release.
Mixed Reality Apps on the Quest Store and App Lab
While not all of the Presence Platform SDKs will arrive at the same time, as of the next Quest developer release, devs will be allowed to ship mixed reality apps via the Quest store or App Lab. That release is expected next month.
The World Beyond Sample App
Early next year Oculus says it will make available a sample project called The World Beyond which developers can use as a starting point for building atop the Presence Platform features. The app will also be made available to users.
Carrier Command 2 launched earlier this month alongside a separate version including full VR support. While the non-VR version of the game sits at a ‘Mostly Positive’ 75% user rating, the VR version quickly tanked to a 27% ‘Mostly Negative’ rating. A desire to cram VR into the game with limited testing and a lack of feedback from experienced VR players is the root cause. While developer Geometa is working hard to deliver fixes, this avoidable stumble right out of the gate hampers the odds of success for the VR version.
User reviews of Carrier Command 2 VR are in the gutter just one week after the game launched. The core issue, however, isn’t that VR isn’t a good fit for the game, but that the developer simply didn’t take their time with the implementation.
We suspected as much when the VR version was announced just two weeks before the game was set to launch, and after the demo period for the non-VR version had passed.
Developer Geometa admitted that VR support wasn’t originally part of its development plan, but—prompted by player requests—felt it would be easy to drop VR into place considering the ‘hands-on’ design of the non-VR version where players control the entire game from the bridge of an aircraft carrier.
Image courtesy Geometa
“While VR was not originally in our roadmap, the literalism of the diegetic interfaces within the game has made it very easy to introduce VR to Carrier Command—this is the same game, with the same balance and same mechanics!” the studio wrote ahead of launch.
And to be fair, the game does look like it could be a good fit for VR. The issues that quickly earned it a 27% ‘Mostly Negative’ rating from users are overwhelmingly about technical missteps when it comes to VR, and much less about the content of the game itself.
It’s clear that a lack of user testing across the range of PC VR hardware and players is to blame. And let’s be honest—it’s far from easy for a small developer to test with every headset out there. But to give your VR game its best chance of success, you’ve got to find a way to get that crucial, early feedback.
Serious credit to the studio for their quick work. But it pains me (and I’m sure them too) to know that the game got battered with bad reviews right out of the gate due to these relatively easy to fix technical issues. That red ‘Mostly Negative’ text is like a scarlet letter that can scare away plenty of curious customers.
‘Carrier Command 2 VR’ user reviews over time
With the fixes deployed so far the studio is steadily climbing out of the hole it dug, but it’s going to be an uphill battle.
Not the First and Not the Last
Carrier Command 2 VR is far from the first VR game to launch with VR-related technical issues that could have been fixed ahead of launch to spare the game from getting slammed with early negative reviews.
VR shooter Larcenauts is another recent example. While the game launched pretty much flawlessly on Quest and Oculus PC, it was completely broken for non-Oculus headsets on Steam right out of the gate. The studio didn’t clearly communicate that non-Oculus headsets wouldn’t be supported at launch, and it got blasted with reviews from understandably confused customers who bought the game and couldn’t play it.
Image courtesy Impulse Gear
While the game’s most recent reviews are ‘Mostly Positive’ at 72%, the overall reviews are clearly stained by the launch issues, sitting at a ‘Mixed’ 66%.
‘Larcenauts’ user reviews over time
Even big studios with serious QA resources aren’t immune. Industry heavyweight Respawn Entertainment launched Medal of Honor: Above and Beyond back in December, 2020. It was the first Oculus-funded title to launch on Steam, and it too got battered by early negative reviews.
Image courtesy Respawn Entertainment
While the issues weren’t only technical in nature, many of them were the kind of thing that you’d probably uncover easily by testing the game with just a handful of experience VR players ahead of launch.
Basic options like smooth turning and video settings weren’t available, and the game’s ‘face-scope’ sniper rifle and pace-breaking ‘Victory!’ screen between missions were universally disliked (and quickly removed).
Over the course of two months Respawn released four patches addressing a litany of issues, many of which were identified by players on day one.
Like Larcenauts, Medal of Honor: Above and Beyond, has battled its way back to a ‘Mostly Positive’ 70% rating among recent reviews, but the issues clearly left a mark on the overall ratings which are now ‘Mixed’ at 62%, which can easily steer away interested customers.
‘Medal of Honor: Above and Beyond’ user reviews over time
– – — – –
So what’s the takeaway here? Making VR games is hard. It’s a science and art that is not nearly as ‘figured out’ as non-VR games—even for top industry talent like Respawn.
To send your game out the door with the best chance of success, testing early with VR users is key—Valve made a huge point about this when we talked with them about building Half-Life: Alyx. It’s hard to get your hands on every headset out there, so find willing enthusiasts and gather their feedback in a structured and actionable way.
Oculus is making hard shift away from its proprietary developer APIs in favor of OpenXR, an industry-backed project that aims to standardize the development of VR and AR applications. As of the latest SDK update, the company says OpenXR will become “the preferred API for all new applications going forward.”
OpenXR is a royalty-free standard that aims to standardize the development of VR and AR applications, making for a more interoperable ecosystem. The standard has been in development since April 2017 and is supported by virtually every major hardware, platform, and engine company in the VR industry, including key AR players.
Image courtesy The Khronos Group
OpenXR has seen a slow but steady rollout since reaching version ‘1.0’ in 2019; this new announcement from Oculus is sure to hasten the pace significantly.
The move begins with the v31 SDK update, in which Oculus is shifting to OpenXR as the “preferred API for all new applications going forward.” According to Oculus, that means only its OpenXR SDK will receive “full support” (like QA testing, bug fixes, and up-to-date documentation). New developer features, like the recently announced passthrough API, will be delivered only through OpenXR extensions from this point forward.
Applications built with the older Oculus Mobile and Oculus PC SDKs will of course continue to work on existing headsets, but starting on August 31st, Oculus is downgrading those SDKs to “compatibility support” only, which means limited QA testing, only critical bug fixing, and no new developer features.
One year after, on August 31st, 2022, Oculus will require that new applications be built with OpenXR, and the Oculus Mobile and Oculus PC SDKs will move to “unsupported” status.
Even after that date, older applications built with the Oculus Mobile and Oculus PC SDKs will continue to work on existing headsets, but Oculus is pushing hard to get all new applications built with OpenXR.
While both Unity and Unreal Engine—the two most commonly used tools for building VR applications—offer some degree of support for OpenXR, neither have shifted to OpenXR as the default for building new VR applications.
In Unity, OpenXR support is still considered “experimental.” Oculus expects that the Unity OpenXR plugin won’t be “fully supported” until early 2022, at which point it will become the recommended option for building VR applications.
As for Unreal Engine, Oculus plans to make an OpenXR backend plugin the default in the v32 release of the Oculus SDK, and expects “full support” for OpenXR in Unreal Engine with the release of Unreal Engine 5 (expected in early 2022). Once UE5 gets is full release, Oculus says that new VR projects for Oculus headsets built with UE5 will be required to use OpenXR.