Developers: How To Properly Support Oculus Quest’s Party Voice Chat

Playing Oculus Quest games with friends over the past few months revealed a major system-wide flaw: most developers don’t properly support Oculus Party voice chat.

That’s not to put blame on developers – Facebook’s system architecture is non-standard and somewhat confusing. The result for players is a Party voice chat system that feels like it barely ever works.

The cause isn’t a bug – the system is technically working as intended.

At this point you may be wondering; what does it even mean to “support” Party voice chat, and why does doing so matter?

The Problem

Just like Xbox Live or PlayStation Network, the Oculus platform lets players invite friends to a ‘Party’ – a background group voice call. Since Quest is architected like a console, you can’t use 3rd party alternatives like Discord (at least not backgrounded).

Some games even let you launch directly into a session with your Party – but that’s rarely used & unrelated to the issue I’m discussing today.

Quest only lets one app, or the system, use the microphone at once. Oculus Platform SDK lets developers disable Party Chat so the microphone is available for their own multiplayer voice chat system:

Oculus.Platform.VoIP.SetSystemVoipSuppressed(true);

This method is supposed to be used temporarily, followed by setting it to false when the microphone stops being needed.

But here’s the problem: most multiplayer Quest games set it to true throughout the entire app lifecycle to ensure the microphone is always available. This has serious consequences for players trying to team up & coordinate for your game.

The Consequences

A group of Quest-owning friends decides to play some VR games together. The first hops on and invites the others to a Party. They join and together decide on a game.

Since the game calls SetSystemVoipSuppressed(true) on launch and never un-suppresses it, each friend will stop transmitting to & hearing from the others in the Party immediately upon the game loading.

In the best case scenario – games with intuitive friend-based invite systems (eg. Population ONE) – the group of friends will be able to hear each other again relatively soon by creating an in-game party.

In the worst case scenarios – games using invite codes or passwords (eg. Onward) – there is now no way for the friends to group up & hear each other without taking off their headsets & exchanging the code via their phones (or using the clunky Oculus text chat messaging system).

In both cases, the games mute the Party audio long before they’ve finished loading, meaning the player is initially left in a silent black void.

Players experiencing these issues often assume either the game or Oculus Party system is broken. A quick “Oculus Party chat not working” Google search confirms this, with leagues of frustrated users.

The Solution

The solution is to only enable SetSystemVoipSuppressed for the times your app needs the microphone, not throughout the full app lifecycle.

When the player exits an active multiplayer lobby or session, re-enable their Party voice chat by calling:

Oculus.Platform.VoIP.SetSystemVoipSuppressed(false);

They’ll be able to coordinate with their friends again to get back into a session. You should leave Party chat un-suppressed throughout menus, offline tutorials, and single player content – anywhere except an active multiplayer lobby.

Given the issues around coordinating into lobbies, it could even be argued the Party Chat shouldn’t be disabled until there’s more than 1 occupant of the lobby.

Making these changes will make it easier & less frustrating for groups of friends to play your game on Oculus Quest.

Oculus Quest Getting Passthrough+ API For AR Apps, Starting With Spatial

Facebook intends to release an API next year that would essentially allow developers to support AR mode in their Oculus Quest apps.

Both Oculus Quests feature four ultra wide angle monochrome cameras. The system uses them for positional tracking and controller tracking, but you see the real world through them when setting up the Guardian system or by double tapping the headset. This is known as video passthrough.

It’s called “Passthrough+” because it uses computer vision algorithms to fuse the view of two of the cameras together into a depth-correct view. That’s needed because the physical cameras are actually very far from your real eyes. If their raw output was shown the scale and perspective would be wrong – you’d get a headache over time.

Remote work & meetings software Spatial already has access to the new API, and shared a short preview of it in action:

https://www.youtube.com/watch?v=aiw6OonAUCI

Note that while this API will let developers superimpose virtual objects into the real world, that doesn’t in any way require those devs to get access to the camera feeds.

Facebook isn’t giving any concrete details yet, but says it will share more on the feature next year. It’s possible this move could see it effectively become a dev kit for Facebook’s future AR ambitions.

Set Oculus Link Resolution Easily With This New Graphics Option

Oculus PC software v19 adds the ability to choose between 3 render options for your Oculus Link resolutions. Alternatively you can have it automatically set based on your graphics card.

Oculus Link is the feature which lets Oculus Quest act as a PC VR headset via a USB cable. This gives Quest owners who own a gaming PC access to the Oculus Rift library and SteamVR.

Games running on Quest itself don’t have graphics options. Like a console, developers find the balance of detail to maintain solid framerate. But PC components vary in performance, so allowing different render resolutions makes sense for Oculus Link, where your PC renders the graphics.

You can find the new option in the Devices tab of the Oculus PC app. Click on ‘Quest and Touch‘ and scroll down to ‘Graphics Preference‘:

The resolutions of these options aren’t shown in the Oculus software- but we tested each option in a Unity app, logging the eye texture resolution it resulted in:

  • Prioritize Quality: 2784×3056 per eye
  • Balanced: 2448×2688 per eye
  • Prioritize Performance: 1568×1728 per eye
  • Automatic: ‘Performance’ for GTX 970, ‘Balanced’ for RTX 2070

Changing requires restarting the Oculus software, so unfortunately you can’t switch mid-game. It’s unclear if it has any effects other than resolution.

Valve’s competing SteamVR platform allows precise selection of resolution, with the ability to save per-game values. A third party tool Oculus Tool Tray brings this functionality to the Rift platform.

If you want to keep it simple, this system gives you a solution to slow performance in certain games; select ‘Prioritize Performance’ mode.

It doesn’t matter if you’re using USB 2.0 or 3.0, it seems to work in either mode.

The Oculus SDK offers a feature to dynamically set resolution to maintain framerate, based on current GPU utilization. Some Oculus Store games use this feature. We tested in Unity and found this dynamic system takes priority when enabled, ignoring the Link resolution setting – keep that in mind if you don’t notice a different in some games.

Oculus Software v19 is currently available on the Public Test Channel. To opt in, navigate to the Beta tab of the Oculus PC app’s Settings. You’ll see it start to download in the Library tab.

Will you be setting Prioritize Performance to maintain a solid framerate? Or will you lock yours to Prioritize Quality to get the sharpest visuals? Let us know in the comments below!

The post Set Oculus Link Resolution Easily With This New Graphics Option appeared first on UploadVR.

Oculus Quest Gets Dynamic Fixed Foveated Rendering To Balance Quality & Performance

The Oculus Quest now has a Dynamic Fixed Foveated Rendering (FFR) feature, which developers can use instead of manually setting the FFR level.

UPDATE April 28: this feature is now available for Unity, the game engine used for the majority of Oculus Quest content.

This article was originally published December 20.

Fixed Foveated Rendering is a rendering feature that developers can use on Oculus Quest. It renders the peripheral of the lenses at a lower resolution than the center, making it easier for the software to maintain a consistent and comfortable frame rate by shaving down detail in places that are less noticeable. There are four levels of FFR developers can choose from: Low, Medium, High, and High Top.

FFR can make it easier for developers to port their PC VR games to Quest. However, the High and High Top can be very noticeable for the user. As we stated in our review of the Quest headset:

In the game’s opening training montage I couldn’t help but point my eyes down and see two blurs for feet running on a treadmill. Tilting my head up over text to move it into the foveated area revealed the scale and size of the effect

Dynamic FFR allows developers to let the Oculus system dynamically adapt the level of foveation based on the GPU utilization. This means that unless it is needed at that time for performance, users won’t see the pixelation and blur seen in some Quest titles today.

The feature is off by default, however, so developers will need to add it to their games via a software update to get the benefits.

For Unity, this can be done by setting useDynamicFixedFoveatedRendering to true on the OVRManager script.

The post Oculus Quest Gets Dynamic Fixed Foveated Rendering To Balance Quality & Performance appeared first on UploadVR.

Oculus SDK Drops Support For Samsung Gear VR

Recent versions of the Oculus Mobile SDK drop support for the Samsung Gear VR mobile headset.

This means that if developers want to continue to support Gear VR in their future app updates, they won’t be able to leverage new Oculus SDK features or bug fixes.

The Samsung Gear VR is a smartphone-based VR headset. Like Google Cardboard and its plastic derivatives, users slot in their smartphone which acts as the display and computer. Unlike cardboard, however, it features a dedicated gyroscope and accelerometer, and runs the same Oculus Mobile platform and store as the Oculus Go.

According to Facebook’s FAQ on the topic, existing Gear VR apps can still be downloaded. However, the company doesn’t mention what exactly will happen if a developer releases an update with the latest SDK version. We assume that Gear VR owners will be served the last compatible version, but we’ve reached out to Facebook to confirm.

This could present a huge problem for multiplayer games and apps which support the Gear VR. Developers may be forced to drop multiplayer support for Gear VR if they need to update their Oculus SDK to take advantage of newer features such as finger tracking for Quest.

Gear VR’s Eulogy

At Oculus Connect 6 back in September, Oculus’ then-CTO John Carmack essentially declared the Gear VR dead.

This was prompted by the news that Samsung’s Galaxy Note 10 will not support VR, and speculation that the regular Galaxy S line will follow suit next year.

Google’s competing Daydream smartphone VR platform is also essentially dead, with neither the Pixel 3a nor Pixel 4 supporting it and sales of the headset itself ending.

Smartphone-based VR created a lot of problems. The time it takes to slot in and out the phone, and the fact the user’s phone is unusable while docked into the headset, makes people less likely to want to use VR on a regular basis. A Gear VR session could also end after a matter of minutes, depending on device and conditions, due to the phone’s processor reaching its thermal limits. Smartphones pack all of their components into an incredibly small space. While Samsung improves its passive cooling design almost every year, there are physical limitations which can’t be overcome packing VR into a device designed first as a phone.

oculus quest oculus go

Standalone VR headsets, though, incorporate the screens and computing hardware and are designed for better cooling. Despite standalones having roughly the same graphical limitations as smartphone VR, Oculus CTO John Carmack claims that the Oculus Go sees Rift-like retention levels, whereas Gear VR’s was much lower.

The post Oculus SDK Drops Support For Samsung Gear VR appeared first on UploadVR.

Oculus Brings More Lifelike Sound Propagation to Audio SDK 1.34

Despite being oft overlooked in the face of more attention-grabbing visuals, audio is an essential component to creating presence in VR. In a quest to create increasingly lifelike audio in VR environments, Oculus has pushed out an update to its Audio SDK recently that provides developers with the ability to create more realistic audio by generating real-time reverb and occlusion based on the app’s geometry.

Now in beta, the so called ‘Audio Propagation’ tool comes in the Oculus Audio SDK 1.34 update which produces more accurate audio propagation with “minimal set up,” the company says in a developer blogpost.

The Audio Propagation tool generates reverb from game geometry, so to change how a scene sounds developers simply need to tag the applicable meshes in the scene and select an acoustic material for each mesh; that includes things like plaster on brick, ceramic tile, and carpet to name a few.

The update also comes with reverb models for several types of spaces, including indoor, outdoor, and asymmetrical spaces, setting it apart from conventional reverb solutions.

Facebook Reality Labs previously teased some of this in their OC5 developer talk entitled ‘Spatial Audio for Oculus Quest and Beyond’.

The video goes on to explain that state of the art ‘AAA’ games tend to implement a work-intensive process of adding independent reverb presets for each room in a scene and then fading between them as the user moves from one room to another—hardly how sound travels in the physical world. Some developers implement a portal system to handle occlusion problems as well.

Oculus’ solution is real-time, and not prebaked, it’s touted for being quicker for developers to produce, allowing dynamic geometry like a door to be open or closed and still provide correctly reverberated audio.

SEE ALSO
VR Headset Growth on Steam Makes Biggest Leap Yet, Eclipses Linux Steam Users

Valve’s Steam Audio Plugin provides both baked and real-time options, however the company says in the lengthy Unity set-up guide that it “incurs a CPU overhead.” Just how much overhead Oculus’ solution takes, we aren’t sure at this time.

This isn’t Oculus’ first go at more realistic audio. Previously, the Audio SDK included something the company calls ‘the shoebox model’, which essentially created a standard-sized cube around you that direct sounds would then bounce off of.

Oculus provides Audio Propagation guides for both Unity and the Unreal. While we haven’t experienced the results for ourselves yet, we’re hoping the company’s stalwart support of real-time, geometry-based audio propagation will become a standard in the VR games and apps yet to come.

The post Oculus Brings More Lifelike Sound Propagation to Audio SDK 1.34 appeared first on Road to VR.

Oculus Audio SDK Update Adds Geometry-Based Sound Propagation

oculus audio propagation

The latest update to the Oculus Audio SDK adds the long awaited dynamic audio propagation feature.

The Audio SDK spatializes audio sources in real time using head-related transfer functions (HRTF). It also allows for volumetric and ambisonic sounds. This new update improves how it handles reflections and occlusion.

The Old Behavior

The spatializer originally simulated audio reflections by assuming a predefined rectangle around the user. That however assumed the user was in the center of that rectangle. It also obviously doesn’t work properly when moving around a scene.

In early 2018 a feature called Dynamic Room Modeling was added. This allows developers to define the current room as a 3D rectangle with a position. When the user changes to a new room the developer can update the rectangle for the new space.

This required a relatively large amount of effort on the developer’s part however, and only fully works in perfectly rectangular spaces. It also couldn’t model the transition between different sized spaces- such as going from inside to outside.

The New Update

The new update accurately models occlusion and reflections of sound in real time based on the scene geometry. The developer simply needs to tag each object with an acoustic material to let it know how it should absorb or reflect sound. Materials like carpet will absorb far more than materials like metal.

How the Audio SDK now ‘sees’ a scene

Valve’s competing Steam Audio has had geometry-based occlusion since late last year. But reflections have to be prebaked. Facebook’s new update brings VR audio to a new level of realism by modelling reverb in real time. The simulation even performs well on mobile VR, even with many sound sources. This will be important for the the upcoming Oculus Quest.

UPDATE: article previously stated that Steam Audio had feature parity. Thanks to reddit user /u/Hethree for the correction.

Tagged with: , , , , , ,

The post Oculus Audio SDK Update Adds Geometry-Based Sound Propagation appeared first on UploadVR.

Oculus Announce Asynchronous Spacewarp Version 2.0

Back in late 2016 Oculus announced a new technology that was designed to help reduce the system hardware requirements for virtual reality (VR) experiences while maintain content quality across a wider array of hardware. The technology was titled Asynchronous Spacewarp (ASW) and uses frame-rate smoothing techniques to almost halve the computer processing unit (CPU) and graphic processing unit (GPU) time required to produce nearly the same output from the same content. Now a new version of ASW is on the way and with it the increased performance that technology provides will be expanded on further.

The news came from the recent Oculus Connect 5 event which features numerous announcements including the new Oculus Quest standalone head-mounted display (HMD). The newly announced ASW 2.0 however is equally as exciting thanks to the improvements it will bring to the experiences users will be able to enjoy.

One of the first big improvements for the technology is that it will be combined with Oculus’ Positional Timewarp (PTW) system to see the HMD motion correction to be handled by PTW rather than ASW. This combination of the two technologies will also see a number of benefits on both sides including leveraging PTW’s timing-free corrections and offering better activation and deactivation of ASW.

Thanks to both PTW and ASW working together, the new 2.0 version will be able to provide noticeable improvements to experiences that will result in less artefacting and stress on the hardware. The technology is also going to be better able to handle depth within titles and will enable more immersive results when users look around a scene and focus on objects at different distances. Again, the results will be more notable by users as they move around while developers will find it easier then before to manage and handle the technologies and the required processing power of their applications.

You can see the section from the Oculus Connect 5 PC SDK keynote that talks about ASW 2.0 in the below video at around the 31 minute mark.

VRFocus will be sure to bring you all the latest on this new version of the technology in the future, along with other updates to the Oculus Rift PC SDK so stay tuned for more.

 

 

Oculus Introduces Experimental New Locomotion types

Methods of movement within virtual reality (VR) has become a hot topic recently. A number of VR users have emerged who regard the teleportation methods that are common in VR titles to be lazy and are demanding more realistic forms of locomotion. Of course, this needs to be balanced against the needs of other players to avoid the dreaded simulation sickness. With this in mind, Oculus have added eight experimental new locomotion methods to its software development kit.

Oculus are encouraging developers to explore the new options. The company is careful to note that though some of the techniques in use might, at first glance, appear to be disruptive to immersion, players can quickly become accustomed to them, so the reduction of enjoyment is minimised.

There are a few various new types of ‘static world’ techniques, which offer users a static point of reference to counteract the movement of the rest of the virtual world. There are forms of ‘cockpit view, and the motion-controlled locomotion such as the one used in Lone Echo amongst others, as well as some more unusual and esoteric techniques, such as the curiously named ‘artificial tilt’ method, which apparently ‘departs from any notion of stasis’.

Most of the new techniques are quite difficult to describe in text without visual aids, but Oculus has released the source code for developers to test for themselves. The full list of new locomotion methods is here:

  • Artificial Tilt
  • Counter Optic Flow
  • Unreal World Beyond Static Cockpit
  • Process Reducing Relevance of Mismatch
  • Ski-pole World Manipulation
  • Portals into a Static World
  • Window into the Moving World
  • Emerging Static in the Periphery

Further information can be found on the Oculus blog.

VRFocus will continue to report on new developments in VR hardware or software.