Oculus Improves Iteration Time for Quest Developers Using UE4

Oculus has updated its UE4 Quest development tools so that developers can spend less time waiting and more time testing and iterating on their VR applications. The changes are similar in scope to recent improvements to Unity Quest development.

Because Quest is its own standalone headset, testing how an application will run on the headset itself requires packaging the application and then deploying it to the headset before being able to test it. Because this process can take a few minutes from the time you start packaging until you get to see your app in the headset, it’s very time consuming to test small changes. Iteration time—how quickly one can make changes and then see them—is a key factor in the efficient creation of any media. For developers building Quest applications, the faster the time between making changes and testing them, the more time they can spend honing their applications to be polished and perfromant.

SEE ALSO
Oculus is (still) Covering Unreal Engine Royalties for $5M in Revenue Per-game Through 2025

Oculus has introduced two changes with the goal of significantly speeding up the iteration process for developers building Quest applications with Unreal Engine 4. Some of the improvements also apply to developers building applications in their own game engine using the native Oculus Mobile SDK.

Skip APK Build When Iterating on Code

The first change allows developers to skip recompiling and repackaging their game after making code changes.

Instead of requiring the APK be rebuilt to include newer compiled binaries, a change was made to the Oculus OS to allow libraries to load automatically from the application’s dataDir instead of from the installed package if a file by the name exists, when a special flag is set in the package settings and the application is debuggable.

Oculus says the option can be found in Editor Preferences > General > Experimental, and also notes that “this option bypasses the normal Android APK build logic. If any changes are made that require a new APK to be generated, such as Java changes or Android manifest changes, you will need to disable this option temporarily to allow it to be built and installed.”

The blog post also explains how the function can be used in native development environments.

Image courtesy Oculus

Using the VR Template that comes with UE4 as a test case, Oculus says that the Skip APK Build change improved the time-to-launch by 2.95 times, from 3 and a half minutes to 66 seconds.

FASTBuild Support for UE4 Code Compilation

Oculus says it has implemented FASTBuild support, “a high performance, open-source build system […] [that] supports highly scalable compilation, caching, and network distribution.” This allows developers to speed up code compilation by distributing the work across computers on the developer’s local (or VPN) network.

The company shared instructions for using FASTBuild in its blog post, and says that the tool can speed up builds of both the UE4 editor and “any Oculus VR game projects,” which we understand to mean that it can benefit both Quest and Rift projects.

Image courtesy Oculus

For the ShowdownVR sample, using FASTBuild distributed compilation with access to 36 threads took 13 minutes and 39 seconds compared to 28 minutes and 28 seconds with local compiling on 12 threads. A fully cached build took just 3 minutes and 30 seconds. “These timings include all preprocessing, precompiled header generation, code compiling, and linking,” Oculus noted.

The post Oculus Improves Iteration Time for Quest Developers Using UE4 appeared first on Road to VR.

Dev Tool Uses Quest Hand-tracking to Quickly Model Realistic Hand Poses for VR

Grabbing objects in VR may be one of the medium’s most fundamental interactions, but making it work well isn’t as easy as you might think. Developers often need to spend time hand animating the hand model so that it appears to hold each object in a realistic way. Developer Luca Mefisto has built a smart tool which uses Quest’s hand-tracking to enable developers to motion-capture hand-poses, making the whole ordeal quicker, more realistic, and ultimately more immersive for players.

Update (July 20th, 2020): Developer Luca Mefisto has released the first version of his HandPosing tool which uses Oculus Quest hand-tracking to quickly author realistic hand-poses for virtual reality interactions. The tool is available on GitHub.

“This is a work in progress, and things are subject to change. I hope it serves others either as a useful tool or at least as a starting point for their grabbing-interaction implementations,” Mefisto writes.

The original article below, which outlines the benefits and functions of the HandPosing tool, continues below.

Original Article (July 7th, 2020): Some VR games employ various methods of ‘dynamic’ animation to create realistic hand poses when players grab objects in VR (Lone Echo, for instance). Generally that’s proprietary tech, which means any developers wanting to do the same would need to build a similar system from scratch (not an easy task).

Rather than do that, some games cut out the hand-posing problem entirely by simply making your virtual hands disappear when you grab objects (Vacation Simulator, for instance).

Developers that want to keep the player’s hands visible need to create hand-poses manually so that when you grab an object, your virtual hand grips the object in a realistic way. It’s not that this is a difficult task per se, but it can be immensely time consuming.

At minimum you need one custom hand-pose for every uniquely shaped object in a given game. Even then, consider how many different ways players might want to hold a single object… even if you cut out unlikely poses, you still may need four or five poses for a single object to cover the most obvious grips. If there’s 100 uniquely shaped objects in a game, that could mean animating 400 or 500 hand-poses.

SEE ALSO
Three Totally Creative Uses of Quest Hand-tracking

VR developer Luca Mefisto wants to make this whole process quicker and easier—allowing developers to make more realistic poses in less time. He’s building a tool which smartly leverages Quest’s hand-tracking feature to allow developers to take a ‘snapshot’ of their own of their own hand gripped around virtual objects.

The tool then allows developers to define valid positions for the pose, allowing the hand to snap realistically to the nearest valid position on the object.

Objects can also have multiple poses and grabbing points to cover different ways of grabbing the same object (like the scissors below).

Though the tool requires Quest’s hand-tracking for creating the poses, Mefisto says the hand-pose tool will work for games that employ hand-tracking or controllers.

Though the tool is so far unnamed, the developer plans to release it as an open source project to the VR development community. You can follow Mefisto on Twitter to see updates on the tool’s development and keep an eye out for its release.

The post Dev Tool Uses Quest Hand-tracking to Quickly Model Realistic Hand Poses for VR appeared first on Road to VR.

Unreal Engine Apps Can Now Be Built with Quest Hand-tracking

An update to Oculus developer tools has brought a handful of updates, including support for Quest hand-tracking in Unreal Engine 4.

Oculus released controllerless hand-tracking on Oculus Quest as a beta feature back in late 2019. At the time, the company had only added support to the Oculus Unity integration, meaning that developers building apps in Unreal Engine didn’t have access to the feature.

Hand-tracking on Quest went from beta to a fully-fledged feature last month, allowing developers to publish third-party apps with hand-tracking in the Oculus Quest store.

Now Oculus has updated its Unreal Engine integration with support for Quest hand-tracking in the v17.0 release. This allows developers working in Unreal Engine to make their app work with both controllers and hands, or hands-only, by selecting the appropriate option in the OculusVR plugin, and rigging up the rest of their app according to the newly released documentation.

SEE ALSO
Oculus is (still) Covering Unreal Engine Royalties for $5M in Revenue Per-game Through 2025

The v17.0 release for both Unity and Unreal engine also adds new capabilities to help developers achieve consistent color grading across Oculus’ different headsets (all of which use different displays).

Both Unity and Unreal Engine integrations now allow developers to choose a specific color space to work in; grading an app’s colors against a specific color space allows each headset to more accurately display the colors intended by the developer, even when the displays have different color capabilities.

Oculus published a new ‘Color and Brightness Mastering Guide‘ for developers which overviews four color space standards which are supported and provides recommendations for color mastering to “avoid issues with low-level banding, hue shift, and under or over-saturation.”

We recommend app developers to master all of their applications for the Oculus Rift and Rift S to the Rift CV1 color space on an Oculus Rift CV1, Rec.2020 color space for Oculus Quest, and Rec.709 color space for Oculus Go. The OLED display has a wider color gamut than the LCD and allows for richer visual experiences. VR apps authored for the Oculus Go and Rift S color spaces tend to have dull or washed out colors when viewed on the Oculus Quest and Rift CV1 displays.

Color space documentation specific to Unity, Unreal Engine, and the Oculus Mobile SDK has been added.

The post Unreal Engine Apps Can Now Be Built with Quest Hand-tracking appeared first on Road to VR.

Oculus is (still) Covering Unreal Engine Royalties for $5M in Revenue Per-game Through 2025

With the recent news that Unreal Engine was permanently waiving engine royalties for the first $1 million in app revenue, we were reminded of a similar program for VR apps based on Unreal Engine 4 that Oculus established back in 2016. We reached out to Oculus which confirmed that the program, which covers UE4 royalties for the first $5 million in revenue, is still in place and will continue through 2025.

Back in 2016, just a few months after Oculus launched its first Rift headset, the company announced a UE4 Royalty Payment program.

Although Epic Games announced last week that it will permanently waive Unreal Engine royalties for the first $1 million in app revenue, Oculus confirmed that its own program remains in place to cover Unreal Engine 4 royalties for the first $5 million in revenue from the Oculus store, per-app, through 2025.

While the change to Epic’s own royalty structure makes the Oculus program just a little less sweet, it’s still effectively free money back into the pockets of developers building VR apps with UE4.

Previously the program would have saved developers up to $250,000 per application; with the core changes to Unreal Engine’s royalty structure, the Oculus program will now save developers up to $200,000 (assuming all revenue from the Oculus store), though that first $50,000 will still get waived anyway given Epic’s new policy.

SEE ALSO
Unreal Engine 5 Tech Demo on PS5 Shows Where Next-gen Graphics are Headed

An Oculus spokesperson told Road to VR that the royalty waiver program only applies to Unreal Engine 4, but the company will consider extending it Unreal Engine 5 as well, which is due out in 2021. The company also clarified that the calculation for covering royalties on the first $5 million in revenue is based on gross revenue (which means before the 30% Oculus store cut).

While the Oculus royalty waiver program is applicable for apps on Quest, Rift, and Go, we aren’t clear on whether or not the same app launched on two or more headsets would be counted as a single app or separate apps in the eyes of the program. We’ve reached out to Oculus for clarity.

The post Oculus is (still) Covering Unreal Engine Royalties for $5M in Revenue Per-game Through 2025 appeared first on Road to VR.

Bored During Lockdown? Get an Intro to Quest Development in This Live Online Workshop

An upcoming online workshop by XR development educator Circuit Stream will teach developers the basics of building applications for Oculus Quest.

With many of us a stuck at home with some extra time on our hands from not traveling (or getting dressed) it may be an opportune time to learn new skills.

Since 2015 Circuit Stream has been educating developers on creating AR and VR applications; later this month the company will run its 25th ‘XR Development with Unity‘ course, a 10-week program covering the ins-and-outs of creating AR and VR applications.

But if you’re not ready to jump in that deep, Circuit Stream is holding a one-day ‘Intro to Oculus Quest Development‘ live workshop online next week on Saturday, May 16th. The company says the class is open to beginners; Unity experience and an Oculus Quest are both recommended but not required.

By the end of the Oculus Quest Workshop, you will have developed an application for the Oculus Quest, understand the foundations you need to develop for mobile VR and get hands on with new features like hand tracking.

After the workshop, you’ll hear from a former Oculus designer on how to successfully design apps for the Quest and learn about resources available in the Quest ecosystem.

The $350 workshop runs for five hours and is taught live by a Unity Certified Instructor. Circuit Stream says participants can expect to learn the following during the workshop:

  • How to build an Oculus Quest app from scratch in Unity
  • How to use Unity’s build pipeline for Quest (Android SDK tools)
  • How to optimize your Unity app for Oculus Quest
  • How to design interactive UI/UX features for Quest
  • How to add hand tracking to your Oculus Quest app
  • How to publish your app to the Oculus store
  • How to design mobile VR experiences around the industry’s best practices

You can get more details and sign up for the workshop on the event page.

SEE ALSO
Oculus 'Designing for Hands' Document Introduces Best Practices for Quest Hand-tracking

Circuit Stream hosts a range of other workshops, some free and on-demand, focusing on topics like introductory AR development, AR face tracking, VR training, building UI for hand-tracking, AI character behavior, and more. You can see the complete list of on-demand and upcoming live workshops here.

The post Bored During Lockdown? Get an Intro to Quest Development in This Live Online Workshop appeared first on Road to VR.

Unreal Engine 4.25 Improves Support for HoloLens 2, Magic Leap, Adds Azure Spatial Anchors

Unreal Engine 4.25 launched this week bringing a host of improvements to the engine’s XR functionality. HoloLens 2 and Magic Leap saw the most attention, including the addition of Azure Spatial Anchors, Microsoft’s cross-platform system which enables shared augmented spaces across devices.

Unreal Engine is one of the most popular game engines for building XR content. Each release brings improvements to the engine’s XR capabilities with new features and bug fixes. This week Unreal Engine 4.25 was released and saw a host of improvements, especially to HoloLens 2 and Magic Leap.

Epic Games says that Unreal Engine now offers production-ready support for HoloLens 2. This comes after a range of bug fixes and new capabilities now supported in the engine, including Microsoft Spatial Audio, App Remoting from packaged UE apps, mixed-reality capture, and beta support for Azure Spatial Anchors.

Image courtesy Microsoft

Azure Spatial Anchors is Microsoft’s cloud-based system which allows AR devices to recognize discrete real-world locations between sessions and to synchronize and share the position of spatial content between devices for multi-user applications.

Image courtesy Magic Leap

Magic Leap also saw a bunch of improvements to streamline development in Unreal Engine 4.25, including the ability to set up shared world experiences using new features of the Magic Leap SDK like GameMode, PlayerController, and GameState. Epic also says it has improved the AugmentedReality interface to make it easy to port smartphone-based AR projects over to Magic Leap.

SEE ALSO
Magic Leap Announces Layoffs & Pivot Away From Near-term Consumer Ambitions

On the VR side, Unreal Engine 4.25 sees a range of bug and crash fixes, and SDK updates.

Oculus’ OVRPlugin has been updated to version 1.45, along with an update to Oculus Audio which adds support for ARM64 on Quest. SteamVR has been updated to 1.5.17 and SteamAudio, Valve’s positional audio system, has been updated as well, now with support for high-quality stereo layers and dynamic geometry.

For more detail, check out the complete release notes for Unreal Engine 4.25.

The post Unreal Engine 4.25 Improves Support for HoloLens 2, Magic Leap, Adds Azure Spatial Anchors appeared first on Road to VR.

Valve Releases Beta OpenVR Support For Unity’s New XR Plugin System

Valve released a beta OpenVR package for the Unity game engine’s new XR plugin system. Unity is used to make the majority of VR games.

When the Unity 2019.3 publicly shipped in January, the engine deprecated support for the built-in VR support, including for OpenVR– Valve’s application programming interface (API) for SteamVR. This was replaced by a new modular XR Plugin system.

Under the new system, Unity ‘officially’ worked with 7 XR platforms: Apple’s ARKit, Google’s ARCore, Microsoft’s HoloLens & WMR, Magic Leap, Oculus, and PlayStation VR. Support for these platforms can be enabled with a few clicks. These platforms are “fully supported” by Unity, and the company is “directly” working with them on “deep platform integration, improvements to our engine, and optimizations to our XR tech stack for the platform”.

However, the engine also allows third parties to write their own plugins. At the time, Unity stated that Valve was working on such a plugin for OpenVR, which would be shipped separately from Unity by Valve.

That is what has now been released, and it’s available on Valve’s GitHub.

Input System Not Yet Complete

Valve describes this initial version as a Beta, and warns that developers should not release titles with it just yet.

Currently, the input system works by mapping specific buttons on a simulated per-controller basis. Games developed with this plugin cannot yet create OpenVR Actions.

That means players won’t be able to use SteamVR’s built in system for remapping controls. It also means developers don’t yet have access to the SteamVR Skeletal Input API.

Valve plans to rectify these issues in future versions. For now, Valve offers the following workaround:

We’ve created custom legacy bindings and hooked them up to the Unity Input System to give you individual access to as many controller sensors as possible. You can modify these bindings while in play mode by going to the SteamVR interface, Menu, Settings, Controllers, Manage Controller Bindings, and Custom. These are saved to a folder in your project at Assets/StreamingAssets/SteamVR/[bindings].json. We’ve included default bindings for a variety of supported SteamVR controllers.

If you would like your controller included in this default list please create an issue on our github with your preferred legacy binding and unity input system layout.

 

The post Valve Releases Beta OpenVR Support For Unity’s New XR Plugin System appeared first on UploadVR.

Oculus Makes Improvements to Iteration Time for Unity Quest Developers

Oculus has introduced three changes to significantly speed up the iteration process for developers building Quest applications with Unity.

Because Quest is its own standalone headset, testing how an application will run on the requires packaging the application and then deploying it to the headset before being able to test it. Because this process can take a few minutes from the time you start packaging to the time you get to see your app in the headset, it’s very time consuming to test small changes. Iteration time—how quickly one can make changes and then see them—is a key factor in the efficient creation of any media. For developers building Quest applications, the faster the time between making changes and testing them, the more time they can spend honing their applications to be polished and perfromant.

SEE ALSO
Dev Shares Crash Course on VR Game Optimization for Oculus Quest & Mobile Hardware

Oculus says long iteration time when building for Quest is a top pain point brought up by developers. The company has introduced three improvements for the Quest Unity integration which can drastically reduce iteration time. Two of the three changes are available with Unity 2018.1 and later, though one requires 2018.2 or later.

OVR Build APK and Run

The first is ‘OVR Build APK and Run’, a command which employs a cache to speed up the time it takes to package the app. Oculus says this method makes the same APK that would come from the normal ‘Unity Build and Run’, but does so 10–50% faster.

Image courtesy Oculus

OVR Quick Scene Preview

The second is ‘OVR Quick Scene Preview’ which automatically divides projects into multiple asset bundles and uploads them to Quest. After making changes to the app and using the command, only the bundles which contain changes need to be uploaded to the headset.

Image courtesy Oculus

Oculus tested OVR Quick Scene Preview with three published Quest apps—Beat Saber, Dead and Buried 2, and Superhot VR—and found that it drastically improved iteration time by more than 80% for each app.

Shader Stripping

The last change is what Oculus calls Shader Stripping (this one requires Unity 2018.2 or later) which can speed up both the ‘OVR Build APK and Run’ and ‘OVR Quick Scene Preview’ processes. Unity applications running on Quest only load Tier2 shaders, Oculus says, which means that it’s a waste to spend time packaging shaders of any other tier.

Image courtesy Oculus

Oculus also offered up an explanation for when developers should use ‘OVR Build APK and Run’ or ‘OVR Quick Scene Preview’:

OVR Quick Scene Preview is for fast iteration on scenes and assets and does not build an APK that is representative of your final project. As you are developing your project, OVR Quick Scene Preview is useful for reducing iteration time. When close to shipping or when wanting to see a closer representation of what your final project will look and run like, use OVR Build APK and Run. Both OVR Build APK and Run and OVR Quick Scene Preview build in development mode and should not be used to create a final shippable bundle.

For more details, Oculus points developers toward its developer documentation which has been updated with these new iteration improvements.

The post Oculus Makes Improvements to Iteration Time for Unity Quest Developers appeared first on Road to VR.

Open-source Modeling & Animation Tool ‘Blender’ Now Includes Basic VR Support

Blender, a popular free open-source modeling and animation tool, just launched its 2.83 update, which brings basic VR support via the OpenXR API. The recent release lets users step into their 3D scenes to see them up close and at scale; new VR features are expected in future releases.

Update (June 4th, 2020): Blender version 2.83 is now live, which includes the ability to inspect scenes from a VR headset. The creators have also released a feature showcase video highlighting some other items in arrival with 2.83, namely OpenVDB import, OptiX viewport denoising, and a new physics-enabled Cloth Brush.

The original article continues below:

Original Article (April 8th, 2020): The next version of Blender, version 2.83 planned for release in late-may, will include a first wave of VR support, the company recently announced. VR support is being added via the OpenXR API, which will allow the software to interface with any headset supporting OpenXR (which has wide support in the VR industry, though is still in the early process rolling out to individual headsets).

Initially, Blender’s VR support will only allow for scene inspection, which means users can look at their creations up close and at scale. For those using Blender to create assets and animations for use in VR games, being able to see their creations in-headset before being imported into a game engine could help streamline the production process.

More VR features are expected to be added in the future. Last year a Blender developer said “We have an awesome team of people interested in working on [XR].”

As Blender is a very complex piece of software, it’s unlikely that the full feature set (or even half of it) will be functional in VR, however it’s conceivable that in the future users might also be able to move objects in their scene in VR, and perhaps even do basic modeling and things like rigging and ‘puppeteering’ for animation.

Well before Blender’s upcoming official VR support, third-party plugins like Blender XR by MARUI have already made it possible to view and interact with Blender scenes in VR. However, official support, especially via OpenXR, should help futureproof the feature by ensuring compatibility with future headsets.

The post Open-source Modeling & Animation Tool ‘Blender’ Now Includes Basic VR Support appeared first on Road to VR.

Oculus Quest Devs Can Now Let The System Menu Show Inside Their Apps

Quest’s new menu UI supports being displayed while inside a VR app. Developers can now release updates to enable this, on a per-game basis.

Last month, Facebook revealed a UI overhaul for the Oculus Quest standalone headset. As well as a new look and multi-window support for 2D system apps, the system menu can now display inside VR apps when brought up.

oculus quest new ui immersive Overlay

The PC-based Rift platform got a similar feature in beta back in late 2017, fully available one year after. Just like with that release, it will only work within an app if the developer updates it with support.

How To Support In Unity

Most Oculus Quest apps are built using the Unity game engine. To add support for system overlays in Unity, developers need to update OVRPlugin to at least v13 and enable ‘Focus Aware‘ in the OVRManager script attached to OVRCameraRig.

To check when the system menu is opened and closed, developers should use the OVRManager.InputFocusLost and OVRManager.InputFocusAcquired events.

Since the system menu takes over input handling and displays controllers visually, apps should hide hands or controllers when it’s open. Facebook also recommends hiding any objects within 2 meters of the player, since otherwise there could be “unexpected visual artifacts” (likely referring to depth conflicts).

Taking the time to add this feature means users can stay within your virtual world while still quickly able to invite friends, take screenshots, manage settings, and in the future potentially even text chat friends.

As the Quest operating system expands, this could lead to true multi-tasking in standalone VR some day in the future.

The post Oculus Quest Devs Can Now Let The System Menu Show Inside Their Apps appeared first on UploadVR.