Apple Might Open Applications For Vision Pro Development Kits In July

A page on Apple's website suggests it will open applications for Vision Pro development kits in July.

The subpage of the visionOS section of the Apple Developer site titled Work with Apple provides three options for developers interested in bringing apps to the new spatial computing platform. A label on the page reads 'Available in July'.

Vision Pro compatibility evaluations: developers of existing iPhone and iPad apps will be able to request a report from Apple "on your app or game’s appearance and how it behaves in visionOS."

Vision Pro developer labs: Apple will host sessions in Cupertino, London, Munich, Shanghai, Singapore, and Tokyo – developers can apply to attend a session and experience their visionOS, iPadOS, or iOS apps running on a Vision Pro.

Vision Pro developer kits: "To support great ideas for apps and games for visionOS," Apple will make Vision Pro developer kits available. "Stay tuned for how to apply," the page notes.

0:00
/
visionOS Simulator

To build spatial apps for visionOS Apple recommends using its own suite of tools: Xcode IDE, SwiftUI for user interfaces, and its ARKit and RealityKit frameworks for handling tracking, rendering, physics, animations, spatial audio, and more. Apple even announced Reality Composer Pro, essentially its own engine editor.

For developers unable to acquire a dev kit, Apple is adding a visionOS Simulator to Xcode in the version 15 update. This lets developers interact with their apps in a simulated environment as if running on Vision Pro. These apps will be able to be tested on Vision Pro at Apple's hosted developer labs, or when the developer is able to acquire a development kit.

We'll certainly be keeping a close eye on Apple's website in July.

Apple to Open Locations for Devs to Test Vision Pro This Summer, SDK This Month

Ahead of the Apple Vision Pro’s release in ‘early 2024’, the company says it will open several centers in a handful of locations around the world, giving some developers a chance to test the headset before it’s released to the public.

It’s clear that developers will need time to start building Apple Vision Pro apps ahead of its launch, and it’s also clear that Apple doesn’t have heaps of headsets on hand for developers to start working with right away. In an effort to give developers the earliest possible chance to test their immersive apps, the company says it plans to open ‘Apple Vision Pro Developer Labs’ in a handful of locations around the world.

Starting this Summer, the Apple Vision Pro Developer Labs will open in London, Munich, Shanghai, Singapore, Tokyo, and Cupertino.

Apple also says developers will be able to submit a request to have their apps tested on Vision Pro, with testing and feedback being done remotely by Apple.

Image courtesy Apple

Of course, developers still need new tools to build for the headset in the first place. Apple says devs can expect a visionOS SDK and updated versions of Reality Composer and Xcode by the end of June so support development on the headset. That will be accompanied by new Human Interface Guidelines to help developers follow best practices for spatial apps on Vision Pro.

Additionally, Apple says it will make available a Vision Pro Simulator, an emulator that allows developers to see how their apps would look through the headset.

Developers can find more info when it’s ready at Apple’s developer website. Closer to launch Apple says Vision Pro will be available for the public to test in stores.

Guest Post: How CM Games Ported Into The Radius To Quest

In spring 2021, our VR team in CM Games decided to port the PC version of Into the Radius VR to the Meta Quest Store. We released it in September 2022 and had it paid off in a week. Now it’s time to share the details.

This article will be helpful for game developers who are considering porting their games or publishing on Meta Quest Store. We will explain how we came to Meta Quest 2, what issues we encountered when porting the PC version and how we solved them.

The following piece is a guest post sent to UploadVR by CM Games, written by Into the Radius producer Aleksei Shulga. 

Choose The Best Porting Studio

Let’s discuss how you can arrange porting in general.

  • Co-development. A partner integrated into your work process with your internal team being heavily involved in the production process and calling the shots while the partner studio provides the missing expertise and human resources.
  • Outsource model. You entrust the game to a partner to take care of it from the beginning to the platform certification with some supervision and guidance from your side. Sometimes the revshare model is used.

For Into the Radius, we opted for a co-dev model from the get-go for the following reasons:

  • Accelerated work pace. The game was constantly updated as part of our premium live game business model. While we negotiated the porting, the game was already in production, and a major update was planned. If we wanted to use the outsourcing model, we would have had to wait until the game was more or less finished to outsource it; otherwise, no one would want to estimate the final cost with an ever-changing work scope.
  • Preserving gameplay. There is a vast difference in specs between the PC and Standalone VR, and our design team had to make sure that if they had to make concessions, it wouldn’t ruin the experience.
  • Future updates. We wanted to keep updating the game, so we either had to depend on a partner to continue to do it or be able to do it ourselves.

To sum up, we didn’t have much choice regarding the model. Your experience may differ, but it’s critical to think through your product plans for a few years after porting.

That’s why we chose to collaborate with a potential porting partner. We also encourage you to pay attention to the following points in your selection process:

  • Know the team. During the search process, you should interview the potential partners’ team members to ensure they are competent and communicate effectively. Remember that companies always put their best foot forward and present the best employees to the client. But there is no guarantee that these particular people will be working on your project or doing so for an extended period. It can be helpful to specify that you want to meet the team working directly on the project before you sign the contract or even put these employees’ names on it, providing for the possibility of replacing them for various reasons.
  • Partner portfolio matters. A good portfolio is not only the guarantee that you can port the game to another platform but also that the porting will be high quality. Meta Quest Store is still a closed platform, therefore, a partner portfolio will help to proceed with pitching.

In our case, we partnered with Snowed In Studios. They are part of Keywords Studios, which is well-known to Meta and a respected service provider globally.

  • Price. When you first contact a studio try and get a very rough ballpark estimate for the cost of the work involved, we found out that the estimates can vary more than thirty times between studios. Save time on the negotiations with the ones that are clearly out of reach or inadequately low. If you are working on a co-development model, you will likely be doing time and materials agreement since you will change the game as you port it and make monthly payments for the work approved by your side.
  • Ability to port. Into the Radius is a challenging game to port to mobile devices — large open spaces where players can roam freely, manual control of items with each item meticulously detailed rather than represented as an icon in an inventory slot. So to make sure it can be done with an acceptable level of gameplay compromise, it’s essential to manage risks. The earlier you can do this, the less money you will spend on a potentially failing project.

Make all the cuts to reach the target frame rate and see how bad it looks, what the most challenging areas are, and try to figure out if and how you can handle it, at least theoretically.

Quest Store Pitch

The Meta Quest Store is a closed and carefully curated platform with high standards for the technical quality of the games released there. The platform determines whether you can release a game in the main store or have to choose others, such as App Lab and Side Quest.

Chances are, if this is your studio’s first VR game, and you don’t have other platforms where your game already runs great, it’s easier to release it to the App Lab first, collect data, iterate the product, and then pitch it to the main store.

The game was doing well on Steam and it helped us in making contact with Meta. We got contacts, met the people in charge, and prepared a pitch deck with all the data and plans we already had. And we held several face-to-face meetings.

We chose this option because we’ve already had success on Steam, selling over 100,000 copies and getting highly positive feedback. We outlined our goals in the pitch deck, after which Meta representatives expressed doubts about the possibility of 1-to-1 porting. But we were able to convince them, thanks partly to thorough communication and the choice of a porting partner.

In September 2022, Meta Platforms released about 5-10 games monthly to give each product its time in the limelight. That’s why platform representatives want to ensure you deliver on time and with high quality.

Porting Stage

Porting a live game is more complicated than a finished one. You need to prepare the original version, merge the original and the ported versions, then control both streams. For a better workflow, communication is key.

  • Refactoring the original version. When the porting team started their work, the ITR team was already busy with the 2.0 update, reworking the graphics, game maps, and even the gameplay loop. So we decided to do separate branches before the release of the 2.0 version, with the porting team focusing on optimizing the Unreal Engine and overall game systems, and the content optimization was to happen after the merge.
  • Monthly planning. The porting studio acts as a contracted group of employees who port the game in stages. Thus, we can easily calculate the budget for a given period. We encourage you to discuss the priorities of each stage, leverage your partner’s experience, and give feedback on the plans and results obtained. Things change, and if both your teams are on the same page about where you’re going now, you can solve problems more effectively.
  • Regular check-ups. Our senior team members joined the daily porting team meetings twice a week. It helped to strengthen ties between the teams and better align developer efforts. The porting team was integrated into our Slack and Jira, and we tried to eliminate as many communication barriers as possible. We also had weekly meetings with management to assess progress, team composition, and goals for the next sprint.
  • Different time zones. We had an 8-hour time difference between studios. It turned out not to be as detrimental as we thought at first. As long as you have a couple of hours of reasonable work day time overlap, it’s ok from our experience. With some workflow modification, it also ensures uninterrupted work on the binary files, because when one team has finished its workday, the other has just started.
  • Communication with Meta. You should also keep your Meta curator informed about the status of the version. Set deadlines for milestones, report them, and meet them. If you fall behind on your deadlines, notify them in advance. The more professional and committed you are, the more credibility you will have.

Public Announcement

After six months of porting, we were allowed to announce that we were preparing a Quest 2 version of the game with a teaser. The release window was still considered September, but in most cases, we are not allowed to announce any dates until certification is complete. The public announcement also included information that there would be a closed Beta.

Beta

We announced the beta at the end of the teaser. Players had to open a landing page and leave an email to sign up for testing to participate. 

It was a fully functional version of the game with all the necessary content, with most of the team’s efforts focused on performance improvements and bug fixes. We decided to divide the beta access into several waves with an increasing number of participants in each wave, starting with 20 select users from the core community and ending with 2,000 in the next waves.

The beta was also critical for us to answer questions from the community and potential buyers. For example, does the trailer reflect the actual gameplay? Is the game truly a fully-featured Into The Radius? The beta release gave users enough information to alleviate any concerns.

Certification

Before your game can be released to the store, it must pass the official certification process. By the start of the process, you supply a complete build, and the designated quality assurance vendor tests it for compliance with the Virtual Reality Checks (VRCs). All bugs are logged and must be addressed in a timely manner or requested to be waived by your project’s curator.

Two VRC points usually cause the most problems. First, it’s performance — there should not be long FPS drops during the game. And second, the progress of the game should not be blocked in any way.

Also, remember that all marketing assets and the store page must be approved and ready several weeks before the release. The earlier you have everything ready, the more leverage you have to publish the page if the certification goes perfectly. The earlier you publish the page, the more time you will have to affect traffic using marketing tools.

Summary

Porting Into the Radius to Meta Quest 2 was a challenge because of our business model of free updates, the game’s scale, and complexity. But it turned out to be justified financially, and in terms of audience response. It can be a much easier process for simpler games or games with a different structure, requiring much less investment.

In any case, if you are considering porting your game to Meta Quest 2, ask yourselves these questions:

  • How complex / hardware-demanding is your game? 
  • Do you have the budget or available investment to do it? 
  • Do you have good metrics to show from other platforms?

Then choose a porting approach and present everything to Meta. If they are interested, decide whether you will do it yourself, find a partner, or use a mixed scheme as we did.

Once again, communication is key to the success of multi-stakeholder projects. Not only do you need to make a great product, but you need to do it on time and with well-delivered communication.


Aleksei Shulga, the director and producer of Into The Radius VR, went from a 3D artist to developer to producer during his 15+ year long career path in the industry.

Guest Post: How CM Games Ported Into The Radius To Quest

Guest Post: How CM Games Ported Into The Radius To Quest

In spring 2021, our VR team in CM Games decided to port the PC version of Into the Radius VR to the Meta Quest Store. We released it in September 2022 and had it paid off in a week. Now it’s time to share the details.

This article will be helpful for game developers who are considering porting their games or publishing on Meta Quest Store. We will explain how we came to Meta Quest 2, what issues we encountered when porting the PC version and how we solved them.

The following piece is a guest post sent to UploadVR by CM Games, written by Into the Radius producer Aleksei Shulga.

Guest Post: How CM Games Ported Into The Radius To Quest

Choose The Best Porting Studio

Let’s discuss how you can arrange porting in general.

  • Co-development. A partner integrated into your work process with your internal team being heavily involved in the production process and calling the shots while the partner studio provides the missing expertise and human resources.
  • Outsource model. You entrust the game to a partner to take care of it from the beginning to the platform certification with some supervision and guidance from your side. Sometimes the revshare model is used.

For Into the Radius, we opted for a co-dev model from the get-go for the following reasons:

  • Accelerated work pace. The game was constantly updated as part of our premium live game business model. While we negotiated the porting, the game was already in production, and a major update was planned. If we wanted to use the outsourcing model, we would have had to wait until the game was more or less finished to outsource it; otherwise, no one would want to estimate the final cost with an ever-changing work scope.
  • Preserving gameplay. There is a vast difference in specs between the PC and Standalone VR, and our design team had to make sure that if they had to make concessions, it wouldn’t ruin the experience.
  • Future updates. We wanted to keep updating the game, so we either had to depend on a partner to continue to do it or be able to do it ourselves.

To sum up, we didn’t have much choice regarding the model. Your experience may differ, but it’s critical to think through your product plans for a few years after porting.

That’s why we chose to collaborate with a potential porting partner. We also encourage you to pay attention to the following points in your selection process:

  • Know the team. During the search process, you should interview the potential partners’ team members to ensure they are competent and communicate effectively. Remember that companies always put their best foot forward and present the best employees to the client. But there is no guarantee that these particular people will be working on your project or doing so for an extended period. It can be helpful to specify that you want to meet the team working directly on the project before you sign the contract or even put these employees’ names on it, providing for the possibility of replacing them for various reasons.
  • Partner portfolio matters. A good portfolio is not only the guarantee that you can port the game to another platform but also that the porting will be high quality. Meta Quest Store is still a closed platform, therefore, a partner portfolio will help to proceed with pitching.

In our case, we partnered with Snowed In Studios. They are part of Keywords Studios, which is well-known to Meta and a respected service provider globally.

  • Price. When you first contact a studio try and get a very rough ballpark estimate for the cost of the work involved, we found out that the estimates can vary more than thirty times between studios. Save time on the negotiations with the ones that are clearly out of reach or inadequately low. If you are working on a co-development model, you will likely be doing time and materials agreement since you will change the game as you port it and make monthly payments for the work approved by your side.
  • Ability to port. Into the Radius is a challenging game to port to mobile devices — large open spaces where players can roam freely, manual control of items with each item meticulously detailed rather than represented as an icon in an inventory slot. So to make sure it can be done with an acceptable level of gameplay compromise, it’s essential to manage risks. The earlier you can do this, the less money you will spend on a potentially failing project.

Make all the cuts to reach the target frame rate and see how bad it looks, what the most challenging areas are, and try to figure out if and how you can handle it, at least theoretically.

Guest Post: How CM Games Ported Into The Radius To Quest

Quest Store Pitch

The Meta Quest Store is a closed and carefully curated platform with high standards for the technical quality of the games released there. The platform determines whether you can release a game in the main store or have to choose others, such as App Lab and Side Quest.

Chances are, if this is your studio’s first VR game, and you don’t have other platforms where your game already runs great, it’s easier to release it to the App Lab first, collect data, iterate the product, and then pitch it to the main store.

The game was doing well on Steam and it helped us in making contact with Meta. We got contacts, met the people in charge, and prepared a pitch deck with all the data and plans we already had. And we held several face-to-face meetings.

We chose this option because we’ve already had success on Steam, selling over 100,000 copies and getting highly positive feedback. We outlined our goals in the pitch deck, after which Meta representatives expressed doubts about the possibility of 1-to-1 porting. But we were able to convince them, thanks partly to thorough communication and the choice of a porting partner.

In September 2022, Meta Platforms released about 5-10 games monthly to give each product its time in the limelight. That’s why platform representatives want to ensure you deliver on time and with high quality.

Porting Stage

Porting a live game is more complicated than a finished one. You need to prepare the original version, merge the original and the ported versions, then control both streams. For a better workflow, communication is key.

  • Refactoring the original version. When the porting team started their work, the ITR team was already busy with the 2.0 update, reworking the graphics, game maps, and even the gameplay loop. So we decided to do separate branches before the release of the 2.0 version, with the porting team focusing on optimizing the Unreal Engine and overall game systems, and the content optimization was to happen after the merge.
  • Monthly planning. The porting studio acts as a contracted group of employees who port the game in stages. Thus, we can easily calculate the budget for a given period. We encourage you to discuss the priorities of each stage, leverage your partner’s experience, and give feedback on the plans and results obtained. Things change, and if both your teams are on the same page about where you’re going now, you can solve problems more effectively.
  • Regular check-ups. Our senior team members joined the daily porting team meetings twice a week. It helped to strengthen ties between the teams and better align developer efforts. The porting team was integrated into our Slack and Jira, and we tried to eliminate as many communication barriers as possible. We also had weekly meetings with management to assess progress, team composition, and goals for the next sprint.
  • Different time zones. We had an 8-hour time difference between studios. It turned out not to be as detrimental as we thought at first. As long as you have a couple of hours of reasonable work day time overlap, it’s ok from our experience. With some workflow modification, it also ensures uninterrupted work on the binary files, because when one team has finished its workday, the other has just started.
  • Communication with Meta. You should also keep your Meta curator informed about the status of the version. Set deadlines for milestones, report them, and meet them. If you fall behind on your deadlines, notify them in advance. The more professional and committed you are, the more credibility you will have.

Public Announcement

After six months of porting, we were allowed to announce that we were preparing a Quest 2 version of the game with a teaser. The release window was still considered September, but in most cases, we are not allowed to announce any dates until certification is complete. The public announcement also included information that there would be a closed Beta.

Beta

Guest Post: How CM Games Ported Into The Radius To Quest

We announced the beta at the end of the teaser. Players had to open a landing page and leave an email to sign up for testing to participate.

It was a fully functional version of the game with all the necessary content, with most of the team’s efforts focused on performance improvements and bug fixes. We decided to divide the beta access into several waves with an increasing number of participants in each wave, starting with 20 select users from the core community and ending with 2,000 in the next waves.

The beta was also critical for us to answer questions from the community and potential buyers. For example, does the trailer reflect the actual gameplay? Is the game truly a fully-featured Into The Radius? The beta release gave users enough information to alleviate any concerns.

Certification

Before your game can be released to the store, it must pass the official certification process. By the start of the process, you supply a complete build, and the designated quality assurance vendor tests it for compliance with the Virtual Reality Checks (VRCs). All bugs are logged and must be addressed in a timely manner or requested to be waived by your project’s curator.

Two VRC points usually cause the most problems. First, it’s performance — there should not be long FPS drops during the game. And second, the progress of the game should not be blocked in any way.

Also, remember that all marketing assets and the store page must be approved and ready several weeks before the release. The earlier you have everything ready, the more leverage you have to publish the page if the certification goes perfectly. The earlier you publish the page, the more time you will have to affect traffic using marketing tools.

Summary

Porting Into the Radius to Meta Quest 2 was a challenge because of our business model of free updates, the game’s scale, and complexity. But it turned out to be justified financially, and in terms of audience response. It can be a much easier process for simpler games or games with a different structure, requiring much less investment.

In any case, if you are considering porting your game to Meta Quest 2, ask yourselves these questions:

  • How complex / hardware-demanding is your game?
  • Do you have the budget or available investment to do it?
  • Do you have good metrics to show from other platforms?

Then choose a porting approach and present everything to Meta. If they are interested, decide whether you will do it yourself, find a partner, or use a mixed scheme as we did.

Once again, communication is key to the success of multi-stakeholder projects. Not only do you need to make a great product, but you need to do it on time and with well-delivered communication.


Aleksei Shulga, the director and producer of Into The Radius VR, went from a 3D artist to developer to producer during his 15+ year long career path in the industry.

Meta Releases Haptics SDK For Quest Controllers, With 34 Premade Patterns

Meta released a Haptics SDK for Unity, and Haptic Studio for authoring haptics.

These tools appear to be the result of Meta’s acquisition of the German startup Lofelt last year, as we speculated at the time of the acquisition. Lofelt offered a haptics SDK for Unity and its flagship product was a haptics authoring tool called Lofelt Studio.

Haptics Studio

Haptics Studio is a desktop app for Windows and macOS with a VR companion app for Quest headsets.

It lets developers create haptics clips and wirelessly test them on Quest 2 and Quest Pro controllers. Clips can be created from audio files or by editing one of the existing haptic samples.

Meta recommends testing on Quest Pro controllers when developing haptics, because they have a higher fidelity haptic actuator.

Finished haptics clips can be exported as .haptic files which can be used in the Haptics SDK for Unity.

Haptics SDK

The Haptics SDK for Unity allows developers to integrate the .haptic files created in Haptics Studios into their apps.

According to Meta, the actuator in Quest 2 controllers has lower vibration data sample rate than the actuator in the Pro controllers, and runs at a fixed frequency. At runtime, the Haptics SDK detects the haptic capabilities of the controller currently in use and “optimizes the haptic pattern.” Meta says this also ensures support for future Quest controllers.

Quest Pro controllers also have independent actuators under the index trigger and thumb rest, but the Haptics SDK documentation doesn’t mention anything about this.

The SDK includes 34 premade haptic files any Quest developer can use, including the feeling of water, bushes, rowing, grass, snow, touching virtual UI, punching, opening a box, opening a drawer, flicking switches, using a servo, and hitting or scrapping with a sword.

Meta Interaction SDK Gets Hand Tracking Teleport Gesture With Demo Game

Meta added a hand tracking teleportation system to its Interaction SDK.

The Interaction SDK is a Unity framework providing high quality common hand interactions for controllers and hand tracking. It includes direct object grabbing, distance grabbing, interactable UI elements, gesture detection, and more. This means developers don’t have to reinvent the wheel, and users don’t have to relearn interactions between apps using the SDK.

The next version of Interaction SDK adds gestures and visualization for teleportation and snap turning when using controller-free hand tracking. Gesture based locomotion systems like this are necessary for adding hand tracking to apps and games where you explore a virtual world.

To point to where you teleport you turn your hand to the side and extend your index finger and thumb while closing your other fingers to your palm. To perform the teleport, just pinch your index finger to your thumb. It’s somewhat similar to the pinch “click” used in the Quest system interface, but with your hand rotated.

Some hand tracking apps such as Waltz Of The Wizard already implement their own teleportation gesture, but Interaction SDK should let any developer add it without needing to build their own.

You can try out Meta’s hand tracking teleportation system in the First Hand demo on App Lab. It showcases many Interaction SDK features, and now has a Chapter 2 to show locomotion too.

Quest Gets A System-Wide Blocking API, But Will Developers Use It?

Quest now has a system-wide Blocking API as part of the Platform SDK.

This means developers can keep people you already blocked in other apps away from you.

While Meta provides a friends and matchmaking service, most developers use their own system to be able to release on multiple platforms, including SteamVR and non-VR consoles. But this means if someone harasses, trolls or annoys you in one app, blocking them won’t stop them interacting with you in other apps.

When you click on an in-app block button, developers can now trigger a system popup to block their Oculus account too. Given this popup will return the response to the app, this could even replace the application’s own blocking interface.

Developers can also retrieve a list of app-specific unique identifications for people who own the same app that the current user has already blocked at the system level. This information can be used to at least mute these users, or ideally prevent them being in the same session at all.

Documentation for Blocking API is available on the Oculus Developer Website.

Using the Blocking API is currently encouraged but not required. It’s unclear how many developers will actually adopt the feature, especially those with focus on multiple platforms. Given the recent interest from wider media in negative interactions in social VR spaces though, might Meta one day enforce its use for apps on the Quest Store?

Quest Developers Can Now Use Your Walls & Furniture For Mixed Reality

Quest developers can now use your walls & furniture from Room Setup.

Room Setup shipped last week as an experimental feature. It lets you mark out your walls, doors, windows, and furniture from inside Quest’s real world passthrough view, using the VR controllers.

With the release of v40 of the Quest SDK developers can now access these walls, doors, windows – as well as your couch and desk – to build mixed reality applications. Marking your couch was added to the Quest OS in February 2021, and marking your desk in April 2021, but developers haven’t been able to access these until now.

Quest 2’s passthrough view is low resolution and black & white – it was originally only intended for setting up your Guardian boundary. So while apps using this new scene understanding will run on Quest 2, it’s clear the real purpose of the new functionality is for Project Cambria, Meta’s upcoming “high end” headset with high resolution color passthrough. From a practical perspective, Quest 2 seems to be a low end development kit for mixed reality, not the ideal target device.

Having to mark out walls, doors, windows, couches, desks and furniture manually is a fairly arduous and imprecise process. Last week Meta’s CTO explained why this isn’t yet automatic, saying “Segmentation is getting better all the time but still has error. The risk of getting it wrong is a concern as it relates to how people can safely navigate a physical space.” Consulting CTO John Carmack went into more detail on Twitter, saying “There are a good number of smart people at Meta working on understanding the world from camera images, but most of it isn’t production ready.”

But will this always be the case for Project Cambria? It will feature much higher resolution cameras and active depth sensing, which Meta says “helps reconstruct surfaces with higher fidelity and accuracy”. In the announcement of the v40 SDK, Meta Product Manager Rangaprabhu Parthasarathy hinted “in the future, devs will be able to add more bells and whistles to their apps that are only available on this device”.

Documentation for Scene Understanding in Unity is available on the Oculus Developer website.

Tech Secrets Behind ‘Cosmonious High’s’ Cast of Interactive Characters

Cosmonious High contains 18 characters across six species all created by a team with zero dedicated animators. That means lots and lots of code to create realistic behaviors and Owlchemy-quality interactivity! The ‘character system’ in Cosmonious High is a group of around 150 scripts that together answer many design and animation problems related to characters. Whether it’s how they move around, look at things, interact with objects, or react to the player, it’s all highly modular and almost completely procedural.

This modularity enabled a team of content designers to create and animate every single line of dialogue in the game, and for the characters to feel alive and engaging even when they weren’t in the middle of a conversation. Here’s how it works.

Guest Article by Sean Flanagan & Emma Atkinson

Cosmonious High is a game from veteran VR studio Owlchemy Labs about attending an alien high school that’s definitely completely free of malfunctions! Sean Flanagan, one of Owlchemy’s Technical Artists, created Cosmonious High’s core character system amongst many other endeavors. Emma Atkinson is part of the Content Engineering team, collectively responsible for implementing every narrative sequence you see and hear throughout the game.

The Code Side

Almost all code in the character system is reusable and shared between all the species. The characters in Cosmonious High are a bit like modular puppets—built with many of the same parts underneath, but with unique art and content on top that individualizes them.

From the very top, the character system code can be broken down into modules and drivers.

Modules

Every character in Cosmonious High gets its behavior from its set of character modules. Each character module is responsible for a specific domain of problems, like moving or talking. In code, this means that each type of Character is defined by the modules we assign to it. Characters are not required to implement each module in the same way, or at all (e.g. the Intercom can’t wave.)

Some of our most frequently used modules were:

CharacterLocomotion – Responsible for locomotion. It specifies the high-level locomotion behavior common to all characters. The actual movement comes from each implementation. All of the ‘grounded’ characters—the Bipid and Flan—use CharacterNavLocomotion, which moves them around on the scene Nav Mesh.

CharacterPersonality – Responsible for how characters react to the player. This module has one foot in content design—its main responsibility is housing the responses characters have when players wave at them, along with any conversation options. It also houses a few ‘auto’ responses common across the cast, like auto receive (catching anything you throw) and auto gaze (returning eye contact).

CharacterEmotion – Keeps track of the character’s current emotion. Other components can add and remove emotion requests from an internal stack.

CharacterVision – Keeps track of the character’s current vision target(s). Other components can add and remove vision requests from an internal stack.

CharacterSpeech – How characters talk. This module interfaces with Seret, our internal dialogue tool, directly to queue and play VO audio clips, including any associated captions. It exposes a few events for VO playback, interruption, completion, etc.

It’s important to note that animation is a separate concern. The Emotion module doesn’t make a character smile, and the Vision module doesn’t turn a character’s head—they just store the character’s current emotion and vision targets. Animation scripts reference these modules and are responsible for transforming their data into a visible performance.

Drivers

The modules that a character uses collectively outline what that character can do, and can even implement that behavior if it is universal enough (such as Speech and Personality.) However, the majority of character behavior is not capturable at such a high level. The dirty work gets handed off to other scripts—collectively known as drivers—which form the real ‘meat’ of the character system.

Despite their more limited focus, drivers are still written to be as reusable as possible. Some of the most important drivers—like CharacterHead and CharacterLimb—invisibly represent some part of a character in a way that is separate from any specific character type. When you grab a character’s head with Telekinesis, have a character throw something, or tell a character to play a mocap clip, those two scripts are doing the actual work of moving and rotating every frame as needed.

Drivers can be loosely divided into logic drivers and animation drivers.

Logic drivers are like head and limb—they don’t do anything visible themselves, but they capture and perform some reusable part of character behavior and expose any important info. Animation drivers reference logic drivers and use their data to create character animation—moving bones, swapping meshes, solving IK, etc.

Animation drivers also tend to be more specific to each character type. For instance, everyone with eyes uses a few instances of CharacterEye (a logic driver), but a Bipid actually animates their eye shader with BipedAnimationEyes, a Flan with FlanAnimationEyes, etc. Splitting the job of ‘an eye’ into two parts like this allows for unique animation per species that is all backed by the same logic.

Continue on Page 2: The Content Side »

The post Tech Secrets Behind ‘Cosmonious High’s’ Cast of Interactive Characters appeared first on Road to VR.

Meta Releases UE4 Graphics Demo to Show What Quest 2 Can Do with Expert Optimization

In an effort to help PC VR developers bring their content to Quest 2, Meta has ported Showdown, an old UE4 VR graphics showcase, to the headset as a case study in optimization best practices.

Showdown is a UE4 PC VR demo originally made by Epic Games back in 2014 to show off high-fidelity VR graphics running at 90Hz on a GTX 980 GPU at a 1,080 × 1,200 (1.3MP) per-eye resolution.

Eight years later, you can now run Showdown on Quest 2 at 90Hz on the headset’s Snapdragon XR2 chip at 1,832 × 1,920 (3.5MP) per-eye resolution.

Meta ported the short demo as a case study in optimizing PC VR content to run on Quest 2.

And while the app has been heavily optimized and doesn’t look as good as its PC VR counterpart—decent anti-aliasing, lighting, and high-res textures are missing—it shows that developers don’t have to shy away from lots of objects, particles, and effects just because they’re targeting Quest 2.

The video above looks slightly worse than the experience in the headset itself due to a low-ish bitrate recording and the visibility of fixed foveated rendering (lower resolution in the corners of the image), which is significantly less visible in the headset itself due to blurring of the lens. Here’s Showdown running on PC if you’d like to see a comparison.

It’s not the best-looking thing we’ve seen on Quest 2, but it’s a good reminder that Quest 2’s low-power mobile chip can achieve something akin to PS2 graphics at 90Hz.

Meta’s Zac Drake published a two-part breakdown of the process of profiling the app’s performance with the company’s App Spacewarp tech, and the process of optimizing the app to run at 90Hz on Quest 2.

The GTX 980 GPU (which Showdown originally targeted on PC) is at least six times more powerful than the GPU in Quest 2… so there was a lot of work to do.

While the guide is specific to projects built with UE4, the overall process, as surmised by Drake, applies to optimizing any project to run on the headset:

  1. Get the project building and running on Quest
  2. Disable performance intensive features
  3. Measure baseline performance
  4. Optimize the stripped down project
  5. Optimize individual features as we re-enable them
  6. Re-enable feature
  7. Measure performance impact
  8. Optimize as needed

Although it’s plenty possible to get ambitious PC VR games running on Quest 2, building from the ground up with the headset in mind from the outset is sure to bring better results, as developer Vertical Robot is hoping to prove with its upcoming Red Matter 2.

The post Meta Releases UE4 Graphics Demo to Show What Quest 2 Can Do with Expert Optimization appeared first on Road to VR.