‘Batman: Arkham Shadow’ Development Started in Late 2020, Rocksteady Alums Among the Dev Team

Batman: Arkham Shadow is the next big first-party title from Meta, headed exclusively to Quest 3 later this year. In an interview with the founder of the studio behind the game, we learned more about the origin of the title, the team behind it, and which of the original Batman: Arkham games it’s most closely aligned with.

Batman: Arkham Shadow has a lot to live up to. Not only Camouflaj’s last game—Iron Man VR, which we called “VR’s first great superhero game”—but also the legacy of the core Batman: Arkham games, which to many represent not just some of the best superhero games ever made, but some of the best games, period.

So when studio founder Ryan Payton says he’s “confident [Batman: Arkham Shadows] will be the best game we’ve shipped to date,” he’s putting a lot on the line.

But there’s reason to believe that Camouflaj can pull it off. The studio now has years of experience in VR game design, and has proven itself able to create fun and comfortable mechanics that many thought couldn’t work well in VR, like high speed flying using repulsor jets on your hands in Iron Man VR.

The studio also has the backing of Meta itself, which acquired Camouflaj in late 2022. What’s more, as we learn in our interview with Payton, Camouflaj picked up two Rocksteady alums, both of which worked on the original Batman: Arkham games.

Below is our interview with Payton, which covers the studio’s transition into Meta, it’s learnings from developing Iron Man VR, and how the studio is positioning itself to deliver on the fantasy of being not just Batman, but specifically the Batman that players came to love in the Arkham games.

Batman: Arkham Shadows Interview with Camouflaj Founder Ryan Payton

Q: How has the transition been after the Meta acquisition?

A: Being part of Meta has been absolutely terrific. I’m reminded every day that my core thesis for Camouflaj joining Meta has thus far proven true: becoming a first-party studio has further enabled Camouflaj to pursue our goals of creating high quality, meaningful games. Our next release, Batman: Arkham Shadow, has definitely benefited from Camouflaj being part of the larger Meta organization, which is exciting for us as developers, and is great for players, too.

Q: How was Batman chosen as the topic of your next game?

A: When we were wrapping development work on Marvel’s Iron Man VR back in 2020, Meta called us and said they loved the game and would really love to see what Camouflaj could do with something of even greater size, scope, and ambition. We explored a few options but quickly settled on Batman, which then kicked off a six-month discussion with Warner Bros. Interactive about not only creating a big, exclusive Batman title for Meta Quest 3, but one that is an official entry in the storied Arkham franchise.

Since day one WB and DC have been terrific partners. They’ve provided guidance where necessary while also yielding a surprising degree of freedom for Camouflaj to explore areas of the Arkham franchise that have yet to be developed, encouraging us to put our stamp on those moments. We’ve struck, in my opinion, the perfect balance between IP partner and development studio to enable us to make something special.

As a lifelong Batman fan and admirer of the Arkham games, this project has been a dream come true. Batman: Arkham Shadow is the manifestation of all the strengths and learnings Camouflaj has developed over the past eight years developing VR games. I’m confident it will be the best game we’ve shipped to date.

Q: How long has Batman: Arkham Shadows been in development in earnest?

A: Development began in earnest sometime in Q3 of 2020, so it will have been about a four-year journey once we wrap development later this year.

Q: How did you approach the game’s overarching direction? There are expectations from the previous Arkham games, but given that this is a VR game, surely there are some unique considerations—were there any aspects that felt ‘essential’ in the game? Any that had to be thrown out because they didn’t fit in VR?

A: We initially went to WB with a back-of-the-napkin list of core Arkham features we felt were essential to getting right in VR. At the top of that list was the freeflow, rhythm-based combat that the team at Rocksteady revolutionized with Arkham Asylum. We also spoke about the importance of nailing the feeling of being Batman, with his graceful moments, the ability to grapple up to perches, run and slide through vent covers, and silently take down enemies from the shadows.

Early in the negotiations with WB we showed up with two pivotal prototypes: one focused on combat, and the other on locomotion. Thankfully they were impressed with what we delivered, and thus development work on Batman: Arkham Shadow commenced. From there we went further down the check list of classic Arkham elements we knew we needed to function great in VR including exploration, boss battles, investigations, gadgets, puzzles, collectibles, and character-driven cinematics.

When players get their hands on Batman: Arkham Shadow later this year, I think they’ll be impressed with the degree at which we’ve not only faithfully translated all the classic Arkham elements into VR, but how fresh and exciting it all feels.

Q: What, if anything, learned during the development of Iron Man VR has informed the design of Batman: Arkham Shadows?

A: Mechanically Iron Man VR and Batman: Arkham Shadow are very different games, which has been a challenge in its own right. One thing I think we’ve proven to excel at as a studio is creating highly polished VR mechanics that feel comfortable despite how fast-moving our games tend to be. When we first showed off Iron Man VR, people doubted our ability to make flying as Iron Man feel fast and comfortable, and yet the team pulled it off. With Arkham Shadow, we knew grappling up to high perches and then gliding down onto unsuspecting enemies, in VR, could be a challenge from a comfort perspective, not to mention how intense Arkham’s freeflow combat is. Yet again, however, I suspect the public will be surprised at how great everything feels when they get their hands on Arkham Shadow later this year. This is all truly a testament to the incredible colleagues I have here at Camouflaj.

Q: What, if anything, from Iron Man VR was not as successful as you would have liked, and you either scrapped or majorly reworked it to make things better for Batman: Arkham Shadows?

A: During the development of Arkham Shadow, we supported the amazing development team at Endeavor One to bring Iron Man VR to Quest 2 in time for holiday 2022. Although some of our engineers were concerned that it would be a distraction from our core work on Arkham Shadow, it ultimately ended up being an incredibly positive experience as it provided valuable insights in terms of how to push the Quest hardware. Supporting the development effort in bringing Iron Man VR to Quest ensured Arkham Shadow wouldn’t be our first Quest-focused game. From there we applied those learnings, allowing Arkham Shadow to take full advantage of all the strengths of Meta Quest 3.

Q: How has the development and design of Batman: Arkham Shadows been similar or different from Iron Man VR?

A: I don’t think it’s an exaggeration to say that the development of Arkham Shadow couldn’t be more different than how we built anything prior. We learned a lot developing our previous titles, and knew we needed to evolve as a team—to take it to the next level.

Throughout developing Iron Man VR I had a nagging feeling there was a better way, and then I came across a gentleman named Bill Green who had left Rocksteady after shipping all the Arkham titles over there. I was able to convince him to join Camouflaj, and since then Bill has truly transformed the way we develop games. Bill was able to convince Sophie Leal-Bolea to also join Camouflaj, who was a designer on both Arkham Origins and Arkham Knight, who has also been an incredible boon to the team.

Since Bill and Sophie joined, we’ve incorporated many of their lessons and transformed the way we make games here. Tangibly it means our designers and artists fully embrace the nature of highly iterative work, and develop pipelines and tools to support drastically uprooting areas that aren’t working for the game and story, and being equipped to make improvements in a matter of days. (The same changes that, back on Iron Man VR, would have taken us weeks if not months.) Bill and Sophie’s approach has supercharged the way we make games here, and I couldn’t be more thankful.

Ultimately it all comes down to the end user experience, so my hope is that players will find that, like the previous Arkham games, the world of Arkham Shadow feels expertly crafted and a joy to explore. We have some incredibly fun Arkham Asylum-inspired dungeons, in particular, I can’t wait to see players explore.

Q: As far as ‘game type’ goes, what are you aiming for? A straightforward linear narrative adventure?

A: Above all, the structure of Arkham Shadow is inspired by the first game in the series: Arkham Asylum. This means the game isn’t open world, but it affords a high degree of freedom for players to explore a courtyard-like area that connects the game’s dungeons. Players are free to return to dungeons in order to unlock secrets once they obtain a new gadget. I absolutely love that game loop, and it feels appropriate for the setting of the game, which we have yet to fully lift the veil on…

– – — – –

Batman: Arkham Shadow is due out Fall 2024. Do you think it will live up to the bar set by its predecessors? Sound off in the comments below!

The post ‘Batman: Arkham Shadow’ Development Started in Late 2020, Rocksteady Alums Among the Dev Team appeared first on Road to VR.

Oculus Founder’s New XR Headset Built Around “military requirements” but Also “used for non-military stuff”

This week at AWE 2024, Oculus founder Palmer Luckey spoke briefly about a new XR headset he’s developing. Though the headset would see Luckey returning to the XR space, this one is being built with military, rather than consumer, uses in mind.

Luckey confirmed earlier this month that he intends to build a new XR headset, but revealed almost no details about the project.

During a panel discussion today at AWE 2024, he offered just a bit more, saying the headset’s design is “being driven by military requirements, but also going to be used for non-military stuff. It’s really cool, it’s really something.”

Though Luckey was on stage with Bigscreen Beyond creator Darshan Shankar, the two didn’t indicate any joint work together.

Luckey said he’s announcing the project now because trying to keep it secret means fewer opportunities to find key collaborators and suppliers.

His mention of the upcoming headset’s military requirements suggests the project originates from within his current company, Anduril, a tech-focused military contractor.

Although “military requirements” can often be seen as synonymous with “incredibly expensive,” Luckey has grown Anduril into a multi-billion dollar company on the premise that major military contractors charge too much and deliver more slowly than they ought too.

Considering the headset’s angle, it seems unlikely that a military-focused headset would plug into any consumer XR ecosystems like Horizon OS or SteamVR. That leaves it up in the air whether the headset will be built on a proprietary platform—and how it will support the “non-military stuff” that Luckey mentioned. Likely that “stuff” refers to enterprise-focused use-cases like training, education, and design.

Luckey founded Oculus in 2012, the company whose Rift headset was the spark that rebooted the modern era of VR. As a rapidly growing startup, Oculus attracted the attention of Meta (at the time Facebook), which acquired the company in 2014 for more than $2 billion. Luckey continued in VR under Meta’s roof for several years but was eventually pushed out of the company due to backlash to his politics. After leaving Meta, Luckey went on to found Anduril, a tech-defense startup which itself went on to achieve a multi-billion valuation. Though Luckey hasn’t been active in XR since leaving Meta, he’s continued to be looked to as a thought leader in the space.

The post Oculus Founder’s New XR Headset Built Around “military requirements” but Also “used for non-military stuff” appeared first on Road to VR.

‘Gorilla Tag’ Has Topped $100M in Revenue, Making it One of VR’s Most Successful Games

VR studio Another Axiom today announced that its breakout title, Gorilla Tag, has surpassed $100 million in revenue. The company shared other key metrics about its player population that shine light on the state of the VR market.

Not long after its launch back in early 2021 it became clear there was something special about Gorilla Tag. It’s minimalistic ‘tag’ gameplay, unique arm-based locomotion, and novel social architecture made for simple social fun. And it turns out, people really like simple social fun.

This week the studio behind the game, Another Axiom, offered up some key metrics for Gorilla Tag that show just how large it has become. Here’s the quick and dirty:

  • $100 million in total revenue

  • 10 million lifetime players
  • 3 million monthly active users
  • 1 million daily active users

Gorilla Tag’s revenue comes primarily through in game cosmetics which allow for try-ons and a social shopping experience.

These figures make Gorilla Tag one of the most successful and most popular VR games to date. Not only are lots of people playing the game, Another Axiom also revealed the average playtime is nearly 60 minutes. That’s doubly impressive considering how physical of a game it can be.

As for what’s next? The studio isn’t leaving Gorilla Tag behind, but it’s busy at work on a spiritual successor to Lone Echo, the game which inspired Gorilla Tag’s movement in the first place.

The post ‘Gorilla Tag’ Has Topped $100M in Revenue, Making it One of VR’s Most Successful Games appeared first on Road to VR.

Quest ‘Augments’ Feature for Concurrent AR Apps Needs More Time to Cook, Says Meta CTO

Last year Meta announced the so-called Augments feature, planned for Quest 3, which would allow persistent mini AR apps to live in the world around you. Now, eight months after the headset hit store shelves, Meta’s CTO explains why the feature has yet to ship.

Augments was announced as a framework for developers to build mini AR apps that could not just live persistently in the space around you, but also run concurrently alongside each other—similar to how most apps work on Vision Pro today.

Image courtesy Meta

And though Meta had shown a glimpse of Augments in action when it was announced last year, it seems the company’s vision (and desire to market that vision) got ahead of its execution.

This week Meta CTO Andrew “Boz” Bosworth responded to a question during an Instagram Q&A about when the Augments feature would ship. He indicated the feature as initially shown wasn’t meeting the company’s expectation.

We were playing with [Augments] in January and we decided it wasn’t good enough. It was too held back by some system architecture limitations we had; it ended up feeling more like a toy and it didn’t really have the power that we think it needed to deliver on the promise of what it was.

So we made a tough decision there to go back to the drawing board, and basically [it needed] a completely different technical architecture. Starting from scratch basically. Including actually a much deeper set of changes to the system to enable what we wanted to build there. I think we made the right call—we’re not going to ship something we’re not excited about.

But it did restart the clock, and so [Augments is] going to take longer than we had hoped to deliver. I think it’s worth while, I think it’s the right call. But that’s what happened.

We’re only two-and-a-half months out from Meta Connect 2024, which would be the one-year anniversary of the Augments announcement. That’s where we likely to hear more about the feature, but at this point it’s unclear if it could ship by then.

The post Quest ‘Augments’ Feature for Concurrent AR Apps Needs More Time to Cook, Says Meta CTO appeared first on Road to VR.

Hands-on: Logitech’s MX Ink for Quest Could Legitimize VR Styluses as a Whole

Over the last decade I’ve reported on and tested many different VR styluses, but none of them have actually caught on. But the new MX Ink stylus for Quest stands a real chance at legitimizing the VR stylus as a whole, thanks to its thoughtful design, strong lineup of launch apps, and tight integration with Quest’s software.

This week Logitech announced MX Ink, an officially endorsed ‘Made for Meta’ stylus supporting Quest 2 and Quest 3 (see the full announcement details here). It’s the first time Meta has allowed any other company to harness its tracking technology in a third-party product. That alone makes MX Ink unique, but there’s more that makes this the device that could legitimize VR styluses as a whole.

The first styluses are thought to have been invented five millennia ago. And there’s a reason they’ve stuck with humanity ever since: a stylus amplifies the precision with which we can point. While that seems rather simple, it makes information tasks like writing, drawing, calculating, and designing significantly more practical and useful than using our fingers alone.

So it’s not surprising that we’ve seen many attempts to bring a VR stylus to life.

Just to name a few: in 2017 an enterprising developer hacked together a chunky prototype using a Vive Tracker and a pressure-sensitive stylus tip; in 2018 a company called Massless designed its own prototype VR stylus that it hoped to bring to market; even Wacom has been toying with the idea. Hell, Logitech already made a VR stylus back in 2019… but at $750, it’s no wonder it never made it to general availability.

So what could be different about Logitech’s new MX Ink? Well for one, the price is significantly more palatable than what’s come before. The $130 price point is a pretty easy sell for professionals for whom the added precision of a stylus could actually improve their workflow.

Logitech is also smartly launching some ‘nice to have’ extras for those who are really serious about making the MX Ink part of their workflow.

There’s the Inkwell dock which, for only another $40, gives you an easy place to store and charge the stylus so it’s ready for your next use. And there’s the MX Mat, for $50, which Logitech pitches as the ideal surface to make it feel like you’re drawing on a paper-like material when using the stylus.

Photo by Road to VR

But more importantly than price or accessories is the first-party integration with Meta and the strong lineup of supported software out of the gate.

Logitech worked directly with Meta, not only to adopt Quest’s tracking technology, but also to build the stylus’ software experience right into Horizon OS. Pairing the MX Ink is just like pairing one of the headset’s own controllers, without any extra hardware or software needed. Even the stylus’ settings—which let you control things like hand selection, button bindings, and pressure curves—are baked right into the system’s own Settings menu.

It’s even got a proper ‘Meta’ button on the end (where the eraser would be), making it easy to pull up the headset’s menu.

And then there’s the strong lineup of software that will work right out of the gate. Logitech has locked in a solid swath of VR design apps for MX Ink support:

  • Adobe Substance Modeler
  • Gravity Sketch
  • PaintingVR
  • Arkio
  • Engage
  • OpenBrush
  • GestureVR
  • ShapesXR
  • Elucis by RealizeMedical

If Logitech plays its cards right, MX Ink could be the first VR stylus that really sticks the landing. So needless to say, I was intrigued to try it.

Hands-on With Logitech MX Ink for Quest

Photo by Road to VR

Last week I swung by Logitech’s San Jose, CA office to check out an early version of the stylus for myself. Compared to the company’s last VR stylus, the MX Ink is significantly more compact. Even so, I was impressed with the tracking.

Photo by Road to VR

Even with my hand covering a significant area of the stylus, there were seemingly enough hidden IR LEDs hiding under the stylus’ shell to provide continuous tracking no matter how I held or twisted the stylus. The company said it even put IR LEDs toward the tip of the MX Ink so it could be held like a wand or a pallet knife.

Logitech says the stylus is ‘as accurate as the Quest controllers’—but that doesn’t mean it can’t be more precise. Using a stylus as a pointing device means you can use your dexterous fingers to manipulate the input position in a very fine way; far more so than twisting your wrist alone (which is what primarily drives fine controller motion).

That was obvious while I was using the MX Ink to draw and sketch directly onto a real table in front of me. The pressure sensitive tip also made it feel natural to vary line width as needed.

Photo by Road to VR

I also tried using the MX Ink stylus against a whiteboard while using Quest 3’s mixed reality view. The tight latency and accuracy of the stylus really made it feel like I was leaving marks on the whiteboard. It was a whole layer of immersion that I wasn’t expecting to feel while trying the stylus.

This sense of actually leaving real marks on the whiteboard only made the next part even more mind-bending… I could lift the stylus from the surface while holding the button on the barrel and extend my drawing into the third dimension. Watching my strokes literally leap off the page like this was just plain fun.

While pressing the MX Ink against a real surface, the tip communicates the amount of pressure to the headset and thus changes the thickness of the line you draw. But when you’re using the stylus to draw in 3D, suddenly there’s no way for the system to know how much pressure you’re using, right? Actually, no; Logitech smartly made the button on the barrel of the stylus pressure sensitive itself, so you can squeeze softer or harder to define the width of brush strokes, even when you’re drawing in the air.

The MX Ink even includes a haptic engine for feedback. So even if you’re using it against a virtual surface, the stylus can let you know when you’re touching the canvas.

– – — – –

I’m impressed with the level of thoughtfulness in the design of MX Ink. It’s clear the company has carried over some important lessons learned from its previous experiments with VR styluses.

MX Ink has a reasonable price point, direct integration with the most popular headsets on the market, and a strong lineup of supporting apps. Logitech is giving the VR stylus—as a category—its best chance yet at really catching on.

The essential pieces are in place. The thing that will make or break this product is now likely down to how well integrated it is into the workflow of key applications. My understanding is that developers have a huge range of control over exactly how their applications will handle MX Ink. Half-hearted implementations could kill what otherwise looks like a strong product.

With MX Ink not due to launch until September, there’s time still for applications to tighten up their implementations, so we’ll have to wait to see how it all comes together.

The post Hands-on: Logitech’s MX Ink for Quest Could Legitimize VR Styluses as a Whole appeared first on Road to VR.

VisionOS 2 Enables WebXR by Default, Unlocking a Cross-platform Path to Vision Pro

We’ve know that Apple planned to support WebXR for quite some time, but with VisionOS 2, the company is enabling the feature for all users. WebXR allows developers to deliver cross-platform XR experiences directly from the web, with no gatekeepers to approve or reject content.

WebXR is the widely supported web standard that allows developers to deliver AR and VR content from the web.

Just like anyone can host a webpage online without any explicit approval from another party, WebXR allows the same for AR and VR content. And because it’s delivered through the browser, accessing and sharing WebXR experiences is as easy as clicking or sending a link—like this one.

Vision Pro has supported initial WebXR support since its launch, but it required users to manually enable the feature by digging into Safari’s settings.

With VisionOS 2—available today as a developer preview, and coming to all this Fall—WebXR will be enabled by default, making it easy for anyone to access WebXR through the headset. Vision Pro thus joins headsets like Quest, HoloLens 2, and Magic Leap 2 in supporting WebXR content.

Though WebXR is “supported” on VisionOS 2, our understanding is that it only support VR (or ‘fully immersive’) experiences. WebXR is also capable of delivering AR experiences (where virtual content is merged with a view of the real world), but VisionOS 2 doesn’t yet support that portion of the standard.

There’s many reasons why developers might want to use WebXR to build experiences over native apps that are distributed through a headset’s proprietary store.

For one, any headset with WebXR support can run any compatible WebXR experience, meaning developers can build one experience that works across many headsets, rather than needing to make multiple builds for multiple headsets, then uploading and managing those builds across multiple platform stores.

Like a webpage, WebXR content can also be updated at any time, allowing developers to tweak and enhance the experience on the fly, without needing to upload new builds to multiple stores, or for users to download a new version.

WebXR also has no gatekeepers. So content that wouldn’t be allowed on, say, Apple or Meta’s app stores—either for technical or content-related reasons—can still reach users on those headsets. That could include adult content that’s explicitly forbidden on most platform app stores.

The post VisionOS 2 Enables WebXR by Default, Unlocking a Cross-platform Path to Vision Pro appeared first on Road to VR.

VisionOS 2 Improvement Targets Key Vision Pro Critique Among Developers

For basic Vision Pro interactions like navigating apps and scrolling web pages, the headset’s look-and-pinch input system works like magic. But if you want to go more ‘hands-on’ with virtual content, the headset’s full hand-tracking leaves much to be desired.

Compared to Quest 3, Vision Pro’s full hand-tracking has notably more latency. That means when moving your hands it takes longer for the headset to register the movement. Especially in interactive content where you directly grab virtual objects, this can make the objects feel like they lag behind your hand.

Changes coming in VisionOS 2 stand to improve hand-tracking. Apple detailed the changes in a developer session at WWDC 2024 this week.

For one, the headset will now report estimated hand positions at 90Hz instead of the previous 30Hz. That means the system can reflect changes in hand position in one-third of the time, also making the movement of the hand smoother thanks to more frequent updates. This only applies to a small portion of the overall latency pipeline (which we previously estimated at a total of 127.7ms) but it could reduce hand-tracking latency by as much as 22ms in the best case scenario.

Here’s a look at that in action:

It’s an improvement, but you can still easily see the latency of the teapod compared to the hand, even with this slow movement.

For a snappier experience, VisionOS 2 will alternatively allow developers to enable hand-tracking prediction, which provides an estimate of the user’s future hand position. While this doesn’t truly reduce latency, it can reduce perceived latency in many cases. Similar prediction techniques are common across various XR tracking systems; it’s quite surprising that Vision Pro wasn’t already employing it—or at least not making it available to developers.

Here’s a look at predictions in action:

Now we can see the virtual teapot staying much more aligned to the user’s hand. Granted, this isn’t likely to look quite as good with faster motions.

We’ll be looking forward to putting Vision Pro’s hand-tracking latency to the test with VisionOS 2 soon!

The post VisionOS 2 Improvement Targets Key Vision Pro Critique Among Developers appeared first on Road to VR.

Hand-tracking Action Arcade Game ‘Thrasher’ Coming to Quest & Vision Pro Next Month, PC VR Later

From the artist and composer behind indie hit Thumper (2016), the forthcoming Thrasher is officially launching on July 25th on Quest and Vision Pro. A PC VR and flatscreen version is planned for release at a later date.

We’ve known Thrasher was in the works for some time now, but this unique looking game now has an official release date of July 25th, priced at $20. The game is coming to Quest and is now confirmed to be releasing on Vision Pro the very same day. The PC VR and flatscreen version will launch at a later date.

A new trailer shows us a better look at the game’s chaotic yet mesmerizing gameplay:

Apparently based purely on hand-tracking (though likely falling back to controllers on PC VR), the gameplay looks undeniably unique. Here’s how the developers describe it:

TRANSCEND SPACETIME

Immerse yourself in a dazzling odyssey where music, visuals, and gameplay mesh into one transcendent experience. Journey from the depths of primordial gloom to the heights of celestial bliss, culminating in a heart pounding reckoning with a cosmic baby god.

YOU V THE UNIVERSE

Swoop, dash and thrash at breakneck speed, busting through obstacles and stacking up combos, leading to nine jaw dropping encounters with mysterious leviathans that will challenge your skills, and your sanity.

POWER UP

Deploy power-ups to supercharge your space eel and max out your combos. Create a destructive rainbow spray of bullets, bulldoze everything in a blaze of color and light, or slow things down to create a perfect path through the chaos.

SOUND AND FURY

Lose yourself in the enthralling soundtrack created by designer Brian Gibson, bassist for the band Lightning Bolt. THRASHER is a spatial audio showcase, creating a stunning sensory experience.

CHILL OR CHALLENGE

Vibe out and enjoy the wild journey, or push yourself to the limit by chaining together massive combos to worm your way up the rankings.

Will you be giving Thrasher a try?

The post Hand-tracking Action Arcade Game ‘Thrasher’ Coming to Quest & Vision Pro Next Month, PC VR Later appeared first on Road to VR.

VR Support Planned for Serene Photography Game Built with UE5

Lushfoil Photography Sim, a serene photography game built on Unreal Engine 5, is expected to get optional PC VR support following its initial release.

In development by solo developer Matt Newell and to be published by Annapurna Interactive, Lushfoil Photography Sim is designed as a serene walking simulator and photography game that immerses players in beautiful landscapes with a photorealistic style.

Players are equipped with a camera with a range of realistic settings, currently including “shutter speed, ISO, aperture, white balance, and different lens types,” and includes a “learning tool for newcomers that covers the basics of exposure and other settings.”

A demo version of the game was released this week as part of Steam Next Fest. Though the game doesn’t currently support VR, developer Matt Newell has confirmed to Road to VR that PC VR support is planned to be added sometime after the game’s initial release.

Given the game’s emphasis on photorealistic visuals, and its Unreal Engine 5 foundation, the developer doesn’t expect a port to Quest or PSVR 2 to be practical for the game.

Lushfoil Photography Sim’s initial release doesn’t have a firm date yet (officially “coming soon”), and it’s not yet clear how long it will take for the addition of VR support after that.

The post VR Support Planned for Serene Photography Game Built with UE5 appeared first on Road to VR.

Some of Vision Pro’s Biggest New Development Features Are Restricted to Enterprise

VisionOS 2 is bringing a range of new development features, but some of the most significant are restricted to enterprise applications.

VisionOS 2 will bring some of the top requested development features to the headset, but Apple says its reserving some of them for enterprise applications only.

Developers that want to use the features will need ‘Enterprise’ status, which means having at least 100 employees and being accepted into the Apple Developer Enterprise Program ($300 per year).

Apple says the restriction on the new dev capabilities is to protect privacy and ensure a predictable experience for everyday users.

Here’s a breakdown of the enterprise-only development features coming to VisionOS 2, which Apple detailed in a WWDC session.

Vision Pro Camera Access

Up to this point, developers building apps for Vision Pro and VisionOS couldn’t actually ‘see’ the user’s environment through the headset’s cameras. That limits the ability for developers to create Vision Pro apps that directly detect and interact with the world around the user.

With approval from Apple, developers building Vision Pro enterprise apps can now access the headset’s camera feed. This can be used to detect things in the scene, or to stream the view for use elsewhere. This is popular for ‘see what I see’ use-cases, where a remote person can see the video feed of someone at a work site in order to give them help or instruction.

Developers could also use the headset’s camera feed with a computer vision algorithm to detect things in view. This might be used to automatically identify a part, or verify that something was repaired correctly.

Even with Apple’s blessing to use the feature, enterprise apps will need to explicitly ask the user for camera access each time it is used.

Barcode and QR Code Detection

Image courtesy Apple

Being able to use the headset’s camera feed naturally opens the door for reading QR codes and barcodes, which allow structured data to be transmitted to the headset visually.

Apple is providing a readymade system for developers to detect, track, and read barcodes using Vision Pro.

The company says this could be useful for workers to retrieve an item in a warehouse and immediately know they’ve found the right thing by looking at a barcode on the box. Or to scan a barcode to easily pull up instructions for assembling something.

Neural Engine Access

Enterprise developers will have the option to tap into Vision Pro’s neural processor to accelerate machine learning tasks. Previously developers could only access the compute resources of the headset’s CPU and GPU.

Object Tracking

Although the new Object Tracking feature is coming to VisionOS 2 more broadly, there are additional enhancements to this feature that will only be available to enterprise developers.

Object Tracking allows apps to include reference models of real-world objects (for instance, a model of a can of soda), which can be detected and tracked once they’re in view of the headset.

Enterprise developers will have greater control over this feature, including the ability to tweak the max number of tracked objects, deciding to track only static or dynamic objects, and changing the object detection rate.

Greater Control Over Vision Pro Performance

Enterprise developers working with VisionOS 2 will have more control over the headset’s performance.

Apple explains that, out of the box, Vision Pro is designed to strike a balance between battery life, performance, and fan noise.

But some specific use-cases might need a different balance of those factors.

Enterprise developers will have the option to increase performance by sacrificing battery life and fan noise. Or perhaps stretch battery life by reducing performance, if that’s best for the given use-case.


There’s more new developer features coming to Vision Pro in VisionOS 2, but these above will be restricted to enterprise developers only.

The post Some of Vision Pro’s Biggest New Development Features Are Restricted to Enterprise appeared first on Road to VR.