New Evidence Suggests Meta is Still Working on PC VR Cloud Game Streaming for Quest

It was first discovered years ago that Meta has been testing cloud game streaming to bring PC VR games to Quest, but the feature has yet to see the light of day (or even an official announcement). New evidence suggests the feature, codenamed Avalanche, is still in active development and may support specific games from the Oculus Studios catalog, like Lone Echo (2017).

Standalone VR is no doubt the most convenient way to play VR, but running VR content on what amounts to a smartphone chip is no small task. Developers need to drastically cut back on graphics and sometimes features in order to maintain acceptable performance in VR games.

Meta fortunately offers the Quest Link feature to stream PC VR content from your own gaming PC to the headset. This makes it easy to enjoy the high-quality visuals of PC VR games with the convenience of a tetherless headset.

But what if you don’t already own a powerful gaming PC? Well, you might be in luck.

Meta has long been testing a cloud game streaming feature for Quest, codenamed Avalanche, which would allow users to play PC VR games by rendering the game in the cloud and streaming it to the headset. It’s very similar to Quest Link, except instead of streaming from your own gaming PC, it’s streaming from a gaming PC that’s somewhere in the cloud.

References to Avalanche cloud game streaming on Quest have been spotted at least as early as 2022, but two years later the feature still hasn’t been launched or even announced.

But hope may not be lost. A new reference to Avalanche has purportedly been spotted in the experimental settings of the forthcoming Quest v67 update. Clicking the option even asks the user which specific game they want to play—in this case the only option shown is Lone Echo (2017), a seminal title in the early days of VR, and still a graphical showcase by most PC VR standards.

Though the Avalanche session never connected to the game, the goal is clear. Meta could use the feature to bring its catalog of impressive PC VR exclusives to a much wider Quest audience. This would mean players would get to play these games at much greater graphical quality than Quest would be able to handle on its own, and without Meta needing to make any major modifications to the titles.

Quest’s Avalanche feature would be far from the first PC VR cloud game streaming service out there. Similar offerings from third-parties like PlutoSphere and Shadow proved out the concept, but struggled to gain traction—in no small part because Meta doesn’t allow VR cloud game streaming services on the Quest store. This, unfortunately, wouldn’t be the first time the company actively disallowed certain services on its headset while building its own version.

The post New Evidence Suggests Meta is Still Working on PC VR Cloud Game Streaming for Quest appeared first on Road to VR.

2D to 3D Photo Conversion in visionOS 2 is the Real Deal

The 2D to 3D photo conversion feature coming to Vision Pro in VisionOS 2.0 makes a novel capability meaningful for the first time.

Cue “Apple didn’t even do it first!” in the comments.

You’re not wrong. There’s been seemingly a hundred different startups over the years that have promised to turn 2D photos into 3D.

Even Meta had a go at it when it added 2D to 3D photo conversion to Facebook several years ago. But they never really caught on… probably because seeing 3D photos on a smartphone isn’t that exciting, even if Facebook added a little ‘wiggle’ animation to show the depth on 2D displays.

When it comes to features that people actually want to use—it doesn’t matter who does it first. It matter who does it well.

This headline says “the real deal,” because Apple has, in fact, actually done it well with Vision Pro. The 2D to 3D conversion doesn’t just look good, the feature is actually implemented in a way that takes it beyond the novelty of previous attempts.

The feature is part of visionOS 2.0, which is currently available in a developer beta. Apple says the feature creates “spatial photos” from your existing 2D images (which of course just means stereoscopic ‘3D’).

Granted, even though it’s “just stereoscopic,” seeing your own photos in 3D really adds a layer of depth to them (figuratively and literally). While a 2D photo can remind us of memories, a 3D photo feels much closer to actually visiting the memory… or at least seeing it through a window.

In VisionOS 2.0, just go to the usual Photos app, then open any photo and spot the little cube icon at the top left. Click it and the headset analyzes and converts it to 3D in just two or three seconds. With a click you can also return to the original.

The results aren’t perfect but they’re very impressive. It’s unfortunate I can’t actually show them to you here—since I have no way to embed a 3D photo in this page, and 99.9% of you are probably reading this on a 2D display anyway—but it’s the best automatic 2D to 3D photo conversion that I’ve personally seen.

The speed and accuracy is doubly impressive because the conversion is happening 100% on-device. Apple isn’t sending your photos off to a server to crank out a 3D version with cloud processing resources and then sending it back to your headset. That makes the feature secure by default (and available offline), which is especially important when it comes to a dataset that’s as personal as someone’s photo library.

Across the photos you’d find in the average person’s library—pictures of people, pets, places, and occasionally things—the conversion algorithm seems to handle a wide range of these very well.

While the feature works best on real-life photography, you can also use it on synthetic imagery, like digital artwork, AI-generated photos, 3D renderings, and the like. Results vary, but I overall I was impressed with the feature’s ability to create plausible 3D depth even from synthetic imagery which itself never actually had any 3D depth in the first place.

The thing the algorithm seems to struggle with the most is highly reflective and translucent surfaces. It often ends up ‘painting’ the reflections right onto the reflecting object, rather than projecting them ‘into’ the object with correct depth.

The only major limitation at the moment is that 2D to 3D photo conversion doesn’t seem to want to work on panoramic images. On Vision Pro panoramas can already be blown up and wrapped around you in a way that feels life-sized, but they would still get another layer of emotional impact from being 3D-ified.

It’s unclear why this limitation exists at present, but it’s likely either because panoramas tend to be very high resolution (and would take longer than a few seconds to convert), or Apple’s 2D to 3D algorithm needs more training on wide field-of-view imagery.

Beyond that limitation, the thing that really makes this feature… a feature (not just a ‘technical possibility’), is that it’s built right in and works in the places and ways you’d expect.

Not only can you send spatial photos to other users who can view them in 3D on their own headset, you can also start a SharePlay session and view them together—an incredible way to share moments and memories with the people that matter to you.

And its easy to actually get the photos you want onto your headset for viewing.

Many people will have their iCloud photos library synced with their headset, so they’ll already have all their favorite photos ready to view in 3D. I personally don’t use iCloud photos, but I was easily able to select some of my favorite photos from my iPhone and AirDrop them, which automatically opened the Photos app so they were right in front of me in the headset.

Further, you can just save any old photo to your headset—be it from Facebook, a website, or another app—and use the 2D to 3D conversion feature to view them with a new layer of intrigue.

And this is what makes this visionOS 2.0 feature different than 2D to 3D conversion software that has come before it. It’s not that Apple has any groundbreaking quality advantage in the conversion… it’s the fact that they made the experience good enough and easy enough that people will actually want to use it.

The post 2D to 3D Photo Conversion in visionOS 2 is the Real Deal appeared first on Road to VR.

‘Batman: Arkham Shadow’ Development Started in Late 2020, Rocksteady Alums Among the Dev Team

Batman: Arkham Shadow is the next big first-party title from Meta, headed exclusively to Quest 3 later this year. In an interview with the founder of the studio behind the game, we learned more about the origin of the title, the team behind it, and which of the original Batman: Arkham games it’s most closely aligned with.

Batman: Arkham Shadow has a lot to live up to. Not only Camouflaj’s last game—Iron Man VR, which we called “VR’s first great superhero game”—but also the legacy of the core Batman: Arkham games, which to many represent not just some of the best superhero games ever made, but some of the best games, period.

So when studio founder Ryan Payton says he’s “confident [Batman: Arkham Shadows] will be the best game we’ve shipped to date,” he’s putting a lot on the line.

But there’s reason to believe that Camouflaj can pull it off. The studio now has years of experience in VR game design, and has proven itself able to create fun and comfortable mechanics that many thought couldn’t work well in VR, like high speed flying using repulsor jets on your hands in Iron Man VR.

The studio also has the backing of Meta itself, which acquired Camouflaj in late 2022. What’s more, as we learn in our interview with Payton, Camouflaj picked up two Rocksteady alums, both of which worked on the original Batman: Arkham games.

Below is our interview with Payton, which covers the studio’s transition into Meta, it’s learnings from developing Iron Man VR, and how the studio is positioning itself to deliver on the fantasy of being not just Batman, but specifically the Batman that players came to love in the Arkham games.

Batman: Arkham Shadows Interview with Camouflaj Founder Ryan Payton

Q: How has the transition been after the Meta acquisition?

A: Being part of Meta has been absolutely terrific. I’m reminded every day that my core thesis for Camouflaj joining Meta has thus far proven true: becoming a first-party studio has further enabled Camouflaj to pursue our goals of creating high quality, meaningful games. Our next release, Batman: Arkham Shadow, has definitely benefited from Camouflaj being part of the larger Meta organization, which is exciting for us as developers, and is great for players, too.

Q: How was Batman chosen as the topic of your next game?

A: When we were wrapping development work on Marvel’s Iron Man VR back in 2020, Meta called us and said they loved the game and would really love to see what Camouflaj could do with something of even greater size, scope, and ambition. We explored a few options but quickly settled on Batman, which then kicked off a six-month discussion with Warner Bros. Interactive about not only creating a big, exclusive Batman title for Meta Quest 3, but one that is an official entry in the storied Arkham franchise.

Since day one WB and DC have been terrific partners. They’ve provided guidance where necessary while also yielding a surprising degree of freedom for Camouflaj to explore areas of the Arkham franchise that have yet to be developed, encouraging us to put our stamp on those moments. We’ve struck, in my opinion, the perfect balance between IP partner and development studio to enable us to make something special.

As a lifelong Batman fan and admirer of the Arkham games, this project has been a dream come true. Batman: Arkham Shadow is the manifestation of all the strengths and learnings Camouflaj has developed over the past eight years developing VR games. I’m confident it will be the best game we’ve shipped to date.

Q: How long has Batman: Arkham Shadows been in development in earnest?

A: Development began in earnest sometime in Q3 of 2020, so it will have been about a four-year journey once we wrap development later this year.

Q: How did you approach the game’s overarching direction? There are expectations from the previous Arkham games, but given that this is a VR game, surely there are some unique considerations—were there any aspects that felt ‘essential’ in the game? Any that had to be thrown out because they didn’t fit in VR?

A: We initially went to WB with a back-of-the-napkin list of core Arkham features we felt were essential to getting right in VR. At the top of that list was the freeflow, rhythm-based combat that the team at Rocksteady revolutionized with Arkham Asylum. We also spoke about the importance of nailing the feeling of being Batman, with his graceful moments, the ability to grapple up to perches, run and slide through vent covers, and silently take down enemies from the shadows.

Early in the negotiations with WB we showed up with two pivotal prototypes: one focused on combat, and the other on locomotion. Thankfully they were impressed with what we delivered, and thus development work on Batman: Arkham Shadow commenced. From there we went further down the check list of classic Arkham elements we knew we needed to function great in VR including exploration, boss battles, investigations, gadgets, puzzles, collectibles, and character-driven cinematics.

When players get their hands on Batman: Arkham Shadow later this year, I think they’ll be impressed with the degree at which we’ve not only faithfully translated all the classic Arkham elements into VR, but how fresh and exciting it all feels.

Q: What, if anything, learned during the development of Iron Man VR has informed the design of Batman: Arkham Shadows?

A: Mechanically Iron Man VR and Batman: Arkham Shadow are very different games, which has been a challenge in its own right. One thing I think we’ve proven to excel at as a studio is creating highly polished VR mechanics that feel comfortable despite how fast-moving our games tend to be. When we first showed off Iron Man VR, people doubted our ability to make flying as Iron Man feel fast and comfortable, and yet the team pulled it off. With Arkham Shadow, we knew grappling up to high perches and then gliding down onto unsuspecting enemies, in VR, could be a challenge from a comfort perspective, not to mention how intense Arkham’s freeflow combat is. Yet again, however, I suspect the public will be surprised at how great everything feels when they get their hands on Arkham Shadow later this year. This is all truly a testament to the incredible colleagues I have here at Camouflaj.

Q: What, if anything, from Iron Man VR was not as successful as you would have liked, and you either scrapped or majorly reworked it to make things better for Batman: Arkham Shadows?

A: During the development of Arkham Shadow, we supported the amazing development team at Endeavor One to bring Iron Man VR to Quest 2 in time for holiday 2022. Although some of our engineers were concerned that it would be a distraction from our core work on Arkham Shadow, it ultimately ended up being an incredibly positive experience as it provided valuable insights in terms of how to push the Quest hardware. Supporting the development effort in bringing Iron Man VR to Quest ensured Arkham Shadow wouldn’t be our first Quest-focused game. From there we applied those learnings, allowing Arkham Shadow to take full advantage of all the strengths of Meta Quest 3.

Q: How has the development and design of Batman: Arkham Shadows been similar or different from Iron Man VR?

A: I don’t think it’s an exaggeration to say that the development of Arkham Shadow couldn’t be more different than how we built anything prior. We learned a lot developing our previous titles, and knew we needed to evolve as a team—to take it to the next level.

Throughout developing Iron Man VR I had a nagging feeling there was a better way, and then I came across a gentleman named Bill Green who had left Rocksteady after shipping all the Arkham titles over there. I was able to convince him to join Camouflaj, and since then Bill has truly transformed the way we develop games. Bill was able to convince Sophie Leal-Bolea to also join Camouflaj, who was a designer on both Arkham Origins and Arkham Knight, who has also been an incredible boon to the team.

Since Bill and Sophie joined, we’ve incorporated many of their lessons and transformed the way we make games here. Tangibly it means our designers and artists fully embrace the nature of highly iterative work, and develop pipelines and tools to support drastically uprooting areas that aren’t working for the game and story, and being equipped to make improvements in a matter of days. (The same changes that, back on Iron Man VR, would have taken us weeks if not months.) Bill and Sophie’s approach has supercharged the way we make games here, and I couldn’t be more thankful.

Ultimately it all comes down to the end user experience, so my hope is that players will find that, like the previous Arkham games, the world of Arkham Shadow feels expertly crafted and a joy to explore. We have some incredibly fun Arkham Asylum-inspired dungeons, in particular, I can’t wait to see players explore.

Q: As far as ‘game type’ goes, what are you aiming for? A straightforward linear narrative adventure?

A: Above all, the structure of Arkham Shadow is inspired by the first game in the series: Arkham Asylum. This means the game isn’t open world, but it affords a high degree of freedom for players to explore a courtyard-like area that connects the game’s dungeons. Players are free to return to dungeons in order to unlock secrets once they obtain a new gadget. I absolutely love that game loop, and it feels appropriate for the setting of the game, which we have yet to fully lift the veil on…

– – — – –

Batman: Arkham Shadow is due out Fall 2024. Do you think it will live up to the bar set by its predecessors? Sound off in the comments below!

The post ‘Batman: Arkham Shadow’ Development Started in Late 2020, Rocksteady Alums Among the Dev Team appeared first on Road to VR.

Oculus Founder’s New XR Headset Built Around “military requirements” but Also “used for non-military stuff”

This week at AWE 2024, Oculus founder Palmer Luckey spoke briefly about a new XR headset he’s developing. Though the headset would see Luckey returning to the XR space, this one is being built with military, rather than consumer, uses in mind.

Luckey confirmed earlier this month that he intends to build a new XR headset, but revealed almost no details about the project.

During a panel discussion today at AWE 2024, he offered just a bit more, saying the headset’s design is “being driven by military requirements, but also going to be used for non-military stuff. It’s really cool, it’s really something.”

Though Luckey was on stage with Bigscreen Beyond creator Darshan Shankar, the two didn’t indicate any joint work together.

Luckey said he’s announcing the project now because trying to keep it secret means fewer opportunities to find key collaborators and suppliers.

His mention of the upcoming headset’s military requirements suggests the project originates from within his current company, Anduril, a tech-focused military contractor.

Although “military requirements” can often be seen as synonymous with “incredibly expensive,” Luckey has grown Anduril into a multi-billion dollar company on the premise that major military contractors charge too much and deliver more slowly than they ought too.

Considering the headset’s angle, it seems unlikely that a military-focused headset would plug into any consumer XR ecosystems like Horizon OS or SteamVR. That leaves it up in the air whether the headset will be built on a proprietary platform—and how it will support the “non-military stuff” that Luckey mentioned. Likely that “stuff” refers to enterprise-focused use-cases like training, education, and design.

Luckey founded Oculus in 2012, the company whose Rift headset was the spark that rebooted the modern era of VR. As a rapidly growing startup, Oculus attracted the attention of Meta (at the time Facebook), which acquired the company in 2014 for more than $2 billion. Luckey continued in VR under Meta’s roof for several years but was eventually pushed out of the company due to backlash to his politics. After leaving Meta, Luckey went on to found Anduril, a tech-defense startup which itself went on to achieve a multi-billion valuation. Though Luckey hasn’t been active in XR since leaving Meta, he’s continued to be looked to as a thought leader in the space.

The post Oculus Founder’s New XR Headset Built Around “military requirements” but Also “used for non-military stuff” appeared first on Road to VR.

‘Gorilla Tag’ Has Topped $100M in Revenue, Making it One of VR’s Most Successful Games

VR studio Another Axiom today announced that its breakout title, Gorilla Tag, has surpassed $100 million in revenue. The company shared other key metrics about its player population that shine light on the state of the VR market.

Not long after its launch back in early 2021 it became clear there was something special about Gorilla Tag. It’s minimalistic ‘tag’ gameplay, unique arm-based locomotion, and novel social architecture made for simple social fun. And it turns out, people really like simple social fun.

This week the studio behind the game, Another Axiom, offered up some key metrics for Gorilla Tag that show just how large it has become. Here’s the quick and dirty:

  • $100 million in total revenue

  • 10 million lifetime players
  • 3 million monthly active users
  • 1 million daily active users

Gorilla Tag’s revenue comes primarily through in game cosmetics which allow for try-ons and a social shopping experience.

These figures make Gorilla Tag one of the most successful and most popular VR games to date. Not only are lots of people playing the game, Another Axiom also revealed the average playtime is nearly 60 minutes. That’s doubly impressive considering how physical of a game it can be.

As for what’s next? The studio isn’t leaving Gorilla Tag behind, but it’s busy at work on a spiritual successor to Lone Echo, the game which inspired Gorilla Tag’s movement in the first place.

The post ‘Gorilla Tag’ Has Topped $100M in Revenue, Making it One of VR’s Most Successful Games appeared first on Road to VR.

Quest ‘Augments’ Feature for Concurrent AR Apps Needs More Time to Cook, Says Meta CTO

Last year Meta announced the so-called Augments feature, planned for Quest 3, which would allow persistent mini AR apps to live in the world around you. Now, eight months after the headset hit store shelves, Meta’s CTO explains why the feature has yet to ship.

Augments was announced as a framework for developers to build mini AR apps that could not just live persistently in the space around you, but also run concurrently alongside each other—similar to how most apps work on Vision Pro today.

Image courtesy Meta

And though Meta had shown a glimpse of Augments in action when it was announced last year, it seems the company’s vision (and desire to market that vision) got ahead of its execution.

This week Meta CTO Andrew “Boz” Bosworth responded to a question during an Instagram Q&A about when the Augments feature would ship. He indicated the feature as initially shown wasn’t meeting the company’s expectation.

We were playing with [Augments] in January and we decided it wasn’t good enough. It was too held back by some system architecture limitations we had; it ended up feeling more like a toy and it didn’t really have the power that we think it needed to deliver on the promise of what it was.

So we made a tough decision there to go back to the drawing board, and basically [it needed] a completely different technical architecture. Starting from scratch basically. Including actually a much deeper set of changes to the system to enable what we wanted to build there. I think we made the right call—we’re not going to ship something we’re not excited about.

But it did restart the clock, and so [Augments is] going to take longer than we had hoped to deliver. I think it’s worth while, I think it’s the right call. But that’s what happened.

We’re only two-and-a-half months out from Meta Connect 2024, which would be the one-year anniversary of the Augments announcement. That’s where we likely to hear more about the feature, but at this point it’s unclear if it could ship by then.

The post Quest ‘Augments’ Feature for Concurrent AR Apps Needs More Time to Cook, Says Meta CTO appeared first on Road to VR.

Hands-on: Logitech’s MX Ink for Quest Could Legitimize VR Styluses as a Whole

Over the last decade I’ve reported on and tested many different VR styluses, but none of them have actually caught on. But the new MX Ink stylus for Quest stands a real chance at legitimizing the VR stylus as a whole, thanks to its thoughtful design, strong lineup of launch apps, and tight integration with Quest’s software.

This week Logitech announced MX Ink, an officially endorsed ‘Made for Meta’ stylus supporting Quest 2 and Quest 3 (see the full announcement details here). It’s the first time Meta has allowed any other company to harness its tracking technology in a third-party product. That alone makes MX Ink unique, but there’s more that makes this the device that could legitimize VR styluses as a whole.

The first styluses are thought to have been invented five millennia ago. And there’s a reason they’ve stuck with humanity ever since: a stylus amplifies the precision with which we can point. While that seems rather simple, it makes information tasks like writing, drawing, calculating, and designing significantly more practical and useful than using our fingers alone.

So it’s not surprising that we’ve seen many attempts to bring a VR stylus to life.

Just to name a few: in 2017 an enterprising developer hacked together a chunky prototype using a Vive Tracker and a pressure-sensitive stylus tip; in 2018 a company called Massless designed its own prototype VR stylus that it hoped to bring to market; even Wacom has been toying with the idea. Hell, Logitech already made a VR stylus back in 2019… but at $750, it’s no wonder it never made it to general availability.

So what could be different about Logitech’s new MX Ink? Well for one, the price is significantly more palatable than what’s come before. The $130 price point is a pretty easy sell for professionals for whom the added precision of a stylus could actually improve their workflow.

Logitech is also smartly launching some ‘nice to have’ extras for those who are really serious about making the MX Ink part of their workflow.

There’s the Inkwell dock which, for only another $40, gives you an easy place to store and charge the stylus so it’s ready for your next use. And there’s the MX Mat, for $50, which Logitech pitches as the ideal surface to make it feel like you’re drawing on a paper-like material when using the stylus.

Photo by Road to VR

But more importantly than price or accessories is the first-party integration with Meta and the strong lineup of supported software out of the gate.

Logitech worked directly with Meta, not only to adopt Quest’s tracking technology, but also to build the stylus’ software experience right into Horizon OS. Pairing the MX Ink is just like pairing one of the headset’s own controllers, without any extra hardware or software needed. Even the stylus’ settings—which let you control things like hand selection, button bindings, and pressure curves—are baked right into the system’s own Settings menu.

It’s even got a proper ‘Meta’ button on the end (where the eraser would be), making it easy to pull up the headset’s menu.

And then there’s the strong lineup of software that will work right out of the gate. Logitech has locked in a solid swath of VR design apps for MX Ink support:

  • Adobe Substance Modeler
  • Gravity Sketch
  • PaintingVR
  • Arkio
  • Engage
  • OpenBrush
  • GestureVR
  • ShapesXR
  • Elucis by RealizeMedical

If Logitech plays its cards right, MX Ink could be the first VR stylus that really sticks the landing. So needless to say, I was intrigued to try it.

Hands-on With Logitech MX Ink for Quest

Photo by Road to VR

Last week I swung by Logitech’s San Jose, CA office to check out an early version of the stylus for myself. Compared to the company’s last VR stylus, the MX Ink is significantly more compact. Even so, I was impressed with the tracking.

Photo by Road to VR

Even with my hand covering a significant area of the stylus, there were seemingly enough hidden IR LEDs hiding under the stylus’ shell to provide continuous tracking no matter how I held or twisted the stylus. The company said it even put IR LEDs toward the tip of the MX Ink so it could be held like a wand or a pallet knife.

Logitech says the stylus is ‘as accurate as the Quest controllers’—but that doesn’t mean it can’t be more precise. Using a stylus as a pointing device means you can use your dexterous fingers to manipulate the input position in a very fine way; far more so than twisting your wrist alone (which is what primarily drives fine controller motion).

That was obvious while I was using the MX Ink to draw and sketch directly onto a real table in front of me. The pressure sensitive tip also made it feel natural to vary line width as needed.

Photo by Road to VR

I also tried using the MX Ink stylus against a whiteboard while using Quest 3’s mixed reality view. The tight latency and accuracy of the stylus really made it feel like I was leaving marks on the whiteboard. It was a whole layer of immersion that I wasn’t expecting to feel while trying the stylus.

This sense of actually leaving real marks on the whiteboard only made the next part even more mind-bending… I could lift the stylus from the surface while holding the button on the barrel and extend my drawing into the third dimension. Watching my strokes literally leap off the page like this was just plain fun.

While pressing the MX Ink against a real surface, the tip communicates the amount of pressure to the headset and thus changes the thickness of the line you draw. But when you’re using the stylus to draw in 3D, suddenly there’s no way for the system to know how much pressure you’re using, right? Actually, no; Logitech smartly made the button on the barrel of the stylus pressure sensitive itself, so you can squeeze softer or harder to define the width of brush strokes, even when you’re drawing in the air.

The MX Ink even includes a haptic engine for feedback. So even if you’re using it against a virtual surface, the stylus can let you know when you’re touching the canvas.

– – — – –

I’m impressed with the level of thoughtfulness in the design of MX Ink. It’s clear the company has carried over some important lessons learned from its previous experiments with VR styluses.

MX Ink has a reasonable price point, direct integration with the most popular headsets on the market, and a strong lineup of supporting apps. Logitech is giving the VR stylus—as a category—its best chance yet at really catching on.

The essential pieces are in place. The thing that will make or break this product is now likely down to how well integrated it is into the workflow of key applications. My understanding is that developers have a huge range of control over exactly how their applications will handle MX Ink. Half-hearted implementations could kill what otherwise looks like a strong product.

With MX Ink not due to launch until September, there’s time still for applications to tighten up their implementations, so we’ll have to wait to see how it all comes together.

The post Hands-on: Logitech’s MX Ink for Quest Could Legitimize VR Styluses as a Whole appeared first on Road to VR.

VisionOS 2 Enables WebXR by Default, Unlocking a Cross-platform Path to Vision Pro

We’ve know that Apple planned to support WebXR for quite some time, but with VisionOS 2, the company is enabling the feature for all users. WebXR allows developers to deliver cross-platform XR experiences directly from the web, with no gatekeepers to approve or reject content.

WebXR is the widely supported web standard that allows developers to deliver AR and VR content from the web.

Just like anyone can host a webpage online without any explicit approval from another party, WebXR allows the same for AR and VR content. And because it’s delivered through the browser, accessing and sharing WebXR experiences is as easy as clicking or sending a link—like this one.

Vision Pro has supported initial WebXR support since its launch, but it required users to manually enable the feature by digging into Safari’s settings.

With VisionOS 2—available today as a developer preview, and coming to all this Fall—WebXR will be enabled by default, making it easy for anyone to access WebXR through the headset. Vision Pro thus joins headsets like Quest, HoloLens 2, and Magic Leap 2 in supporting WebXR content.

Though WebXR is “supported” on VisionOS 2, our understanding is that it only support VR (or ‘fully immersive’) experiences. WebXR is also capable of delivering AR experiences (where virtual content is merged with a view of the real world), but VisionOS 2 doesn’t yet support that portion of the standard.

There’s many reasons why developers might want to use WebXR to build experiences over native apps that are distributed through a headset’s proprietary store.

For one, any headset with WebXR support can run any compatible WebXR experience, meaning developers can build one experience that works across many headsets, rather than needing to make multiple builds for multiple headsets, then uploading and managing those builds across multiple platform stores.

Like a webpage, WebXR content can also be updated at any time, allowing developers to tweak and enhance the experience on the fly, without needing to upload new builds to multiple stores, or for users to download a new version.

WebXR also has no gatekeepers. So content that wouldn’t be allowed on, say, Apple or Meta’s app stores—either for technical or content-related reasons—can still reach users on those headsets. That could include adult content that’s explicitly forbidden on most platform app stores.

The post VisionOS 2 Enables WebXR by Default, Unlocking a Cross-platform Path to Vision Pro appeared first on Road to VR.

VisionOS 2 Improvement Targets Key Vision Pro Critique Among Developers

For basic Vision Pro interactions like navigating apps and scrolling web pages, the headset’s look-and-pinch input system works like magic. But if you want to go more ‘hands-on’ with virtual content, the headset’s full hand-tracking leaves much to be desired.

Compared to Quest 3, Vision Pro’s full hand-tracking has notably more latency. That means when moving your hands it takes longer for the headset to register the movement. Especially in interactive content where you directly grab virtual objects, this can make the objects feel like they lag behind your hand.

Changes coming in VisionOS 2 stand to improve hand-tracking. Apple detailed the changes in a developer session at WWDC 2024 this week.

For one, the headset will now report estimated hand positions at 90Hz instead of the previous 30Hz. That means the system can reflect changes in hand position in one-third of the time, also making the movement of the hand smoother thanks to more frequent updates. This only applies to a small portion of the overall latency pipeline (which we previously estimated at a total of 127.7ms) but it could reduce hand-tracking latency by as much as 22ms in the best case scenario.

Here’s a look at that in action:

It’s an improvement, but you can still easily see the latency of the teapod compared to the hand, even with this slow movement.

For a snappier experience, VisionOS 2 will alternatively allow developers to enable hand-tracking prediction, which provides an estimate of the user’s future hand position. While this doesn’t truly reduce latency, it can reduce perceived latency in many cases. Similar prediction techniques are common across various XR tracking systems; it’s quite surprising that Vision Pro wasn’t already employing it—or at least not making it available to developers.

Here’s a look at predictions in action:

Now we can see the virtual teapot staying much more aligned to the user’s hand. Granted, this isn’t likely to look quite as good with faster motions.

We’ll be looking forward to putting Vision Pro’s hand-tracking latency to the test with VisionOS 2 soon!

The post VisionOS 2 Improvement Targets Key Vision Pro Critique Among Developers appeared first on Road to VR.

Hand-tracking Action Arcade Game ‘Thrasher’ Coming to Quest & Vision Pro Next Month, PC VR Later

From the artist and composer behind indie hit Thumper (2016), the forthcoming Thrasher is officially launching on July 25th on Quest and Vision Pro. A PC VR and flatscreen version is planned for release at a later date.

We’ve known Thrasher was in the works for some time now, but this unique looking game now has an official release date of July 25th, priced at $20. The game is coming to Quest and is now confirmed to be releasing on Vision Pro the very same day. The PC VR and flatscreen version will launch at a later date.

A new trailer shows us a better look at the game’s chaotic yet mesmerizing gameplay:

Apparently based purely on hand-tracking (though likely falling back to controllers on PC VR), the gameplay looks undeniably unique. Here’s how the developers describe it:

TRANSCEND SPACETIME

Immerse yourself in a dazzling odyssey where music, visuals, and gameplay mesh into one transcendent experience. Journey from the depths of primordial gloom to the heights of celestial bliss, culminating in a heart pounding reckoning with a cosmic baby god.

YOU V THE UNIVERSE

Swoop, dash and thrash at breakneck speed, busting through obstacles and stacking up combos, leading to nine jaw dropping encounters with mysterious leviathans that will challenge your skills, and your sanity.

POWER UP

Deploy power-ups to supercharge your space eel and max out your combos. Create a destructive rainbow spray of bullets, bulldoze everything in a blaze of color and light, or slow things down to create a perfect path through the chaos.

SOUND AND FURY

Lose yourself in the enthralling soundtrack created by designer Brian Gibson, bassist for the band Lightning Bolt. THRASHER is a spatial audio showcase, creating a stunning sensory experience.

CHILL OR CHALLENGE

Vibe out and enjoy the wild journey, or push yourself to the limit by chaining together massive combos to worm your way up the rankings.

Will you be giving Thrasher a try?

The post Hand-tracking Action Arcade Game ‘Thrasher’ Coming to Quest & Vision Pro Next Month, PC VR Later appeared first on Road to VR.