Hands-on: Apple Vision Pro isn’t for Gaming, But it Does Everything Else Better

While Apple’s new Vision Pro headset isn’t going to satisfy the existing base of consumer VR users, it’s mastering the rest of the basics better than anyone else.

Probably 90% of what consumers are using VR headsets for today is entertainment, and of that entertainment, most of it is gaming. And if you’re among those people using such headsets today, you’ll reasonably be disappointed that Apple Vision Pro lacks controllers and isn’t going to be playing top VR games anytime soon. But for everyone else, it’s a back-to-basics approach that’s laying a sturdy foundation to build upon in the future.

Today at Apple’s headquarters I got to check out Vision Pro for myself. Unfortunately the company didn’t permit any photos or footage during the demo, but the clips below are a fair representation of what I saw.

Photo by Road to VR

Apple Vision Pro (APV, let’s call it) is doing what only Apple can: carving out a subset of what other devices do, and making sure that subset of things is done really well. And given the current state of UX on most other headsets, this is a reckoning that was a long time coming.

Look & Tap

It starts with the input. Apple is leaning heavily into using your eyes as a cursor, and a pinch gesture as a click. The headset has cameras on the bottom that face downward so that even subtle pinches from your hand in your lap are visible and detected. But you don’t see a floating cursor where your eyes are, nor do you see a laser pointer shooting out of your hand. You just look at the thing you want to press, then do a quick pinch.

On paper you might think this sounds shoddy. But remember, this is Apple. They’ve tested and refined this system six ways from Sunday, and it works so well that after a minute or two you hardly think about how you’re interacting with the headset, you just are.

The pinch input is responsive and reliable. It felt so natural that the two or three times the headset missed my pinch during a 30 minute demo it felt really weird because my brain was already convinced of its reliability.

This look-and-pinch system is so simple for the headset’s basic input that I won’t be surprised if we see other companies adopt it as soon as possible.

Reality First

So there’s the simple input and then there’s a passthrough-by-default view. This is an MR headset after all, meaning it can easily do augmented reality—where most of your view is of the real world, with some virtual content; or virtual reality—where all of your view is virtual content.

When you put AVP on your head, you instantly see the outside world first. In fact, the way that Apple defers to the passthrough view shows that they want to treat fully immersive experiences as the exception rather than the rule. Generally you won’t pop into a fully immersive scene unless you actively making the decision to do so.

The passthrough view is certainly best-in-class, but we’re still probably two generations away from it truly feeling like there’s nothing separating your eyes from the real world. Granted, I was able to read all the text on my phone with no issue, which has been the ‘bar’ for passthrough quality that I’ve been waiting to see exceeded.

Beautiful Virtual Displays

The imperfect passthrough resolution somewhat betrays the exceptional display resolution which exhibits not even a hint of screen-door effect. It may not be ‘retina resolution’ (generally agreed to be around 60 pixels per-degree), but it’s good enough that I won’t know how far off it is from retina resolution until I sit down with an objective test target to find out.

That’s a long way of saying that the headset’s display has excellent resolution with great clarity across the lens. Top of the class.

This clarity is helped by the fact that Apple has done its Apple-y thing and ensured that panels, text, and images consistently render with superb quality. The entire interface feels iOS-polished with animations and easy to use buttons and controls. The interface was so simple to use that the demo chaperones had a hard time keeping me on task as I wanted to flick through menus and move floating apps around the room.

But here’s the thing, probably 75% of what Apple showed me was essentially just floating screens. Whether it was videos or a floating iMessage app or the web browser, it’s clear that Apple wants Vision Pro to be first and foremost be great at displaying flat content to the user.

The other 25% of what I saw, while very impressive all around, felt like just the start of a journey for Apple to build out a broader library immersive experiences.

Record & Rewatch Memories

AVP might not be a VR gaming headset, but it does at least one thing that no other headset does: capture volumetric memories using its on-board cameras. Using the button on the top of the headset you can capture volumetric photos and videos with just a press.

Apple showed me a demo of a volumetric video capture of a group of kids blowing out candles on a birthday cake. It was like they were right in front of me. I’d never even seen these kids before but I could immediately feel their giddy emotions as they giggled and bounced around… as if I was sitting right there while it was happening. Not to mention that the quality was good enough, at least in this best-case-scenario demo capture, that my first thought had nothing to do with the famerate or quality or dynamic range, but purely of the emotion of the people in front of me.

That instant connection—to people I don’t even know—was a clear indicator that there’s something special to this. I can already imagine watching a volumetric video of a cherished memory, or of a loved one that has passed, and I know it would be a powerful experience.

Doing it Right

And here’s the thing; I’ve seen plenty of volumetric video demos before. This isn’t a new idea, not even close. The thing that’s novel here is that everyday users could potentially shoot these videos on their own, and readily watch, share, and store them for later. On other headsets you’d need a special camera for capturing, special software for editing, a player app, and a sharing app to make the same thing happen.

This is the ‘ecosystem’ part of XR that’s missing from most other headsets. It’s not about what’s possible—it’s about what’s easy. And Apple is focused on making using this headset easy.

Continue on Page 2: Immersion Isn’t Off the Table »

Apple to Open Locations for Devs to Test Vision Pro This Summer, SDK This Month

Ahead of the Apple Vision Pro’s release in ‘early 2024’, the company says it will open several centers in a handful of locations around the world, giving some developers a chance to test the headset before it’s released to the public.

It’s clear that developers will need time to start building Apple Vision Pro apps ahead of its launch, and it’s also clear that Apple doesn’t have heaps of headsets on hand for developers to start working with right away. In an effort to give developers the earliest possible chance to test their immersive apps, the company says it plans to open ‘Apple Vision Pro Developer Labs’ in a handful of locations around the world.

Starting this Summer, the Apple Vision Pro Developer Labs will open in London, Munich, Shanghai, Singapore, Tokyo, and Cupertino.

Apple also says developers will be able to submit a request to have their apps tested on Vision Pro, with testing and feedback being done remotely by Apple.

Image courtesy Apple

Of course, developers still need new tools to build for the headset in the first place. Apple says devs can expect a visionOS SDK and updated versions of Reality Composer and Xcode by the end of June so support development on the headset. That will be accompanied by new Human Interface Guidelines to help developers follow best practices for spatial apps on Vision Pro.

Additionally, Apple says it will make available a Vision Pro Simulator, an emulator that allows developers to see how their apps would look through the headset.

Developers can find more info when it’s ready at Apple’s developer website. Closer to launch Apple says Vision Pro will be available for the public to test in stores.

Watch Apple’s WWDC Keynote Right Here at 10AM PT

Apple’s WWDC keynote is today, and the company is heavily expected to reveal an immersive headset for the first time. Here’s where to see the action live.

Apple’s WWDC keynote will be held at 10AM PT on June 5th (your timezone here). You can catch the official livestream from Apple embedded below:

Follow for Up-to-the-minute Updates

I’ll be on-site at Apple Park for the WWDC keynote, and maybe more than that… if you want the most up-to-the-minute updates for what comes after the keynote, follow along on Twitter: @benz145.

What to Expect

We’re expecting that Apple’s WWDC keynote will first focus first on its existing products, including major updates to its mobile and desktop operating systems, with the potential for a revamped 15-inch MacBook Air.

But of course the thing we’re looking for is the rumored announcement of Apple’s first XR headset, which we expect will come at the end of the keynote—though we’re still 50/50 on whether or not it’ll be preceded by the words “one more thing,” which the company hasn’t dropped since 2020.

Rumors for what an Apple XR headset might actually do or look like have varied substantially over the years, though recent leaks suggest the following:

  • Resolution: Dual Micro OLED displays at 4K resolution (per eye)
  • FOV: 120-degrees, similar to Valve Index
  • Chipset: Two 5nm chips. Includes a main SoC (CPU, GPU, and memory) and a dedicated image signal processor (ISP). Chips communicate via a custom streaming codec to combat latency.
  • Battery: Waist-mounted battery, connected via MagSafe-like power cable to the headset’s headband. Two-hour max battery life, although hot-swappable for longer sessions.
  • PassthroughISP chip contains custom high-bandwidth memory made by SK Hynix, providing low latency color passthrough
  • Audio: H2 chip, providing ultra-low latency connection with the second-generation AirPods Pro and future AirPods models. No 3.5mm and possible no support for non-AirPod BT headphones.
  • ControllerApple is said to favor hand-tracking and voice recognition to control the headset, but it has tested a “wand” and a “finger thimble” as alternative control input methods.
  • Prescription Lenses: Magnetically attachable custom prescription lenses for glasses-wearers.
  • IPD Adjustment: Automatic, motorized adjustment to match the wearer’s interpupillary distance.
  • Eye Tracking: At least one camera per-eye for things like avatar presence and foveated rendering
  • Face & Body Tracking: More than a dozen cameras and sensors capture both facial expressions and body movements, including the user’s legs.
  • Room Tracking: Both short- and long-range LiDAR scanners to map surfaces and distances in three dimensions.
  • App Compatibility: Said to have the ability to run existing iOS apps in 2D.

It’s very likely that this is only an initial announcement of the company’s headset, with a heavy focus on what developers will be able to do with it (need we remind you, this is Apple’s Worldwide Developers Conference). We don’t expect it to launch until later this year at the earliest, but when it does it’s not clear if Apple will position the device like a sort of early adopter development kit, or market it to consumers outright. The latter seems less likely considering the rumored price between $1,500–$3,000.

While Apple pretty much never launches any product as a ‘dev kit’, an XR headset might be such a shift for the company and its army of iOS developers that they will need that interim step to hone the experience ahead of a full blown push to consumers. We’ll find out soon enough.

Hands-on: CREAL’s Light-field Display Brings a New Layer of Immersion to AR

More than four years after I first caught wind of their tech, CREAL’s light-field display continues to be one of the most interesting and promising solutions for bringing light-fields to immersive headsets. At AWE 2023 I got to check out the company’s latest tech and saw first hand what light-fields mean for immersion in AR headsets.

More Than One Way to Focus

So first, a quick recap. A light-field is a fundamentally different way of showing light to your eyes compared to the typical displays used in most headsets today. The key difference is about how your eyes can focus on the virtual scene.

Your eyes have two focus methods. The one most people are familiar with is vergence (also called stereoscopy), where both eyes point at the same object to bring overlapping views of that object into focus. This is also what makes things look ‘3D’ to us.

But each individual eye is also capable of focusing in a different way by bending the lens of the eye to focus on objects at different distances—the same way that a camera with only one lens focuses. This is called accomodation.

Vergence-Accommodation Conflict

Most XR headsets today support vergenge (stereoscopic focus), but not accomodation (single-eye focus). You may have heard this called Vergence-Accomodation Conflict; also known to the industry as ‘VAC’ because it’s a pervasive challenge for immersive displays.

The reason for the ‘conflict’ is that normally the vergence and accommodation of your eyes work in tandem to achieve optimal focus on the thing you want to look at. But in a headset that supports vergence, but not accomodation, your eyes need to break these typically synchronous functions into independent functions.

It might not be something you ‘feel’ but it’s the reason why in a headset it’s hard to focus on things very near to you—especially objects in your hands that you want to inspect up close.

The conflict between vergence and accommodation can be not just uncomfortable for your eyes, but in a surprising way also rob the scene of immersion.

Creal’s Solution

And this is where we get back to Creal, a company that wants to solve the Vergence-Accommodation Conflict with a light-field display. Light-field displays structure light in the same way that we see it in the real world, allowing both of the focus functions of the eyes—vergence and accommodation—to work in tandem as they normally do.

At AWE 2023 this week, I got to check out the company’s latest light-field display tech, and came away with an added sense of immersion that I haven’t felt in any other AR headset to date.

I’ve seen Creal’s static bench-top demos before, which show static floating imagery through the lens to a single eye, demonstrating that you can indeed focus (accommodate) at different depths. But you won’t really see the magic until you see a light-field with both eyes and head-tracking. Which is exactly what I got to do this week at AWE.

Photo by Road to VR

On an admittedly bulky proof-of-concept AR headset, I got to see the company’s light-field display in its natural habitat—floating immersively in front of me. What really impressed me was when I held my hand out and a little virtual turtle came floating over to the palm of my hand. Even though it was semi-transparent, and not exceptionally high resolution or accurately colored, it felt… weirdly real.

I’ve seen all kinds of immersive XR experiences over the years, and holding something in your hand sounds like a banal demo at this point. But there was just something about the way this little turtle looked—thanks to the fact that my eyes could focus on it in the same way they would in the real world—that made it feel more real than I’ve ever really felt in other headsets. Like it was really there in my hand.

Photo by Road to VR

The trick is that, thanks to the light-field, when I focused my eyes on the turtle in my hand, both the turtle (virtual) and my hand (real) were each in proper focus—something that isn’t possible with conventional displays—making both my hand and the turtle feel more like they were inhabiting the same space right in front of me.

It’s frustratingly impossible to explain exactly how it appeared via text alone; this video from Creal shot through-the-lens gives some idea of what I saw, but can’t quite show how it adds immersion over other AR headsets:

It’s a subtle thing, and such added immersion probably only meaningful impacts objects within arms reach or closer—but then again, that distance is where things have the potential to feel most real to use because they’re in our carefully watched personal space.

Digital Prescriptions

Beyond just adding a new layer of visual immersion, light-field displays stand to solve another key problem, which is vision correction. Most XR headsets today do not support any kind of prescription vision correction, which for maybe even more than half of the population means they either need to wear their correctives while using these devices, buy some kind of clip-on lens, or just suffer through a blurry image.

But the nature of light-fields means you can apply a ‘digital prescription’ to the virtual content that exactly matches the user’s corrective prescription. And because it’s digital, this can be done on-the-fly, meaning the same headset could have its digital corrective vision setting change from one user to the next. Doing so means the focus of virtual image can match the real world image for those with and without glasses.

Continue on Page 2: A More Acceptable Form-factor »

Quest 3 Will Continue to Support PC VR Thanks to Oculus Link

Like its predecessors, Quest 3 will be able to plug into high-end gaming PCs to play top PC VR titles.

Meta might have largely abandoned PC VR, but it’s not ready to pull the plug completely.

A spokesperson for the company has confirmed with Road to VR that Quest 3 will continue to support Oculus Link (also known as Quest Link) and Air Link.

Oculus Link allows users to plug Quest headsets into their PC via USB-C to interface with the Oculus PC software. From there users can use the headsets, including Quest 3 when it launches, to play Oculus PC games like Lone Echo and SteamVR games like Half-Life: Alyx.

Air Link, which offers the same PC VR capability—except wirelessly over Wi-Fi—is also confirmed for Quest 3.

As with prior versions of the headset, this could be a lifeline for the PC VR space which would otherwise be shrinking if not for a glut of Quest 2 users using their headset to play PC VR games. Quest 2 became the most popular headset used on SteamVR shortly after it launched, and has remained there ever since, holding a significantly larger share of usage than any other headset on the platform (including Meta’s older dedicated PC VR headsets like Rift and Rift S).

Meta Shows First Glimpse of Quest 3 Mixed Reality Gameplay and Improvements Over Quest Pro

With Quest 3 now officially announced, Meta is emphasizing the device’s improved MR capabilities.

Meta CEO Mark Zuckerberg took to Instagram to share a first look at mixed reality gameplay on Quest 3 which was announced yesterday.


View this post on Instagram


A post shared by Mark Zuckerberg (@zuck)

The video shows the headset’s full color passthrough MR mode, which allows it to present a view of the outside world while selectively adding virtual content to the scene.

We also see some shots of virtual objects attached to the wall, like a glass window into an undersea world, or a zombie jumping through a window into the room to attack the player. While Quest 2 and Quest Pro have done the same in the past, Quest 3’s new depth sensor should make attaching virtual objects to walls, floors, and ceilings more convincing thanks to a more precise map of the world around the headset.

We also see Meta CTO Andrew “Boz” Bosworth jump into the action, showcasing a co-presence experience where both Zuckerberg and Bosworth battle each other virtually but in the same physical space.

Beyond Quest Pro

It’s difficult to tell from the footage how Quest 3’s passthrough resolution compares to Quest Pro. However, it’s notable that the footage doesn’t show any of the obvious color fringing that was an artifact of Quest Pro’s passthrough architecture, which used multiple black-and-white cameras that were fused with the color from a single RGB camera. That ought to be solved now that Quest 3 will include two RGB cameras which will allow stereoscopic capture of color information, rather than monoscopic like with Quest Pro.

Another common artifact of Quest Pro (and Quest 2) passthrough is the warping of objects (especially hands) that are close to the headset. This is caused by a breakdown of the computer-vision depth estimation which struggles with near-field objects, especially when they’re moving.

It’s difficult to tell from the footage we have so far, but there’s a good chance that Quest 3 significantly reduces these passthrough warping artifacts thanks to its included depth sensor. Whereas Quest 2 and Quest Pro estimate the distance to objects and surfaces around the headset with computer vision, Quest 3’s depth sensor will provide much more reliable distance measurements which the system can use to judge how far it should render each part of the scene.

It will be interesting to see if the prior issue with color fringing on Quest Pro manifests in the same way with depth. With a single depth sensor, the headset only has a monoscopic depth view, whereas it will have a stereoscopic visual of the real world. Ostensibly the stereoscopic view of the world will be projected onto the depth map, and ‘depth fringing’ may occur around near field objects for the same reason that we saw color fringing on Quest Pro.

Meta is Dropping the Price of Quest 2 to Make Way for Quest 3

With the announcement of Quest 3, Meta is dropping the price of Quest 2 starting on June 4th to make way for its new flagship headset.

The story of Quest 2’s price takes yet another turn. Let’s recap.

When the headset launched in 2020, it started with a somewhat unbelievably low price of $300 for the 64GB model, and later the company sweetened that deal even further by offering the headset with 128GB for the same price. That price point was apparently so aggressively low that Meta may even have been losing money on each headset sold, which prompted the company to raise the price of the base model last year to $400. This was purportedly in response to supply chain and inflation struggles.

But now, just about a year later, Meta has announced that it’s (re)reducing the price of Quest 2, apparently to make room for Quest 3 which will launch this Fall starting at $500. It seems likely this move is an effort to start selling off remaining Quest 2 stock, and perhaps to better differentiate Quest 2 from Quest 3.

So, starting June 4th, Quest 2 (128GB) will return to its original $300 price point, with the 256GB model priced at $350.

Even in the face of Quest 3, that remains a killer deal for the most complete standalone VR headset on the market, one that’s been very hard for other players in the space to contend with. Maybe (just maybe) another reason for this price change is to highlight the contrast between Quest 2 and the rumored $1,500–$3,000 price point of Apple’s first headset.

Zuckerberg Might be Teasing the First Glimpse of Quest 3 Ahead of Quest Gaming Showcase

Zuckerberg took to Instagram to tease an announcement for… something… ahead of its big Quest Gaming Showcase tomorrow. Signs are pointing to Quest 3.

We have no insider info on what might be behind the countdown clock on Zuckerberg’s Instagram story, but the obvious guess is it could be our first look at Quest 3.

Quest 3 recently got a somewhat mysterious hands-on preview from, as far as we can tell, only a single reporter—Mark Gurman. That just happens to be the same reporter that’s been leading the charge in recent months with apparent insider info about Apple’s rumored headset that’s expected to be announced next week.

Zuckerberg’s countdown teaser takes us to 7AM on June 1st, the same day as the Quest Gaming Showcase, the company’s biggest annual event for VR game reveals. And in the background of the photo we can see a view of Zuckerberg’s feet, apparently from a camera on the device that he’s holding.

Image courtesy Mark Zuckerberg

With Meta having cancelled other hardware products like its Portal smart video speakers, it’s hard to imagine what else Zuckerberg would be physically holding other than a new hardware product.

It would certainly make sense for Meta to offer a glimpse of Quest 3. Apple’s rumored headset announcement is expected in just a few days. Meanwhile, Quest 2 is now nearly three years old, which might make an Apple headset look especially new and shiny to Questers looking for the next big thing.

In any case, tomorrow promises to be a big day for Quest gaming news, and maybe more. Stay tuned.

New Leap Motion 2 Brings High-end Hand-tracking to Standalone Headsets

10 years after the launch of Leap Motion—which garnered praise for offering some of the best hand-tracking in the industry—the company has announced a next-generation version of the device which now supports standalone XR headsets in addition to Windows and MacOS.

Years before the modern era of VR, Leap Motion set out to build a hand-tracking module that it hoped would revolutionize human-computer interaction. Launched initially in 2013, the device was praised for its impressive hand-tracking, but failed to find a killer use-case when used as an accessory for PCs. But as the VR spark began anew a few years later, Leap Motion’s hand-tracking started to look like a perfect input method for interacting with immersive content.

Between then and now the company pivoted heavily into the VR space, but didn’t manage to find its way into any major headsets until well after the launch of first-gen VR headsets like Oculus Rift and HTC Vive (though that didn’t stop developers from attached the Leap Motion module and experimenting with hand-tracking). Over the years the company kept honing their hand-tracking tech, improving its software stack which made hand-tracking with the first generation of the hand-tracking module better over time.

First generation Leap Motion | Image courtesy Leap Motion

(It should be noted that Leap Motion was once both the name of the device and the company itself, Leap Motion was merged with another company to form Ultraleap back in 2019.)

More recently the company has built newer versions of its hand-tracking module—including integrations with headsets from the likes of Varjo and Lynx—but never sold that newer hardware as a standalone tracking module that anyone could buy. Until now.

Leap Motion 2 is the first new standalone hand-tracking module from the company since the original, and it’s already available for pre-order, priced at $140, and expected to ship this Summer.

Purportedly built for “XR, desktop use, holographic displays, and Vtubing,” Ultraleap says the Leap Motion 2 is its “most flexible camera ever” thanks to support for Windows, MacOS, and standalone Android headsets with Qualcomm’s XR2 chip.

Image courtesy Ultraleap

From a specs standpoint, the company says the new tracker has “higher resolution cameras, increased field-of-view, and 25% lower power consumption, all in a 30% smaller package for optimum placement and convenience.”

Ultraleap says that Leap Motion 2 will give developers an easy way to experiment with high-quality hand-tracking by adding it to headsets like Varjo Aero, Pico Neo 3 Pro, and Lenovo’s ThinkReality VRX. The company also plans to sell a mount for the device to be attached to XR headsets, as it did with the original device.

Image courtesy Ultraleap

And with the launch of this next-gen hand-tracking module, Ultraleap says it’s moving on from the original Leap Motion tracker.

“Existing customers [using the first Leap Motion module] may continue to access the latest compatible software including the soon-to-be-released Gemini for macOS. Support will also continue to be provided. Future versions of the software will not deliver any performance improvements to the original Leap Motion Controller device,” the company says.

Ultraleap said it has sold more than 1 million Leap Motion trackers to date, with some 350,000 developers having build apps and experiences using the company’s hand-tracking tech.

VR Horror Hit Returns With ‘Five Nights at Freddy’s: Help Wanted 2’, Trailer Here

One of VR’s most popular horror games is getting a sequel. Five Nights at Freddy’s: Help Wanted 2 is officially heading to PSVR 2 and promises to improve on the original.

Developer Steel Wool Studios says that FNAF: Help Wanted 2 will bring new mini-games and feature familiar characters and locations from the series, while breaking new ground as well.

Help Wanted 2 will feel familiar to players who experienced the first game, but with all new games, locations, story, and animatronics. Do your best to complete your work as fast and as diligently as you can, but be careful. One wrong move at this job can lead to… unexpected consequences. Utilizing the enhanced power and fidelity of PS VR2, this title will be the most immersive, heart-racing Five Nights at Freddy’s title ever. PS VR2 sense controller and headset haptics let players feel every step, rumble, and shake as you race to complete your tasks on time. VR brings players even closer to the animatronics than ever before, just not too close, they have been known to bite.

The original FNAF: Help Wanted is available on the original PSVR, Quest, and PC VR, and while the studio hasn’t confirmed that the sequel will reach all the same platforms, we’d guess that’s where things are headed (except probably not the original PSVR now that Sony has moved on to PSVR 2).

The FNAF: Help Wanted 2 release date is planned for “late 2023.”