Apple Vision Pro Supports Unity Apps & Games

Apple confirmed Vision Pro supports porting Unity apps and games.

Acknowledging the existing Unity VR development community, Apple said "we know there is a community of developers who have been building incredible 3D apps for years" and announced a "deep partnership" with Unity in order to "bring those apps to Vision Pro".

This partnership involved "layering" Unity's real time engine on top of RealityKit, Apple's own high level framework (arguably engine) for building AR apps.

This approach means Unity apps can run alongside other visionOS apps in your environment, a concept Apple calls the "shared space."

0:00
/

Unity apps will get access to visionOS features including the use of real-world passthrough as a background, foveated rendering, and the native system hand gestures.

Apple also has its own Mac-based suite of tools for native spatial app development. You use the Xcode IDE, SwiftUI for user interfaces, and its ARKit and RealityKit frameworks for handling tracking, rendering, physics, animations, spatial audio, and more. Apple even announced Reality Composer Pro, which is essentially its own engine editor.

Vision Pro will have a "brand new" App Store for immersive apps as well as iPhone and iPad apps compatible with the headset.

Apple Might Open Applications For Vision Pro Development Kits In July

A page on Apple's website suggests it will open applications for Vision Pro development kits in July.

The subpage of the visionOS section of the Apple Developer site titled Work with Apple provides three options for developers interested in bringing apps to the new spatial computing platform. A label on the page reads 'Available in July'.

Vision Pro compatibility evaluations: developers of existing iPhone and iPad apps will be able to request a report from Apple "on your app or game’s appearance and how it behaves in visionOS."

Vision Pro developer labs: Apple will host sessions in Cupertino, London, Munich, Shanghai, Singapore, and Tokyo – developers can apply to attend a session and experience their visionOS, iPadOS, or iOS apps running on a Vision Pro.

Vision Pro developer kits: "To support great ideas for apps and games for visionOS," Apple will make Vision Pro developer kits available. "Stay tuned for how to apply," the page notes.

0:00
/

To build spatial apps for visionOS Apple recommends using its own suite of tools: Xcode IDE, SwiftUI for user interfaces, and its ARKit and RealityKit frameworks for handling tracking, rendering, physics, animations, spatial audio, and more. Apple even announced Reality Composer Pro, essentially its own engine editor.

For developers unable to acquire a dev kit, Apple is adding a visionOS Simulator to Xcode in the version 15 update. This lets developers interact with their apps in a simulated environment as if running on Vision Pro. These apps will be able to be tested on Vision Pro at Apple's hosted developer labs, or when the developer is able to acquire a development kit.

We'll certainly be keeping a close eye on Apple's website in July.

Apple Vision Pro Full Specs, Features & Details

Today Apple finally revealed Vision Pro, its $3500 AR/VR headset launching in 2024. Here's a rundown of its full specs and features.

Vision Pro is an ultra high-end headset packing in the highest resolution and most sensors ever in an AR/VR product. It introduces new features never before shipped and its visionOS re-thinks the line between 2D, AR, and VR.

Lightweight Design With Tethered Battery

Vision Pro has a custom lightweight aluminum alloy frame supporting a curved "three-dimensionally formed" laminated glass front plate to achieve a thin design.

What also keeps Vision Pro light is the separation of the battery from the headset. Some other headsets like Meta Quest Pro and Pico 4 have their battery in the rear of the strap, but Apple's design takes it off your head entirely with an external battery tethered to a magnetic connector on the left of the headband.

Apple claims the external battery lasted 2 hours under the following test conditions:

Video playback, internet browsing, spatial video capture, and FaceTime. Video playback tested in conjunction with an Environment, using 2D movie content purchased from the Apple TV app. Internet browsing tested using 20 popular websites. FaceTime tested between two Apple Vision Pro units with Personas enabled. Tested with Wi-Fi associated to a network.
0:00
/

Alternatively, Vision Pro can used perpetually without the battery by plugging it in to a power source. Apple hasn't yet gone into details about which sources are supported or whether the adapter for this is included.

A Plethora Of Cameras & Sensors

Vision Pro has a total of twelve cameras, four depth sensors, a LiDAR sensor, and six microphones.

0:00
/

Passthrough AR

Six of the twelve cameras are under the front glass.

Two of these six capture high resolution color to provide the headset's passthrough view of the real world, streaming "over one billion color pixels per second."

0:00
/

The other four front cameras perform headset positional tracking and other general computer vision tasks.

Hand Tracking

One purpose of the four depth sensors is hand tracking. Apple describes the hand tracking quality as "so precise, it completely frees up your hands from needing clumsy hardware controllers."

0:00
/

Vision Pro lacks any kind of tracked controllers, though it supports playing 2D games on a virtual screen with a gamepad.

Environment Meshing

The LiDAR sensor is used to perform "real time 3D mapping" of your environment in conjunction with the other front cameras.

0:00
/

Apple claims Vision Pro has a "detailed understanding" of your floors, walls, surfaces, and furniture, which apps can leverage. One example Apple gave was virtual objects casting shadows on real tables, but this only scratches the surface of what should be possible.

Face & Eye Tracking For FaceTime & More

Two downwards facing cameras track your face, while four internal IR cameras beside them track your eyes, helped by a ring of LED illuminators around the lenses.

Vision Pro's eye tracking serves three purposes: authentication, foveated rendering, and driving your FaceTime avatar.

0:00
/

Apple is calling its new iris scanning authentication OpticID, following the naming scheme of TouchID and FaceID from its other devices. OpticID is how you unlock Vision Pro, and it also works with Apple Pay purchases and password autofill. As with TouchID and FaceID, the biometric data powering OpticID is processed on-device by a Secure Enclave Processor.

Foveated rendering is a technique where only the small region of the display that your eyes are currently looking at is rendered in full resolution, thus freeing up performance since the rest is lower resolution. Freed up GPU resources can be used for better performance, to increase rendering resolution, or to increase graphics settings. It leverages the fact that our eyes only see in high resolution in the very center of the fovea.

0:00
/

Finally, eye tracking combines with the downwards cameras to track your facial expressions in real time to drive your FaceTime Persona, Apple's take on photorealistic avatars. Meta has been showing off research towards this for over four years now, but it looks like Apple will be the first to ship – albeit not to the same quality of Meta's research.

R1 Chip For Ultra Low Latency

To fuse the input from all these cameras, sensors, and microphones together, Apple developed a custom chip it calls R1.

Apple claims R1 "virtually eliminates lag, streaming new images to the displays within 12 milliseconds."

For comparison, the founder of the French headset startup Lynx claims Meta Quest Pro's passthrough latency is 35-60 milliseconds. It's unclear if this is a like-for-like comparison though.

AR-VR Digital Crown

Vision Pro only has two physical controls, both on the top. A button to capture "spatial videos" and "spatial photos" at any moment, and a Digital Crown.

Pressing the Digital Crown brings up the system Home View. But turning it controls your level of immersion, all the way from full AR to full VR. If you go halfway, for example, you'll see VR in front of you and AR behind you.

0:00
/

On existing headsets like Meta Quest and Pico 4, passthrough is a toggle option, meaning you have to choose between full and no immersion. Apple wants to instead let you choose exactly how engaged to be with your real surroundings.

EyeSight & Person Awareness

A completely unique feature of Vision Pro is an external display that shows your eyes to other people in the room and indicates how aware of them you are. Apple calls this technology EyeSight.

When you're in an AR app EyeSight shows a colored pattern in front of your eyes, and when you're in a VR app it shows only the pattern with your eyes invisible.

0:00
/

When someone comes close to you, Vision Pro will show a cutout of the person and EyeSight will reveal your eyes to them.

Apple described making sure that you're "never isolated from the people around you" as one of its "foundational design goals" for Vision Pro, and the company sees it as a clear differentiator to fully opaque headsets like Meta Quests.

Micro OLED Displays With 23 Million Total Pixels

Vision Pro features dual micro-OLED panels with unprecedented pixel density. Apple says each is "the size of a postage stamp" yet together they have 23 million pixels, fewer than was rumored.

Apple didn't reveal the exact resolution, but 23 million total pixels would suggest a per-eye resolution of around 3400×3400 for a square aspect ratio, or around 3200×3600 for the 9:10 aspect ratio typically used in headsets. We don't know the exact aspect ratio in Vision Pro, however.

0:00
/

Apple confirmed Vision Pro's displays support wide color gamut and high dynamic range, but didn't reveal detailed specs like peak brightness.

M2 Chip For 'Unparalleled' Performance

Vision Pro is powered by the same Apple Silicon M2 chip used in recent Macs.

Apple says this delivers "unparalleled standalone performance", and allows Vision Pro to "maintain a comfortable temperature and run virtually silent."

Compared to the rumored specs of the next generation Qualcomm Snapdragon chipset Meta Quest 3 will use, Apple's M2 should deliver roughly 25% faster single-threaded CPU performance, 75% faster multi-threaded, and roughly 15% more GPU power.

However, without knowing the exact clock speeds of the processors in each headset, this is only a very rough comparison.

visionOS With Hand & Eye Control

visionOS is Apple's custom "spatial operating system" for Vision Pro, and presumably for future headsets in the Vision line too.

Apple describes visionOS as "familiar, yet groundbreaking." It presents you with floating 2D apps that you scroll through with a flick of your fingers. You select menu items with your eyes by just looking at them, and you use your fingers to click.

Many of Apple's first party apps and services are available in visionOS, including Notes, Messages, Safari, Keynote, Photos, FaceTime, and Apple Music.

Apple walks us through visionOS

Rather than just existing within their 2D frame, many of Apple's apps "become spatial," taking up space around you. In FaceTime group calls for example, each person's webcam view becomes its own floating rectangle, with spatial audio coming from each person. Apple also gave the example of being able to pull out 3D models from messages into real space.

Vision Pro also lets you expand your Mac's display into a huge virtual display wirelessly just by looking at it.

A major focus of visionOS is watching movies and TV shows on a huge virtual screen, including support for viewing 3D movies from Apple's library with depth.

Personalized Ray Traced Spatial Audio

Vision Pro has "audio pods" on its side, each with two drivers. Apple describes it as the "most advanced Spatial Audio system ever."

If you have an iPhone with a TrueDepth FaceID sensor you can scan your face to enable Personalized Spatial Audio, where the system will tailor the sound to your head and ear geometry for the most precise possible spatial audio.

0:00
/

Vision Pro also uses a technique called Audio Ray Tracing, where it scans the features and materials of your space to "precisely match sound to your room." This technique is also used in Apple's HomePod speakers.

Apple claims Vision Pro buyers will be "totally convinced that sounds are coming from the environment around you."

Pricing & Availability

Apple Vision Pro will go on sale in the US in early 2024 starting at $3500. It'll be available online and in Apple Stores.

Apple says more countries will get Vision Pro "later next year," but didn't disclose exactly which countries.

Watch Apple Reveal Its AR/VR Headset At 10am PT Today

Apple's WWDC keynote, where it will reportedly unveil its AR/VR headset, starts at 10am Pacific Time.

That's:

  • 12pm Central Daylight Time (CDT)
  • 1pm Eastern Daylight Time (EDT)
  • 5pm Coordinated Universal Time (UTC)
  • 6pm British Summer Time (BST)
  • 7pm Central European Summer Time (CEST)

Apple hasn't directly said it will announce an AR or VR device today, but multiple sources including Bloomberg's Mark Gurman, The Information's Wayne Ma, and supply chain analyst Ming-Chi Kuo have all pointed to today as the reveal for the headset. Further, Apple recently trademarked 'xrOS' as well as 'Reality Pro' last year, teased today's event with the tagline 'Code New Worlds', and invited UploadVR's Ian Hamilton to attend in person. All signs point to a reveal.

The headset reportedly features a thin & light design with a carbon fiber frame and a tethered waist-mounted battery pack. Its plethora of cameras and sensors are said to enable high resolution color passthrough for AR, hand tracking, eye tracking, face tracking, environment meshing, and potentially even body tracking. Some even suggest it will have a screen on the front showing the wearer's eyes.

Apple is reportedly going all-out with specifications, equipping the headset with 4K OLED microdisplays, 120° field of view pancake lenses, and the M2 chipset used in its latest MacBooks.

The xrOS operating system will reportedly support running multiple iPad apps at once in a floating interface, as well as spatial AR and VR experiences. It will apparently feature an in-air virtual keyboard for text input but can also connect to an actual keyboard, presumably via Bluetooth.

Apple is said to be working on spatial versions of key apps as well as new apps, including a version of the Freeform collaboration app letting users draw on virtual whiteboards together, a VR version of Fitness+, an immersive sports viewing service, and a VR version of FaceTime that “realistically renders a user’s face and full body in virtual reality”.

Apple Headset Specs, Release Date, & Everything We’ve Heard
Reports indicate Apple is preparing to reveal a standalone headset with VR and AR capabilities. Here’s our updated roundup of everything we’ve heard:

While the headset is expected to be revealed today, it reportedly won't actually ship until later this year due to the challenges involved in mass producing the "most complicated" device Apple has ever made.

Of course, reports of the specs and features of unreleased Apple products are sometimes wrong. We'll find out what Apple's headset truly is, what it does, and when it will arrive during the keynote later today, and we'll of course have the confirmed features and specifications here at UploadVR.com once they're revealed.

Meta To Show 'Near Retinal Resolution' Varifocal Headset Prototype At SIGGRAPH

Meta will show a varifocal VR headset prototype with close to retinal resolution at SIGGRAPH 2023 in August.

The 'Retinal-resolution Varifocal VR' headset is described as follows:

We develop a virtual reality head-mounted display that achieves near retinal resolution with an angular pixel density up to 56 pixels per degree, supporting a wide range of eye accommodation from 0 to 4 diopter (i.e., infinity to 25 cm), and matching the dynamics of eye accommodation.

SIGGRAPH is a yearly conference where researchers showcase breakthroughs in computer graphics hardware & software. Last year Meta showed off Starburst, a HDR research prototype with 20,000 nits brightness, the highest of any known headset. This year Meta will show near-retinal resolution and varifocal optics.

UploadVR went eyes-in with Starburst last year, and we've reached out to Meta about the possibility of going hands on with this year's prototype too.

Retinal Resolution

Angular resolution is the true measure of the resolution of head-mounted displays because it accounts for differences in field of view between headsets by describing how many pixels you actually see in each degree of your vision, in pixels per degree (PPD). For example, if two headsets use the same display but one has a field of view twice as wide, it would have the same resolution yet just half the PPD.

‘Retinal’ is a term used to describe angular resolution exceeding what the human eye can discern. The generally accepted threshold is 60 PPD.

No consumer VR headset today comes close to this – Quest Pro reaches around 22 PPD, while Bigscreen Beyond is claimed at 32 PPD and the $1990 Varjo Aero reaches 35. Varjo’s $5500 business headsets do actually reach retinal resolution, but only in a tiny rectangle in the center of the view.

Snellen chart showing visual acuity of Quest 2 compared to 20/20 vision

Last year, Meta showed off a research prototype achieving 55 PDD called Butterscotch. Its stated purpose was to demonstrate and research the feeling of retinal resolution, and it had a field of view only half as wide as Quest 2.

Butterscotch and the headset being shown at SIGGRAPH are not meant to be direct paths to actual products. But in December, Meta did confirm it was developing displays and optical systems to achieve retinal resolution "on our product roadmap".

Varifocal

All current headsets on the market have fixed focus lenses. Each eye gets a separate perspective but the image is focused at a fixed focal distance, usually a few meters away. Your eyes will point (converge or diverge) toward the virtual object you’re looking at, but can't actually focus (accommodate) to the virtual distance to the object.

This is called the vergence-accommodation conflict. It causes eye strain, makes VR feel less real, and can make virtual objects look blurry close up.

In 2018, Facebook showed off a prototype headset called Half Dome, which incorporated eye tracking to rapidly mechanically move the displays forward and back to adjust focus. Half Dome solved the vergence-accommodation conflict but the mechanical approach would present serious reliability problems in the real world, making it unsuitable to be shipped in products.

In 2019, Facebook described Half Dome 2 and Half Dome 3. Half Dome 2 used more reliable actuators and a more compact (but lower field of view) design. Half Dome 3, however, took a completely new approach with no moving parts. Instead of moving the displays, Half Dome 3 uses a stack of liquid crystal lens layers with voltage applied to activate them in different combinations for varying focus distances.

Meta says the varifocal prototype being shown at SIGGRAPH 2023 is capable of supporting accommodation from 25 cm to infinity, so you could focus on objects up close or in the far distance. It hasn't yet confirmed how the new prototype's varifocal technology works, though.

In 2020, Facebook's Director of Display Systems Research described varifocal optics as "almost ready for primetime", but when showing the Mirror Lake compact varifocal concept last year Mark Zuckerberg suggested its technology won't be seen in products until “the second half of the decade”.

Horizon Workrooms Now Gives You Free Extra Monitors On Windows Too

Horizon Workrooms can now give you two extra virtual monitors on Windows, a feature formerly only available for macOS.

Workrooms is Meta's collaborative productivity app for its Quest headsets. It lets you view your PC monitor inside the headset and share your screen with teammates as avatars in a VR meeting room.

In addition to the collaboration features, Workrooms offers a solo Personal Office space, letting you choose between a few VR environments or your real room, ie. mixed reality. Unlike in the meeting spaces, your Personal Office shows up to three monitors.

On Mac Workrooms can create two entirely virtual extra monitors, but until now this feature wasn't available on Windows. You had to have three real monitors to have a triple monitor setup, somewhat defeating the purpose.

The latest Workrooms update brings the feature to Windows too. Your Quest 2 or Quest Pro can effectively turn your single-screen laptop into a triple monitor setup. This is also possible with the third party app Immersed.

We should note that even Meta's $1000 work-focused Quest Pro doesn't really have sufficient angular resolution to make this a practical replacement for real monitors for most people though. However, leaked specs of the ultra premium headset that Apple will seemingly reveal today suggest it will, and Apple will reportedly have its own solution for viewing your Mac's display in VR and AR too.

Meta Reveals How Quest 3 Controller Tracking Works & Beat Saber Founder Weighs In

Meta's CTO explained how Quest 3's controllers are tracked.

Meta officially revealed Quest 3 on Thursday, launching in fall starting at $500. It will come with Meta's new Touch Plus controllers, which lack both the IR LED rings seen in Quest 2's Touch controllers and the cameras seen in the self-tracking Touch Pro controllers.

In Quest 3's announcement, Meta simply said this is "thanks to advances in tracking technology," but in an Instagram ask me anything (AMA) session on Friday, Meta CTO Andrew Bosworth went into much more detail.

He revealed that Touch Plus still have infrared LEDs on them, but on the face of the controllers instead of along a tracking ring. Given these LEDs will be occluded from the headset much more often – when covered by your hand or facing away from the headset's cameras – Quest 3 also continuously runs controller-free hand tracking, and this is fused with the controller LED  tracking. As with all VR controllers, all this optical tracking data is combined with the onboard accelerometers and gyroscopes to produce the final tracking result.

Meta's approach with Touch Plus suggests Quest 3's controller-free hand tracking may be significantly improved over previous Quest headsets, likely due to its depth sensor. Meta hasn't yet said much about Quest 3's hand tracking though, other than to hint it will "will take another step forward."

0:00
/

Bloomberg's Mark Gurman speculated that Quest 3 may have controller tracking issues in some games, and others have speculated online that Touch Plus may have inferior tracking to Quest 2's Touch. However, when asked about these concerns, Beat Saber co-founder Jaroslav Beck, who stepped down in May, replied, "It’s good. Don’t worry."

Since the development of the original Oculus Quest, Beat Saber's Expert+ mode has been used as an internal benchmark for controller tracking by Meta. Over the weekend a Meta employee claimed Quest 3's controllers "pass the Expert+ test" too.

Ditching the tracking rings also has significant usability advantages. As we described in our Quest Pro review, it lets you bring the controllers much closer together at any angle since you’re no longer at risk of bashing plastic you can’t see in VR. That may sound unimportant, but it actually opens up entirely new precise hand-to-hand interactions and improves existing interactions such as reloading two handed weapons.

Watch The First Quest 3 Mixed Reality Gameplay Footage Here

Meta just revealed the first Quest 3 mixed reality gameplay, showing the passthrough quality and room-aware features.

Mark Zuckerberg posted the footage to his Instagram page, showing him playing a series of mixed reality games and demos on Meta's just-announced $500 headset.

0:00
/

Quest 3 will feature stereo color cameras for the first time in a Meta headset. The passthrough quality seen in Zuckerberg's video looks noticeably superior to Quest Pro's, lacking the "warping" distortion and graininess seen in Meta's now-$1000 headset.

One portion of the video appears to show environment relighting, a feature already present in mobile AR platforms like Apple's ARKit, Google's ARCore, and Snapchat's Lenses. This involves changing the colors and lightness of the real portions of the scene to match a virtual light source, in this case a window to a virtual aquarium anchored to the wall. It's an example of a feature passthrough headsets can pull off convincingly because of their full control of pixels while transparent AR glasses can't because of their limited field of view.

0:00
/

Another part of the video shows how these wall-anchored portals to virtual worlds can be used to extend the environment for mixed reality gaming, with a zombie enemy climbing through a virtual window to the room.

Yet another part shows how miniature games can be anchored to your desk for other surfaces for seated mixed reality content.

In the final section of the video Zuckerberg joins Meta's CTO Andrew Bosworth to play two colocated mixed reality games, meaning they see the same virtual objects at the same locations in the room, including ones they're holding. This can be achieved today with the Shared Spatial Anchors API Meta released in December.

The first colocation game shown sees Zuckerberg and Bosworth compete in a sword fighting game standing facing each other, while the other shows a more relaxed scenario of sitting beside each other remotely controlling virtual miniature tanks on the floor.

0:00
/

Meta still hasn't gone into detail about how exactly Quest 3's environmental awareness works. For mixed reality to interact with your walls today on Quest 2 and Quest Pro you have to arduously manually mark them out, but Meta confirmed Quest 3 has a depth sensor and claims it's capable of "intelligently understanding and responding to objects in your physical space."

Full details of Quest 3's mixed reality technology will likely be revealed at Connect 2023 on September 27.

Meta Tests Horizon Worlds Model & Texture Importing With 'Titanborne' Shooter

Meta tested asset importing to overhaul Horizon Worlds graphics with a new shooter world called Titanborne.

The company described Titanborne as a "sneak peak at new features, improvements and creator tools currently in development", confirmed to be the model and texture importing features announced at Connect 2022 in October.

0:00
/

Currently Horizon Worlds creators build virtual worlds entirely inside VR, using the controllers to place and manipulate primitive shapes then using a spatial visual scripting system to add dynamic functionality. But this results in a crude simplistic graphics style that has faced widespread ridicule when seen in screenshots outside VR.

The ability to import models & textures should enable worlds with significantly improved graphics quality. “We started with simple graphics, and we’re doing a ton of work to meaningfully improve how Horizon will look and feel over the next year – the metaverse needs to feel inspiring”, Meta’s CTO Andrew Bosworth said in October.

Meta is currently also testing direct sharing of captured photos and videos to Instagram and Facebook Stories and members-only worlds for clubs, groups, and communities. In April the company opened the platform up to teenagers in the United States and Canada, a reversal of its previous 18+ policy.

Alongside the announcement of asset importing, Meta also said Horizon Worlds will get support for TypeScript, a popular programming language based on JavaScript, mostly used on the web, for “more dynamic and interactive worlds.” There's no indication Titanborne uses TypeScript, however.

A leaked internal Meta memo last year revealed that the executive in charge of Horizon Worlds believed it "has not found product market fit". Its competitors Rec Room and VRChat are almost always in the top 5 most popular Quest apps, while Horizon Worlds only makes the top 25.

Overhauling Horizon's graphics could be the key step needed to increase the appeal of the platform and expand the kinds of worlds possible to build in it. Meta also plans to add legs to the avatars and eventually overhaul their graphics too.

There's no word yet on when the features used for Titanborne will be available to other Horizon Worlds creators.

Bigscreen Beyond Gets Last Minute Lens Improvements, Including Wider Field Of View

Bigscreen announced last minute optical improvements to Beyond ahead of its launch later this year.

Beyond was announced in February. It's an ultra-compact and ultra-light SteamVR headset designed to enable truly comfortable long duration PC-based VR sessions. The key drivers of its tiny size and light weight are the use of OLED microdisplays and the fact that each unit is custom-fitted for the buyer's eyes, based on an iPhone face scan they provide upon ordering.

Bigscreen Beyond Proves A Point About VR Headset Weight
How does Bigscreen Beyond hold up while watching an entire movie? It proves a point about VR headset weight.

Beyond is priced at $1000 but doesn't come with positional tracking or controllers out of the box. It uses the SteamVR tracking system, so you'll need at minimum one base station - and ideally at least two - alongside your own input devices such as Valve Index controllers.

UploadVR's Ian Hamilton tried out a Beyond "pre-production model" in April, writing that it was "such a striking difference to feel so little weight on your face while still feeling entirely transported to a virtual environment".

0:00
/

Bigscreen today announced the optics have been improved, expanding the field of view from 93 degrees to 102 degrees - placing it between Quest 2 and Quest Pro despite its tiny size. However, this increase won't apply for buyers with an IPD of over 70mm "due to physical size limitations".

The startup has also increased the central angular resolution from 28 pixels per degree (PPD) to 32 PPD, a significant jump over the 22 central PPD of Quest Pro for example.

It also claims the new lenses improve clarity and sweet spot, while reducing artifacts such as blur.

Finally, they support a wider range of IPDs, from 53mm to 74mm via physical lens spacing available from 55mm to 72mm, tailored to each user's face scan.

Bigscreen says it's still on track to start shipping Beyond in Q3 this year in the US, and Q4 worldwide. We plan to review the finished product when it's available to give you our thoughts and analysis.