The Verge: Meta Plans True AR Glasses For 2024 With Neural Wristband Input

The Verge‘s Alex Heath just released a report detailing Meta’s apparent plans for glasses over the next six years.

Facebook, now Meta, has repeatedly publicly confirmed it’s working on augmented reality glasses since as far back as 2016. In late 2019 the company described a product launch as “a few years out”.

Korean news outlet The Electronic Times claimed Samsung will manufacture a custom tracking chip for Meta’s glasses, and The Information reported Meta has agreed to buy several years of the entire output of AR microLED display maker Plessey.

Nazare: Expensive AR Glasses

At the Connect 2021 conference in October Meta announced Project Nazare, “our first full augmented reality glasses”. The company didn’t, however, show a prototype or even real footage, instead showing a brief “simulated” concept clip. “We still have a ways to go with Nazare, but we’re making good progress,” Mark Zuckerberg remarked.

Heath writes that Meta plans to release Nazare as a product in 2024. It apparently weighs 100 grams, around four times normal glasses, and resembles Clark Kent’s thick black frames. Nazare will apparently have a battery life of just 4 hours and be powered by an included wireless puck which can fit in your pocket. The idea here is for Nazare to be able to fully replace your smartphone. The original target field of view was 70 degrees, but “that goal likely won’t be met” Heath writes.

The components to deliver this true AR experience – including eye tracking cameras, custom waveguides, and microLED projectors – apparently cost “multiple thousands of dollars”. As such, Nazare, at least the first generation, will be a niche product for early adopters and developers.

Meta has already spent billions of dollars developing Nazare, and plans a lighter & more advanced second generation model for 2026 as well as a third generation in 2028, Heath reports.

Hypernova: Affordable HUD Glasses

While Nazare will be very expensive true AR glasses, Heath says Meta also plans cheaper mainstream glasses in the same year (2024) codenamed Hypernova.

Hypernova will apparently pair with your smartphone to display notifications and contextually useful information in a small heads up display – similar to Google Glass but in a regular glasses frame.

Meta’s current Ray Ban Stories, which has a camera and speakers but no display

Last week The Information reported Meta plans to launch a second generation of its Ray Ban Stories smartglasses in 2023. Stories are essentially camera glasses – they have speakers for music and phone calls but no display. That report suggests the second gen won’t have a display either.

Heath seems to describe Hypernova as separate to the Stories product line. Could Hypernova actually be the third generation Ray Bans, launching a year after the second gen? That would seem to make more sense, but it’s unclear.

Neural Wristband Input For Both

Both Nazare and Hypernova will apparently be bundled with Meta’s neural wristband. Meta’s Quest headsets today support optical controller-free hand tracking, but this requires a well lit room and only works when your hands are within view of the cameras.

A different approach to hand tracking is to read the neural signals passing through your arm with a wristband using EMG (electromyography). Such a device can sense finger movement before it even happens, and can even sense incredibly subtle movements not clearly perceptible to others nearby.

Facebook filed a patent application for such a device in early 2019. New York based startup CTRL-Labs had been working on the same idea, so Facebook acquired the company in late 2019. Last year the company showed off its progress on this device in more detail.

Heath’s report suggests Meta plans to release a regular smartwatch this year, with a second generation the year after. But in 2024 the third generation is expected to integrate this EMG technology, and be bundled with both Nazare and Hypernova as the control method.

“Everyone I’ve talked to who has tried a prototype of the band Meta is working on says it’s one of the most impressive tech demos they’ve ever experienced. If it works at scale, the company thinks it could have the next mouse and keyboard” Heath writes.

Meta Is Becoming A Hardware Company

If Heath’s report is accurate and Meta is able to meet Mark Zuckerberg’s ambitious timeline, the company could slowly transition from being known primarily for its software to being known just as well for its hardware.

These plans won’t go without competition of course. Apple reportedly plans its own AR glasses for 2025, and Google is acquiring suppliers to do the same. Other tech giants like Amazon likely won’t sit on the sidelines either.

But Meta has been working on these technologies for almost 8 years now, and Heath wrote that Zuckerberg is still investing more than any other company.

“Zuck’s ego is intertwined with [the glasses]” a source of Heath’s apparently said – “He wants it to be an iPhone moment.”

The Verge: Meta Plans True AR Glasses For 2024 With Neural Wristband Input

The Verge‘s Alex Heath just released a report detailing Meta’s apparent plans for glasses over the next six years.

Facebook, now Meta, has repeatedly publicly confirmed it’s working on augmented reality glasses since as far back as 2016. In late 2019 the company described a product launch as “a few years out”.

Korean news outlet The Electronic Times claimed Samsung will manufacture a custom tracking chip for Meta’s glasses, and The Information reported Meta has agreed to buy several years of the entire output of AR microLED display maker Plessey.

Nazare: Expensive AR Glasses

At the Connect 2021 conference in October Meta announced Project Nazare, “our first full augmented reality glasses”. The company didn’t, however, show a prototype or even real footage, instead showing a brief “simulated” concept clip. “We still have a ways to go with Nazare, but we’re making good progress,” Mark Zuckerberg remarked.

Heath writes that Meta plans to release Nazare as a product in 2024. It apparently weighs 100 grams, around four times normal glasses, and resembles Clark Kent’s thick black frames. Nazare will apparently have a battery life of just 4 hours and be powered by an included wireless puck which can fit in your pocket. The idea here is for Nazare to be able to fully replace your smartphone. The original target field of view was 70 degrees, but “that goal likely won’t be met” Heath writes.

The components to deliver this true AR experience – including eye tracking cameras, custom waveguides, and microLED projectors – apparently cost “multiple thousands of dollars”. As such, Nazare, at least the first generation, will be a niche product for early adopters and developers.

Meta has already spent billions of dollars developing Nazare, and plans a lighter & more advanced second generation model for 2026 as well as a third generation in 2028, Heath reports.

Hypernova: Affordable HUD Glasses

While Nazare will be very expensive true AR glasses, Heath says Meta also plans cheaper mainstream glasses in the same year (2024) codenamed Hypernova.

Hypernova will apparently pair with your smartphone to display notifications and contextually useful information in a small heads up display – similar to Google Glass but in a regular glasses frame.

Meta’s current Ray Ban Stories, which has a camera and speakers but no display

Last week The Information reported Meta plans to launch a second generation of its Ray Ban Stories smartglasses in 2023. Stories are essentially camera glasses – they have speakers for music and phone calls but no display. That report suggests the second gen won’t have a display either.

Heath seems to describe Hypernova as separate to the Stories product line. Could Hypernova actually be the third generation Ray Bans, launching a year after the second gen? That would seem to make more sense, but it’s unclear.

Neural Wristband Input For Both

Both Nazare and Hypernova will apparently be bundled with Meta’s neural wristband. Meta’s Quest headsets today support optical controller-free hand tracking, but this requires a well lit room and only works when your hands are within view of the cameras.

A different approach to hand tracking is to read the neural signals passing through your arm with a wristband using EMG (electromyography). Such a device can sense finger movement before it even happens, and can even sense incredibly subtle movements not clearly perceptible to others nearby.

Facebook filed a patent application for such a device in early 2019. New York based startup CTRL-Labs had been working on the same idea, so Facebook acquired the company in late 2019. Last year the company showed off its progress on this device in more detail.

Heath’s report suggests Meta plans to release a regular smartwatch this year, with a second generation the year after. But in 2024 the third generation is expected to integrate this EMG technology, and be bundled with both Nazare and Hypernova as the control method.

“Everyone I’ve talked to who has tried a prototype of the band Meta is working on says it’s one of the most impressive tech demos they’ve ever experienced. If it works at scale, the company thinks it could have the next mouse and keyboard” Heath writes.

Meta Is Becoming A Hardware Company

If Heath’s report is accurate and Meta is able to meet Mark Zuckerberg’s ambitious timeline, the company could slowly transition from being known primarily for its software to being known just as well for its hardware.

These plans won’t go without competition of course. Apple reportedly plans its own AR glasses for 2025, and Google is acquiring suppliers to do the same. Other tech giants like Amazon likely won’t sit on the sidelines either.

But Meta has been working on these technologies for almost 8 years now, and Heath wrote that Zuckerberg is still investing more than any other company.

“Zuck’s ego is intertwined with [the glasses]” a source of Heath’s apparently said – “He wants it to be an iPhone moment.”

Apple Headset Rumor Roundup: Specs, Release Date, & Everything We’ve Heard

Bloomberg, The Information, and supply chain analyst Ming-Chi Kuo all claim Apple plans to release a standalone headset some time in the next year.

The standalone headset market is currently dominated by Meta’s $299 Quest 2. Reports suggest Apple’s headset – codenamed N301 – will be a much higher priced premium product aimed at delivering both virtual reality and mixed reality experiences.

It will reportedly feature color passthrough cameras, very high resolution displays, and a MacBook tier processor – all in a slim and relatively lightweight design.

Here’s everything we’ve heard so far about Apple’s headset:

High Quality Mixed Reality

The existence of Apple’s headset was first reported on by The Information in late 2019. This report claimed N301 will “offer a hybrid of AR and VR capabilities” with cameras “mounted on the outside of the device, allowing people to see and interact with their physical surroundings”.

This report also said Apple executives claimed the headset can “map the surfaces, edges and dimensions of rooms with greater accuracy than existing devices on the market”.

In February 2021 The Information claimed the device has more than a dozen cameras, including eye tracking cameras.

How This Compares: Quest 2 can only show a grainy black & white view of the real world with no automatic scene understanding, but Meta plans to launch a higher end headset this year, codenamed Project Cambria, with color cameras and enhanced mixed reality. French startup LYNX also plans to launch a headset with color mixed reality by July.

Slim & Relatively Light Design

In January 2021 Bloomberg claimed Apple is planning to use “a fabric exterior” to reduce the device’s weight.

In The Information‘s February 2021 report the outlet claimed to have viewed images of a late-stage prototype “which show a sleek, curved visor attached to the face by a mesh material and swappable headbands”, drawing this impression:

The Information Apple VR

In March 2021 Apple supply chain analyst Ming-Chi Kuo claimed prototypes weighed 200-300 grams with a target of 100-200 grams. But in December Kuo revised this claim, saying the first generation model will weigh 300-400 grams while a future second generation will be “significantly lighter”.

How This Compares: Quest 2 weighs just over 500 grams.

High Resolution OLED Microdisplays

The Information‘s February 2021 report claimed Apple’s headset will feature dual 8K screens, but didn’t go into detail about the display technology.

But in November supply chain analyst Ming-Chi Kuo claimed the headset will feature dual 4K OLED microdisplays. This is a much more realistic prospect – The Information could have been misinterpreting dual 4K screens, which some would describe as 8K, as dual 8K.

OLED microdisplays provide the true blacks and infinite contrast of regular OLED displays but are much smaller, enabling very compact headset designs. However, microdisplays are significantly more expensive.

How This Compares: Quest 2 has a resolution of 1832×1920 per eye. The PC VR leader Varjo Aero has a resolution of 2880×2720 per eye.

rOS: A New Operating System

As far back as 2017, Bloomberg reported Apple was working on a new operating system dubbed “rOS” for “reality operating system”.

In December Apple posted a job listing for ‘AR/VR Frameworks Engineer’, with the role described as “developing an entirely new application paradigm” for “software that is deeply integrated into our operating systems”.

In January, iOS Developer Rens Verhoeven spotted a new platform “com.apple.platform.realityos” in the App Store app upload logs.

 

In February, “award-winning git repository surgeon” Nicolás Álvarez spotted Apple committing code to its open source GitHub repository referencing ‘TARGET_FEATURE_REALITYOS’ and ‘realityOS_simulator’ – the latter likely a feature to allow developers without the headset to test building AR or VR applications.

How This Compares: Quest 2, Vive Focus 3, and Pico Neo 3 Pro all use heavily modified versions of Google’s Android, an OS not designed for low latency VR and AR.

MacBook Level Performance

Ming-Chi Kuo’s November note claimed Apple’s headset will have a new chip with “similar computing power as the M1 for Mac”.

M1 is Apple’s first in-house PC processor, the first in a line intended to transition its Mac products from the x86 architecture that has dominated PCs for two decades to the more power efficient ARM architecture used in smartphones & tablets.

An M1 level chip would outperform any current standalone VR headset, and thus narrow the performance gap between mobile and PC based VR.

How This Compares: Quest 2, Vive Focus 3, and Pico Neo 3 Pro all use Snapdragon XR2, a variant of Qualcomm’s early 2020 flagship smartphone chip. Apple’s M1 is roughly twice as powerful.

“Thimble” Controllers?

The biggest unknown is what input method Apple’s product will use. Bloomberg’s January 2021 report claimed the headset’s high resolution cameras will enable controller-free hand tracking with a floating keyboard for text entry.

But The Information‘s February 2021 report claimed Apple was developing a “thimble-like device to be worn on a person’s finger”.

Apple Thimble

Shortly after that report, Patently Apple spotted a patent application titled Self-Mixing Interferometry-Based Gesture Input System Including a Wearable or Handheld Device. The patent described a device to “track a user’s finger movements with reference to any surface, including, in some cases, the surface of another finger, the user’s palm, and so on” which”may in some cases be used to provide input to an AR, VR, or MR application“. The filing even cites the example of detecting when the user is holding a stylus, and potentially even providing useful information on what’s being drawn.

How This Compares: Quest 2 and PC VR headsets use gaming focused controllers, which resemble an Xbox or PlayStation controller split into halves for each hand.

Price?

The Information‘s February 2021 report claimed Apple internally discussed pricing the headset around $3000, with a goal of 250,000 units in the first year.

But in Ming-Chi Kuo’s March 2021 note, he said he expects it to be priced around $1000, “similar to a high end iPhone”.

Finally, in January Bloomberg claimed Apple has “weighed prices north of $2000”. Despite this, the report also said Apple is forecasting sales of up to 10 million units in the first year.

While these reports cite widely different price expectations, it’s clear this won’t be a cheap mass market product. Expect a price with four digits, not three.

Release Date?

The Information‘s February 2021 report said Apple was aiming to release the headset in 2022.

In September Taiwanese news outlet DigiTimes claimed mass production was scheduled for Q2 2022, in time for a release in the second half of the year.

Ming-Chi Kuo’s November note predicted the headset will launch in Q4 2022.

However, in January Bloomberg claimed Apple was intending to announce in June at its Worldwide Developer Conference (WWDC) and ship later in the year. But “challenges related to overheating, cameras and software” mean the announcement could be pushed to late this year, and the release into 2023. The overheating issue is said to be caused by trying to use a laptop grade processor in a lightweight headset.

Snap Acquires Brain Reading Tech Startup NextMind For AR Glasses Input

Snap inc is acquiring BCI company NextMind for future AR glasses, The Verge’s Alex Heath reports.

A major challenge in shipping consumer AR glasses is the input method. A traditional controller, such as those used with many VR devices, would not be practical for glasses you want to wear out and about on the street. Similarly, while voice recognition is now a mature technology, people tend to not want to give potentially private commands out loud in front of strangers.

A brain computer interface (BCI) could one day allow users to control their glasses, and even type words and sentences, by just thinking.

NextMind is a French startup which released a a $400 developer kit two years ago, a headband for brain input. Its competitors include Neurosity.

This isn’t Snap’s first core tech acquisition for its AR glasses ambitions. In May it acquired WaveOptics, its supplier of the transparent optical waveguides and accompanying projectors in the Spectacles developer kit.

Mark Zuckerberg’s Meta began its own BCI research project in 2017, with the stated goal of “a system capable of typing 100 words-per-minute straight from your brain”. But in July 2021 the company announced it was cancelling this project to work on wrist-mounted devices to read signals passing from the brain to the hands instead.

A Snap spokesperson apparently told The Verge the NextMind acquisition is a long-term research bet, with no specific technology yet intended to ship in products.

Google Acquires MicroLED Display Startup Raxium For AR Glasses

Google acquired AR MicroLED display startup Raxium.

The acquisition suggests Google may be working on AR glasses. Meta plans to release glasses in “a few years”, and reports suggest Apple plans to as early as 2025. In 2020, Meta signed a deal to buy several years worth of output from MicroLED supplier Plessey.

Almost all electronic displays today are either LCD, including its many variants, or OLED. LCD pixels provide color but separate backlights provide the light, which limits the contrast possible. OLED pixels are self-emissive, enabling true deep blacks and infinite contrast.

To be clear, “miniLED”, “QLED”, and similar names are just marketing terms for variants of LCD with improved backlight technology or tiny shutters for better contrast.

MicroLED is self-emissive like OLED, but should be orders of magnitude brighter than OLED, as well as significantly more power efficient. This makes them uniquely suitable for consumer AR glasses, which need to be usable even on sunny days yet powered by a small and light battery.

While all major electronics companies are actively researching microLED – including Samsung, Sony, and Apple – no company has yet figured out how to affordably mass manufacture it for consumer products.

Interestingly, The Information cited a Google source as saying the company is looking to acquire more AR glasses components suppliers. “Google has realized you can’t just own the operating system” for such devices, this person reportedly said. If true, Google could be planning to sell AR hardware, not just provide the software.

This article was originally published March 23 when the acquisition was reported by The Information. It has been updated as Google confirmed the news.

Magic Leap 2 Controllers Have Onboard Inside-Out Tracking

An image shared on Twitter of the upcoming Magic Leap 2 shows cameras on the controllers, used for inside-out tracking.

In January we reported on Magic Leap 2’s specs – or at least some of them – being shared at SPIE Photonics West 2022. But as it turns out, we missed that the company also revealed that the controller itself uses inside-out tracking. It wasn’t clear at the time what exactly that meant, but the image shared this week by entrepreneur Peter Diamandis shows two front-facing cameras.

Controllers in almost all AR and VR systems available today are either tracked by the headset or rely on external base stations. Meta’s Quest 2 for example tracks a pattern of infrared LEDs underneath the plastic ring of its controllers, while Valve’s Index controllers determine their position relative to SteamVR “Lighthouse” base stations placed in the corner of your room.

Relying on the headset for tracking has a flaw: if the controller moves out of view of the sensors or if any part of your body gets in the way, tracking will temporarily break. This isn’t a problem for many use cases, but does limit intricate two handed interactions and scenarios like looking left while shooting right. Using external base stations can alleviate most of these issues, but that increases setup time and severely limits portability – and the path from controllers to base stations can still be occluded.

Magic Leap 1 and Pico Neo 2 used magnetic tracking. Unlike visible light, the magnetic field can pass through the human body so occlusion isn’t an issue. But magnetic tracking isn’t as precise as optical tracking systems can be, and adds significant weight and cost to the hardware.

Controllers with onboard cameras promise to solve the occlusion problem while maintaining high precision by tracking themselves in the same way inside-out headsets do – using a type of algorithm called Simultaneous Location And Mapping (SLAM). SLAM essentially works by comparing the acceleration (from an accelerometer) and rotation (from a gyroscope) to how high contrast features in your room are moving relative to the cameras. Initial SLAM algorithms were hand crafted, but most today use at least some machine learning.

The potential downsides of this approach are the cost of a chip powerful enough to run the tracking algorithm, the reduction in battery life due to the power that chip would draw, and the need to have a well-lit environment with high contrast features such as posters – though that limitation applies to inside-out headsets too. Some have suggested tracking quality may be reduced in fast movements due to motion blur, but this shouldn’t be any more of an issue than tracking fast moving LEDs – a global shutter sensor with a low exposure time should make this a non-issue.

Meta is seemingly also planning to use controllers with onboard inside-out tracking in its upcoming Project Cambria headset. Images of Quest-like controllers with cameras instead of an LED ring first leaked in September, and the rings aren’t present in the official reveal trailer either.

Both Magic Leap 2 and Project Cambria are slated to release this year, though neither has a specific release window. They’re very different products – ML2 is a transparent AR headset designed for enterprise while Cambria is an opaque headset for VR and mixed reality – but whichever launches first will be the first AR or VR system to use this new approach to controller tracking.

Korean News: Samsung Working On Its Own AR Headset, Separate From Microsoft

Samsung is working on its own Android-based AR headset separate to Microsoft, South Korea’s Electronic Times reports.

Four weeks ago, a report from Insider’s Ashley Stewart claimed Microsoft shelved HoloLens 3 last year in favor of letting Samsung build hardware powered by Windows mixed reality software. One of Stewart’s sources described the partnership as a “shit show”. But the Electronic Times report claims Samsung is planning “its own AR devices and the AR devices developed with Microsoft”.

The report suggests Samsung will use one of its own Exynos chips in the device, rather than sourcing from Qualcomm. Samsung has apparently completed a prototype, and “is deciding the release date”.

From 2014 Samsung partnered with Facebook on the phone-based Gear VR, technically the first widely shipped consumer VR product, but by 2019 then-CTO John Carmack declared it dead as standalone headsets took over. In January 2020 at CES, Samsung showed off a concept of AR glasses, but didn’t actually say whether this was a planned product. That same month China’s intellectual property office awarded Samsung a patent for a VR headset with four tracking cameras, but no product emerged from this either.

It’s unclear what exactly would distinguish Samsung’s own headset from its partnership with Microsoft, but if the report is to believed Samsung is finally ready to re-enter this market again.

Magic Leap 2 Specs Suggest A Best-In-Class Seethrough AR Headset

Magic Leap 2’s specs – at least some of them – were revealed this week in a presentation at SPIE Photonics West 2022.

While Microsoft HoloLens came out first targeted toward businesses, the original Magic Leap One launched in 2018 as the first augmented reality headset available to consumers, priced at $2300. After reportedly selling just 6000 units in the first six months, the company pivoted to targeting enterprise customers instead in 2019.

Magic Leap 2 was first teased back in late 2019 with a launch target of 2021, though no details were given. In October 2021 the company shared the first image and announced it will launch this year, claiming it will have the largest field of view of any transparent optics headset.

 

At Photonics West, Magic Leap’s VP of Optical Engineering Kevin Curtis revealed some key specs for Magic Leap 2 (ML2).

The headset apparently weighs 248 grams, down from Magic Leap 1’s 316 grams.

However, Magic Leap headsets use a tethered compute box attached to your waist rather than housing the battery & processor in the headset itself. Curtis says ML2′ new compute box is more than twice as powerful as ML1’s with “more memory and storage” too. While ML1 used an NVIDIA Tegra chip, Magic Leap announced a partnership with AMD in December.

ML1 has two variants to accommodate narrower and wider interpupillary distances (IPDs). Curtis claims ML2’s eyebox is twice as large meaning this is no longer necessary. The eyebox refers to the horizontal and vertical distance from the center of the lens your eyes can be and still get an acceptable image.

While ML1 uniquely had two focal planes so near and far virtual objects were focused at different distances, there was no mention of the same technology in the ML2 spec presentation.

ML2 seems to have its own unique optical technology though; a new feature called Dynamic Dimming. A major problem with see-through AR headsets is the inability to display the color black, since their optical systems are additive – they superimpose color onto a transparent lens, but black is the absence of color. Curtis claims dynamic dimming can vary the lens from letting through 22% of real world light to letting through just 0.3%. At 22% the real world will be visible even in dark rooms, 0.3% would let virtual objects remain visible even in bright outdoor conditions.

ML1 had one eye tracking camera per eye, but ML2 has two per eye, which Curtis says “improves image quality, minimizes render errors, and enables “segmented dimming”. The later use case wasn’t elaborated on, but may suggest the headset could vary the Dynamic Dimming level based on whether you’re looking at darker or lighter virtual objects.

Notably, Curtis did not reveal the resolution or the exact field of view. But CEO Peggy Johnson revealed it in November at Web Summit as approximately 70 degrees diagonal, up from 50 degrees in the original.

If we assume the aspect ratio shared in the October tease is accurate, that would mean a horizontal field of view of roughly 45 degrees and vertical field of view of roughly 55 degrees. This is significantly narrower than opaque passthrough headsets like LYNX R1, but much taller than competing seethrough headsets like HoloLens 2.

Magic Leap 1 is targeted toward enterprise but still available to individuals who want one. It’s unclear what sales path Magic Leap 2 will take, and no price or specific release date has yet been revealed.

Bloomberg: Facing Overheating Issue, Apple’s Headset Probably Won’t Ship This Year

Apple recently informed suppliers its mixed reality headset probably won’t release this year, Bloomberg reports.

BloombergThe Information, and supply chain analyst Ming-Chi Kuo all claim Apple is preparing to release a high end headset with high resolution color cameras for mixed reality. Recent notes from Kuo claim it will weigh 100-200 grams less than Meta’s Quest 2 and feature dual 4K OLED microdisplays.

In February 2021 The Information claimed to have viewed images of a late-stage prototype “which show a sleek, curved visor attached to the face by a mesh material and swappable headbands”. The outlet drew this impression:

The Information Apple VR

Bloomberg’s new report claims Apple was intending to announce in June at its Worldwide Developer Conference (WWDC) and ship later in the year. But “challenges related to overheating, cameras and software” mean the announcement could be pushed to late this year, and the release into 2023.

The overheating issue is said to be caused by trying to use a laptop grade processor in a lightweight headset. In November Ming-Chi Kuo reported the headset will use a “new chip with “similar computing power as the M1 for Mac”.

As we noted at the time of that report, M1 is a fairly large chip. While its power efficiency is incredibly impressive for a laptop or PC, it draws much more power and generates much more heat than the smartphone-tier processors used in existing headsets like Quest 2. That’s why Apple uses its lower power A-series chips in even its most advanced iPhones.

The report also claims Apple has “weighed prices north of $2000” for the headset. Despite this, the report also says Apple is forecasting sales of up to 10 million units in the first year. Meta hasn’t revealed how many units Quest 2 has sold, but the face foam recall documents and a comment from Qualcomm’s CEO suggest somewhere in the region of 5-10 million already.

If Apple can eventually pull off putting an M1-tier chip in a 300-400 gram headset, it may deliver a significantly higher end experience than Meta. But Bloomberg’s report suggests doing is proving harder than expected.

Nreal Light AR Glasses Review: A (Limited) Preview Of The Future

Nreal Light is the first AR glasses product available in the US. But is this new technology ready for regular consumers yet? Read on to find out.

Light is available in Verizon stores and on Verizon’s website. It weighs around three times a heavy pair of sunglasses, or a third of a Magic Leap One headset. To achieve this form factor Light is powered by a smartphone over USB cable – there is no battery or full-fledged chip onboard.

Price & Compatibility

Light is priced at $599. While Nreal says you can mirror any Android or iOS device to a floating virtual screen in front of you, to use the actual augmented reality capabilities including positional tracking and AR apps you’ll need a compatible Verizon flagship device:

  • Samsung Galaxy S21 Ultra 5G
  • Samsung Galaxy S21+ 5G
  • Samsung Galaxy S21 5G
  • Samsung Galaxy Z Fold3 5G
  • Samsung Galaxy S20 FE 5G UW
  • Samsung Galaxy S20 5G UW
  • Samsung Galaxy Note 20 Ultra 5G
  • OnePlus 8 5G UW

The cheapest of these is $799, so the total buy-in price if you don’t already own one starts at $1398.

Visual Quality & Field of View

To start, I should probably really be referring to Light as sunglasses since the real world is darkened. Other than being darker, the view of the real world is clear and undistorted – this is the key advantage of see-through head mounted displays.

The problem with see-through display systems however is that the darker the pixel, the less opaque it will appear. True black is completely invisible, since it’s produced by turning the pixels off. This means some virtual objects can appear more like translucent holograms than real objects, unless they entirely consist of bright colors.

But this opacity limitation aside, Light’s dual 1920×1080 OLED microdisplays provide very impressive angular resolution. The visual quality is sharper than any (consumer) VR headset, and apps like the browser feel much like using a real 1080p monitor. Even at a distance, small text is easily readable. At no point using Light did I feel resolution was a limitation.

The major flaw with Light’s image is that it appears to blur as you move your head. It’s incredibly distracting, and suggests the displays are full persistence. The vast majority of VR and AR headsets since 2014 have used low persistence displays, precisely to avoid this blur effect.

Field of view is much more difficult to convey. I could tell you it’s 53 degrees diagonal. I could explain how that’s equivalent to sitting in front of a 19″ monitor, or 2 meters away from a 77″ television. But none of these figures really capture what it’s like through the glasses.

The best way I can think to really get across the field of view is to express it as a percentage of the lens – ie. how much of the lens can actually show pixels and how much is just regular sunglasses. Before I express that though, you need to understand a few caveats compared to regular glasses:

  • Light’s lens is narrower (vertically), so there’s lots of empty space below you
  • the top of Light’s frame is much thicker, so you can’t see above you
  • Light’s lens sits further in front of your eyes

With that out of the way – to my eye Light’s display extends across roughly 85% of the lens vertically and around 70% horizontally.

What this means in practice is you’ll want to position virtual screens and objects at least a few meters away to be able to see all of them at once. This severely limits what kind of content Light works well with, but this is a problem with all current see-through displays.

Comfort & Size

Nreal Light is more comfortable than any head mounted display I’ve ever used. Unlike bulky AR goggles and compact VR headsets it truly does feel like wearing a heavy pair of glasses – 109 grams to be exact.

Light comes with four separate nose pads, and everyone I demoed it to was able to find at least one they found very comfortable for their nose shape.

The only complaint I have about Light’s comfort is it sometimes gets warm. This isn’t a dealbreaker, but when it happens it limits how long it can be worn for.

Tracking

Nreal Light has two tracking cameras for inside-out positional tracking. It works, but there’s noticeable bounce and drift.

While Nreal’s SDK can detect horizontal planes like your floor and table, it doesn’t generate a depth map and doesn’t map your walls in any way. As such, it doesn’t support occlusion – meaning virtual objects and screens always appear in front of real world objects, even when they’re more distant.

From a core technology perspective tracking is the weakest aspect of Light, and the biggest difference between it and much more expensive AR hardware such as HoloLens 2 or Magic Leap One.

Input

Like HTC’s Vive Flow virtual reality headset, Light is controlled by your smartphone acting as a rotational laser pointer.

The control scheme on the touchscreen varies between apps. Nreal has a default, but apps can render their own phone UI and virtual buttons. Since you can actually see the phone this is much more usable than Flow, but since the phone isn’t positionally tracked it’s still awkward and clunky.

Nreal’s SDK actually supports hand tracking, but bizarrely almost no apps support this, and the system software (Nebula) doesn’t either.

Software & Content

Nebula is Nreal’s system software, the default app you’ll see in AR mode. It lets you open, move, resize and reposition web browser windows or phone apps in your real room as well as being the interface for launching AR apps.

Nebula is genuinely impressive. While I wish I could point and pinch with hand tracking, even using the phone to position browser windows means you can easily watch videos or read articles anywhere, without having to hold a phone or tablet in your hand.

While the hardware field of view limits Nebula’s usefulness, this is a genuine preview of a future where physical TVs and monitors are antiques of the past and your workspace is wherever you want it to be.

Other AR apps are far less useful. You get them from Google Play, Nreal doesn’t have its own store. There are maybe two dozen in total. Most are essentially demos. AR content still feels like VR content did back in 2014 when the only widely available hardware was the Oculus developer kits.

Should You Buy One?

If you’re a software developer or tinkerer interested in building for the latest technology platforms, and $599 is a reasonable price to you, picking up Nreal Light could be a great way to get started in AR.

 

But what if you’re not a developer? If you frequently spend time in hotels or temporary accommodation and find yourself missing your big TV from home, Light could effectively be a huge ultra-portable floating screen.

For everyone else though, unless you’re incredibly eager to preview the future and $600 is pocket change, AR just isn’t ready for you yet.