Qualcomm Reveals AR Viewer Reference Design Based On XR1

Qualcomm announced a reference design for what it calls an “AR Smart Viewer” meant to enable manufacturers to build on the idea more quickly.

Qualcomm developers a number of reference designs that showcase some of the potential of devices built around its chips. The latest design is based on the XR1 platform and meant to connect over a wire to a compatible smartphone, Windows PC, or dedicated processing puck. The glasses have processing inside of them, distributing tasks between the viewer themselves and the device to which it connects. Qualcomm says the architecture can reduce overall power consumption by 30 percent compared to glasses which leave all the processing outside the headset. The company also announced an app framework designed to make it easier for manufacturers to launch smartphone apps as windows that can be anchored to locations in the physical world.

There’s support for a camera with image stabilization — meant for certain use cases like a remote assistant — as well dual monochrome cameras for 6DoF head tracking as well as hand tracking. The reference design made by Goertek includes a micro-OLED binocular display from BOE with frame rates up to 90Hz.

The design looks similar in overall concept to the glasses from companies like Nreal and Lenovo, but Qualcomm’s reference designs are typically proof-of-concept devices that pack in a lot of features which are sometimes left out of consumer devices for various reasons, like cost or power consumption. Manufacturers hope to eventually pack all of the processing needed for AR glasses into a single head-worn device that’s so slim people think it is comfortable and stylish enough to wear out of the house all day, but that goal is still a long ways a way from realization.

FinchRing Is A New Gesture Controller Shown With Nreal

Finch Technologies revealed a new ring-like wearable for gesture recognition.

The FinchRing was announced in connection with Mobile World Congress in Shanghai alongside a partnership with AR glasses maker Nreal.

Finchring finchshift

The device requires a separate device — the FinchTracker — worn on upper arm or full 6DoF tracking and the company claims there’s no limits to field-of-view for tracking via the internal IMU, though there’s mention of “optical enhancement” to the device’s tracking. There’s also a touchpad on the top of the FinchRing with customizable vibration patterns for haptic feedback and up to four hours of active use for the battery.

Finch Technologies is pitching the device as not just being useful for VR or AR headsets, but also for PCs, tablets, or phones with open-air gestures. We haven’t tried the device and that’s the best way to judge the potential of input mechanisms for VR and AR headsets.

Recent reports suggest that an Apple VR headset in late-stage development may include a “thimble-like” accessory worn on the finger for interaction. So it is interesting to see another small hand-based device being explored for interaction, though of course there are lots of ideas being explored for haptics and interaction for AR and VR headsets.

Realworld Will Let You Explore The World In VR With Hand-Tracking On Quest, Also Coming To PC VR

Realworld is a newly announced in-development VR app from the creator of Cubic VR that will let users explore the actual world from inside a VR headset while connecting with friends.

Based on that description and the video embedded above if you think that sounds a lot like Google Earth, then you’re absolutely correct. However, like Microsoft Flight Simulator, Realworld uses Bing Maps, not Google Maps.

Realworld is coming natively to Oculus Quest with additional plans for support on PC VR, mobile AR, and mixed reality devices. The eventual goal is to make it so that if you visit a location in real life, you can see markups and notes that people left via Realworld, in addition to being able to use AR to look up and see VR users from around the world.

We haven’t gotten a chance to try Realworld, but it looks a bit like Google Earth was condensed down onto a tabletop to make rendering that sort of information manageable. Using a “pinch” type gesture with both hands you can zoom the view in and out very quickly.

In the trailer we can even see the ability to “grab” one another, since this is multiplayer, and either shrink or grow each user to get a different perspective on the environment. Since the table is so small, you can start from a space-style continental view and then zoom all the way down to street level very smoothly. But the limited scope of the “table” format seems to rid the experience of the grand scale of things found in something like Google Earth.

Luckily, you can still go “inside” the street view perspective like a 360-photo instantly like you can in Wander. The table becomes sort of like a 3D map with to-scale models of locations and then you can teleport down to see it all around you if you’d like. Realworld will also let you sketch onto the world itself to draw things with 3D pens, drag and drop your own 3D models directly into the world itself which has some amazing possibilities, as well as much more such as built-in streaming support, sticky notes, animation features, and lots of other tools the trailer only hints at for now.

realworld google earth multiplayer

You can go sign up on the official Realworld website to stay up-to-date on details and future information.

Microsoft Introduces New HoloLens 2 Industrial Edition

Microsoft introduced a new edition of the HoloLens 2 this week, which is “designed, built, and tested to support regulated industrial environments.”

The HoloLens 2 Industrial Edition adds a few standards and certifications to the headset, along with changes to warranty and unit replacement. The Industrial Edition meets the “clean room compatible” standard, with an ISO 14644-1 Class 5-8 rating, as well as the “Intrinsic safety” standard, with a UL Class I, Division 2 rating.

The headset also comes with a two year warranty and a ‘Rapid Replacement Program’ which Microsoft says “minimizes downtime, with advance exchange and expedited shipping.”

The HoloLens 2 started shipping in November 2019 as an enterprise-level, standalone AR headset, priced at $3,500. A successor to the original HoloLens, we found that the HoloLens 2 made some good improvements to comfort and accessibility. In June last year, the headset was made available to purchase directly for non-enterprise customers, but not for any cheaper — it still carried the same hefty $3,500 price tag.

The Industrial Edition shipping later this year is even pricier, at $4,950 per unit. The increase is likely to cover the new standards and comprehensive replacement program, given that industrial environments might see the headset needing to be replaced or repaired more frequently.

In August last year, a new HoloLens 2 application was released that helps doctors and nurses safely identify symptoms of COVID-19 patients using volumetric capture.

HoloLens 2 Industrial Edition shipments will begin in Spring 2021, with pre-orders now open to all existing HoloLens 2 markets.

You can read more about the Industrial Edition over on Microsoft’s blog, and be sure to also check out our hands-on with the original HoloLens 2 from MWC 2019.

Panasonic Announces AR HUD For Cars With Situational Awareness

This week at CES, Panasonic announced a new augmented reality heads-up display (HUD) system for cars, which claims to offer many more features than currently available in typical car HUD systems.

Heads-up displays have been available in cars for a number of years now, however in their current form, they tend to be pretty limited in scope. In most situations, they simply offer basic navigation and vehicle information overlaid on a small static display that is projected from the dashboard onto the windscreen, visible only to the driver.

However, Panasonic’s new AR HUD system aims to offer much more, including a larger display and situational awareness of the area in front of and around the car.

According to Panasonic, the new HUD system “projects 3D, AI-driven key information into the driver’s line of sight to help reduce driver distraction and potentially increase safety on the road.” It promises an “expanded field of view” and a system that uses AI to dynamically rearrange the graphics to adjust with the vehicle’s movements.

Panasonic AR Hud Car

Panasonic provided the image embedded above as an illustration, however this may not be fully representative of the look of the final product.

Panasonic claim that sudden changes to the environment, such as a potential collision or a cyclist on the road, can be detected and marked by the system, with environmental information updating in less than 300 milliseconds. It uses a 3D imaging radar to achieve this, with “full 180-degree forward vision up to 90 meters and across approximately three traffic lanes,” alongside eye tracking that will correct for any inconsistencies between the projected HUD image and the driver’s line of sight.

“Panasonic’s AR HUD solutions cover more of the roadway, with traditional cluster content like speed and fuel in the near field as well as 3D overlays in the far field, showing navigation and other critical driver data mapping spatially to the road ahead,” says President of Panasonic Automotive and Executive Director of Panasonic Smart Mobility Scott Kirchner. “And in a future with more self-driving vehicles, our AR HUD could provide an important added level of comfort and assurance for AV passengers as well.”

Would you be interesting in trying out this AR HUD in your car? Let us know what you think in the comments below.

Lenovo Targets Mid-2021 For ThinkReality A3 AR Glasses

This week as part of CES 2021, Lenovo announced its new model of AR glasses, the ThinkReality A3.

The A3 is a successor to the ThinkReality A6 headset from 2019, which was an enterprise-focused AR headset aimed at taking on similar offerings from Magic Leap and Microsoft’s HoloLens.

“The smart glasses are part of a comprehensive integrated digital solution from Lenovo that includes the advanced AR device, ThinkReality software, and Motorola mobile phones,” said Jon Pershke, Lenovo Vice President of Strategy and Emerging Business. Like the A6, the A3 is an enterprise-focused AR device.

Inside the ThinkReality A3 is a Qualcomm Snapdragon XR1, stereoscopic 1080p displays, an 8MP camera for 1080p video and dual fish-eye cameras for roomscale tracking. The headset will tether to a PC or select Motorola smartphones via USB-C, depending on the edition of the glasses.

The A3 PC Edition can tether to a laptop or PC in order to “enable users to position multiple, large virtual monitors in their field of view and use Windows software tools and applications.” Lenovo says the virtual monitors are “optimized and compatible” with its ThinkPad laptops and other mobile workstations that use Intel and AMD processors.

Lenovo ThinkReality A3

The A3 Industrial Edition will tether to Motorola smartphones that have a Qualcomm Snapdragon 800 series or better, providing “hands-free, AR-supported tasks in complex work environments … supported by the ThinkReality software platform, which enables commercial customers to build, deploy, and manage mixed reality applications and content on a global scale.”

Lenovo says the ThinkReality A3 glasses will be available “in select markets worldwide starting mid-2021,” with no word on pricing.

TikTok Introduces New AR Effect Using iPhone 12 Pro LiDAR Scanner

TikTok introduced a new AR effect to bring in the new year this week. However, unlike other AR effects, this one requires the iPhone 12 Pro’s LiDAR scanner.

AR effects are now a common feature of social media videos. Instagram, Facebook, and TikTok all employ AR to offer effects that interact with the user’s environment, face and more. However, a new effect from TikTok takes the AR integration even further, using the LiDAR scanner in the iPhone 12 Pro to scan objects in the environment and alter the effect accordingly.

It features a New Year’s countdown, which explodes into confetti when it reaches zero. The LiDAR scanner is used to gather information of the environment so that the confetti can then realistically fall onto objects just like real life. Here’s a video from TikTok demonstrating what it looks like:

As you can see in the video, the confetti falls onto the lounge realistically, scattering over the cushions, the arm rests, and the chaise just like it would in real life. Some of the confetti falls lower onto the floor, and it all stays correctly in place when the camera is moved.

The LiDAR scanner is essentially able to measure the depth between points in the environment and the camera lens, allowing for the confetti effect to be pulled off realistically. Because of this, it’s only available to users with an iPhone 12 Pro — no earlier models or other iPhone 12 models come equipped with the LiDAR scanner.

Have you tried out this TikTok effect or used your iPhone 12 Pro’s LiDAR scanner for something else? Let us know in the comments below.

Developers Can Now Test Spark AR Effects Virtually Using Oculus Quest

Developers of Spark AR effects can now test their creations in VR using an Oculus Quest.

The Spark AR Player is now available for Oculus Quest, which allows developers to test Spark AR effects on a virtual phone in a virtual environment. This allows developers to see how their effects will look in environments that they might not have access to in real life. For example, a developer creating an effect for a public area or art gallery would be able to test how the effect looks in action without physically being at the location.

Spark AR is used by developers to create augmented reality effects that are overlaid on a phone’s camera feed in real time in apps like Instagram and Facebook Messenger. The Spark AR Player for Quest provides a virtual phone or tablet within VR, on which developers can test effects in VR without needing to use a physical phone.

Facebook says Spark AR Player for Quest works best when testing effects that track a target or track something along a plane. It does not support certain features, such as face tracking, face effects, hand tracking, microphone and video features. However, you can test gestures such as tapping and pinching.

The Spark AR Player for Quest should be available to download on the Spark AR site, but at the time of writing the link on the posting may be broken temporarily.

After installing the Spark AR Player on your Quest, the headset will show up in the Spark AR Studio toolbar as an option under ‘Test on Device’, provided your computer and headset are on the same network.

You can read more about testing Spark AR effect on Oculus Quest here.

How Cyberpunk 2077 Depicts The Future Of AR And VR

Cyberpunk 2077 was one of last year’s biggest, and most controversial, game releases. Set almost 60 years in the future, Night City is a world dominated by technology, including interpretations of VR and AR technology that go far beyond current applications.

Warning: The following article will discuss lore and gameplay mechanics from Cyberpunk 2077, along with minor spoilers for a mini-game that takes place in one of the side quests involving River and very minor spoilers for one objective of a main story mission that comes early in the game. There are no spoilers for the main story’s narrative or plot. It will discuss how Night City depicts VR and AR technology, both in terms of gameplay and the world’s lore. 

cyberpunk braindance

Based off a roleplaying game from the 1980s, Cyberpunk 2077’s vision of the future isn’t exactly one that extrapolates off our world today,  Instead, the game imagines an alternate timeline inspired by the ideas and predictions of the 1980s and the original Cyberpunk RPG that released in 1988. So while everything does look futuristic, it’s a kind of retro-futurism dripping in 80s aesthetic. There’s no social media here, and everyone still calls each other on mobile phones.

That’s not to say that modern influences aren’t present — in many ways, Cyberpunk 2077 merges a vision of the future from the 1980s with snippets of modern life and its technology. In particular, developers CD Projekt Red have taken the concepts and current state of VR and AR in 2020 and extrapolated that out into their interpretation of what that technology might look like in 60 years’ time.

In some ways, these depictions are limited in scope — it’s hard, if not impossible, to accurately predict what the future of VR and AR will look like in 50 years’ time. Nonetheless, it’s interesting to look at how the game imagines we might use virtual and augmented reality in ways that go beyond our current scope and capabilities.

Braindances and VR in Night City

cyberpunk braindance 2

The most extreme version of this is Cyberpunk’s ‘braindance’ technology, frequently shortened to BD. A BD is an immersive form of virtual reality which involves the user ‘jacking in’ (connecting the cyberware in one’s body to a piece of external technology) to a BD headset. It will flash a series of lights into the user’s eyes, putting them into a coma-like state, where they will relive a sequence recorded from someone else’s thoughts and memories.

Cyberpunk 2077 mainly uses BDs as a way for V to gain information or advance the narrative, relieving important key moments through BDs that then allow V to more towards his next objective with more information. However, in a lore sense, BDs are used mainly for entertainment purposes. According to the Cyberpunk wiki, companies sell BDs that feature extreme or dream scenarios, which involve someone first seeking out and particpating in those experiences in real life so that they can be recorded and sold as a BD.

There’s a large entertainment market for BDs, including those of erotic and pornographic nature. And of course, it’s not always black and white — in a world where people’s experiences and memories can be recorded and resold, there’s always going to be grey areas. One main story mission early in the game requires you to obtain a illegal BD containting questionable content from a shady dealer in the city’s red light district.

Cyberware and AR Gaming

When it comes to AR, Cyberpunk 2077 depicts technology that leans a little closer to what we have now in real life. Google Glass or Magic Leap-like AR overlays are a part of every day life, allowing people to take AR phone calls, run diagnostics for their cyberware and much more. The game’s UI is fully immersive, as it could simply be considered part of an AR overlay being used by V at all times.

However, the big difference is that the AR overlays run through cyberware upgrades that are applied directly to the user’s eyes — there’s no headsets involved at all. The upgrades can be installed by a ripper doctor (or ripperdoc, for short) who can add any number of modifications to your body, for a price.

However, the game does also feature it’s own interpretation of the future of AR gaming. In a side mission with River, V is invited to play an augmented reality video game using a AR headset and a toy gun. The game is a competitive 4-player AR shooter called Trouble in Heywood, and you can watch a full playthrough of the mini game in the video embedded above.

As you can see in the video, the AR headset modifies the color and visuals of the world around V, while also putting him and the other players into police outfits. The two kids playing with V and River are even scaled up in height to match the adults.

Split into two teams, the players then have to walk around their environment and shoot pop-up targets that appear behind objects or scattered across buildings. The AR game is cut short before it ends properly, but the gist is that the team with the highest score wins. Unlike the braindance, this AR game is a much more simplistic (or perhaps realistic) depiction of where AR technology could end up in 50 years time.


With a world as big as Night City, these examples are probably only just scratching the surface of VR and AR representations in the game.

Let us know what you thought about Cyberpunk 2077’s depiction of AR/VR in the comments below.