Cyberpunk + XR

Cyberpunk is science fiction subgenre in a dystopian futuristic setting that tends to focus on a “combination of lowlife and high tech”, featuring futuristic technological and scientific achievements, such as artificial intelligence and cybernetics, juxtaposed with societal collapse, dystopia or decay (per Wikipedia). What Cyberpunk also often features are advanced demonstrations and uses of XR technologies: Augmented Reality, Virtual Reality and Mixed Reality.

In previous blog posts, we’ve mentioned KBZ which has created lists of Augmented Reality and Virtual Reality films, Artificial Intelligence films, Hard Sci-Fi filmsMultiverse films and Technology films (often also featuring AR, VR and MR). KBZ has two new articles that look at various Cyberpunk films – the Top 20 Best Cyberpunk Films and the Top Cyberpunk Films You Haven’t Seen.

The Best Cyberpunk Films article is worth a read if you’re looking for some of the best AR, VR and MR films to watch. There’s quite a bit of crossover between Cyberpunk and XR tech and the article lists some of the best films like Ready Player One, Blade Runner 2049, Total Recall, The Matrix and others.

What we found more interesting was the Top Cyberpunk Films You Haven’t Seen article as it has some new and obscure AR & VR films we haven’t seen yet. There’s a recent film called Karmalink that has some advanced AR concepts, a film called Hardwired that features AR advertising and display concepts (via a brain implant) and Terminal Justice which features old school VR HMD’s, a virtual reality crime scene and AR eye implants for infrared and night time vision. There were also two additional films we would recommend every AR & VR fan check out: Virtual Nightmare and Natural City. Virtual Nightmare uses VR similar to The Matrix and Natural City is similar to Blade Runner and shows many advanced AR concepts integrated into a futuristic society.

KBZ also has a short video that highlights some of these films (with Cyberpunk AR, VR and MR concepts) and you can watch the video below.

The post Cyberpunk + XR appeared first on Zugara.

AR, VR, AI, Multiverse & More

We’ve referenced the KBZ Films site before as they’ve posted some lists of lesser-known Augmented Reality & Virtual Reality films. Recently, KBZ has posted an article of the Best AR & VR Films that has some great films listed (and that we’ve highlighted in the past on our blog here). From the article’s Top 20 rankings are some great AR & VR films including Avalon (VR), Auggie (AR), Brainstorm (VR), Sleep Dealer (MR) and Anon (AR). It’s worth checking the article out if you’re new to the XR field and want to see a list of the Best AR & VR films. KBZ also has a list of every imaginable film about AR, VR & MR that you can find here.

The KBZ site also has some other interesting articles to check out including a list of some great films about Artificial Intelligence (AI) and another article that highlights some obscure films about the Multiverse and alternate realities. The AI list of films is especially relevant as AI technology is also usually found in AR & VR films. The film Auggie comes to mind as the AR projection through eyewear is based on AI models of the person’s subconscious. Below are videos of some of the AI and Multiverse films referenced in the articles.

Finally, if you’re into Sci-Fi, there’s also some other great articles from the site including a list of the best time travel films and best time loop films. We’ve always found those films interesting as more recent time travel films have also included aspects of XR technologies.

The post AR, VR, AI, Multiverse & More appeared first on Zugara.

Meta Shows New Progress on Key Tech for Making AR Genuinely Useful

Meta has introduced the Segment Anything Model, which aims to set a new bar for computer-vision-based ‘object segmentation’—the ability for computers to understand the difference between individual objects in an image or video. Segmentation will be key for making AR genuinely useful by enabling a comprehensive understanding of the world around the user.

Object segmentation is the process of identifying and separating objects in an image or video. With the help of AI, this process can be automated, making it possible to identify and isolate objects in real-time. This technology will be critical for creating a more useful AR experience by giving the system an awareness of various objects in the world around the user.

The Challenge

Imagine, for instance, that you’re wearing a pair of AR glasses and you’d like to have two floating virtual monitors on the left and right of your real monitor. Unless you’re going to manually tell the system where your real monitor is, it must be able to understand what a monitor looks like so that when it sees your monitor it can place the virtual monitors accordingly.

But monitors come in all shapes, sizes, and colors. Sometimes reflections or occluded objects make it even harder for a computer-vision system to recognize.

Having a fast and reliable segmentation system that can identify each object in the room around you (like your monitor) will be key to unlocking tons of AR use-cases so the tech can be genuinely useful.

Computer-vision based object segmentation has been an ongoing area of research for many years now, but one of the key issues is that in order to help computers understand what they’re looking at, you need to train an AI model by giving it lots images to learn from.

Such models can be quite effective at identifying the objects they were trained on, but if they will struggle on objects they haven’t seen before. That means that one of the biggest challenges for object segmentation is simply having a large enough set of images for the systems to learn from, but collecting those images and annotating them in a way that makes them useful for training is no small task.


Meta recently published work on a new project called the Segment Anything Model (SAM). It’s both a segmentation model and a massive set of training images the company is releasing for others to build upon.

The project aims to reduce the need for task-specific modeling expertise. SAM is a general segmentation model that can identify any object in any image or video, even for objects and image types that it didn’t see during training.

SAM allows for both automatic and interactive segmentation, allowing it to identify individual objects in a scene with simple inputs from the user. SAM can be ‘prompted’ with clicks, boxes, and other prompts, giving users control over what the system is attempting to identifying at any given moment.

It’s easy to see how this point-based prompting could work great if coupled with eye-tracking on an AR headset. In fact that’s exactly one of the use-cases that Meta has demonstrated with the system:

Here’s another example of SAM being used on first-person video captured by Meta’s Project Aria glasses:

You can try SAM for yourself in your browser right now.

How SAM Knows So Much

Part of SAM’s impressive abilities come from its training data which contains a massive 10 million images and 1 billion identified object shapes.  It’s far more comprehensive than contemporary datasets, according to Meta, giving SAM much more experience in the learning process and enabling it to segment a broad range of objects.

Image courtesy Meta

Meta calls the SAM dataset SA-1B, and the company is releasing the entire set for other researchers to build upon.

Meta hopes this work on promptable segmentation, and the release of this massive training dataset, will accelerate research into image and video understanding. The company expects the SAM model can be used as a component in larger systems, enabling versatile applications in areas like AR, content creation, scientific domains, and general AI systems.

More Augmented Reality & Virtual Reality Movies To Check Out

In a previous post, we shared with you a list of every Augmented Reality & Virtual Reality Movie from KBZ Film. KBZ Film has detailed lists of films and bills itself as “The Internet’s Largest Collection of Subgenre and Microgenre Film Lists”. After viewing their AR & VR Film List, we can attest to the completeness of their list as they have listed quite a few obscure AR & VR films – some of which we think every AR & VR fan should check out including Sleep Dealer and Anon.

KBZ has recently launched a video for their Top Augmented Reality & Virtual Reality Films You Haven’t Seen and it has a few clips of these films from their list. It’s worth checking out the video (and embedded above) or you can check out their article.


The post More Augmented Reality & Virtual Reality Movies To Check Out appeared first on Zugara.

Steam Comes to Nreal’s AR Glasses, AR Hackathon Announced

One company at the forefront of augmented reality (AR) glasses is China-based Nreal, having released the Nreal Light followed by the Nreal Air. Outside of its traditional market in Asia, Nreal’s devices have only started to see global availability in the last year and in doing so the company is increasing content efforts. It’s doing so in a couple of ways, one with Steam compatibility and the other via its first hackathon event.

Nreal AR Cloud Gaming

Unlike AR smartglasses that have features like 6DoF tracking, Nreal’s AR glasses allow users to connect their smartphones to watch movies or play videogames on giant virtual screens. Hence why the company has pushed towards native AR cloud gaming experiences by releasing “Steam on Nreal”. So yes, that does mean you can now stream Steam games from your PC onto a huge 130-inch HD virtual display.

Nreal does note that “Steam on Nreal” is a beta release that requires a bit of setup effort without going into specifics. The software isn’t yet optimized for all Steam games but gamers can enjoy titles like DiRT Rally and the Halo series. As an additional benefit, Nreal Light and Air users can already utilise Xbox Cloud Gaming via a browser inside Nebula, Nreal’s 3D system.

“We are excited to be the first to bring Steam into AR,” said Peng Jin, co-founder of Nreal in a statement. “The beta release is meant to give people a glimpse into what is possible. After all, AAA games should be played on a 200″ HD screen and they should be played free of location restrictions.”

Nreal Air

As for the AR Jam, this will be Nreal’s first augmented reality hackathon, an online international contest with more than $100,000 USD in cash prizes to be won by creators. Kicking off on 27th June 2022, the AR Jam is looking for developers to compete in at-home fitness, art; games, video (highlighting Nebula’s multi-screen functionality) and Port (converting existing apps into AR) categories. There will also be three bonus categories should participants wish to enter, Multiplayer/Social/Networks; NFT Galleries, and Students.

“We’ve always been focused on creating consumer-ready AR experiences with groundbreaking tech, to redefine the way we interact with information and content in our everyday lives. With the AR Jam and content fund, Nreal is demonstrating its commitment to supporting pioneering developers and their AR passion projects,” Jin added.

Category winners will receive $10k, whilst those in second and third places will receive small cash prizes. Honourable mentions will get their very own Nreal Light Dev kit. The AR Jam will run until 27th July 2022.

For continued updates on Nreal and the AR market, keep reading gmw3.

Microsoft HoloLens Boss Alex Kipman Leaving Due to Misconduct Allegations

As the co-creator of Hololens and the chief of Microsoft’s mixed reality division, Alex Kipman has been the face of the company’s immersive efforts for several years now. That’s all coming to an end, with reports stating that Kipman will be leaving Microsoft after allegations of verbal abuse and sexual harassment surfaced.

Microsoft HoloLens 2

Insider reported the allegations back in May and it was the same site this week that first reported on Kipman resigning his position. While Microsoft has yet to officially comment on the report, Geekwire obtained an email from Scott Guthrie, the head of Microsoft’s Cloud & AI Group, announcing a restructuring of the Hololens group.

The hardware and software teams will be split between the Windows + Devices organisation and the Experiences + Devices division respectively. This hasn’t been an overnight decision it seems, with Guthrie stating in the email that: “Over the last several months, Alex Kipman and I have been talking about the team’s path going forward. We have mutually decided that this is the right time for him to leave the company to pursue other opportunities.” Kipman won’t be leaving right away. He’ll help the team’s transitions over the next couple of months before departing Microsoft.

What this will mean for Hololens is unclear as Kipman was by far Hololens’ (and mixed reality’s) most ardent supporter within Microsoft. The news comes at a turbulent time for the device as the US Army decides whether to continue with HoloLens development – called IVAS – for its soldiers, with reports suggesting that the 10-year, $21.9 billion USD contract might be delayed or reduced in size.

Microsoft Ignite, Alex Kipman and John Hanke
Alex Kipman and John Hanke at Microsoft Ignite

A Brazilian engineer, Kipman joined Microsoft in 2001 and worked within the Windows and Xbox teams – he helped create the Xbox Kinect sensor – before heading up the mixed reality division. Insider’s report last month saw dozens of staff detail his alleged behaviour to the publication. These included one employee saying Kipman watched what was essentially VR porn in front of others whilst another spoke of an incident where he kept massaging a female employee’s shoulders even though she was trying to shrug him off.

It was this pattern of continual inappropriate behaviour and unwanted touching that created an atmosphere where managers reportedly told staff women shouldn’t be left alone with him.

At the beginning of the year, the Wall Street Journal reported on more than 70 staff from the Hololens team leaving Microsoft in 2021, with 40 of those joining Meta.

For continued Hololens updates, keep reading gmw3.

Lynx R1’s Summer Launch is a “Moving Target” Due to Component Sourcing

There are a bunch of virtual reality (VR) headsets due for release later this year, with the Lynx R1 expected to arrive first for early backers. When that’ll happen though remains fluid, with Lynx founder Stan Larroque recently confirming the launch is a “moving target” due to external factors. On the plus side, he revealed new details regarding the headsets’ controllers.


Larroque holds a live, unscripted, update/Q&A stream on YouTube each month where he’s open and honest about the headsets’ progress and issues the team is currently facing. It makes for a far different approach than most other VR hardware manufacturers but also highlights the problems a smaller startup can face. That includes having to adjust a launch window which has been stencilled in for June/July.

When asked about the shipping date during the Q&A portion, he said: “It’s a moving target. As I told you, sometime during the summer. I know that’s not a good answer but that’s all I can tell you. We still need answers for some of the components. We’ve secured all of the components on the main board which was a pain but we still have some things to figure between Taiwan and China; which is a complicated matter.”

So for the time being, backers will still need to be patient for one of the most interesting mixed reality (MR) headsets coming to market.


As for the controller news, Larroque said he had “very good news”. In collaboration with Finch, the Lynx R1 will get optically tracked controllers much like the Meta Quest, with a ring the headset can see. This will mean the Lynx R1 will be able to support a far wider array of VR games on platforms like Steam. Out the box, the headset will still be focused on hand tracking as the primary input method as there’s no release timing or pricing for the controller at the moment.

Lastly, there’s been a bit of confusion around the previously announced SideQuest integration. In the stream Larroque mentions a cancelled contract without mentioning specifics but he followed this up with a statement via Twitter, clarifying that work was still ongoing.

Watch the full Lynx update for May below, and when further details arise, gmw3 will let you know.

Innovations in AR: Retail

With the global AR, VR and MR market worth $28bn in 2021 (and projected to top $250bn by 2028), it’s little wonder that companies are wanting to hop onto the XR bandwagon. In the retail industry, the augmented reality (AR) subsector is proving particularly enticing, with retail having been one of the boldest industries adopting AR technology, particularly over the past decade. That’s been aided by AR going mainstream thanks to the advent of smartphones packed with all the sensors and capabilities necessary for advanced experiences, resulting in 810 million active mobile AR users in 2021 (up from 440 million in 2019).

That rapid increase can also partly be attributed to the COVID-19 pandemic, which has resulted in a huge shift to online shopping and e-commerce – adding $219bn to US e-commerce sales in 2020-2021. Of course, even before COVID-19, the ratio of internet sales to total sales was trending steadily upwards, but as the pandemic itself has abated, digital shoppers have remained. And as customers have moved online, they have become increasingly ready to embrace digital technologies such as AR. 

AR and the Customer Experience

Seizing on that appetite, retail brands have created a wide range of AR experiences to entice customers. Sportswear brand Nike, for instance, has built-in AR functionality in its app in order to properly measure shoe size. The app makes use of a smartphone camera and simply requires the user to point their phone at their feet. The app also allows customers to share their saved shoe size with Nike stores via a QR code – helping to ensure a perfectly fitting shoe.

Nike FIT Digital Foot Measurement Tool  3

Part of the attraction for retailers is the way the technology can build excitement and deliver unusual and buzzy customer experiences. Retail stores themselves can build-in AR functionality, taking advantage of their physical space to offer more complex possibilities. Consider magic mirrors, for instance, screens which capture live views of shoppers, overlaying products onto their person. AR displays can also be placed on a storefront to draw viewers inside. Timberland took exactly this approach, utilising Microsoft Kinect technology to produce a virtual fitting room in the front of a store. Shoppers could stand in front of a screen and see a virtual representation of themselves wearing Timberland clothes – all before they’d even stepped foot inside.

For brands without the capabilities to build these AR experiences themselves, agencies have sprung up to help retailers make the most of the technology. Rather than create their own AR apps, brands can also benefit from tie-ins with some of the biggest AR-enabled apps, with the likes of TikTok, Instagram and Snapchat all offering extensive filter options. That removes much of the legwork from getting started with AR, which is why there are so many examples, whether it’s Porsche, Coca-Cola, or Starbucks.

The branded filter approach has been proven effective for marketing brands, as with over-the-counter cold and flu medicine Mucinex, which created a TikTok filter which resulted in a 42.7% increase in purchase intent.

Aside from including AR in their marketing endeavours, some retail companies have even delved into creating full-fledged AR products. Consumer product manufacturer Bic has released an app and accompanying drawing book known as DrawyBook which lets children bring their illustrations to life via an AR scan.

The Virtual Try-On

Perhaps the most popular use-case for retail AR, however, is the virtual try-on. Most of the industry’s biggest brands offer some form of the technology, which allows prospective buyers to see how a product would look on them without needing to physically try it on. Typically, such AR experiences make use of the ubiquitous phone camera to display the virtual elements in real-time. Prominent virtual try-on examples include make-up from Maybelline, clothing from ASOS and Zeekit, and shoes from Vyking.

Try-ons needn’t be limited to clothing. One good example is the IKEA Place app which allows users to place 3D models of the company’s furniture into their own rooms in order to preview how they would look, automatically scaling them based on the room’s dimensions to ensure they are true to life. In the US, Home Depot has taken a similar approach, aimed at improving the experience for mobile shoppers, who make up more than twothirds of online traffic. Home Depot said in 2020 that customers who engaged with its app’s AR features were two to three times more likely to convert. 

Virtual try-ons have added benefits for retailers. It is estimated that returns cost retailers in the UK £60bn every year. If people can have a better idea of what they’re ordering before it is sent out, there’s every chance of bringing that number down – helping retailers and also the planet, as items don’t need to be sent back the other way after being delivered. Customers might be nudged into trying items virtually thanks to retailers increasingly moving away from free returns.

Room to Grow

Despite the plethora of AR options on offer, consumer interest for retail AR is still at a relatively low level. In October 2021, a survey found that only 13% of US adults had ever used AR or virtual reality (VR) to shop. Admittedly, that was up 5% on the year before, and 37% of those questioned did say they were at least somewhat interested in using AR or VR to shop. That means that 50% of US adults have either used or are interested in using AR while shopping.

According to the Impact of Augmented Reality on Retail report, of those making use of AR, 77% use the technology to visualise differences in products, such as alternative colours and styles. Meanwhile, 72% of shoppers who used AR in their shopping journey said it resulted in them buying.

AR also has a burgeoning role when it comes to navigation and directing customers around retail stores more effectively. In the US, home improvement store Lowe’s has developed an app which overlays directions onto a smartphone’s view of the store, for instance, helping customers to more quickly find what they are looking for.


In the retail sector, AR finds a distinct niche, serving to enable new and innovative customer experiences in the never-ending battle to attract potential buyers. Retailers have already become very canny with making the most of AR opportunities using customers’ smartphones – the next frontier will see better use of physical stores themselves to deliver more complex and compelling AR experiences.

Niantic Launches Visual Positioning System For ‘Global Scale’ AR Experiences

Niantic‘s new Lightship Visual Positioning System (VPS) will facilitate interactions with ‘global scale’ persistent and synced AR content on mobile devices.

Niantic launched Lightship during its developer conference this week and you can see some footage in the video embedded above showing some phone-based AR apps using its new features starting from the 50:20 mark. The system is essentially a new type of map that developers can use for AR experiences, with the aim of providing location-based persistent content that’s synced up for all users.

Niantic is building the map from scanned visual data, which Niantic says will offer “centimeter-level” accuracy when pinpointing the location and orientation of users (or multiple users, in relation to each other) at a given location. The technology is similar to large-scale visual positioning systems in active development at Google and Snap.

While the promise of the system is to work globally, it’s not quite there just yet — as of launch yesterday, Niantic’s VPS system has around 30,000 public locations where VPS is available for developers to hook into. These locations are mainly spread across six key cities — San Francisco, London, Tokyo, Los Angeles, New York City and Seattle — and include “parks, paths, landmarks, local businesses and more.”

To expand the map, Niantic developed the Wayfarer app which allows developers to scan in new locations using their phones, available now in public beta. Niantic has also launched a surveyor program in the aforementioned six key launch cities to expedite the process.

“With only a single image frame from the end user’s camera, Lightship VPS swiftly and accurately determines a user’s precise, six-dimensional location,” according to a Niantic blog post.

Scaling VPS to a global level is a lofty goal for Niantic, but could improve mobile AR experiences which could seem to unlock far more interesting content with accurate maps pinning content to real world locations.

You can read more about Lightship VPS over on the Niantic blog.