Hands-on: Mojo Vision’s Smart Contact Lens is Further Along Than You Might Think

Having not had a chance to see Mojo Vision’s latest smart contact lens for myself until recently, I’ll admit that I expected the company was still years away from having a working contact lens with more than just a simple notification light or a handful of static pixels. Upon looking through the company’s latest prototype I was impressed to see a much more capable prototype than I had expected.

When I walked into Mojo Vision’s demo suite at AWE 2022 last month I was handed a hard contact lens that I assumed was a mockup of the tech the company hoped to eventually shrink and fit into the lens. But no… the company said this was a functional prototype, and everything inside the lens was real, working hardware.

Image courtesy Mojo Vision

The company tells me this latest prototype includes the “world’s smallest” MicroLED display—at a miniscule 0.48mm, with just 1.8 microns between pixels—an ARM processor, 5GHz radio, IMU (with accelerometer, gyro, and magnetometer), “medical-grade micro-batteries,” and a power management circuit with wireless recharging components.

And while the Mojo Vision smart contact lens is still much thicker than your typical contact lens, last week the company demonstrated this prototype can work in an actual human eye, using Mojo Vision CEO Drew Perkins as the guinea pig.

Image courtesy Mojo Vision

And while this looks, well… fairly creepy when actually worn in the eye, the company tells me that, in addition to making it thinner, they’ll cover the electronics with cosmetic irises to make it look more natural in the future.

At AWE I wasn’t able to put the contact lens in my own eye (Covid be damned). Instead the company had the lens attached to a tethered stick which I held up to my eye to peer through.

Photo by Road to VR

When I did I was surprised to see more than just a handful of pixels, but a full-blown graphical user interface with readable text and interface elements. It’s all monochrome green for now (taking advantage of the human eye’s ability to see green better than any other color), but the demo clearly shows that Mojo Vision’s ambitions are more than just a pipe dream.

Despite the physical display in the lens itself being opaque and directly in the middle of your eye, you can’t actually see it because it’s simply too small and too close. But you can see the image that it projects.

Photo by Road to VR

Compared to every HMD that exists today, Mojo Vision’s smart contact lens is particularly interesting because it moves with your eye. That means the display itself—despite having a very small 15° field-of-view—moves with your vision as you look around. And it’s always sharp no matter where you look because it’s always over your fovea (the center part of the retina that sees the most detail). In essence, it’s like having ‘built-in’ foveated rendering. A limited FoV remains a bottleneck to many use-cases, but having the display actually move with your eye alleviates the limitation at least somewhat.

But what about input? Mojo Vision has also been steady at work on figuring out how users will interact with the device. As I wasn’t able to put the lens into my own eye, the company instead put me in a VR headset with eye-tracking to emulate what it would be like to use the smart contact lens itself. Inside the headset I saw roughly the same interface I had seen through the demo contact lens, but now I could interact with the device using my eyes.

The current implementation doesn’t constrain the entire interface to the small field-of-view. Instead, your gaze acts as a sort of ‘spotlight’ which reveals a larger interface as you move your eyes around. You can interact with parts of the interface by hovering your gaze on a button to do things like show the current weather or recent text messages.

It’s an interesting and hands-free approach to an HMD interface, though in my experience the eyes themselves are not a great conscious input device because most of our eye-movements are subconsciously controlled. With enough practice it’s possible that manually controlling your gaze for input will become as simple and seamless as using your finger to control a touchscreen; ultimately another form of input might be better but that remains to be seen.

This interface and input approach is of course entirely dependent on high quality eye-tracking. Since I didn’t get to put the lens on for myself, I have no indication if Mojo Vision’s eye-tracking is up to the task, but the company claims its eye-tracking is an “order of magnitude more precise than today’s leading [XR] optical eye-tracking systems.”

In theory it should work as well as they claim—after all, what’s a better way to measure the movement of your eyes than with something that’s physically attached to them? In practice, the device’s IMU is presumably just as susceptible to drift as any other, which could be problematic. There’s also the matter of extrapolating and separating the movement of the user’s head from sensor data that’s coming from an eye-mounted device.

Image courtesy Mojo Vision

If the company’s eye-tracking is as precise (and accurate) as they claim, it would be a major win because it could enable the device to function as a genuine AR contact lens capable of immersive experiences, rather than just a smart contact lens for basic informational display. Mojo Vision does claim it expects its contact lens to be able to do immersive AR eventually, including stereoscopic rendering with one contact in each eye. In any case, AR won’t be properly viable on the device until a larger field-of-view is achieved, but it’s an exciting possibility.

So what’s the road map for actually getting this thing to market? Mojo Vision says it fully expects FDA approval will be necessary before they can sell it to anyone, which means even once everything is functional from a tech and feature standpoint, they’ll need to run clinical trials. As for when that might all be complete, the company told me “not in a year, but certainly [sooner than] five years.”

Apple Reveals Improvements Coming in ARKit 6 for Developers

Earlier this month during Apple’s annual developer conference, WWDC 2022, the company gave developers the first look at improvements coming to Apple’s ARKit 6 toolkit for building AR apps on iOS devices.

Though Apple has yet to reveal (or even confirm) the existence of an AR headset, the clearest indication the company is absolutely serious about AR is ARKit, the developer toolkit for building AR apps on iOS devices which Apple has been advancing since 2017.

At WWDC 2022 Apple revealed the latest version, ARKit 6, which is bringing improvements to core capabilities so developers can build better AR apps for iPhones and iPads (and eventually headsets… probably).

Image courtesy Apple

During the ‘Discover ARKit 6’ developer session at WWDC 2022, Apple ARKit Engineer Christian Lipski, overviewed what’s next.

Better Motion Capture

ARKit includes a MotionCapture function which tracks people in the video frame, giving developers a ‘skeleton’ which estimates the position of the person’s head and limbs. This allows developers to create apps which overlay augmented things onto the person, or moves them relative to the person (it can also be used for occlusion to place augmented content behind someone to more realistically embed it into the scene).

In ARKit 6, Lipski says the function is getting a “whole suite of updates,” including improved tracking of 2D skeletons which now estimate the location of the subject’s left and right ears (which will surely be useful for face-filters, trying on glasses with AR, and similar functions involving the head).

Image courtesy Apple

As for 3D skeletons, which gives a pose estimation with depth, Apple is promising better tracking with less jitter, more temporal consistency, and more robustness when the user is occluded by the edge of the camera or other objects (though some of these  enhancements are only available on iPhone 12 and up).

Camera Access Improvements

Image courtesy Apple

ARKit 6 gives developers much more control over the device’s camera while it’s being used with an AR app for tracking.

Developers can now access incoming frames in real-time up to 4K at 30FPS on the iPhone 11 and up, and the latest iPad Pro (M1). The prior mode, which uses a lower resolution but higher framerate (60FPS), is still available to developers. Lipski says developers should carefully consider which mode to use. The 4K mode might be better for apps focused on previewing or recording video (like a virtual production app), but the lower resolution 60FPS mode might be better for apps that benefit from responsiveness, like games.

Similar to higher video resolution during an AR app, developers can now take full resolution photos even while an AR app is actively using the camera. That means they can pluck out a 12MP image (on an iPhone 13 anyway) to be saved or used elsewhere. This could be great for an AR app where capturing photos is part of the experience. For instance, Lipski says, an app where users are guided through taking photos of an object to later be converted into a 3D model with photogrammetry.

ARKit 6 also gives developers more control over the camera while it’s being used by an AR app. Developers can adjust things like white balance, brightness, and focus as needed, and can read EXIF data from every incoming frame.

More Location Anchor… Locations

Image courtesy Apple

ARKit includes LocationAnchors which can provide street-level tracking for AR in select cities (for instance, to do augmented reality turn-by-turn directions). Apple is expanding this functionality to more cities, now including Vancouver, Toronto, and Montreal in Canada; Fukuoka, Hiroshima, Osaka, Kyoto, Nagoya, Yokohama, and Tokyo in Japan; and Singapore.

Later this year the function will further expand to Auckland, New Zealand; Tel Aviv-Yafo, Israel; and Paris, France.

Plane Anchors

Plane Anchors are a tool for tracking flat objects like tables, floors, and walls during an AR session. Prior to ARKit 6, the origin of a Plane Anchor would be updated as more of the plane was discovered (for instance, moving the device to reveal more of a table than the camera saw previously). This could make it difficult to keep augmented objects locked in place on a plane if the origin was rotated after first being placed. With ARKit 6, the origin’s rotation remains static no matter how the shape of the plane might change during the session.

– – — – –

ARKit 6 will launch with the iOS 16 update which is available now in beta for developers and is expected to be release to the public this Fall.

The post Apple Reveals Improvements Coming in ARKit 6 for Developers appeared first on Road to VR.

Nreal Air review: new augmented reality specs put a big screen in your view

AR smart glasses with displays now widely available in UK but must be connected to a smartphone to work

The first widely available augmented reality glasses have hit the UK high street, putting TV shows, movies and games on a big virtual screen just in front of your eyes. But while the Nreal Air are the first of their type on the shelves, they are limited in what consumers can do with them.

Many firms have tried to be the first to make AR glasses the next generation of technology, not least Google with its ill-fated Glass back in 2013. Snapchat and Facebook have made attempts, both sporting cameras for recording others, but so far there have been no glasses for consumers with displays for the wearer to view. Until now.

Continue reading...

Steam Comes to Nreal’s AR Glasses, AR Hackathon Announced

One company at the forefront of augmented reality (AR) glasses is China-based Nreal, having released the Nreal Light followed by the Nreal Air. Outside of its traditional market in Asia, Nreal’s devices have only started to see global availability in the last year and in doing so the company is increasing content efforts. It’s doing so in a couple of ways, one with Steam compatibility and the other via its first hackathon event.

Nreal AR Cloud Gaming

Unlike AR smartglasses that have features like 6DoF tracking, Nreal’s AR glasses allow users to connect their smartphones to watch movies or play videogames on giant virtual screens. Hence why the company has pushed towards native AR cloud gaming experiences by releasing “Steam on Nreal”. So yes, that does mean you can now stream Steam games from your PC onto a huge 130-inch HD virtual display.

Nreal does note that “Steam on Nreal” is a beta release that requires a bit of setup effort without going into specifics. The software isn’t yet optimized for all Steam games but gamers can enjoy titles like DiRT Rally and the Halo series. As an additional benefit, Nreal Light and Air users can already utilise Xbox Cloud Gaming via a browser inside Nebula, Nreal’s 3D system.

“We are excited to be the first to bring Steam into AR,” said Peng Jin, co-founder of Nreal in a statement. “The beta release is meant to give people a glimpse into what is possible. After all, AAA games should be played on a 200″ HD screen and they should be played free of location restrictions.”

Nreal Air

As for the AR Jam, this will be Nreal’s first augmented reality hackathon, an online international contest with more than $100,000 USD in cash prizes to be won by creators. Kicking off on 27th June 2022, the AR Jam is looking for developers to compete in at-home fitness, art; games, video (highlighting Nebula’s multi-screen functionality) and Port (converting existing apps into AR) categories. There will also be three bonus categories should participants wish to enter, Multiplayer/Social/Networks; NFT Galleries, and Students.

“We’ve always been focused on creating consumer-ready AR experiences with groundbreaking tech, to redefine the way we interact with information and content in our everyday lives. With the AR Jam and content fund, Nreal is demonstrating its commitment to supporting pioneering developers and their AR passion projects,” Jin added.

Category winners will receive $10k, whilst those in second and third places will receive small cash prizes. Honourable mentions will get their very own Nreal Light Dev kit. The AR Jam will run until 27th July 2022.

For continued updates on Nreal and the AR market, keep reading gmw3.

Microsoft HoloLens Boss Alex Kipman Leaving Due to Misconduct Allegations

As the co-creator of Hololens and the chief of Microsoft’s mixed reality division, Alex Kipman has been the face of the company’s immersive efforts for several years now. That’s all coming to an end, with reports stating that Kipman will be leaving Microsoft after allegations of verbal abuse and sexual harassment surfaced.

Microsoft HoloLens 2

Insider reported the allegations back in May and it was the same site this week that first reported on Kipman resigning his position. While Microsoft has yet to officially comment on the report, Geekwire obtained an email from Scott Guthrie, the head of Microsoft’s Cloud & AI Group, announcing a restructuring of the Hololens group.

The hardware and software teams will be split between the Windows + Devices organisation and the Experiences + Devices division respectively. This hasn’t been an overnight decision it seems, with Guthrie stating in the email that: “Over the last several months, Alex Kipman and I have been talking about the team’s path going forward. We have mutually decided that this is the right time for him to leave the company to pursue other opportunities.” Kipman won’t be leaving right away. He’ll help the team’s transitions over the next couple of months before departing Microsoft.

What this will mean for Hololens is unclear as Kipman was by far Hololens’ (and mixed reality’s) most ardent supporter within Microsoft. The news comes at a turbulent time for the device as the US Army decides whether to continue with HoloLens development – called IVAS – for its soldiers, with reports suggesting that the 10-year, $21.9 billion USD contract might be delayed or reduced in size.

Microsoft Ignite, Alex Kipman and John Hanke
Alex Kipman and John Hanke at Microsoft Ignite

A Brazilian engineer, Kipman joined Microsoft in 2001 and worked within the Windows and Xbox teams – he helped create the Xbox Kinect sensor – before heading up the mixed reality division. Insider’s report last month saw dozens of staff detail his alleged behaviour to the publication. These included one employee saying Kipman watched what was essentially VR porn in front of others whilst another spoke of an incident where he kept massaging a female employee’s shoulders even though she was trying to shrug him off.

It was this pattern of continual inappropriate behaviour and unwanted touching that created an atmosphere where managers reportedly told staff women shouldn’t be left alone with him.

At the beginning of the year, the Wall Street Journal reported on more than 70 staff from the Hololens team leaving Microsoft in 2021, with 40 of those joining Meta.

For continued Hololens updates, keep reading gmw3.

Augmented eyes on Apple at developer conference

New computers, iPad overhaul and expanded Messages app on the cards, with AR glasses a possibility

Apple is to reveal details of the software updates coming to its phones, tablets and computers, in the company’s annual worldwide developers’ conference (WWDC).

But while new computers, an expanded Messages app, and an overhaul of the iPad’s software to make it more like a laptop are all on the cards, the biggest question mark on Monday is whether Apple will show any evidence of its forthcoming augmented reality – or AR – glasses.

Continue reading...

Innovations in AR: Retail

With the global AR, VR and MR market worth $28bn in 2021 (and projected to top $250bn by 2028), it’s little wonder that companies are wanting to hop onto the XR bandwagon. In the retail industry, the augmented reality (AR) subsector is proving particularly enticing, with retail having been one of the boldest industries adopting AR technology, particularly over the past decade. That’s been aided by AR going mainstream thanks to the advent of smartphones packed with all the sensors and capabilities necessary for advanced experiences, resulting in 810 million active mobile AR users in 2021 (up from 440 million in 2019).

That rapid increase can also partly be attributed to the COVID-19 pandemic, which has resulted in a huge shift to online shopping and e-commerce – adding $219bn to US e-commerce sales in 2020-2021. Of course, even before COVID-19, the ratio of internet sales to total sales was trending steadily upwards, but as the pandemic itself has abated, digital shoppers have remained. And as customers have moved online, they have become increasingly ready to embrace digital technologies such as AR. 

AR and the Customer Experience

Seizing on that appetite, retail brands have created a wide range of AR experiences to entice customers. Sportswear brand Nike, for instance, has built-in AR functionality in its app in order to properly measure shoe size. The app makes use of a smartphone camera and simply requires the user to point their phone at their feet. The app also allows customers to share their saved shoe size with Nike stores via a QR code – helping to ensure a perfectly fitting shoe.

Nike FIT Digital Foot Measurement Tool  3

Part of the attraction for retailers is the way the technology can build excitement and deliver unusual and buzzy customer experiences. Retail stores themselves can build-in AR functionality, taking advantage of their physical space to offer more complex possibilities. Consider magic mirrors, for instance, screens which capture live views of shoppers, overlaying products onto their person. AR displays can also be placed on a storefront to draw viewers inside. Timberland took exactly this approach, utilising Microsoft Kinect technology to produce a virtual fitting room in the front of a store. Shoppers could stand in front of a screen and see a virtual representation of themselves wearing Timberland clothes – all before they’d even stepped foot inside.

For brands without the capabilities to build these AR experiences themselves, agencies have sprung up to help retailers make the most of the technology. Rather than create their own AR apps, brands can also benefit from tie-ins with some of the biggest AR-enabled apps, with the likes of TikTok, Instagram and Snapchat all offering extensive filter options. That removes much of the legwork from getting started with AR, which is why there are so many examples, whether it’s Porsche, Coca-Cola, or Starbucks.

The branded filter approach has been proven effective for marketing brands, as with over-the-counter cold and flu medicine Mucinex, which created a TikTok filter which resulted in a 42.7% increase in purchase intent.

Aside from including AR in their marketing endeavours, some retail companies have even delved into creating full-fledged AR products. Consumer product manufacturer Bic has released an app and accompanying drawing book known as DrawyBook which lets children bring their illustrations to life via an AR scan.

The Virtual Try-On

Perhaps the most popular use-case for retail AR, however, is the virtual try-on. Most of the industry’s biggest brands offer some form of the technology, which allows prospective buyers to see how a product would look on them without needing to physically try it on. Typically, such AR experiences make use of the ubiquitous phone camera to display the virtual elements in real-time. Prominent virtual try-on examples include make-up from Maybelline, clothing from ASOS and Zeekit, and shoes from Vyking.

Try-ons needn’t be limited to clothing. One good example is the IKEA Place app which allows users to place 3D models of the company’s furniture into their own rooms in order to preview how they would look, automatically scaling them based on the room’s dimensions to ensure they are true to life. In the US, Home Depot has taken a similar approach, aimed at improving the experience for mobile shoppers, who make up more than twothirds of online traffic. Home Depot said in 2020 that customers who engaged with its app’s AR features were two to three times more likely to convert. 

Virtual try-ons have added benefits for retailers. It is estimated that returns cost retailers in the UK £60bn every year. If people can have a better idea of what they’re ordering before it is sent out, there’s every chance of bringing that number down – helping retailers and also the planet, as items don’t need to be sent back the other way after being delivered. Customers might be nudged into trying items virtually thanks to retailers increasingly moving away from free returns.

Room to Grow

Despite the plethora of AR options on offer, consumer interest for retail AR is still at a relatively low level. In October 2021, a survey found that only 13% of US adults had ever used AR or virtual reality (VR) to shop. Admittedly, that was up 5% on the year before, and 37% of those questioned did say they were at least somewhat interested in using AR or VR to shop. That means that 50% of US adults have either used or are interested in using AR while shopping.

According to the Impact of Augmented Reality on Retail report, of those making use of AR, 77% use the technology to visualise differences in products, such as alternative colours and styles. Meanwhile, 72% of shoppers who used AR in their shopping journey said it resulted in them buying.

AR also has a burgeoning role when it comes to navigation and directing customers around retail stores more effectively. In the US, home improvement store Lowe’s has developed an app which overlays directions onto a smartphone’s view of the store, for instance, helping customers to more quickly find what they are looking for.

Summary

In the retail sector, AR finds a distinct niche, serving to enable new and innovative customer experiences in the never-ending battle to attract potential buyers. Retailers have already become very canny with making the most of AR opportunities using customers’ smartphones – the next frontier will see better use of physical stores themselves to deliver more complex and compelling AR experiences.

Niantic Launches Visual Positioning System For ‘Global Scale’ AR Experiences

Niantic‘s new Lightship Visual Positioning System (VPS) will facilitate interactions with ‘global scale’ persistent and synced AR content on mobile devices.

Niantic launched Lightship during its developer conference this week and you can see some footage in the video embedded above showing some phone-based AR apps using its new features starting from the 50:20 mark. The system is essentially a new type of map that developers can use for AR experiences, with the aim of providing location-based persistent content that’s synced up for all users.

Niantic is building the map from scanned visual data, which Niantic says will offer “centimeter-level” accuracy when pinpointing the location and orientation of users (or multiple users, in relation to each other) at a given location. The technology is similar to large-scale visual positioning systems in active development at Google and Snap.

While the promise of the system is to work globally, it’s not quite there just yet — as of launch yesterday, Niantic’s VPS system has around 30,000 public locations where VPS is available for developers to hook into. These locations are mainly spread across six key cities — San Francisco, London, Tokyo, Los Angeles, New York City and Seattle — and include “parks, paths, landmarks, local businesses and more.”

To expand the map, Niantic developed the Wayfarer app which allows developers to scan in new locations using their phones, available now in public beta. Niantic has also launched a surveyor program in the aforementioned six key launch cities to expedite the process.

“With only a single image frame from the end user’s camera, Lightship VPS swiftly and accurately determines a user’s precise, six-dimensional location,” according to a Niantic blog post.

Scaling VPS to a global level is a lofty goal for Niantic, but could improve mobile AR experiences which could seem to unlock far more interesting content with accurate maps pinning content to real world locations.

You can read more about Lightship VPS over on the Niantic blog.

Oppo is Taking its AR Hardware to AWE 2022

We do love a good hardware announcement here at gmw3 and Oppo hasn’t disappointed as it prepares for the Augmented World Expo (AWE) in California next week. The company has announced that for the first time North American visitors will be able to demo its augmented reality (AR) hardware.

Oppo Air Glass
Oppo Air Glass. Image credit: Oppo

Oppo will be demoing three of its products at AWE 2022, the Oppo Air Glass, AR Glass 2021 and its ColorOS Ray Tracing 3D Wallpaper. All of these have previously been revealed, the earliest of which was the Oppo AR Glass 2021, the company’s second AR glasses concept, introduced at OPPO INNO Day 2020. Air Glass appeared at INNO Day 2021 whilst the 3D Wallpaper was introduced during this year’s Game Developer Conference (GDC).

The Oppo Air Glass is very reminiscent of more enterprise-focused AR devices like Google Glass, providing users with time or situational information. With a sleek design, the Air Glass has a magnetic component so it can attach to users’ glasses whilst housing Oppo’s own Spark Micro Projector, a Micro LED and a bespoke diffraction optical waveguide display.

It’ll have all the input methods you’d expect from an XR device such as this, using touch, voice, head movement and hand motions to scroll through and select information. While Oppo has been developing AR technology since 2014 the Oppo Air Glass will be the company’s first commercially available XR product. An actual release date has yet to be confirmed.

Oppo AR Glass 2021
Oppo AR Glass 2021. Image credit: Oppo

“With the explosion of digital information, the ways in which we interact and exchange information between the physical and digital worlds are constantly evolving,” said Yi Xu, Director of XR Technology at OPPO in a statement. “Our belief that AR can be used to create a new digital world entirely based on the real world has been the driving force behind our investment and R&D in AR technologies, including the development of fundamental technology, applications, user interfaces and ecosystems.”

Or for something slightly different there’s always the ColorOS Ray Tracing 3D Wallpaper. Nope, this isn’t some hi-tech home wallpaper, this is a ray tracing application for smartphones. It allows Oppo phone users to interact with their wallpapers using hand gestures, so they can rotate, tap and wave whilst enjoying more vivid and life-like wallpapers.

For further coverage from AWE 2022, keep reading gmw3.

Robotics Manufacturer Kawasaki Joins Microsoft’s “Industrial Metaverse”

The “metaverse” is a buzzword being dropped next to all sorts of industries but for the most part, they’ve been promoted as social/gaming spaces. Microsoft held its annual Build conference this week with CEO Satya Nadella discussing its far different vision, an “industrial metaverse” that’s welcomed Kawasaki into the fold.

Microsoft Kawasaki

Now, unlike most other metaverse platforms where you get to run around virtual environments, hanging up your avatar’s clothing every five minutes and enjoying social banter, Microsoft’s industrial metaverse is actually very different. This is essentially Kawasaki floor workers donning HoloLens 2 devices to see holographic representations of real robotics so they can solve any issues that arise with minimal downtime.

This process is called digital twinning, creating digital versions of real-world items and processes to aid learning or in the case of heavy industry; speeding up repairs, increasing production or starting a new manufacturing line. There are plenty of possibilities, so much so that Kawasaki now joins Heinz and Boeing as Microsoft industrial metaverse partners.

“That’s why I think you’re seeing a lot of energy in that space,” Jessica Hawk, Microsoft’s corporate vice president of mixed reality, told CNBC. “These are real world problems that these companies are dealing with … so having a technology solution that can help unblock the supply chain challenge, for example, is incredibly impactful.”

Microsoft Kawasaki

Microsoft isn’t purely interested in the industrial applications for connecting people using XR technology. Apart from owning Minecraft and AltspaceVR, Microsoft’s metaverse ambitions stretch across a range of products with Teams and Mesh highlighted during the conference.

“With the latest AI and Teams Rooms experiences, we’re dissolving the walls between digital and physical participation so people can present a PowerPoint together as though they were in the same location, even when they’re apart,” says Nadella. Mesh, on the other hand, is all about creation: “You can build your metaverse experiences on Mesh that are accessible from anywhere on any device, whether it’s HoloLens, VR headsets, phones, tablets or PCs.”

As Microsoft continues to explore metaverse possibilities, gmw3 will keep you updated.