The “metaverse” is a buzzword being dropped next to all sorts of industries but for the most part, they’ve been promoted as social/gaming spaces. Microsoft held its annual Build conference this week with CEO Satya Nadella discussing its far different vision, an “industrial metaverse” that’s welcomed Kawasaki into the fold.
Now, unlike most other metaverse platforms where you get to run around virtual environments, hanging up your avatar’s clothing every five minutes and enjoying social banter, Microsoft’s industrial metaverse is actually very different. This is essentially Kawasaki floor workers donning HoloLens 2 devices to see holographic representations of real robotics so they can solve any issues that arise with minimal downtime.
This process is called digital twinning, creating digital versions of real-world items and processes to aid learning or in the case of heavy industry; speeding up repairs, increasing production or starting a new manufacturing line. There are plenty of possibilities, so much so that Kawasaki now joins Heinz and Boeing as Microsoft industrial metaverse partners.
“That’s why I think you’re seeing a lot of energy in that space,” Jessica Hawk, Microsoft’s corporate vice president of mixed reality, told CNBC. “These are real world problems that these companies are dealing with … so having a technology solution that can help unblock the supply chain challenge, for example, is incredibly impactful.”
Microsoft isn’t purely interested in the industrial applications for connecting people using XR technology. Apart from owning Minecraft and AltspaceVR, Microsoft’s metaverse ambitions stretch across a range of products with Teams and Mesh highlighted during the conference.
“With the latest AI and Teams Rooms experiences, we’re dissolving the walls between digital and physical participation so people can present a PowerPoint together as though they were in the same location, even when they’re apart,” says Nadella. Mesh, on the other hand, is all about creation: “You can build your metaverse experiences on Mesh that are accessible from anywhere on any device, whether it’s HoloLens, VR headsets, phones, tablets or PCs.”
As Microsoft continues to explore metaverse possibilities, gmw3 will keep you updated.
For many, when it comes to augmented reality (AR) most will have only experienced the technology through their phones rather than specialist hardware. Especially not Microsoft’s enterprise-focused HoloLens 2. In a public first for the device, an environmental art exhibit called Arcadia Earth will utilise HoloLens 2 to give guests the ability to interact with holograms as they walk through the show.
Developed in partnership with Enklu, HoloLens 2 gives life to this art exhibit, enhancing guests’ visit by making ecosystems interactive, animating animals and unlocking hidden gems along the way. Teaching visitors about the plight of the environment, they can enjoy captivating experiential art whilst learning about issues such single-use plastic waste and overfishing.
One area is dedicated to coral reefs, home to over a quarter of the oceans’ marine life. Attendees will be surrounded by fish and other aquatic life, all within easy reach. Reefs are in danger of bleaching, a process that participants can now view in holographic form. They’ll then be offered advice, such as which sunscreen is far more environmentally friendly, thus protecting the reefs. This information can then be sent to their phones at the touch of a button.
As Arcade Earth is a multisensory exhibit it is also filled with physical installations and proximity-triggered audio, all aided by a friendly orb that guides guests around.
One of the benefits of an exhibition like Arcadia Earth using AR is its adaptability. New experiences can be created, holograms updated as required, or new information added as discoveries are made. Thus encouraging return trips to educate guests on the natural world.
Arcadia Earth has a number of locations around the world including New York, Las Vegas and Saudi Arabia. However, the HoloLens powered experience will be based at the New York City location, rolling out later this month. General admission tickets are $39 USD peak and $33 off-peak, with the HoloLens tour price coming in at $59. Proceeds from ticket sales go towards planting mangrove trees, found to be great at absorbing carbon from the atmosphere.
Qualcomm held its CES 2022 press conference earlier today and as part of the event revealed that it’s partnered with Microsoft to help push the future of augmented reality (AR). Qualcomm announced that the collaboration will see the pair develop custom AR chips for both consumer and enterprise devices.
These custom AR chips will focus on ushering in an era of AR glasses that are lightweight and power-efficient as well as integrating into Microsoft’s ecosystem. That’ll mean support for software like Microsoft Mesh – Microsoft’s shared mixed-reality (MR) platform – and the Snapdragon Spaces XR Developer Platform.
What this could mean is a more lightweight, consumer-friendly version of Microsoft’s HoloLens 2. A consumer edition was in fact confirmed last year by Microsoft’s Alex Kipman who said: “we are absolutely working on a consumer journey for HoloLens.” And then there was that Pokémon GO demo by Niantic Labs using the MR headset.
“This collaboration reflects the next step in both companies’ shared commitment to XR and the metaverse,” said Hugo Swart, vice president and general manager of XR, Qualcomm Technologies, Inc. in a statement. “Qualcomm Technologies’ core XR strategy has always been delivering the most cutting-edge technology, purpose-built XR chipsets and enabling the ecosystem with our software platforms and hardware reference designs. We are thrilled to work with Microsoft to help expand and scale the adoption of AR hardware and software across the entire industry.”
“Our goal is to inspire and empower others to collectively work to develop the metaverse future – a future that is grounded in trust and innovation,” adds Rubén Caballero, corporate vice president Mixed Reality, Microsoft. “With services like Microsoft Mesh, we are committed to delivering the safest and most comprehensive set of capabilities to power metaverses that blend the physical and digital worlds, ultimately delivering a shared sense of presence across devices. We look forward to working with Qualcomm Technologies to help the entire ecosystem unlock the promise of the metaverse.”
Qualcomm is heavily invested in the XR space with chipsets like its Snapdragon XR2 platform being used in devices like Meta Quest 2. And then there’s the XR1 AR Smart Viewer Reference Design which OEM’s can utilise to enter the AR glasses market. This, of course, all leads towards a metaverse vision that most tech companies seem to be scrambling towards. As further details arise, keep reading VRFocus.
The largest Defence, Simulation and Training conference descended on London (DSEI) and immersive technology specialist, Kevin Williams, took the time to traverse the massive convention space and return with observations on VR and AR impact in this sector.
The reality of VR in commercial training, simulation and education is often overlooked or side-lined. The enterprise or commercial aspect of VR has proven a very lucrative part of the technology’s deployment, with many consumer headset manufacturers pivoting from a consumer-centric focus to broadening their investment to include a commercial business focus.
What has been coined by me as the “Serious VR” landscape, comprising commercial applications using more powerful hardware and a focus on a core deliverable (such as training, marketing, or out-of-home entertainment). While the “Casual VR” scene is focused on consumer requirements and a price-sensitive, home gaming approach.
The best example of Serious VR was amassed in London, with the holding of the Defence and Security Equipment International (DSEI) 2021, covering all the ExCel exhibition centre, and even taking up the riverside births for presentations of the latest Naval craft. The show gathering more than 30,000 attendees from the international military services, and operations that support them.
Along with warfighting, the convention gathers security, medical, training and infrastructure elements, and the show floor proved a valuable litmus of the actual penetration of immersive technology into the aspects of the commercial scene. Previous DSEI attendance has seen a growing interest in VR, but this years’ shows a definite re-evaluation of the hype over the reality of the value of the technology.
The first aspect of VR application on observation can be described as “Direct Training”.
One of the largest military providers, BAE Systems, used DSEI to launch their new SPA-TAC platform, a solution for sophisticated training, and mission rehearsal suite of tools, using virtual reality visualisation. These allow multiple user support and are deployed on the latest high-end VR hardware. On the booth, the company presented both the latest VRgineers XTAL professional headset, with its impressive field-of-view. Alongside the HTC Vive Pro series.
Another developer at the defence event was VRAI – a specialist dedicated to combining VR and Artificial Intelligence (AI) towards providing enterprise and public service organisations remote training. The ability to use the latest VR technology to create a mobile training solution in the field driving many of the applications seen. On their booth the company had a flight training solution, employing theHPReverb G2 headset. HP is one of those manufacturers that has seen the opportunity in commercial development support. And alongside this, was a Cleanbox Technology headset sanitizing system offering a much-needed hygienic approach to usage in this environment.
Across the way, on the Defence and Security Accelerator (DASA) stand was a demonstration of high-level immersion for training UK soldiers, employing the latest Varjo VR-3 professional VR headset. DASA is a government fund that invests in exploitable innovation for a safer future. The usage of VR in this application cutting the time for training, and offering better information retention by new recruits, with the control interfaces mapped to offer realistic weapon interaction.
The latest Varjo headset hardware was also seen on many other booths – the platform focused wholly on high-end commercial VR applications, offering an impressive performance beyond consumer headset specifications. The professional headset is deployed in automotive, aeronautical, CAD design and training. This marks a new phase of development in VR deployment, with the commercial sector at such as scale that it can support its own unique hardware development. On the Inzpire booth, the latest Varjo XR-3 was employed promoting its mixed reality capabilities.
The company had on one of their demonstrations a Joint Terminal Attack Controller (JTAC) training platform, that was incredibly portable and rugged. Powered by two high-end PC’s the user could wear the VR headset and see the actual binoculars and physical controls, as the MR capability dropped the real-world imagery into the virtual environment through sophisticated tracking. This was a compelling demonstration of the versatility that VR training can bring, and the level of immersion was extremely high compared to consumer applications. Also promoting their portability of training simulation, the company showed a helicopter simulator, using both VR (from an HTC Vive Pro) and conventional screen, able to be broken down into a small case.
Simple to install and operate VR training aids were also on display at the Lockheed Martin booth, showcasing their Armoured Fighting Vehicle (AFV) gunnery simulator. Employing in the VR configuration the Varjo headset and offering a means to be deployed anywhere for training units. Previously, this level of training would have depended on crude flatscreen alternatives, or expensive dedicated simulators, unable to be deployed in the field. VR applications beginning to be seen as a strong middle-ground alternative.
On the British Army stand was developers and solution providers QinetiQ – developing realistic training environments for mission rehearsal, and procedures. The company presented their latest environment for infantry training and army warfighting scenarios in urban conditions. Deploying the latest VR hardware with their setup of Varjo headsets. The level of visual realism and performance from their VR setup far surpassing anything comparable on consumer hardware.
The second aspect of VR application seen in this sector can be described as “Promotion and Visualisation”.
While there were seen some Standalone VR headsets, such as HTC Vive Focus, and an Oculus Quest 2 – these applications were more for promotional means, allowing visitors on booths a glimpse at simple information or applications. In previous years VR headsets on booths were ubiquitous, but now the focus was more on the high-end application, steering away from the casual approach.
Visualisation also saw the appearance of augmented reality (AR) on the show floor. To be more accurate the services have been employing AR in its basic form since the 1980s with the use of helmet-mounted optics supporting IR night vision or even heads-up telemetry displays. The latest AR technology has generated a lot of headlines in defence procurement, with Microsoft awarded a $22b deal to supply Hololens headsets in the evaluation of battlefield support for the US Army.
AR was represented at DSEI with the appearance of the Microsoft Hololens 2, being fielded on another part of the British Army booth, and with the developer of the application, Atos. The company is a world leader in digital transformation, providing cloud-based and information handling solutions. Their infrastructure used the Hololens to allow the user to have tactical awareness of the battlefield and deployment of resources, communicating with other users in real-time. Offering a demonstration of the future strategic planning aids that this technology represents.
Overall, the new trends on display at DSEI 2021 were clearly the explosion in investment into Unmanned Vehicles and Autonomous support – ranging from Naval based helicopter drones, and UAVs – with the first appearance of UAV land vehicles for support and casualty retrieval. Great advances in this sector are expected, and the use of augmented displays to track and direct these vehicles is expected to grow.
As mentioned previously, from the great hype and promise, VR has entered a more pragmatic phase in this industry. Its ubiquity replaced at this point, for a focus on more grounded high-end simulation, using the newly available high-end headsets. A new phase of development is about to take place, ejecting Serious VR into the next level of immersion.
Having previously developed mixed reality (MR) training solutions such as HoloPatient for medical and nursing schools, GIGXR has announced a new contract to develop a chemistry training simulation for US Air Force Academy (USAFA) cadets. Called HoloChem it’ll be an MR application for USAFA 100 and 200 level chemistry courses.
GIGXR received a $750,000 USD Phase II Small Business Innovative Research (SBIR) contract from AFWERX to develop HoloChem which will be used by cadets in conjunction with Microsoft HoloLens 2, aiding them in their chemistry studies.
The platform will utilise GIGXR’s mixed reality expertise from its other applications, providing lifelike learning experiments which take traditional chemistry curricula and makes them more engaging. Like most XR teaching applications, HoloChem will utilise gamification techniques to encourage concept retention and critical thinking skills.
“Traditional chemistry labs lack the immersive, safe-to-fail environments students need to develop the critical thinking skills for using chemistry in the real world,” said Captain Wale Lawal, Ph.D., Assistant Professor at the Department of Chemistry for the U.S. Air Force Academy in a statement. “GIGXR’s mixed reality learning solutions allows USAFA educators to immerse students in ultra-realistic, interactive scenarios to transform their understanding of chemistry from rote memorization or rigid ‘cookbook chemistry’ to real-time problem-solving. This kind of learning mimics the challenges they might face in the future, which would otherwise be impossible to simulate or place our students at risk.”
“Being awarded with the Phase II SBIR Contract is an achievement we’re incredibly proud of and marks a milestone in mixed reality’s valuable impact on education,” said David King Lassman, CEO, GIGXR. “We’re honoured to be working with the U.S. Air Force Academy, whose reputation for academic excellence and technological innovation is second to none. USAFA needed a product that enabled their instructors to provide hands-on, immersive learning and future-proof execution. HoloChem simulates key real-life experiences that labs simply cannot and allows instructors to reach multiple campuses or remote students to match the ultra-realistic learning experience available to the cadets in the instructor’s own classroom.”
HoloChem is slated to be deployed in early 2022 with instructors using HoloLens 2 headsets to share content with students. They’ll also have access to the MR headsets or can connect via Android or iOS smartphones and tablets. For further updates on GIGXR’s latest mixed reality projects, keep reading VRFocus.
CES 2020 in January was a bit of a mixed bag when it came to virtual reality (VR) and augmented reality (AR) announcements. There were lots of interesting smaller updates and advancements but nothing like those from previous years – even Panasonic’s Eyeglasses weren’t that amazing. There were products which did catch VRFocus’ eye, one of them being Spatial, a mixed reality (MR) app designed for workplace collaboration.
Spatial emerged from stealth almost two years ago, with a vision to make it easier for people to work together on projects wherever they are in the world as if they were in the same office together.
This has been made possible thanks to the likes of Microsoft’s HoloLens and Magic Leap 1, allowing digitised information to be overlaid on the real world. Spatial does support VR standalone headset Oculus Quest but that wasn’t available to demo, the HoloLens 2 on the other hand, was.
The software is all about versatility and ease of use, where users can place sticky notes on walls and import 3D models for others to examine and assess. Before any of that takes place a work environment wouldn’t feel very engaging or personal if you were represented as names or amorphous blobs. Spatial has managed to create an avatar system which creates a 3D representation of any user from a normal 2D photograph.
As you can see from the imagery the process works surprising well, creating an avatar that you can genuinely connect and have a conversation with. When VRFocus spoke with Spatial’s CEO and co-founder Anand Agarawala he noted that while eye tracking was currently supported if the hardware had the feature, the avatars could also support further facial movement like lips and eyebrows when the hardware catches up.
During the demo, Agarawala dropped a model of the Mars surface into the workspace, with any of the users able to spin or resize it, all in real-time and viewable by the group, not just them. This was certainly helped by the new hand gesture features of HoloLens 2, markedly improved over the previous model, making the interactions far more fluid and natural. HoloLens 2’s wider field of view (FoV) also made for comfortable viewing, taking in more of what was going on. If the FoV was narrower then it would be difficult to imagine Spatial having the same impact, its vision constrained by the hardware.
While the Mars model was out Spatial demonstrated some of the other tools users had access to. These included putting another model (a Mars rover) onto the surface then opening a painting/drawing tool for users to visually explain ideas and processes, in this case, a possible route for the rover to take.
Because the system is designed to connect people worldwide – it supports Microsoft Meeting for example – as well as throwing in lots of 2D/3D data it was noticeable that a good WiFi connection was needed, as – no fault of Spatial – the hotel connection did introduce moments of lag which if presented in a work environment would hamper the experience.
Spatial has yet to be officially rolled out with a fully released product expected later this year. Enterprise customers interested can still access the app by contacting Spatial directly. The software provides a tantalising taste of what’s achievable in MR with current technology, with possibilities beyond merely office collaboration, stretching into education and more.
The was a lot going at CES 2020 this month when it came to virtual reality (VR), augmented reality (AR) and mixed reality (MR) technologies, with Spatial‘s collaboration tool of particular note. Designed for headsets like Microsoft’s HoloLens or Magic Leap 1, VRFocus had a chance to sit down with CEO and co-founder Anand Agarawala to learn about the company’s plans and where it’s headed.
Founded in 2016, Spatial emerged from stealth in 2018, revealing its plans for an AR tool which could help colleagues collaborate in a digital space, all in real-time. Spatial can turn any area into an augmented workspace, where users can pin sticky notes, images or videos to a wall for anyone in the group to see; 3D models can be imported, scaled and manipulated, while devices like a phone or PC can be integrated. This means you don’t actually need a headset to join if one isn’t readily available.
One of Spatial’s most unique features is its ability to create a digital avatar of a user from a 2D photograph. As all users need a digital representation of themselves for remote colleagues to see and interact with, the company has tried to ensure a lifelike reproduction for a more natural working environment.
In the future Spatial aims to add further avatar features when headsets allow, such as facial tracking of lips and eyebrows – eye and hand tracking are already available. The software also supports VR headsets like Oculus Quest and integrates with Microsoft Teams.
While available on the HoloLens store and Magic Leap World, Spatial has yet to officially launch. That will take place later this year, so for now, interested users have to download the software then contact Spatial at hello@spatial.io to create an account. Once officially available this process will no longer be required with both free and premium paid versions available to businesses.
Check out the full interview with Agarawala below and for further updates from the team keep reading VRFocus. Of for more CES 2020 coverage why not take a look at VRFocus‘ interview with HaptX or our chat with Teslasuit about its new glove.
Microsoft might not be doing much virtual reality (VR) development but its dedication to augmented reality (AR) and mixed reality (MR) has never been stronger. Having announced the HoloLens 2 headset earlier this year during the Mobile World Congress as well as pre-orders, there was never any indication regarding when the device would actually launch. Today, that question has been answered with HoloLens 2 set to arrive next month.
The announcement was made by Microsoft’s executive vice president Harry Shum in a speech at the World Artificial Intelligence Conference in Shanghai this week reports Reuters.
No other details were given including an actual date in September, but for those who’ve managed to pre-order the headset, at least that means shipments should begin soon.
The original HoloLens arrived back in 2016 offering businesses the first real opportunity to explore MR technology. Retailing for $3,000 USD the headset was purely enterprise-focused. One of the main grievances with the original HoloLens was the field-of-view (FoV), coming in at a rather restrictive 35-degrees (an Oculus Rift is 110-degree for example).
HoloLens 2 promises an improved experience over its predecessor, offering greater comfort thanks to better balancing – the battery is placed at the back of the head now – and of course, the FoV has been increased. Those who wear glasses should find the design better suited to them and the front is hinged to allow easy dropping in and out of any experience.
Naturally, the HoloLens 2 isn’t cheap coming in at $3,500. You can’t simply pre-order the device either. Those interested will need to head to the official HoloLens 2 page to register their interest before going any further. It’s also worth noting that customers need to be based in the US, France, Germany, Ireland, New Zealand, Australia, or the UK. If you happen to be part of Microsoft’s Mixed Reality Developer Program then you can also access the HoloLens 2 Development Edition starting from $99 per month (or you can buy it outright). This edition is designed to encourage creators onto the system, only available through this program.
VRFocus will continue its coverage of Microsoft’s HoloLens 2, reporting back with all the latest updates.
Microsoft unveiled the HoloLens 2 mixed reality (MR) headset during the Mobile World Congress (MWC) a few months ago with a launch slated for later this year. There will, in fact, be two versions as the Redmond-based company has just announced the HoloLens 2 Development Edition, offering an easier way onto the platform for creators.
Microsoft has partnered with Unity for the release of the HoloLens 2 Development Edition so that customers who purchase the headset will receive a 3-month trial of Unity Pro and PiXYZ Plugin as part of the deal. Microsoft will also be throwing in $500 USD in free Azure credits.
One of the big hurdles to MR development has been cost, with the original HoloLens retailing for $3,000 with HoloLens 2 retailing for $3,500 when it arrives. To encourage more studios onto the platform the HoloLens 2 Development Edition will release later this year starting at $99 per month (or you can buy it outright) to registered members of Microsoft’s Mixed Reality Developer Program (which is free to be part of). Additionally, the HoloLens 2 Development Edition will not be available for pre-order, so the developer programme is the only way to get hold of this particular edition (the standard HoloLens 2 can be pre-ordered however).
“By bringing together HoloLens 2, Azure MR services, and the Unity platform, we are making it easier than ever for developers to get started building the real-time 3D experiences that are driving this 3rd wave of computing,” said Matt Fleckenstein, Senior Director, Mixed Reality, Microsoft in a statement.
“Pairing HoloLens 2 with Unity’s real-time 3D platform enables industrial businesses to create immersive, interactive experiences that accelerate business and reduce costs. The addition of Unity Pro and the PiXYZ Plugin makes it easy to import 3D design data in minutes rather than hours,” adds Tim McDonough, GM of Industrial, Unity.
Microsoft’s HoloLens 2 features a range of improvements over the previous model. The field of view (FOV) has been expanded, eye-tracking is now built in as well as full hand tracking. The headset is now said to be more comfortable thanks to a change to the centre of balance, plus the front of the headset now flips up to drop in and out of MR quickly and easily.
There’s no specific release date for HoloLens 2 at the moment. When those details are made public VRFocus will let you know.
When it comes to the development of videogames and virtual reality (VR) experiences in particular, for most indie developers there’s a simple choice when it comes to game engines. Either Unity or Epic Games’ Unreal Engine 4. For those working with the latter, they’ll be pleased to know that Unreal Engine 4.22 has just arrived, adding all sorts of new features.
When it comes to specific virtual reality (VR) or augmented reality (AR) features in Unreal Engine 4.22, Epic Games has added support HoloLens Remote Streaming which: “allows Unreal applications to run on a Windows desktop PC and stream the rendered result wirelessly to HoloLens over a Wi-Fi connection in real time.”
When it comes to building awesome looking virtual worlds Unreal Engine 4.22 will enable developers to do so thanks to early access support for real-time ray tracing, the technology Nvidia has been shouting about ever since it introduced the RTX series of graphics cards. Ray tracing is a technology being employed in videogames to provide natural realistic looking lighting effects in real-time, making images like those above look almost photorealistic without being too resource intensive. It’s not something you’re going to see in VR titles just yet, but it’s an eventuality. This was showcased at the Game Developers Conference (GDC) 2019 with Troll (trailer at bottom of the article).
Another important step in Unreal Engine 4.22 is the reduction of build times. Epic Games has optimized UnrealBuildTool and UnrealHeaderTool to make C++ iteration times up to 3x faster it claims, releasing the following stats:
Full build (UE4Editor Win64 Development):
Unreal Engine 4.21
Unreal Engine 4.22
Improvement
Total Build Time:
436.90
326.81
30% faster
Compiling UnrealHeaderTool:
46.12
46.30
Generating headers
25.05
15.50
60% faster
Compiling UE4Editor
323.15
257.97
25% faster
UnrealBuildTool overhead
42.58
7.04
600% faster
Incremental build (UE4Editor Win64 Development):
Unreal Engine 4.21
Unreal Engine 4.22
Improvement
Total Build Time:
7.47
2.14
340% faster
Compiling UE4Editor
1.19
1.08
UnrealBuildTool overhead
6.28
1.06
590% faster
Epic Games’ other big announcement recently was the $100,000,000 USD Epic MegaGrants initiative, which aims to assist videogame developers, media and entertainment creators, enterprise professionals, students, educators, and tools developers who are working with Unreal Engine 4.
To learn more about Unreal Engine 4.22 head on over to the Unreal Engine blog. As the company continues to release further improvements VRFocus will keep you updated.