Innovations in AR: Heavy Industry

Augmented reality (AR) is a key pillar of Industry 4.0 (or the fourth industrial revolution), side-by-side with other potentially transformative technologies like machine learning and big data. Indeed, consultancy firm PwC has estimated that industrial manufacturing and design is one of the biggest areas for augmented and virtual reality (VR), with their use in heavy industry having the potential to deliver a $360bn GDP boost by 2030.

In this latest edition of our series on how augmented reality is faring across a range of industries, we’ll be taking a closer look at why AR is proving so useful in heavy industry, in particular the fields of construction, manufacturing and energy.

Construction

AR is proving to be a key tool for the construction industry, whether in the design stage or actually in the construction process itself, leading a 2020 study of the architecture, engineering, and construction (AEC) industry to say that AR and VR would see “strong growth” in the next 5 to 10 years.

On the design side, numerous architectural tools exist to help with space visualisation using augmented reality. One such example is The Wild, which allows designers to view 3D models in both virtual and augmented reality. Such tools can layer virtual details onto a building plan so that plans can be more readily understood by stakeholders. 

That requires highly detailed and accurate 3D models, which is where the technology overlaps with digital twin technology. Using those digital twins, companies like Akular can enable clients to see what a building would look like on-site in the real world before it is built via a mobile application.

When it comes to actual construction, augmented reality again finds a number of uses, not least training workers on safety. That might involve AR headsets that interact with tags on potentially dangerous areas to bring up safety information, but even before workers are on-site, AR can help with training them on how to use heavy machinery – as with the construction equipment training simulators offered by CM Labs or the Arvizio AR Instructor.

Arvizio AR Instructor

“Industries are experiencing a shortage of skilled front-line employees and view augmented reality as a means to accelerate training and efficiently transfer the expertise of experienced workers,” said Jonathan Reeves, CEO of Arvizio. “Arvizio enables organizations to rapidly upskill employees without the need for on-site coaching and mentoring. By delivering no-code authored augmented reality instruction and remote expert connectivity, AR Instructor can substantially increase productivity and reduce errors of workers performing complex operational activities.”

Meanwhile, progress capture and tracking functionality directly compares real-world sites with virtual models to ensure they aren’t deviating – all in real-time. A host of companies provide variations on that technology such as VisualLive, which enables users to witness 3D models in real life via headsets such as the Microsoft HoloLens or mobile devices.

Manufacturing

Much of the technology we’ve covered for construction can equally apply to the manufacturing industry, whether that’s learning how to use dangerous equipment or visualising the layout of equipment and machinery in a factory. None of this is to say there aren’t plenty of bespoke uses for augmented reality in the manufacturing space, however.

One early pioneer was Volkswagen, which was using augmented reality to assist service workers way back in 2013. The MARTA app showed step-by-step instructions on how to repair and replace certain components, overlaying its advice on the car via an iPad app. Along similar lines is Boeing’s more recent use of augmented reality to give technicians real-time, hands-free, interactive 3D wiring diagrams. 

Interestingly, that technology has bled over into the consumer space with AR manuals that assist car-owners with basic maintenance operations by showing precisely where components are located within a car.

In the design space, AR has been deployed by the largest manufacturers to rapidly iterate and do away with expensive and time-consuming physical prototypes. In the case of Ford and its partnership with HoloLens, changes can be made to a design and reflected in real-time to collaboratively sculpt a new vehicle.

AR has been trusted at the very highest levels of manufacturing, too. Lockheed Martin utilised augmented reality in the creation of NASA’s Orion Spacecraft, overlaying information to help with mission-critical procedures such as precisely aligning fasteners.

Nasa Orion HoloLens

Energy

In the energy sector, AR has the potential to remedy significant problems faced by the industry, chief of which is a brain drain caused by an ageing workforce. Indeed, the US Department of Labor estimated in 2019 that 50% of the current energy utility workforce will retire within the next ten years. The institutional knowledge being lost could be replenished more quickly with the help of AR technology.

Shell is duly using the remote collaboration possibilities of AR to educate workers in the field. Expert consultants are able to see through a worker’s eyes via an AR headset, and even draw on the screen of the augmented reality display they are using. That increases safety as workers interact with potentially dangerous heavy oil and gas equipment, as well as allowing experienced but ageing employees the ability to work remotely.

Shell AR
Image credit: Shell

The energy sector is no slouch when it comes to more specific AR solutions either, such as Upskills’s Skylight platform which allows companies to more easily develop bespoke augmented reality apps for use with AR devices, ranging from Google Glass to Microsoft HoloLens 2 and mobile devices. Then there are solutions such as Adroit, which can provide guidance on repairing high-stakes equipment such as oil rigs by scanning and identifying faulty components and machinery.

Final Thoughts

In heavy industry, where the costs of prototyping are enormous and the potential risks from machinery are significant, leaning on the virtual possibilities of augmented reality is common sense – hence the interest in the technology from across the sector.

To find out more about how AR is progressing in other fields, read the previous entry in the series, where we explored the healthcare industry in particular.

Digital Twinning in the Metaverse

April 13th 1970: The Apollo 13 spacecraft is 220,000 kilometres from Earth when an explosion rocks the crew and tears off one of the two oxygen tanks from the spacecraft. The blast destroyed one side of the transport, not only removing the oxygen supply from the crew but also water and some electrical systems. This disaster echoed across the world as astronaut Jack Swigert radioed to NASA control, “Houston, we’ve had a problem here.”

NASA capsule

From that moment, engineers and scientists at NASA rushed to put their heads together to find a solution to Apollo’s problem. The engineers needed to solve the issue using what the Apollo crew would have to hand, but crucially, without physically seeing the damage firsthand. In the end, the fix was simple; NASA instructed the crew to use cardboard, plastic bags and tape to patch up the craft enough to get them home.

There’s not much NASA could have done at the time to foresee the issue; building several spacecraft to stress test every possible outcome would have burned through budgets swiftly. When the Apollo 13 disaster occurred, NASA engineers could no longer rely on their original designs, as the craft had failed due to an unforeseen hostile environment. The crew in Houston needed a model on Earth that directly mirrored the craft in space.

The Digital Twin

In 2002, NASA coined the term ‘digital twin’, though the original concept is a little older. A digital twin can be described as ‘a digital copy of a physical object: mechanism, building or concept based in reality’. For example, a car manufacturer may create a digital twin of their main assembly plant and use it to implement new safety protocols or install new machinery, by first trying it within a digital safe zone.

Healthcare professionals can use digital twinning to simulate rare illnesses and practice care first hand – albeit digitally – and learn the correct techniques. Planning departments in government can replicate dense population areas of cities to see how new infrastructure will impact the city and society. Environmentalists are simulating extremes of climate change on digital twins based on rainforests and oceans.

A digital twin is a living model of something physical, which, to metaverse aspirers will sound familiar. Digital twins are becoming much more popular and with the advent of more immersive technology – Augmented Reality (AR) and Virtual Reality (VR) – the concept of digital twinning is becoming more mainstream. Not only that, but it points to where the metaverse can aid industry and where digital twinning can benefit from creating the metaverse.

The Impact

To fully realise a digital twin of a location or person, sensors can be placed in the physical space to gauge temperature, humidity, footfall traffic, heart rate, etc. This data is then sent to the digital twin to be replicated and be shown in almost real-time within a 3D metaverse-style space.

The opportunities for this technology are vast and far-reaching and while the positives can be seen, there must be a balance in data use. Any sensors and personal data being beamed back and forth to digital twins must be heavily encrypted and safeguarded. Landmarks and buildings would likely contain blueprints and maps on the interiors and any personal data relating to users must be made safe.

The idea of constant monitoring may be off-putting for some, given the decentralised nature of the metaverse. If the metaverse is to be hosted by millions of users across a blockchain network such as Bitcoin or Etherium, it would make the data much harder to hack, given the security of the ledger technology. Whereas a centralised server hosting this information may be more liable to attack.

Metaverse Possibilities.

Where digital twinning can be used within the metaverse is in representing aspects of reality that cannot be accessed by all. When we imagine the metaverse, we often do so by picturing fantastical and grandiose buildings or processes. In theory, the metaverse could host a shopping district where brands might create a neon-soaked, futuristic 3D building to house their products, or they may use a digital twin of their store from London, Milan or New York.

This theory can be applied to famous landmarks and buildings also. You might think to yourself “why would we need a metaverse version of Buckingham Palace or the White House?” There’s a great benefit of being able to step into these 3D realised locations when physical travel isn’t possible due to economic, geographical or even cultural limitations. These buildings and the people who operate within them can be studied in a whole new light.

Ultimately, there are many educational reasons for these landmarks to be created within a VR metaverse; digital visitors can observe and learn about everything from business management to architecture, particularly if these are ‘living, breathing’ twins full of sensor data, which show a wealth of information.

If we look at our above example of a fashion house hosting a storefront in the metaverse, we could see sensors or transaction data change the contents of the metaverse shop’s shelves and rails to reflect the demand in the real-world store.

Of course, this can work in reverse also. Shopping trends within the metaverse can adjust stock levels in the real world store, ordering the most popular products to be stocked in reality; people stopping in front of an exhibit in a metaverse museum may cause curators to move the exhibit in the real world to an area with higher footfall traffic. Metaverse members could even stress test an environment digitally to aid in real-world planning development.

Digital twinning within the fashion sector is already starting to grow with consumers able to purchase digital accessories paired with a physical item; in March 2021 RTFKT facilitated a sale of a pair of NFT sneakers designed by FEWOCiOUS. Potential buyers were able to virtually try on the sneakers using AR technology and the eventual buyer bagged the NFT kicks as well as a pair in reality also.

New Realities

In 2018 Microsoft and Mojang partnered with Great Ormond Street charity to build the famed London hospital within Minecraft. The purpose was to allow children who would become patients to explore the building and become more comfortable with the surroundings before attending an appointment in person. Digital twinning can use more graphically sophisticated software to do similar things, but maybe it’s not a hospital, maybe it’s a wildlife sanctuary for a post-pandemic school field trip or a trip to Cape Canaveral to watch the launch of Apollo 13.

These examples are great on a 2D desktop or TV display, but they become a more immersive experience when using Mixed Reality (MR). Where digital twins are used already, they benefit greatly from the use of AR or VR technology. VR allows for that immersion; walking through the hallowed halls of an ancient castle or exploring a virtual shopping centre due to be built physically in your neighbourhood.

Augmented Reality, on the other hand, still utilises the reality around us, rather than blocking out the natural world. So, to reach back to our previous examples, you might be walking those hallowed halls, but instead, use a tablet or your smartphone to see a digital overlay delivering information or dramatic reenactments. Or, for those in particular industries, such as retail or construction, the 3D shopping centre, constructed from planning permission blueprints can help those to realise a new vision by swiping through digital layers of construction.

Within the metaverse, whole buildings can be constructed and filled with digital twinning tools to fully allow remote training for new jobs or treating rare health problems. And this is all done within a digital environment, meaning fewer financial losses and the full safety of users. In the metaverse, anything can be tried. 

Have you seen the movie Soul, by Pixar? In that film, a potential Earthbound soul tries many different careers and pastimes to find her ‘spark’ or ‘passion’ on which her life will hang. What’s to say we won’t be able to wander the halls of potential careers, try them out and then pursue them? Or learn something new about ourselves which will change our lives for the better?

The Future of Healthcare in the Metaverse

While medicine has traditionally been a hands-on encounter, the pandemic has rapidly accelerated the adoption of remote care technologies. Before COVID-19, a reported 43% of healthcare facilities were known to offer telehealth services. In 2020, we saw this percentage rise to 95%.

Across the globe, many of our healthcare systems have also become the subjects of abject scrutiny. With pressures of rising costs, ageing populations, limited resources and the strain of a global pandemic, the idea of bringing parts of healthcare from the hospital to the home might not sound like a bad idea.

As our everyday lives become more and more digitised, the pandemic’s push on our developments has certainly unearthed more health-related opportunities and business models for us to explore. Let’s highlight some of the ways where we will see these new developments start to shape the future of healthcare in the metaverse.

Osso VR Tool Use
Image credit: Osso VR

Digital twins will revolutionise everything

The consensus amongst experts is that digital twins will be the foundations that build the metaverse. Digital twin technology also has the potential to transform several key areas of healthcare — including the treatment and diagnosis of patients, better-optimised preventative care, better surgery preparation and much more. 

Currently, 25% of healthcare executives have reported using digital twins within their organisations — while an estimated 66% believe their investment in digital twins will increase within the emerging metaverse. And while we are still in the early days, healthcare leaders across the globe have already begun connecting networks of digital twins to create virtual models of supply chains, facilities and even human organs and other body parts. 

Some experts even believe that everyone could one day have access to a digital twin of their genetic profile, which would be created for them after birth. In the case that they would be subject to illness or disease, their “virtual profile” would be computationally treated to provide doctors with advanced solutions on how to best treat their real bodies.

Digital twins will also certainly improve surgical practices in the metaverse. Surgery would be practised on a digital twin before an actual real-life procedure would be carried out, enabling surgeons to reference points in the simulation’s anatomy as needed. This would also allow for experimental techniques or treatments to be trialled on digital twins before being applied to real bodies, thereby reducing the level of risk to patients.

VR Surgery
Photo by © EPStudio20 – Shutterstock.com

Several vendors have also made progress with creating customised virtual organs for patients, which can be used for research, observation and better surgical planning. Leading electronics providers Philips and Siemens have both developed digital twins of the human heart to simulate cardiac catheter interventions and other custom treatments. Dassault Systemes has also created a specialised digital heart model in collaboration with US-based hospitals, where these digital twin models have helped surgeons calculate the shape of a cuff between the heart and its arteries. Sim&Cure’s Sim&Size platform also now helps brain surgeons treat aneurysms with the use of simulations, allowing for better pre-operative strategies.

Brian Kalis, managing director of digital health at Accenture, puts things nicely: “Digital twins have potential across both clinical and operational dimensions in the healthcare industry. The ability to model the physical world in a digital format could help with medical education, research and care delivery in the future.” 

Moreover, Kalis believes that: “Digital twins also have the potential to improve operational efficiency of healthcare enterprises through the ability to track and trace healthcare facilities, equipment and supplies in near-real-time, [allowing them to] more efficiently match supply and demand.”

It will transform medical training as we know it

VR has been used by companies to conduct medical training for a number of years now. However, emerging metaverse platforms are now presenting the combined use of VR, AR and AI to offer more effective, real-time guidance for training medical staff. For instance, there is immense potential for surgical training to be completely revolutionised within the metaverse. Alongside the backdrop of immersive experiences replicated from surgical practices, real-time guidance can be provided within surgeons’ fields of view on XR devices.

Metaverse technology may even one day allow students or trainees to “enter” a simulated body, allowing for a full-scale view and replication of actual procedures. AR is also a great way to provide students with better hands-on learning, giving medical students a better opportunity to practise and visualise new techniques before actually performing them in real life.

Veyond Metaverse aims to be a leading future healthcare metaverse ecosystem — citing advanced cloud and real-time communication technology as part of their communication infrastructure. Under their platform, their goal is to: “bring global participants into [their] metaverse world, enabling healthcare professionals to interact in real-time. Thus, simultaneous education, training, planning and collaborative medical procedures are possible.”

It will enhance mental health resources and treatment

While some analysts suggest that the metaverse has the potential to remove users from reality and negatively impact their mental health, a great deal of research suggests that the next phase of the web will also make way for more innovative mental health treatment. As it turns out, there are multiple ways for mental health-related conditions to be improved through VR technology

A peer-reviewed study from Oxford University recently concluded that patients who tried VR therapy saw a 38% decrease in anxiety or avoidant symptoms over the course of a six-week period. Another study also found that patients suffering from paranoid beliefs noticed a reduction in their phobias after even just one VR coaching session.

VR Mental Health
Photo by © DC Studio – Shutterstock.com

Doctors are also already recommending VR videogames to treat mental health-related conditions such as brain fog, ADHD, PTSD and depression. In June 2020, Akili Interactive became one of the first “prescription-strength” video games to be approved by the FDA to treat ADHD in children.

And Rey, a growing Texas-based metaverse startup, secured its round of Series A funding within the last year. Rey offers VR sessions to help users work through challenges that will “rewire the circuitry” that causes anxiety. Through VR, Rey’s users can access simulations of various social situations — offering an opportunity for them to better acclimate to concepts that may trigger their anxiety symptoms. Throughout these sessions, human coaches also provide guidance to help users develop stronger coping mechanisms.

So, why exactly is virtual therapy effective? In short, VR’s ability to trick our brain into thinking it is reacting to a real encounter is also able to teach us healthier coping strategies — a phenomenon that we may see become more commonplace in treating mental health conditions in the metaverse. 

Oxford professor Daniel Freeman (who also happens to be a scientific founder at Rey) has remarked on the effectiveness of VR therapy: “The beautiful bit… is that there’s also a conscious bit of your brain saying it’s not real, therefore I can try things differently. It doesn’t break the spell — it just enables you to make the learning.”

It will pave the way for more digitised and decentralised interfaces

The COVID-19 pandemic forced people worldwide to turn to digital services for wider (and safer) healthcare access. As a result, people have become increasingly more comfortable with the ideas of teleconsultations and accessing their medical data through digital services. 

We are likely to see this level of comfort deepen within the metaverse — with some analysts suggesting that we will eventually see the creation of an entire meta-health ecosystem. This may come in the form of avatars for more life-like consultations, or with treatment and diagnosis being provided through data interconnectivity.

Immersive Healthcare Interface
Photo by © Elnur – Shutterstock.com

UK-based non-profit organisation DeHealth has stepped into the forefront of the digital healthcare industry, announcing the start of a decentralised metaverse platform that hopes to see millions of doctors and patients interact with each other in full 3D format. Users can even earn virtual assets by selling their anonymised medical data. And to top things off, DeHealth also plans to power its own economy using blockchain technology: the HLT (health) token will be offered as a primary means of settlement within the ecosystem.

Anna Bondarenko, co-founder of De-Health, has outlined the company’s goal as: “Providing people with the most advanced technologies to preserve their health, so that every person in the world, regardless of their place of residence, social status and financial capabilities, can control their health and life.” And thanks to HLT, the hope is that anyone in the metaverse will one day be able to sell and control their impersonal medical information.
DeHealth will be available for download in late 2022, offering access to 3 million Hospital OS users.

Final thoughts

In this article, we’ve been able to observe some of the immense ways in which the metaverse will transform the course of the healthcare sector. There is a long list of opportunities for populations one day to harness better control over their own healthcare data, or for medical students to learn from more advanced training modules. Incredibly, surgeons will also one day be able to reduce the number of trials they perform on patients and increase the efficiency of their procedures through digital twin models.

Overall, health leaders should lean into the metaverse and continue to explore the ways in which it can be used to make healthcare safer, more inclusive and more accessible for all.