NVIDIA researchers developed the thinnest VR display system yet, just 2.5 mm thick.
Called Holographic Glasses, this could one day lead to VR headsets as thin as sunglasses. But for now this is research technology with severe practical limitations.
The primary driver of the size and bulk of today’s VR headsets is the optical design. Achieving a relatively wide field of view with widely used optical technology like fresnel lenses and aspheric lenses requires a long gap between the lens and display. After adding the plastic housing needed to contain this, even the most minimal designs end up above 300 grams. With the need for other components like a battery and cooling, a system like Quest 2 weighs just over 500 grams, for example.
The recent productization of pancake lenses is leading to a new class of significantly more compact headsets like HTC’s Vive Flow. Pancake lenses require a much shorter gap to the display, but the lens and panel are still very distinct elements and the result is still much thicker than glasses.
NVIDIA’s researchers are presenting a very different type of VR optical system using “a pupil-replicating waveguide, a spatial light modulator, and a geometric phase lens”. It’s a technically detailed but very clearly written paper, so rather than paraphrasing we invite you to read it in the researcher’s own words.
Remarkably it’s a true holographic display, providing realistic depth cues to mitigate a major flaw with today’s headsets called the vergence-accommodation conflict – the discomfort your eyes feel because they’re pointing toward the virtual distance of virtual objects but focusing to the fixed focal distance of the lenses.
At 2.5mm thin it’s even thinner than the 9mm “holographic” glasses Facebook researchers presented two years ago. What makes NVIDIA’s research even more impressive is that Facebook’s was monochrome, not full color, and was fixed focus, not actually holographic.
This is groundbreaking research that may one day lead to dramatically thinner VR headsets. But for now, it has severe limitations. Even with eye tracking the eyebox is just 8mm, and the field of view is just 23°. That’s a fraction of current headsets, and narrower than even today’s primitive AR glasses. The researchers claim that by stacking two geometric phase lenses and using a larger spatial light modulator the field of view could reach as wide as 120° diagonal “without significantly increasing the total thickness”. That’s an extraordinary claim we’d need to see demonstrated in a future project to believe.
The never-ending quest for thinner, lighter and more compact virtual reality (VR) and augmented reality (AR) devices primarily starts with display technology. From packing more pixels per inch (PPI) into displays to simplifying and streamlining the optics, it isn’t an easy process but Standford University researchers in conjunction with NVIDIA have recently showcased their latest project dubbed Holographic Glasses.
Most of a VR headset’s bulk comes from the distance between its magnifying eyepiece and the display panel, folding the light in as short a space as possible whilst maintaining quality and without distortion. Hence why most VR devices use Fresnel lenses as they off a good trade-off between optic weight and light refraction.
As part of SIGGRAPH 2022 which takes place this summer, NVIDIA and Stanford University have unveiled their latest research, glasses that create 3D holographic images from a display just 2.5 millimetres thick. Even thinner than pancake lenses, to make this possible the: “Holographic Glasses are composed of a pupil-replicating waveguide, a spatial light modulator (SLM), and a geometric phase lens” to create the holographic images.
The SLM is able to create holograms right in front of the user’s eyes thus removing the need for that gap more traditional VR optics require to produce a suitable image. While the pupil-replicating waveguide and the geometric phase lens help further reduce the setup depth. To produce an outcome that suitably combined display quality and display size the researchers employed an AI-powered algorithm to co-design the optics.
All of this inside a form factor that only weighs 60g.
As these are still early research prototypes this type of technology is still years away from deployment (or maybe never) with pancake lenses the next major step for most VR headsets. It has been rumoured that Meta’s upcoming Project Cambria will utilise pancake optics to give it a slimmer profile.
This isn’t the only VR collaboration between Stanford and NVIDIA for SIGGRAPH 2022, they’re also working on a paper looking at a “computer-generated holography framework that improves image quality while optimizing bandwidth usage.” For continued updates on the latest VR developments, keep reading gmw3.
Manufacturing is a highly complex process, in addition to being the most important part of supply chain management. There are several components that affect the manufacturing production process — such as the availability of raw materials, labour costs, inventory costs and overall marketplace demand.
Since the start of the Industrial Revolution, the effective marriage of systems and machines has allowed us to increase production times, reduce product costs and find new ways of organising work. Within the last 50 years, digital transformation has continued this trend, enabling us to better understand the physical through digital operations.
With that being said, however, the physical has still held precedence over the digital for most of modern times. The rise of the metaverse will allow us to reverse this dichotomy, giving us access to a primarily digital space. In the case of the manufacturing industry, we will be able to translate this digital space onto the physical world, rather than simply just enhancing it.
Let’s look at some of the key ways where we can expect to see the manufacturing industry change within the metaverse.
An entrance into the creator’s economy
The metaverse will provide users with easier access to digital materials — a major shift that may very well encourage more creators and consumers to pursue industrial design. This will inevitably create new industry demands and completely change how products are made.
3D content creation tools will also become more widely available in the metaverse. This will add manufacturing to the creator’s economy, providing the general public with more tools to render and simulate 3D prototypes at their own convenience.
Just like with gaming platforms, streaming services or other various forms of online content creation, we will be sure to see the same type of growth proliferate within manufacturing and supply chain management. According to analyst firm TrendForce, the industrial metaverse revenue is set to reach $540 billion by 2025.
Easier collaboration on product development
The metaverse will also provide much easier collaboration on all aspects of product development. Given that it will be capable of serving as a communal space for all stakeholders involved with a project, multiple processes will be able to be achieved more rapidly and simultaneously — such as product design, sharing with manufacturers, iterating based on feedback and much more.
NVIDIA’s VR-based collaboration tool Omniverse has experienced a successful launch in the enterprise sphere. As a multi-GPU, real-time development platform for design teamwork and 3D simulation, it has become a staple for those working in the industrial sector or for those who specialise in the creation of digital twin applications.
To date, Omniverse has been downloaded by over 50,000 creators — with a recent platform subscription having been launched by NVIDIA to allow for wider outreach. The Omniverse platform has already experienced tremendous growth, with integrations from popular design platforms (such as Blender and Adobe) being made available for developers to use from any location. These integrations have well-positioned NVIDIA as a viable leader for collaborative product development in the metaverse.
Workplace changes due to the pandemic have also led to a rise in collaborative XR solutions within the enterprise sector. SkyReal, an aerospace-focused software company, started its operations by helping companies collaboratively approach their various stages of manufacturing — from conception and industrialization, though to training and marketing. Now, SkyReal helps aerospace teams work on CAD files in real-time, offering them an immersive experience that allows for even better collaboration capabilities.
More streamlined processes through digital twins
Digital twins are virtual representations that serve as real-time replicas of a physical object. From gaming companies to automotive manufacturers, many industries have already started using digital twins to collect real-time data and predict how objects will perform before they are manufactured and sold.
The digital twin market has been projected to grow to an incredible $86 billion by 2025. This level of growth is largely being fueled by an increase in demand for things such as predictive maintenance, industrial IoT solutions and a smarter and more energy-efficient infrastructure.
Digital twins also provide real-time data for users, allowing them to gain better insights on overall production processes. For example, automotive manufacturers are already using digital twins to better pinpoint equipment failures and ensure that all parts are meeting quality standards before being delivered to customers.
BMW has already started using a simulated system to better streamline its production process. A version of the company’s Regensburg-based production line exists solely within a computer simulation, serving as a digital twin to its physical counterpart. Before any parts enter the production line, the entire manufacturing process runs in a hyper-realistic virtual iteration of the factory. By adopting this technology, managers can now plan their production process in greater detail.
Other large companies that have adopted the use of digital twins include Microsoft, Unilever, Boeing, Siemens Energy and Ericsson. With Azure Digital Twins, Microsoft has created a leading IoT platform that features a live execution environment, allowing users to create digital representations of real-life things, people, places and processes.
In all, digital twins will be an extremely integral building block of the metaverse. They will provide us with lifelike representations of things from our physical world and come equipped with live feeds of every sensor and component they contain.
Shorter lead times
The collaborative approach offered by working in the metaverse will certainly shorten the life cycle for projects. More robust virtual spaces will also allow manufacturers to quickly see how moving assets around can impact a production cycle. By simulating real physics and identifying potential errors, this approach is a great way for manufacturers to see more efficacy and faster turnaround times.
Down the road, greater interoperability initiatives will also make product designs generally easier and faster to implement. Designers and creators will no longer have to go through as many hoops to complete their designs and get them into the hands of manufacturers. This will result in shorter lead times, as well as an exponential increase in the number of product designs they can complete.
Supply chain transparency
In more recent years, demand for supply chain transparency has been on the rise. According to the MIT Sloan School of Management, consumers are reportedly willing to pay between 2% and 10% more for products that offer greater supply chain transparency.
What we can deduct from this data is that consumers find value in the treatment of workers in a supply chain, as well as in a company’s efforts to provide decent working conditions. Ethical concerns, such as slave labour or deforestation, have made consumers increasingly more averse to purchasing products that don’t meet these standards.
With this being said, the truth is that supply chains were not originally designed to be transparent. However, access to the supply chain or to digital twin management in the metaverse could resolve this issue for good.
Working in the metaverse will also provide far better project visibility, for both staff members and consumers alike. Given that multiple collaborators will be able to work within the same space, regardless of their physical location, all parties will have access to 3D design representations of how products are designed, built, sold and distributed. Customers may even grow used to tracking their orders throughout the entire cycle, from raw materials through to a finished product. With this added insight, customers will gain full transparency into the entire production process.
Greater supply chain transparency will also give customers greater visibility of lead times. This will offer them a better sense of real-time shipping costs and allow them to better prepare for potential pitfalls (such as shipping delays).
The metaverse will pave the way towards a digital-first approach to manufacturing. This will essentially be driven by both consumer preferences and different types of actions that will be necessary to operate inside a virtual world.
There are valuable steps that manufacturers can take to bring us closer to an ideal metaverse system. For starters, it is critical that they work on harvesting data from their processes — and also that they implement the best interoperability protocols for connecting said data across the entire supply chain.
Recent innovations — such as NVIDIA’s CloudXR platform (which has been configured to work with Google Cloud) — have begun enabling organizations to securely access their data through cloud-based solutions. This will allow creators to access their work and collaborate on projects from anywhere in the world, all while doing so through the lens of an immersive, high-quality user experience.
In all, these areas are all currently being worked on to forever disrupt and change the concept of supply chains. This is an extremely exciting and innovative time for manufacturing technology — and we look forward to tracking the eventual paradigm shift that’s to come.
In its bid to help creators and businesses connect NVIDIA launched the beta version of its Omniverse platform last year, offering early access to those interested in signing up. As part of CES 2022 this week, the company has announced that the platform is now freely accessible to creators with no sign ups required. Additionally, several new features are also available.
These new additions include Omniverse Nucleus Cloud, a feature that enables sharing of large Omniverse 3D scenes without transferring massive datasets so that clients can see changes made by creators in real-time. Then there’s Omniverse Machinima where users can remix and recreate their own videogame cinematics using free characters, objects and environments from titles such as Mechwarrior 5 and Shadow Warrior 3.
For those who require facial animations, there’s Omniverse Audio2Face. This is an: “AI-enabled app that instantly animates a 3D face with just an audio track,” states NVIDIA. Creators can then directly export to Epic Game’s MetaHuman Creator app.
“We are at the dawn of the next digital frontier. Interconnected 3D virtual worlds … with shops, homes, people, robots, factories, museums … will be built by an expanding number of creators, collaborating across the globe,” said Jeff Fisher, senior vice president of NVIDIA’s GeForce business at CES 2022. “This is the future of 3D content creation and how virtual worlds will be built.”
To help creators even further, 3D asset libraries like TurboSquid by Shutterstock, CGTrader, Sketchfab and Twinbru have all added support for the Omniverse ecosystem, all based on Universal Scene Description (USD) format.
“With this technology, content creators get more than just a fast renderer,” said Zhelong Xu, a digital artist and Omniverse Creator based in Shanghai. “NVIDIA Omniverse and RTX give artists a powerful platform with infinite possibilities.”
NVIDIA Omniverse is free to download for individual users and works with GeForce RTX graphics cards to enhance existing 3D workflows. For businesses, there’s Omniverse Enterprise, a subscription service with a 30-day free trial. For continued updates from NVIDIA, keep reading VRFocus.