NVIDIA Omniverse XR Shows 3D Scenes In VR With Real-Time Ray Tracing

Omniverse XR lets you view 3D scenes in VR with real-time ray tracing.

NVIDIA describes it as “the first full-fidelity, fully ray traced VR”.

Omniverse XR supports importing highly complex USD file scenes with tens of millions of polygons. Unlike game engines, since full ray tracing is used no preprocessing or compiling is required. Inside VR objects can be repositioned and rescaled, and the lighting will adapt in real time.

The application interfaces with SteamVR, so theoretically should work with any PC headset. However, only the Oculus Touch and HTC Vive controllers are officially supported as input devices.

Of course, real-time ray tracing in virtual reality is no easy feat. NVIDIA lists the minimum required GPU as an RTX 3080 Ti, and the recommended requirement as two RTX 3090s with NVLink. To help with performance, Omniverse XR has built in static foveated rendering so only the center of the lens is rendered in full resolution. NVIDIA says the specific technique used here is “built specifically for real-time ray tracing”.

Whilst certainly an interesting experiment, those demanding settings are a good reminder of just how much computing power will be needed to power truly photorealistic VR running in real-time some day.

Omniverse XR can be downloaded inside the Omniverse launcher.

NVIDIA Researchers Built The Thinnest VR Display System Yet

NVIDIA researchers developed the thinnest VR display system yet, just 2.5 mm thick.

Called Holographic Glasses, this could one day lead to VR headsets as thin as sunglasses. But for now this is research technology with severe practical limitations.

vr panels lenses dual

The primary driver of the size and bulk of today’s VR headsets is the optical design. Achieving a relatively wide field of view with widely used optical technology like fresnel lenses and aspheric lenses requires a long gap between the lens and display. After adding the plastic housing needed to contain this, even the most minimal designs end up above 300 grams. With the need for other components like a battery and cooling, a system like Quest 2 weighs just over 500 grams, for example.

The recent productization of pancake lenses is leading to a new class of significantly more compact headsets like HTC’s Vive Flow. Pancake lenses require a much shorter gap to the display, but the lens and panel are still very distinct elements and the result is still much thicker than glasses.

NVIDIA’s researchers are presenting a very different type of VR optical system using “a pupil-replicating waveguide, a spatial light modulator, and a geometric phase lens”. It’s a technically detailed but very clearly written paper, so rather than paraphrasing we invite you to read it in the researcher’s own words.

Remarkably it’s a true holographic display, providing realistic depth cues to mitigate a major flaw with today’s headsets called the vergence-accommodation conflict – the discomfort your eyes feel because they’re pointing toward the virtual distance of virtual objects but focusing to the fixed focal distance of the lenses.

At 2.5mm thin it’s even thinner than the 9mm “holographic” glasses Facebook researchers presented two years ago. What makes NVIDIA’s research even more impressive is that Facebook’s was monochrome, not full color, and was fixed focus, not actually holographic.

This is groundbreaking research that may one day lead to dramatically thinner VR headsets. But for now, it has severe limitations. Even with eye tracking the eyebox is just 8mm, and the field of view is just 23°. That’s a fraction of current headsets, and narrower than even today’s primitive AR glasses. The researchers claim that by stacking two geometric phase lenses and using a larger spatial light modulator the field of view could reach as wide as 120° diagonal “without significantly increasing the total thickness”. That’s an extraordinary claim we’d need to see demonstrated in a future project to believe.

NVIDIA & Stanford Researchers Use AI to Develop 2.5mm Thick Wearable Display

The never-ending quest for thinner, lighter and more compact virtual reality (VR) and augmented reality (AR) devices primarily starts with display technology. From packing more pixels per inch (PPI) into displays to simplifying and streamlining the optics, it isn’t an easy process but Standford University researchers in conjunction with NVIDIA have recently showcased their latest project dubbed Holographic Glasses.

Holographic Glasses

Most of a VR headset’s bulk comes from the distance between its magnifying eyepiece and the display panel, folding the light in as short a space as possible whilst maintaining quality and without distortion. Hence why most VR devices use Fresnel lenses as they off a good trade-off between optic weight and light refraction.

As part of SIGGRAPH 2022 which takes place this summer, NVIDIA and Stanford University have unveiled their latest research, glasses that create 3D holographic images from a display just 2.5 millimetres thick. Even thinner than pancake lenses, to make this possible the: “Holographic Glasses are composed of a pupil-replicating waveguide, a spatial light modulator (SLM), and a geometric phase lens” to create the holographic images.

The SLM is able to create holograms right in front of the user’s eyes thus removing the need for that gap more traditional VR optics require to produce a suitable image. While the pupil-replicating waveguide and the geometric phase lens help further reduce the setup depth. To produce an outcome that suitably combined display quality and display size the researchers employed an AI-powered algorithm to co-design the optics.

Holographic Glasses

All of this inside a form factor that only weighs 60g.

As these are still early research prototypes this type of technology is still years away from deployment (or maybe never) with pancake lenses the next major step for most VR headsets. It has been rumoured that Meta’s upcoming Project Cambria will utilise pancake optics to give it a slimmer profile.

This isn’t the only VR collaboration between Stanford and NVIDIA for SIGGRAPH 2022, they’re also working on a paper looking at a “computer-generated holography framework that improves image quality while optimizing bandwidth usage.” For continued updates on the latest VR developments, keep reading gmw3.