3 reasons you should still buy an Nvidia GPU over AMD
Surprise — Redfall on PC is yet another problematic port
What is Nvidia Reflex and how do you enable it in 2023?
Here’s how to get Redfall Bite Back Edition for free from Nvidia
Atomic Heart restored my fragmented faith in PC ports
Chromebook gaming could get serious with rumored Nvidia GPUs on the way
NVIDIA Omniverse XR Shows 3D Scenes In VR With Real-Time Ray Tracing
Omniverse XR lets you view 3D scenes in VR with real-time ray tracing.
NVIDIA describes it as “the first full-fidelity, fully ray traced VR”.
Omniverse XR supports importing highly complex USD file scenes with tens of millions of polygons. Unlike game engines, since full ray tracing is used no preprocessing or compiling is required. Inside VR objects can be repositioned and rescaled, and the lighting will adapt in real time.
The application interfaces with SteamVR, so theoretically should work with any PC headset. However, only the Oculus Touch and HTC Vive controllers are officially supported as input devices.
Of course, real-time ray tracing in virtual reality is no easy feat. NVIDIA lists the minimum required GPU as an RTX 3080 Ti, and the recommended requirement as two RTX 3090s with NVLink. To help with performance, Omniverse XR has built in static foveated rendering so only the center of the lens is rendered in full resolution. NVIDIA says the specific technique used here is “built specifically for real-time ray tracing”.
Whilst certainly an interesting experiment, those demanding settings are a good reminder of just how much computing power will be needed to power truly photorealistic VR running in real-time some day.
Omniverse XR can be downloaded inside the Omniverse launcher.
NVIDIA Researchers Built The Thinnest VR Display System Yet
NVIDIA researchers developed the thinnest VR display system yet, just 2.5 mm thick.
Called Holographic Glasses, this could one day lead to VR headsets as thin as sunglasses. But for now this is research technology with severe practical limitations.
The primary driver of the size and bulk of today’s VR headsets is the optical design. Achieving a relatively wide field of view with widely used optical technology like fresnel lenses and aspheric lenses requires a long gap between the lens and display. After adding the plastic housing needed to contain this, even the most minimal designs end up above 300 grams. With the need for other components like a battery and cooling, a system like Quest 2 weighs just over 500 grams, for example.
The recent productization of pancake lenses is leading to a new class of significantly more compact headsets like HTC’s Vive Flow. Pancake lenses require a much shorter gap to the display, but the lens and panel are still very distinct elements and the result is still much thicker than glasses.
NVIDIA’s researchers are presenting a very different type of VR optical system using “a pupil-replicating waveguide, a spatial light modulator, and a geometric phase lens”. It’s a technically detailed but very clearly written paper, so rather than paraphrasing we invite you to read it in the researcher’s own words.
Remarkably it’s a true holographic display, providing realistic depth cues to mitigate a major flaw with today’s headsets called the vergence-accommodation conflict – the discomfort your eyes feel because they’re pointing toward the virtual distance of virtual objects but focusing to the fixed focal distance of the lenses.
At 2.5mm thin it’s even thinner than the 9mm “holographic” glasses Facebook researchers presented two years ago. What makes NVIDIA’s research even more impressive is that Facebook’s was monochrome, not full color, and was fixed focus, not actually holographic.
This is groundbreaking research that may one day lead to dramatically thinner VR headsets. But for now, it has severe limitations. Even with eye tracking the eyebox is just 8mm, and the field of view is just 23°. That’s a fraction of current headsets, and narrower than even today’s primitive AR glasses. The researchers claim that by stacking two geometric phase lenses and using a larger spatial light modulator the field of view could reach as wide as 120° diagonal “without significantly increasing the total thickness”. That’s an extraordinary claim we’d need to see demonstrated in a future project to believe.