Niantic is Bringing Its Large-scale AR Positioning System to WebAR Too

This week Niantic announced Lightship VPS, a system designed to make possible accurate localization of AR devices at a large scale to enable location-based AR content that can also be persistent and multi-user. While the first implementation of the system will need to be baked into individual apps, the company says it’s bringing the tech to WebAR too.

With the launch of Lightship VPS (visual positioning system), Niantic is staking its claim in the AR space by offering up an underlying map on which developers can build AR apps which are tied to real-world locations. Being able to localize AR apps to real-world locations means those apps can have persistent virtual content that always appears in the same location in the world, even for different users at the same time.

The system is built into Niantic’s Lightship ARDK, which is a set of tools (including VPS) that developers can use to build AR apps. For the time being, VPS can be added to apps that users will download onto their phone, but Niantic says it also plans to make a version of VPS that will work from a smartphone’s web browser. While it’s not ready just yet, the company showed some live demos of the browser-based VPS in action this week.

WebAR is a foundation of technologies that allow AR experiences to run directly from a smartphone’s web browser. Building AR into the web means developers can deploy AR experiences to users that are easy to share and don’t have the friction of going to an app store to download a dedicated app (you can check out an example of a WebAR experience here).

Image courtesy Niantic

Thanks to Niantic’s recent acquisition of WebAR specialist 8th Wall, the company is now poised to make VPS compatible with 8th Wall’s WebAR tools, bringing the same large-scale AR positioning capabilities to web developers. Though it showed off the first demos this week, the company hasn’t said when the WebAR version of VPS will become available.

The post Niantic is Bringing Its Large-scale AR Positioning System to WebAR Too appeared first on Road to VR.

Unexpected Update Brings New Content to SteamVR Home

Valve released an update to SteamVR this week that unexpectedly added new content to SteamVR Home.

Valve has long been making small but helpful updates to SteamVR, but especially in the last year there hasn’t been a particularly clear signal that the company was putting much work into the platform.

That’s why it was surprising to see that with the public launch of the SteamVR 1.22 update this week, the company added a surprise—a brand new photogrammetry environment for SteamVR Home, its first-party VR social space.

The new environment is a capture of a portion of the village of Fornalutx in Mallorca, which resides in the Western Mediterranean Sea.

SteamVR users can download the new environment by subscribing to it in the Steam Workshop. Once downloaded, it will become available as a new place you and friends can visit in SteamVR Home.

Unfortunately it doesn’t seem like this slice of new content means that Valve has renewed motivation to work on its VR platform. As explained in the patch notes, the photos were originally captured back in October 2019—shortly after the release of the company’s Index headset. Given what we know about Valve’s unique corporate structure, it seems likely the timing of this release merely coincided with an individual at the company digging up the old photos and processing them as something of a side project.

So, nice to have a cool new scan to explore, but we don’t think it means Valve is suddenly going to be cranking out big new updates for SteamVR.

The company has continued to work on SteamVR, albeit slowly. A handful of updates have landed over the last year or so that have made improvements to the SteamVR dashboard, added new settings for power users, and made the platform play nicer with Quest headsets using Oculus Link.

Photo by Road to VR

Granted, still long overdue is a SteamVR native version of core Steam features like friends list, voice chats, a fully functional store and library, achievements, and more. SteamVR for a long time has fallen back to the Steam ‘Big Picture’ interface that’s designed for large displays. Unfortunately in VR that interface runs very clunkily and was clearly designed for a different input modality.

Beyond adding the new SteamVR Home environment, the SteamVR 1.22 update also brought with it a bunch of bug fixes and technical improvements that were previously released in beta versions of the software; you can see the full patch notes here.

The post Unexpected Update Brings New Content to SteamVR Home appeared first on Road to VR.

Meta Research Explores a New Solution to One of VR’s Biggest Display Challenges

New research from Kent State University and Meta Reality Labs has demonstrated large dynamic focus liquid crystal lenses which could be used to create varifocal VR headsets.

Vergence-Accommodation Conflict in a Nutshell

In the VR R&D space, one of the hot topics is finding a practical solution for the so-called vergence-accommodation conflict (VAC). All consumer VR headsets on the market to date render an image using stereoscopy which creates 3D imagery that supports the vergence reflex of pair of eyes (when they converge on objects to form a stereo image), but not the accommodation reflex of an individual eye (when the lens of the eye changes shape to focus light at different depths).

In the real world, these two reflexes always work in tandem, but in a VR they become disconnected because the eyes continue to converge where needed, but their accomodation remains static because the light is all coming from the same distance (the display). Researchers in the field say VAC can cause eye strain, make it difficult to focus on close imagery, and may even limit visual immersion.

Seeking a Solution

There have been plenty of experiments with technologies that could be used in varifocal headsets that correctly support both vergence & accommodation, for instance holographic displays and multiple focal planes. But it seems none have cracked the code on a practical, cost effective, and mass producible solution to solve VAC.

Another potential solution to VAC is dynamic focus liquid crystal (LC) lenses which can change their focal length as their voltage is adjusted. According to a Kent State University graduate student project with funding and participation from Meta Reality Labs, such lenses have been demonstrated previously, but mostly in very small sizes because the switching time (how quickly focus can be changed) significantly slows down as size increases.

Image courtesy Bhowmick et al., SID Display Week

To reach the size of dynamic focus lens that you’d want if you were to build it into a contemporary VR headset—while keeping switching time low enough—the researchers have devised a large dynamic focus LC lens with a series of ‘phase resets’, which they compare to the rings used in a Fresnel lens. Instead of segmenting the lens in order to reduce its width (as with Fresnel), the phase reset segments are powered separately from one another so the liquid crystals within each segment can still switch quickly enough to be practical for use in a varifocal headset.

A Large, Experimental Lens

In new research presented at the SID Display Week 2022 conference, the researchers characterized a 5cm dynamic focus LC lens to measure its capabilities and identify strengths and weaknesses.

On the ‘strengths’ side, the researchers show the dynamic focus lens achieves high image quality toward the center of the lens while supporting a dynamic focus range from -0.80 D to +0.80 D and a sub-500ms switching speed.

For reference, in a 90Hz headset a new frame is shown to the user every 11ms (90 times per second), while a 500ms switching time is the equivalent of 2Hz (two times per second). While that’s much slower than the framerate of the headset, it may be within the practical speed when considering the rate at which the eye can adjust to a new focal distance. Further, the researchers say the switching time can be increased by stacking multiple lenses.

Image courtesy Bhowmick et al., SID Display Week

On the ‘weaknesses’ side, the researchers find that the dynamic focus LC lens suffers from a reduction in image quality as the view approaches the edge of the lens due to the phase reset segments—similar in concept to the light scattering due to the ridges in a Fresnel lens. The presented work also explores a masking technique designed to reduce these artifacts.

Figures A–F are captures of images through the dynamic focus LC lens, increasingly off-axis from center, starting with 0° and going to 45° | Image courtesy Bhowmick et al., SID Display Week

Ultimately, the researchers conclude, the experimental dynamic focus LC lens offers “possibly acceptable [image quality] values […] within a gaze angle of about 30°,” which is fairly similar to the image quality falloff of many VR headsets with Fresnel optics today.

To actually build a varifocal headset from this technology, the researchers say the dynamic focus LC lens would be used in conjunction with a traditional lens to achieve the optical pipeline needed in a VR headset. Precise eye-tracking is also necessary so the system knows where the user is looking and thus how to adjust the focus of the lens correctly for that depth.

The work in this paper presents measurement methods and benchmarks showing the performance of the lens which future researchers can use to test their own work against or identify improvements that could be made to the demonstrated design.

The full paper has not yet been published, but it was presented by its lead author, Amit Kumar Bhowmick at SID Display Week 2022, and further credits Afsoon Jamali, Douglas Bryant, Sandro Pintz, and Philip J Bos, between Kent State University and Meta Reality Labs.

Continue on Page 2: What About Half Dome 3? »

The post Meta Research Explores a New Solution to One of VR’s Biggest Display Challenges appeared first on Road to VR.

Qualcomm’s Latest AR Glasses Reference Design Drops the Tether, Keeps the Compute

Qualcomm has revealed its latest AR glasses reference design, which it offers up to other companies as a blueprint for building their own AR devices. The reference design, which gives us a strong hint at the specs and capabilities of upcoming products, continues to lean on a smartphone to do the heavy compute, but this time is based on a wireless design.

Qualcomm’s prior AR glasses reference design was based on the Snapdragon XR1 chip and called for a wired connection between a smartphone and the glasses, allowing the system to split rendering tasks between the two devices.

Now the company’s latest design, based on Snapdragon XR2, takes the wire out of the equation. But instead of going fully standalone, the new reference design continues to rely on the smartphone to handle most of the heavy rendering, but now does so over a wireless connection between the devices.

Image courtesy Qualcomm

In addition to Snapdragon XR2, the AR glasses include Qualcomm’s FastConnect 6900 chip which equips it with Wi-Fi 6E and Bluetooth 5.3. The company says the chip is designed for “ultra-low latency,” and manages less than 3ms of latency between the headset and the smartphone. The company has also announced XR-specific software for controlling its FastConnect 6900, allowing device makers to tune the wireless traffic between the devices to prioritize the most time-critical data in order to reduce instances of lag or jitter due to wireless interference.

Though a connected smartphone seems like the most obvious use-case, Qualcomm also says the glasses could just as well be paired to a Windows PC or “processing puck.”

Beyond the extra wireless tech, the company says the latest design is 40% thinner than its previous reference design. The latest version has a 1,920 × 1,080 (2MP) per-eye resolution at 90Hz. The microdisplays include a ‘no-motion-blur’ feature—which sounds like a low persistence mode designed to prevent blurring of the image during head movement. A pair of monochrome cameras are used for 6DOF tracking and an RGB camera for video or photo capture. The company didn’t mention the device’s field-of-view, so it’s unlikely to be any larger than the prior reference design at 45° diagonal.

Like its many prior reference designs, Qualcomm isn’t actually going to make and sell the AR glasses. Instead, it offers up the design and underlying technology for other companies to use as a blueprint to build their own devices (hopefully using Qualcomm’s chips!). Companies that build on Qualcomm’s blueprint usually introduce their own industrial design and custom software offering; some even customize the hardware itself, like using different displays or optics.

That makes this AR glasses reference design a pretty good snapshot of the current state of AR glasses that can be mass produced, and a glimpse of what some companies will be offering in the near future.

Qualcomm says its latest AR glasses reference design is “available for select partners,” as of today, and plans to make it more widely available “in the coming months.”

The post Qualcomm’s Latest AR Glasses Reference Design Drops the Tether, Keeps the Compute appeared first on Road to VR.

Qualcomm’s Latest AR Glasses Reference Design Drops the Tether, Keeps the Compute

Qualcomm has revealed its latest AR glasses reference design, which it offers up to other companies as a blueprint for building their own AR devices. The reference design, which gives us a strong hint at the specs and capabilities of upcoming products, continues to lean on a smartphone to do the heavy compute, but this time is based on a wireless design.

Qualcomm’s prior AR glasses reference design was based on the Snapdragon XR1 chip and called for a wired connection between a smartphone and the glasses, allowing the system to split rendering tasks between the two devices.

Now the company’s latest design, based on Snapdragon XR2, takes the wire out of the equation. But instead of going fully standalone, the new reference design continues to rely on the smartphone to handle most of the heavy rendering, but now does so over a wireless connection between the devices.

Image courtesy Qualcomm

In addition to Snapdragon XR2, the AR glasses include Qualcomm’s FastConnect 6900 chip which equips it with Wi-Fi 6E and Bluetooth 5.3. The company says the chip is designed for “ultra-low latency,” and manages less than 3ms of latency between the headset and the smartphone. The company has also announced XR-specific software for controlling its FastConnect 6900, allowing device makers to tune the wireless traffic between the devices to prioritize the most time-critical data in order to reduce instances of lag or jitter due to wireless interference.

Though a connected smartphone seems like the most obvious use-case, Qualcomm also says the glasses could just as well be paired to a Windows PC or “processing puck.”

Beyond the extra wireless tech, the company says the latest design is 40% thinner than its previous reference design. The latest version has a 1,920 × 1,080 (2MP) per-eye resolution at 90Hz. The microdisplays include a ‘no-motion-blur’ feature—which sounds like a low persistence mode designed to prevent blurring of the image during head movement. A pair of monochrome cameras are used for 6DOF tracking and an RGB camera for video or photo capture. The company didn’t mention the device’s field-of-view, so it’s unlikely to be any larger than the prior reference design at 45° diagonal.

Like its many prior reference designs, Qualcomm isn’t actually going to make and sell the AR glasses. Instead, it offers up the design and underlying technology for other companies to use as a blueprint to build their own devices (hopefully using Qualcomm’s chips!). Companies that build on Qualcomm’s blueprint usually introduce their own industrial design and custom software offering; some even customize the hardware itself, like using different displays or optics.

That makes this AR glasses reference design a pretty good snapshot of the current state of AR glasses that can be mass produced, and a glimpse of what some companies will be offering in the near future.

Qualcomm says its latest AR glasses reference design is “available for select partners,” as of today, and plans to make it more widely available “in the coming months.”

The post Qualcomm’s Latest AR Glasses Reference Design Drops the Tether, Keeps the Compute appeared first on Road to VR.

VR Attraction Zero Latency Ditches Backpack PCs in Favor of Vive Focus 3 & Wireless Rendering

Zero Latency, one of the longest running VR attractions in the out-of-home VR space, is dropping the backpack PCs that were once the backbone of the platform. Now the company says it’s moving to standalone Vive Focus 3 headsets with wireless delivery of PC-rendered VR content.

Unlike a VR arcade, which lets customers play consumer VR content, Zero Latency is a VR attraction offering totally unique multi-user VR experiences designed to be played in a large, shared arena.

Image courtesy Zero Latency

The company, which offers up its platform and experiences to franchisees, has steadily upgraded its VR tech as the space has developed.

Early on the system relied on a custom backpack PC paired with OSVR HDK 2 headsets and an optical overhead tracking system. Eventually the company moved to purpose-built VR backpacks and first-gen WMR headsets from HP, which allowed it to streamline the system considerably by dropping the overhead tracking in favor of WMR’s inside-out tracking. Later versions of the system moved to the more modern HP Reverb headset.

Now Zero Latency has announced its latest upgrade to the system, which further streamlines the setup by opting for the standalone Vive Focus 3 and streaming PC-rendered content wirelessly to the headsets.

Image courtesy Zero Latency

That means dropping the VR backpacks entirely, which not only reduces the cost of the system, but significantly reduces complexity for both operators and users; operators don’t need to clean, charge, and maintain the backpack units, and it’s one less step during onboarding which means more playtime for users.

And while other standalone headsets like Quest 2 might have been an option, HTC’s Vive Focus 3 has a couple of unique advantages for out-of-home use. Especially its swappable battery which reduces the number of headsets needed on hand as the batteries can be charged independently and swapped on the fly.

On the content side, Zero Latency locations continue to offer the same experiences as before, which span cooperative and competitive multiplayer experiences with up to eight simultaneous players. Though, given the company’s knack for innovation in their in-house content, it’ll be interesting to see if the move to a more simplified system will unlock potential for experiences that wouldn’t quite work with the bulkier setup.

Given today’s announcement, it’ll likely be some time yet before the upgrade rolls out to existing Zero Latency locations, but it seems the company will be offering this upgraded version of the system to new franchisees going forward.

The post VR Attraction Zero Latency Ditches Backpack PCs in Favor of Vive Focus 3 & Wireless Rendering appeared first on Road to VR.

Preview: ‘Shores of Loci’ is a Gorgeous 3D Puzzler Coming to Quest 2 & SteamVR Next Week

First time VR studio MikeTeevee is soon to release Shores of Loci, a 3D puzzle game backed by gorgeous and fantastical visuals. The game is set for an Early Access release on Quest 2 via App Lab and SteamVR on May 24th.

Though production company MikeTeevee has been around since 2011, the studio has never released a VR game before Shoes of Loci. Along with the game’s initial release on App Lab (also coming to SteamVR), you might expect the studio’s debut project to be rough around the edges. On the contrary, after previewing the game myself I found a polished experience that offers up enjoyable 3D puzzles with a backdrop of sharp and fantastical visuals that are a cut above many games you’d find on Quest 2.

At its most basic, Shores of Loci is like a fictional version of Puzzling Places. While the latter has you snapping together scans of real buildings, Shores of Loci instead slices up totally imagined (and quite beautiful) little dioramas.

A completed puzzle in ‘Shores of Loci’ | Image courtesy MikeTeeVee

Shores of Loci is enhanced by a surrounding environment that’s beautifully rendered and art directed—from the last glimpse of sunlight reflecting at the very edge of the horizon to the towering structures that surround you like silent giants—even on Quest 2 it all looks great.

A completed puzzle in ‘Shores of Loci’ | Image courtesy MikeTeeVee

The game effectively uses VR as a canvas for the imagination and serves up some very striking and creative visuals, like a scene transition that sees the entire world before you enveloped as if being consumed and then regurgitated by a black hole (it’s more peaceful than it sounds, I promise).

Shores of Loci’s puzzling offers a slightly more organic feeling than Puzzling Places, perhaps because of the way that the 3D models you fit together have volume inside of them instead of being hollow textures. In any case, the fundamental gameplay is quite similar in that you’ll need to use a combination of traditional puzzling skills (edge shapes, color matching, etc) with some spatial reasoning to reach the point that you get to snap that final, satisfying piece into place.

Alongside its lovely visual backdrop, Shores of Loci also has some great audio design, with peaceful music and satisfying sonic feedback as you progress through each puzzle.

– – — – –

For anyone that loves puzzles, Shores of Loci is an easy recommendation. You’re getting some fun 3D puzzles and a fantastical visual feast to go along with them. And you won’t need to wait long to try it yourself; Shores of Loci launches on App Lab and SteamVR on May 24th, priced at $15.

The post Preview: ‘Shores of Loci’ is a Gorgeous 3D Puzzler Coming to Quest 2 & SteamVR Next Week appeared first on Road to VR.

The 20 Best Rated & Most Popular Quest Games & Apps – May 2022

While Oculus doesn’t offer much publicly in the way of understanding how well individual games & apps are performing across its Quest 2 storefront, it’s possible to glean some insight by looking at apps relative to each other. Here’s a snapshot of the 20 best rated Oculus Quest games and apps as of May 2022.

Some quick qualifications before we get to the data:

  • Paid and free apps are separated
  • Only apps with more than 100 reviews are represented
  • App Lab apps are not represented (see our latest Quest App Lab report)
  • Rounded ratings may appear to show ‘ties’ in ratings for some applications, but the ranked order remains correct

Best Rated Oculus Quest 2 Games & Apps – Paid

The rating of each application is an aggregate of user reviews and a useful way to understand the general reception of each title by customers.

Rank Name Rating (# of ratings) Rank Change Price
#1 Puzzling Places 4.9 (1,130) $15
#2 The Room VR: A Dark Matter 4.89 (10,954) $30
#3 I Expect You To Die 2 4.86 (2,100) $25
#4 Walkabout Mini Golf 4.86 (7,013) $15
#5 Swarm 4.82 (2,002) $25
#6 Ragnarock 4.82 (844) $25
#7 Moss 4.81 (5,816) $30
#8 I Expect You To Die 4.8 (4,667) $25
#9 YUKI 4.8 (193) $20
#10 Cubism 4.8 (701) $10
#11 Cosmonious High 4.79 (267) $30
#12 Pistol Whip 4.78 (8,478) $30
#13 The Thrill of the Fight 4.78 (9,168) $10
#14 Five Nights at Freddy’s: Help Wanted 4.77 (10,142) $30
#15 Little Cities 4.75 (110) New $20
#16 GORN 4.75 (6,612) ↓ 1 $20
#17 In Death: Unchained 4.75 (3,789) $30
#18 Yupitergrad 4.74 (513) $15
#19 The Tale of Onogoro 4.73 (212) ↓ 3 $30
#20 Trover Saves the Universe 4.73 (2,131) ↓ 1 $30

Rank change & stats compared to April 2022

Dropouts:
Vermillion

  • Among the 20 best rated Quest apps
    • Average rating (mean): 4.8 out of 5 (±0)
    • Average price (mean): $23 (±$0)
    • Most common price (mode): $30 (±$0)
  • Among all paid Quest apps
    • Average rating (mean): 4.2 out of 5 (±0)
    • Average price (mean): $20 (±$0)
    • Most common price (mode): $20 (±$0)

Continue on Page 2: Most Popular Paid Oculus Quest Apps »

The post The 20 Best Rated & Most Popular Quest Games & Apps – May 2022 appeared first on Road to VR.

Meta Releases UE4 Graphics Demo to Show What Quest 2 Can Do with Expert Optimization

In an effort to help PC VR developers bring their content to Quest 2, Meta has ported Showdown, an old UE4 VR graphics showcase, to the headset as a case study in optimization best practices.

Showdown is a UE4 PC VR demo originally made by Epic Games back in 2014 to show off high-fidelity VR graphics running at 90Hz on a GTX 980 GPU at a 1,080 × 1,200 (1.3MP) per-eye resolution.

Eight years later, you can now run Showdown on Quest 2 at 90Hz on the headset’s Snapdragon XR2 chip at 1,832 × 1,920 (3.5MP) per-eye resolution.

Meta ported the short demo as a case study in optimizing PC VR content to run on Quest 2.

And while the app has been heavily optimized and doesn’t look as good as its PC VR counterpart—decent anti-aliasing, lighting, and high-res textures are missing—it shows that developers don’t have to shy away from lots of objects, particles, and effects just because they’re targeting Quest 2.

The video above looks slightly worse than the experience in the headset itself due to a low-ish bitrate recording and the visibility of fixed foveated rendering (lower resolution in the corners of the image), which is significantly less visible in the headset itself due to blurring of the lens. Here’s Showdown running on PC if you’d like to see a comparison.

It’s not the best-looking thing we’ve seen on Quest 2, but it’s a good reminder that Quest 2’s low-power mobile chip can achieve something akin to PS2 graphics at 90Hz.

Meta’s Zac Drake published a two-part breakdown of the process of profiling the app’s performance with the company’s App Spacewarp tech, and the process of optimizing the app to run at 90Hz on Quest 2.

The GTX 980 GPU (which Showdown originally targeted on PC) is at least six times more powerful than the GPU in Quest 2… so there was a lot of work to do.

While the guide is specific to projects built with UE4, the overall process, as surmised by Drake, applies to optimizing any project to run on the headset:

  1. Get the project building and running on Quest
  2. Disable performance intensive features
  3. Measure baseline performance
  4. Optimize the stripped down project
  5. Optimize individual features as we re-enable them
  6. Re-enable feature
  7. Measure performance impact
  8. Optimize as needed

Although it’s plenty possible to get ambitious PC VR games running on Quest 2, building from the ground up with the headset in mind from the outset is sure to bring better results, as developer Vertical Robot is hoping to prove with its upcoming Red Matter 2.

The post Meta Releases UE4 Graphics Demo to Show What Quest 2 Can Do with Expert Optimization appeared first on Road to VR.

Researchers Show Full-body VR Tracking with Controller-mounted Cameras

Researchers from Carnegie Mellon University have demonstrated a practical system for full-body tracking in VR using cameras mounted on the controllers to get a better view of the user’s body.

Although it’s possible to achieve full-body tracking in VR today, it requires the use of extra hardware that needs to be strapped onto your body (for instance, Vive Trackers or IMU trackers). That makes full-body tracking a non-starter for all but hardcore VR enthusiasts who are willing to spent the money and time to strap on extra hardware.

Three vive trackers are used here to add torso and foot-tracking | Image courtesy IKinema

Because standalone VR headsets already have cameras on them to track their position in the world and the user’s controllers, in theory it’s also possible to look at the user’s body and use a computer-vision approach to tracking it. Unfortunately the angle of the cameras from the headset is too extreme to get a reliable view of the user’s legs, which is what led Meta to recently conclude that full-body tracking just isn’t viable on a standalone headset (especially as they get smaller).

But researchers from Carnegie Mellon University are challenging that notion with a prototype standalone VR system that adds cameras to the controllers to get a much clearer view of the user’s body, making it possible to extract reliable tracking data for the legs and torso.

What’s especially interesting about this approach is that it seems to align with the direction next-gen VR controllers are already heading; both Meta’s Project Cambria and Magic Leap 2 are using controllers that ditch a headset-dependent tracking system in favor of calculating their position with their own inside-out tracking system.

Image courtesy Carnegie Mellon University

Using a standard Quest 2 headset as the basis for their prototype system, the researchers added two cameras to the controller which face the user. With the user’s hands in front of them, the cameras can get a much clearer view of the upper and lower body. This view is corrected so a computer-vision system can optimally extract the user’s pose and then combine that data with the known position of the head and hands to create a full-body tracking model.

Image courtesy Carnegie Mellon University

Of course, the user’s hands won’t always been in front of them. The researchers say some limited testing showed that VR users have their hands out in front of them around 68% of the time. When the hands aren’t in a good position to capture the body, the system should fall back to an IK estimate of the body position. And though their prototype didn’t go this far, the researchers say they believe that with an additional camera angle on the controller, it should be possible to capture the leg position even when the user’s arms and controllers are resting at their side.

As for accuracy, the researchers, Karan Ahuja, Vivian Shen, Cathy Mengying Fang, Nathan Riopelle, Andy Kong, and Chris Harrison, say that millimeter tracking precision is probably out of the question for this sort of system, but centimeter tracking precision is likely on the table, which may be good enough for many VR use-cases. For their prototype specifically, the system had a “mean 3D join error of 6.98cm,” though the researchers say this should be “considered the floor of performance, not the ceiling,” given the limited time they spent optimizing the system.

With full-body tracking, legs are finally a viable part of the experience. That’s desirable not just to make your avatar look more realistic to other people, but also the option to incorporate your lower body into the experience, adding to immersion and providing another input for players to use in gameplay.

Image courtesy Carnegie Mellon University

The researchers not only created a full-blown tracking model for the system, they also made some prototype experiences to show off how tracked legs can add to gameplay. They showed a hockey goalie experience, where players can block the puck with any part of their body; a ‘body shape matching’ experience, where players match the shape of an incoming wall to fit through it; and even a ‘Feet Saber’ game, where players cut blocks with their hands and feet.

– – — – –

So could we see full-body tracking from headsets like Magic Leap 2 and Project Cambria? It’s tough to say at this point; although the controllers appear to do their own inside-out tracking, the cameras on the controllers seem to point mostly away from the user.

But maybe some future headset—or just an upgraded controller—could make it happen.

Regardless of where those headsets land, this research shows that practical, low-friction full-body tracking on standalone VR headsets might not be that far out of reach. Combined with the ability to run highly realistic face-tracking, the standalone headsets of the future will radically increase the embodiment felt in VR.

The post Researchers Show Full-body VR Tracking with Controller-mounted Cameras appeared first on Road to VR.