New Quest Dev Tools to Add Leg Estimation for More Convincing Avatars

Meta announced that it’s offering new developer tools for Quest headsets to make avatars more realistic. The company also unveiled a Quest 3-exclusive upper body tracking feature that supports a much wider range of body motion.

Announced at Connect 2023 late last week, Meta showed off some new features coming both to Quest 3 and the rest of the Quest platform.

On Quest 3, Meta says it will be able to use inside-out sensor data to optically track wrists, elbows, shoulders, and torso—or something the company is calling ‘Inside Out Body Tracking’ (IOBT). The Quest 3-exclusive feature also tracks where your legs are relative to your torso, making avatars capable of bending forward and peering over a cliff.

Image courtesy Meta

By using this upper body data to extrapolate lower body actions, the company says it can make avatars replicate more natural movements than traditional inverse kinematics (IK)-based methods.

The company also announced a feature called ‘Generative Legs’, which is headed to Quest 2/3/Pro in December. The developer tool is said to create more realistic leg movement using either three-point body tracking or the Quest 3-exclusive IOBT. It’s capable of recreating more natural standing and sitting poses, a more lifelike gait when walking, and also supports jumping, ducking and squatting.

Since it’s essentially guessing where your legs might naturally be in any given situation, Generative Legs won’t account for individual leg movement like a dedicated tracker might, such as a SteamVR tracking puck or Sony’s Mocapi motion capture device—that means your avatar can’t do karate or breakdance.

Still, it’s pretty impressive how much better the whole system is in comparison to standard IK. Granted, Quest users won’t be able to pull of the fancy footwork CEO Mark Zuckerberg did on the virtual stage at Connect 2022 last year, but it’s starting to look pretty close.

Check out Meta’s Generative Legs and the new Quest 3 upper body tracking feature in action in a Meta-built showcase app called Dodge Arcade:

Sony’s Mocap Device Lets ‘VRChat’ Users Unleash Their Inner Anime Girl

There are a few body tracking solutions on the market to help you on your quest to finally transform into a dancing anime girl, and Sony is now releasing its own previously Japan-only device in the US. 

Mocopi, which gets its name from motion capture (mocap), was initially announced in late 2022, becoming available exclusively in Japan in early 2023. While a ton of vTubers worldwide already jumped the gun and ordered direct from Japan, now Sony is making it officially available in the US, priced at $450.

Mocopi comes with six small and lightweight inertial measurement unit (IMU) sensors that hook into a dedicated smartphone app (iOS and Android), letting you do full-body motion tracking both in and outside of VR.

Image courtesy Sony

And while Mocopi seems to be squarely targeting those would-be vTubers, another use case the company is trumpeting is undoubtably the device’s ability to give avatars better full-body tracking for things like VRChat. Yes, there’s a built-in VRChat integration, which means you can grab your Quest 2 or PC VR headset, hook up Mocopi to your extremities, and get dancing for all to see.

Like many such IMU-based tracking devices, positional drift is a real concern, although it seems Sony is pitching this more as a way to casually jump into body tracking and not get that 100 percent accuracy you’ll need when doing the [fill in a popular dance] on TikTok.

Coming from Sony, you’d think there would be some sort of integration with PSVR 2, although that doesn’t seem to be a possibility. The company hasn’t mentioned any such integration since it initially launched in Japan in January.

Mocopi is already available for purchase, available exclusively from Sony for $450. Sony says Mocopi orders will ship to customers starting July 14th, 2023. Check out the Sony’s quick start guide below to get see just what you’re signing up for.

Hands-on: HTC’s New Standalone Vive Tracker Effortlessly Brings More of Your Body Into VR

With three versions of SteamVR trackers under its belt, HTC has been a leading enabler of full-body tracking in VR. Now the company’s latest tracker could make it even easier to bring your body into VR.

HTC’s new standalone Vive tracker (still unnamed) has a straightforward goal: work like the company’s existing trackers, but easier and on more platforms.

The ‘easier’ part comes thanks to inside-out tracking—using on-board cameras to allow the device to track its own position, rather than external beacons like those used by the company’s prior trackers.

Photo by Road to VR

To that end, things seem really promising so far. I got to demo the new Vive tracker at GDC 2023 this week and was impressed with how well everything went.

Photo by Road to VR

With two of the new Vive trackers strapped to my feet, I donned a Vive XR Elite headset and jumped into a soccer game. When I looked down at my feet, I saw a pair of virtual soccer shoes. And when I moved my feet in real-life, the soccer shoes moved at the same time. It took less than two seconds for my mind to say ‘hey those are my feet!’, and that’s a testament to both the accuracy and latency being very solid with the new tracker.

That’s not a big deal for older trackers that use SteamVR Tracking, which has long been considered the gold standard for VR tracking. But to replicate a similar level of performance in a completely self-contained device that’s small and robust enough to be worn on your feet… that’s a big deal for those who crave the added immersion that comes with bringing more of your body into VR.

Throughout the course of my demo, my feet were always where I expected to see them. I saw no strange spasms or freezing in place, no desync of coordinate planes between the tracker and the headset, and no drifting of the angle of my feet. That allowed me to easily forget that I was wearing anything special on my feet and simply focus on tracking to kick soccer balls into a goal.

While the tracker worked well throughout, the demo had an odd caveat—I had feet but no legs! That makes it kind of weird to try to juggle a soccer ball when you expect to be able to use your shin as a backboard but watch as the ball rolls right over your virtual foot.

Ostensibly this is the very thing that trackers like this should be able to fix; by attaching two more trackers to my knees, I should be able to have a nearly complete representation of my leg movements in VR, making experiences like ‘soccer in VR’ possible when they simply wouldn’t work otherwise.

I’m not sure if the demo app simply wasn’t designed to handle additional tracking points on the knees, or if the trackers are currently limited to just two, but HTC has confirmed the final inside-out Vive tracker will support up to five trackers in addition to the tracked headset and controllers.

Trackers can, of course, be used to track more than just your body, though apps that support these kinds of tracked accessories are rare | Photo by Road to VR

So the inside-out factor is the ‘easier’ part, but what about the other goal of the tracker—to be available on more platforms than just SteamVR Tracking?

Well, the demo I was playing was actually running purely on the standalone Vive XR Elite. To connect the trackers, a small USB-C dongle needs to be connected to the headset to facilitate the proprietary wireless connection between the dongle and the trackers. HTC says the same dongle can plug into a PC and the trackers will work just fine through SteamVR.

The company also says it’s committed to making the trackers OpenXR compatible, which means (in theory) any headset could support them if they wanted.

– – — – –

I only got to use it in one configuration (on my feet) and in one environment (a large office space). So there’s still the question of how robust they will be. For now though, I’m suitably impressed.

If these trackers really work as well as they seem from their first impression, it could open the door to a new wave of people experiencing the added immersion of full-body tracking in VR… but there’s still the lingering question of price, which historically never seems to be quite right consumer VR market when it comes to HTC. Until then, our fingers shall remain crossed.

This Developer Made A Massive Outdoor VR Racing Game (On A Race Track)

A new experimental Quest game lets its maker jump on a real bike and race it on an actual sprint track. But, no, you shouldn’t try it yourself.

Content creator Valem is behind the project, which saw him remove the Quest’s Guardian limitations and head outside to a race track. Quest headsets aren’t meant to be used outside and the camera-based tracking can be spotty in high light conditions (not to mention you risk damaging those cameras in direct sunlight). We tested the kit outdoors when we first got one, too, but the results were pretty mixed.

It goes without saying that you shouldn’t try this yourself given that it’s dangerous to both you and your headset. Nevertheless, you can check out the developer’s work in the video below.

Valem took the race track and imported a map of it to Unity, allowing him to build virtual environments around the course. He then attached a Quest controller to the front of a bike, allowing him to track it in VR with a virtual model. As a final touch, obstacles were added around the course (in VR, not in real life) to avoid. Some of them were even moving, giving his game an assault course style. He then tasked himself with reaching the finish line as fast as possible.

Though a little rough and definitely not something others should try, it’s pretty fun to see in action. In fact, in one moment of tracking troubles, Valem manages to gracefully fall into a bush. All the same, it’s a pretty fascinating experiment that perhaps better demonstrates how AR might enhance outside activities in fun new ways more than VR.

Researchers Show Full-body VR Tracking with Controller-mounted Cameras

Researchers from Carnegie Mellon University have demonstrated a practical system for full-body tracking in VR using cameras mounted on the controllers to get a better view of the user’s body.

Although it’s possible to achieve full-body tracking in VR today, it requires the use of extra hardware that needs to be strapped onto your body (for instance, Vive Trackers or IMU trackers). That makes full-body tracking a non-starter for all but hardcore VR enthusiasts who are willing to spent the money and time to strap on extra hardware.

Three vive trackers are used here to add torso and foot-tracking | Image courtesy IKinema

Because standalone VR headsets already have cameras on them to track their position in the world and the user’s controllers, in theory it’s also possible to look at the user’s body and use a computer-vision approach to tracking it. Unfortunately the angle of the cameras from the headset is too extreme to get a reliable view of the user’s legs, which is what led Meta to recently conclude that full-body tracking just isn’t viable on a standalone headset (especially as they get smaller).

But researchers from Carnegie Mellon University are challenging that notion with a prototype standalone VR system that adds cameras to the controllers to get a much clearer view of the user’s body, making it possible to extract reliable tracking data for the legs and torso.

What’s especially interesting about this approach is that it seems to align with the direction next-gen VR controllers are already heading; both Meta’s Project Cambria and Magic Leap 2 are using controllers that ditch a headset-dependent tracking system in favor of calculating their position with their own inside-out tracking system.

Image courtesy Carnegie Mellon University

Using a standard Quest 2 headset as the basis for their prototype system, the researchers added two cameras to the controller which face the user. With the user’s hands in front of them, the cameras can get a much clearer view of the upper and lower body. This view is corrected so a computer-vision system can optimally extract the user’s pose and then combine that data with the known position of the head and hands to create a full-body tracking model.

Image courtesy Carnegie Mellon University

Of course, the user’s hands won’t always been in front of them. The researchers say some limited testing showed that VR users have their hands out in front of them around 68% of the time. When the hands aren’t in a good position to capture the body, the system should fall back to an IK estimate of the body position. And though their prototype didn’t go this far, the researchers say they believe that with an additional camera angle on the controller, it should be possible to capture the leg position even when the user’s arms and controllers are resting at their side.

As for accuracy, the researchers, Karan Ahuja, Vivian Shen, Cathy Mengying Fang, Nathan Riopelle, Andy Kong, and Chris Harrison, say that millimeter tracking precision is probably out of the question for this sort of system, but centimeter tracking precision is likely on the table, which may be good enough for many VR use-cases. For their prototype specifically, the system had a “mean 3D join error of 6.98cm,” though the researchers say this should be “considered the floor of performance, not the ceiling,” given the limited time they spent optimizing the system.

With full-body tracking, legs are finally a viable part of the experience. That’s desirable not just to make your avatar look more realistic to other people, but also the option to incorporate your lower body into the experience, adding to immersion and providing another input for players to use in gameplay.

Image courtesy Carnegie Mellon University

The researchers not only created a full-blown tracking model for the system, they also made some prototype experiences to show off how tracked legs can add to gameplay. They showed a hockey goalie experience, where players can block the puck with any part of their body; a ‘body shape matching’ experience, where players match the shape of an incoming wall to fit through it; and even a ‘Feet Saber’ game, where players cut blocks with their hands and feet.

– – — – –

So could we see full-body tracking from headsets like Magic Leap 2 and Project Cambria? It’s tough to say at this point; although the controllers appear to do their own inside-out tracking, the cameras on the controllers seem to point mostly away from the user.

But maybe some future headset—or just an upgraded controller—could make it happen.

Regardless of where those headsets land, this research shows that practical, low-friction full-body tracking on standalone VR headsets might not be that far out of reach. Combined with the ability to run highly realistic face-tracking, the standalone headsets of the future will radically increase the embodiment felt in VR.

The post Researchers Show Full-body VR Tracking with Controller-mounted Cameras appeared first on Road to VR.

“Major improvements” Coming to Quest 2 Hand-tracking with ‘2.0’ Upgrade

Meta today announced “major improvements” coming to Quest 2’s controllerless hand-tracking capability. The ‘re-architected computer vision and machine learning approach’ is said to specifically improve reliability for overlapping or fast moving hands and specific gestures. The SDK and OS update to enable these improved capabilities will begin rolling out today.

Meta first introduced controllerless hand-tracking to the original Quest back in late 2019 where it remained an ‘experimental’ feature until mid-2020 when it began allowing developers to use the new capability in their apps.

Since then we’ve seen a handful of games incorporate hand-tracking into their apps and even the launch of some games that exclusively rely on hand-tracking, like Hand Physics Lab (2021) and Unplugged: Air Guitar (2021).

Now, a little less than a year later, Meta says it’s bringing “major improvements” to Quest 2’s hand-tracking capability (the company confirmed the original Quest will not receive these improvements).

The improvements come thanks to a ‘re-architected computer vision and machine learning approach’ which improves the robustness of hand-tracking in key ways.

With the 1.0 version of hand-tracking on Quest 2, the system had particular trouble recognizing the user’s hands when they obstructed or touched each other and when moving quickly. From the user’s point of view, their virtual hands would disappear momentarily during these lost tracking moments and then reappear once the system detected them again.

With the 2.0 version of hand-tracking on Quest 2, Meta says the system will handle those obstructed and fast-moving scenarios much better, leading to fewer instances of disappearing hands. The company calls it a “step-function improvement in tracking continuity.”

The update is also said to improve gesture recognition in the hand-tracking system. Gesture recognition looks for specific hand-poses which the system detects as unique and can therefore be used as inputs. For instance, pinching is one such gesture and it’s employed to allow users to ‘click’ on elements in the Quest interface.

In the demo below, a ‘grab’ gesture is used to hold the virtual object, and the improvement in robustness for clapping is demonstrated as well.

– – — – –

Meta says the hand-tracking 2.0 update on Quest 2 will begin rolling out via an SDK and OS update starting today. The company says developers who have already built hand-tracking into their apps won’t need to change any API calls in order to use the upgraded system, though it won’t be automatically enabled. The company says developers can reference “upcoming documentation” for enabling it in their apps.

The move should bring Quest 2’s hand-tracking a step closer to Ultraleap, which has maintained some of the best hand-tracking in the industry to date, though it isn’t clear yet how the two systems will stack up.

The post “Major improvements” Coming to Quest 2 Hand-tracking with ‘2.0’ Upgrade appeared first on Road to VR.

HTC Opens Pre-orders for ‘Mars CamTrack’ Virtual Production Box

HTC teased a new Vive product last week which aims to leverage the company’s Vive Trackers for virtual productions. Called Vive Mars CamTrack, the box is appealing to filmmakers looking for an easy way of integrating Vive Trackers into their productions by shrinking complicated workflow into a compact box.

Update (April 26th, 2022): HTC announced its VIVE Mars CamTrack is ready for pre-order, priced at $5,000. The early bird pre-order includes a $200 redeem code for Glassbox Technologies professional virtual camera, DragonFly, the company says in a blog post.

Outside of the the Mars production box itself, the package includes three camera mounts (named Rover), two SteamVR 2.0 Base stations, and two VIVE Tracker (3.0). To learn more, check out the product website here.

HTC also dropped a video showing Mars CamTrack in action, featuring testimonials from Ryan Connolly of Film Riot, Norman Wang from Glassbox Technologies, Sam Gorski of Corridor Digital, and Paul Hamblin of Treehouse Digital.

The company is also demoing VIVE Mars CamTrack at the FMX conference in Stuttgart, Germany, from May 3-5. Check out the official event website here.

Original Article (April 18th, 2022): HTC’s Vive brand has been exclusively focused on XR technologies, but late last week the company teased a new Vive product that is likely more tangential to the XR space than part of it.

The new product was shown pictured essentially in full, which gives us some strong hints about what it will do.

Image courtesy HTC

The small box clearly has a display which shows that it can detect three Vive Trackers and four SteamVR Tracking base stations. And while the DisplayPort, HDMI, and three USB ports might suggest this is a compact PC that can handle its own VR rendering, other hints point toward a less powerful, purpose-built control device for detecting, capturing, and relaying the position of the trackers.

Why might you want such a thing? Virtual production—using mixed reality-like technologies for shooting film productions—is the obvious answer. This is bolstered by the ‘Timecode’ and ‘Genlock’ readouts on the box’s display, which are commonly used to keep film and audio equipment in perfect sync.

Positional tracking is hugely important in virtual production, especially for tracking cameras, whether you’re shooting entirely against a green-screen or an LED wall.

In the first case, precisely capturing the movement of the camera makes like much easier in post-production when CGI comes into play. Instead of manually aligning the real shot against CGI elements, the virtual camera can be easily aligned to the real camera to keep everything in sync.

If shooting against an LED wall (a large panoramic display that shows background imagery rendered in real-time at the time of filming), you need to know the precise position of the real camera in order to have the background imagery move realistically in real-time.

Beyond camera tracking, accurate position tracking in production can be used to track props, actors, and more, which makes them more easily replaced or altered in post-production.

Of course, there’s plenty of positional tracking technologies that have been used in the film space for decades at this point… so why would HTC be getting into the game?

Well, compared to most of what’s out there, Valve’s SteamVR Tracking system is affordable, easy to set up, fairly precise, and dang cheap. And HTC is the leading provider of SteamVR trackers, small tracking pucks which are compatible with the system.

For around a thousand dollars—excluding the price of HTC’s new… let’s call it ‘Vive Tracker box’—you can have a reasonably sized tracking volume with four SteamVR Tracking base stations to precisely track three trackers (though hopefully the box will support more than just three, which would make the system easily extensible).

Compare that to something more commonly seen in the virtual production space, like OptiTrack, which can do more but starts closer to $10,000 and can easily exceed $100,000 if you want to increase the size of the volume.

Back to HTC’s new Vive product; it’s already possible to use SteamVR Tracking for virtual production use-cases, but it isn’t exactly a straightforward process. Not only do you need a dedicated PC with uncommon (in the film space) software installed (SteamVR), but you also need a USB dongle for each Vive Tracker that you plan to use.

The HTC Vive Tracker box is probably designed to be a turnkey solution that’s ready to go without any software installation or extra dongles, plus the ability to sync the positional tracking data timing precisely with other production equipment on set.

– – — – –

The end goal here is not just for HTC to make money by selling the box, but also by selling more of its Vive Trackers. The company’s tracking pucks are popular among hardcore VR enthusiasts who want to do full-body tracking, but that’s a highly niche audience. General purpose tracking, for virtual production or otherwise, is a much larger potential market for HTC to tap, even if it does mean veturing a bit outside of what has been the usual wheelhouse of the Vive brand.

And while Vive tracker box probably doesn’t mean too much for the XR industry itself, it may tell us more about how HTC’s XR arm—Vive—is faring right now.

With its last four major VR headset releases seemingly not generating much traction for the company in the consumer VR space where it once dominated, Vive is veering into new territories in search of business. Beyond the Vive Tracker box here, another recent example is Vive Arts, the company’s effort to stake a claim in the NFT art scene.

So far HTC has not officially announced the Vive Tracker box beyond the teaser photo—so we still don’t have details like when it will launch, how it will be priced, or what specific features it will have—though we expect those details to come soon.

The post HTC Opens Pre-orders for ‘Mars CamTrack’ Virtual Production Box appeared first on Road to VR.

Latest Manus VR Gloves Promise New Levels of Finger Tracking Accuracy

At GDC 2022 this week, VR glove creator Manus revealed its new Quantum Metagloves which the company says delivers significantly more accurate finger tracking than its prior solutions. Though priced for enterprise use, the company says it one day hopes to deliver the tech to consumers.

Manus has been building motion gloves for use in real-time VR and motion capture for years now, with prior offerings being based on IMU and flex-sensor tracking.

The company’s latest product, the Quantum Metagloves, moves to a new magnetic tracking approach which purportedly offers significantly more accurate finger tracking, especially when it comes to self-contact (ie: fingers touching other fingers or the palm of the hand).

Revealed at GDC 2022 for the first time, Manus showed off a demo of the Quantum Metagloves using a realistic real-time hand model that mirrored the wearer’s finger movements. Though the gloves are designed to work in conjunction with 6DOF tracking (via a SteamVR tracker or other motion tracking tech), the GDC demo didn’t employ 6DOF (which is why the visualization of the arm rotates in place). The latency reflected in this setup is also purportedly not representative of the actual tracking latency.

The Quantum Metagloves have a magnetic base positioned on the back of the palm while each finger has a module on the tip that is sensed within the magnetic field. Manus says this means the gloves can detect absolute finger length and width (once calibrated), which enables more accurate hand-tracking when combined with an underlying skeletal model of the hand that is scaled dynamically to the user.

Photo by Road to VR

In the video I asked the demonstrator to make a handful of different poses. Indeed, finger-to-finger and finger-to-palm contact looked impressive with no obvious clipping or stuttering. The company told me the demo wasn’t specially programmed to make clipping impossible and that the behavior was purely thanks to the positional data of the sensors which was described as “very clean” compared to alternative approaches to finger tracking.

Manus says the Quantum Metagloves are unique in this way, as other finger tracking technology tends to break down in these sorts of close-contact and self-contact scenarios, especially when both hands are near or touching each other. Even expensive optical tracking systems (with markers on the tips of each finger) can be foiled easily by self-occlusion or one hand occluding the other. Similarly, purely IMU-based finger tracking is prone to drift and requires regular recalibration.

But magnetic tracking is by no means perfect. In other magnetic tracking systems we’ve seen challenges with latency and electromagnetic interference.

Manus admitted that holding metallic or electronic items could throw off the tracking, but says it worked hard to ensure the gloves don’t interfere with each other; up to eight gloves can be active near each other without interference issues, the company says.

While self-contact looked generally quite good with the Quantum Metagloves, other poses didn’t fare quite as well—like a completely clenched first. The demonstrator suggested this would be improved easily with a more robust calibration process that included similar poses; whereas they say the calibration used for the demo at GDC was designed to be quick and easy for purposes of the show.

Photo by Road to VR

While the finger tracking did look great in many of the demos I saw, some of the other demo gloves on display showed much less accuracy. This was chalked up to “calibration,” though a big question for such systems is how much said calibration drifts over time and whether the periods between recalibration are practical for a given use-case.

In any case, use-cases will be deeply constrained by price; Manus says a pair of the Quantum Metagloves will cost $9,000, with pre-orders opening in April and shipments expected by the end of Q3. The company says it also plans to launch a haptic version of the Quantum Metagloves which will include per-finger haptics to enhance immersion in VR.

Manus maintains that it would like to bring its gloves to consumers one day, but says the number of custom parts and manufacturing makes it difficult to get the price down to a reasonable level.

The post Latest Manus VR Gloves Promise New Levels of Finger Tracking Accuracy appeared first on Road to VR.

Meta Says Full-body Tracking Probably Not Viable with Inside-out Headsets

Much ado has been made about Meta’s latest avatars which juxtapose impressive expression against, well… a complete lack of legs. While full-body tracking is both desirable and achievable today with outside-in tracking systems, Meta doesn’t think it’s viable with inside-out tracking on headsets like Quest 2. However, the company say it’s investigating ‘fake’ legs instead.

Andrew “Boz” Bosworth is the VP of Meta Reality Labs (the company’s XR division), and soon to be CTO of Meta overall. In his role heading the company’s XR division he’s made a habit of doing impromptu Q&A sessions via Instagram where he answers both personal and work-related questions.

In his latest Q&A he was asked about the potential for full-body tracking on future headsets from the company, but dismissed the idea as not viable with inside-out tracking.

While Quest 2 is presently capable of head and hand-tracking (which makes it relatively easy to estimate the position of the arms and chest too), the headset has no concept of where your legs, feet, or hips are, and that’s why Meta’s avatars are essentially chopped off below the waist when you see them in VR.

Image courtesy Meta

On the other hand, other VR systems using cameras external to the headset (known as outside-in tracking) are capable of full-body tracking to enable more lifelike avatars and some use-cases that simply wouldn’t be possible without it.

It’s been suggested that with some computer-vision trickery, perhaps tracking cameras mounted on the headset (known as inside-out tracking) could be used to estimate the position of the user’s feet. Bosworth reasoned that this is not only extremely difficult given the position of the cameras on Quest 2, but will become even harder as headsets shrink.

Body tracking is super tricky. Because from the camera that’s on your face it can’t see your legs very well. And as we want to make the [headset] smaller form-factor, it gets even worse—[the cameras] can’t even see past your cheek sometimes to your upper body. Now we can get away pretty well with the upper body because we can see your arms, elbows, hands, and we kind of have a sense of what the musculoskeletal structure must be doing behind it, but feet are tough.

So [using outside-in tracking as opposed to inside-out] is probably necessary for some of the [full-body tracking] use-cases people have in mind. So that’s one of the things that we’re looking at.

Now there’s been a leaked thing [about full-body tracking on Quest]… I think that’s actually probably a little premature… there’s nothing substantial behind that necessarily. But it’s something that we’re always looking at and have in mind for sure.

The “leaked thing” Bosworth mentioned is likely a hint about body tracking in Quest 2 documentation that was spotted by our friends over at UploadVR.

Later in the Q&A, Bosworth again addressed Meta’s legless avatars and suggested that the company will probably move forward with ‘fake’ legs, which would only be seen from a third-person vantage point, as a stopgap.

Tracking your own legs accurately is super hard and basically not workable just from a physics standpoint with existing headsets [that use inside-out tracking]. You could go outside-in body tracking—but that’s an extra component, an extra cost, an extra setup—it’s a lot. Or you could have no legs, but everyone else when they look at you could see that you have legs and we could fake it, and no one would know the difference. That’s a better direction.

Obviously we’ve read the jokes [about our legless avatars], we appreciate them; they are very funny and very fair. So we are looking at how we can do [some form of legs]… if you look at your own legs and you see them out of position, that is a very bad experience and you feel very dysmorphic, but if you look at someone else’s legs and we just made up their position but they seemed reasonable, you’d just be like ‘yeah that’s probably where [their legs] are.’

Naturally Bosworth understands that the fake legs approach wouldn’t enable use-cases like using your legs as game input or for things like dancing, but at least in simple social VR settings—where avatars are just hanging out and chatting—it would be nice to have avatars that aren’t awkwardly missing their lower half while floating in the air.

Elsewhere in the Q&A Bosworth was asked if Meta would consider subscription-based access to the Quest content library (akin to Xbox Game Pass or Viveport Infinity). He said he didn’t think the company “has a catalog yet that could sustain a subscription,” further saying that cross-platform content would probably also be necessary for that business model to work (which is not presently the case with the Quest store). He did indicate that the company is likely to introduce gift cards for the Quest store in the future.

The post Meta Says Full-body Tracking Probably Not Viable with Inside-out Headsets appeared first on Road to VR.

Ultraleap Hand-tracking Update Delivers Improved Two-handed Interactions

The latest version of Ultraleap’s hand-tracking tech is finally available today on Windows for use with the Leap Motion Controller accessory and promises to improve two-handed interactions, speed, and robustness. The release includes a demo experience showcasing how hand-tracking can be used as a primary input for a standalone XR device.

Ultraleap today publicly released ‘Gemini’, the company’s fifth-generation hand-tracking software which was initially made available in a developer preview earlier this year. The improved hand-tracking software has already been deployed to headsets like Varjo’s and been made available for devices based on Qualcomm’s Snapdragon XR2, and now it can be downloaded on Windows to be used with the company’s existing Leap Motion Controller accessory which can be mounted to VR headsets. Support for MacOS and Linux are expected further down the road.

While the Leap Motion Controller is by now quite old, the company has continued to refine the software that underlies it, improving on what is already recognized as some of the best hand-tracking tech available in the industry. More recently, Ultraleap has released improved hand-tracking modules with a wider field-of-view and other improvements, though these aren’t available as a standalone accessory.

Image courtesy Ultraleap

With the Gemini update, Ultraleap says it has improved two-handed interactions, initialization speed, and the robustness of its hand-tracking. Alongside the Windows release of Gemini, the company is also making available an ‘XR Launcher’ demo experience which shows how the hand-tracking tech can be used for a fully functional XR interface.

The post Ultraleap Hand-tracking Update Delivers Improved Two-handed Interactions appeared first on Road to VR.