China’s Largest Telecom Forms Metaverse Industry Alliance, Including Xiaomi, Huawei, HTC & Unity

China Mobile, that country’s largest wireless carier with over 940 million subscribers, has formed a metaverse industry alliance including some of the biggest names in China-based tech.

As reported by Shanghai Securities News (Chinese), China Mobile announced during Mobile World Congress Shanghai what it calls the ‘China Mobile Metaverse Industry Alliance’, something the company says will be “the world’s strongest metaverse circle of friends.”

At MWC Shanghai, state-owned China Mobile announced the first batch of 24 members of the alliance, including Huawei, Xiaomi, HTC Vive, Unity China, NOLO, XREAL (formerly Nreal), AI company iFlytek, video streaming platform MGTV, and cloud streaming platform Haima Cloud.

Image courtesy China Mobile

Main objectives include improving the state of metaverse development in China, sharing resources to deepen cooperation between the companies, and developing a “win-win concept” to share the new dividends of the digital economy. China Mobile additionally announced a member alliance fund that will support outstanding metaverse projects as well as R&D for both hardware and XR content creation.

At the MWC Shanghai press conference, Zhao Dachun, deputy general manager of China Mobile, said that the metaverse represents a new opportunity for trillions of yuan (hundreds of billions of USD) and “an important carrier to accelerate the construction of digital China and realize the digital economy.”

China Mobile isn’t new to the space. In 2018, China Mobile partnered with HTC to “accelerate the proliferation of 5G infrastructure and devices in China” and provide HTC with greater push to get its VR devices into more retail channels.

In 2021, the company launched its own XR interoperability standard called GSXR (General Standard for XR), which included support from many of the companies listed above in addition to Pico, Rokid, Oppo, Baidu, Tencent, China Telecom, and Skyworth.

Migu, China Mobile’s streaming content subsidiary, has also recently built a new ‘Metaverse Headquarters’ in Xiamen, China. There, the company says it will leverage 5G and XR technologies to help build Xiamen into “high-quality, high-value, modern and international” city with digital intelligence, China Daily reports.

Bonelab ‘May End Up’ Using 120 Hz On Quest 2 Via Application SpaceWarp

Bonelab “may end up” using 120 Hz refresh rate on Quest 2 via Application SpaceWarp, one of its developers said.

Quest 2 gives game developers a choice between four refresh rate modes: 72 Hz, 80 Hz, 90 Hz, and 120 Hz. Technically there’s also a 60 Hz mode, but the store & App Lab only allows this for video content, not immersive apps or games.

Stress Level Zero’s Brandon Laatsch revealed on Twitter that Bonelab – the Boneworks follow-up revealed at the Meta Quest Gaming Showcase – is currently using 90 Hz for physics, render and display, but “may end up” doing 120 Hz physics, 60 Hz render and 120 Hz display with ASW – Application SpaceWarp. But what does that mean?

Application SpaceWarp is an advanced extrapolation technology on Quest that lets apps render at half frame rate by generating every other frame synthetically. It uses the depth buffer and motion vectors provided by the game engine to extrapolate a plausible next frame for every real frame. The depth buffer is a low resolution version of each frame representing the distance of each pixel from your eye instead of color, while the motion vectors represent the movement of pixels from one frame to the next.

When revealing ASW back in November Meta claimed that when the overhead is taken into account, it can give apps roughly 70% more power to work with compared to rendering at full framerate.

Application SpaceWarp
An example of using ASW for 36 FPS rendering and 72 Hz display

Like Boneworks before it, Bonelab will have a heavy focus on physics-based interactions, with almost all objects grabbable and even interacting with the player’s body. Game engines like Unity allow physics calculations to run separately from rendering, so if Stress Level Zero does decide to go with ASW the fidelity and responsiveness of the physics engine wouldn’t be affected, and in fact may even improve.

It’s unclear how 60 FPS -> 120 Hz ASW would actually feel and perform compared to 90 Hz in practice – that’s probably what Stress Level Zero is experimenting with ahead of Bonelab’s release later this year.

Meta Releases Hand Interaction & Tracked Keyboard SDKs For Quest

Meta just released a hand Interaction SDK and Tracked Keyboard SDK for its Quest VR headsets.

Interaction SDK

There are already interaction frameworks available on the Unity Asset Store, but Meta is today releasing its own free alternative as an experimental feature. That experimental descriptor means developers can play around with it or use it in SideQuest apps, but can’t yet ship apps using it to the Store or App Lab.

Interaction SDK supports both hands and controllers. The goal is to let developers easily add high quality hand interactions to VR apps instead of needing to reinvent the wheel.

The SDK supports:

  • Direct grabbing or distance grabbing of virtual objects, including “constrained objects like levers”. Objects can be resized or passed from hand to hand.
  • Custom grab poses so hands can be made to conform to the shape of virtual objects, including tooling that “makes it easy for you to build poses which can often be a labor intensive effort”.
  • Gesture detection including custom gestures based on finger curl and flexion.
  • 2D UI elements for near-field floating interfaces and virtual touchscreens.
  • Pinch scrolling and selection for far-field interfaces similar to the Quest home interface.

Meta says Interaction SDK is already being used in Chess Club VR and ForeVR Darts, and claims the SDK “is more flexible than a traditional interaction framework—you can use just the pieces you need, and integrate them into your existing architecture”.

Tracked Keyboard SDK

Quest is capable of tracking 2 keyboard models: Logitech K830 and Apple Magic Keyboard. If you pair either keyboard using bluetooth it will show up as a 3D model in the home environment for 2D apps like Oculus Browser.

Tracked Keyboard SDK allows developers to bring this functionality to their own Unity apps or custom engines. Virtual keyboards are slower to type with and result in more errors, so this could open up new productivity use cases by making text input in VR practical.

The SDK was made available early to vSpatial, and has been used by Meta’s own Horizon Workrooms for months now.

Unity’s Incredible Mixed Reality Demo is Coming to Quest in 2022

Earlier this year the Unity Labs team shared an incredible proof-of-concept mixed reality demo that shows the power of blending the real and virtual worlds together. Now the developers behind the project say you’ll be able to get your hands on the experiment on Quest sometime next year.

At Facebook Connect at the end of October, the Unity Labs team revealed Unity Slices: Table, a proof-of-concept social mixed reality app that seamlessly connects people—both local and remote—into a shared experience centered around a game board.

It’s tough to explain so let’s jump right to a video example:

Take a look at the video above. We can see two other users around the table, and then we see the view transitioning seamlessly between the passthrough view of the real world and the virtual world. But do you notice anything else interesting?

Of the two users we see, one is actually there, and the other is not. As the virtual view is wiped away in favor of the real-world view, the local player’s real body becomes visible, but the virtual player’s body remains as an avatar (because they aren’t actually there in the real room).

As Eric Provencher, one of the developers behind the project, explained in a breakdown of the demo, one goal of Unity Slices: Table is to dissolve the barrier between local and remote users by making either kind of player feel equally present in the experience.

This is why the core of the experience is built around a virtual chess board which serves as a central anchor for everyone in the scene, whether they are actually in the same room or on opposite sides of the world. Beyond bringing everyone together around a shared point in space, the chess board rests atop a real surface, which turns it into a sort of virtual touchscreen with real haptics (thanks to the real surface underneath). Everyone (local or remote) collectively ‘touches’ the same board within the same spatial frame of reference, making it too feel like a shared piece of reality.

“It took us awhile to get to a system that worked smoothly, but the moment we first hopped into a networked session with expressive avatars, and could both see and hear the other person tapping our table over the voice chat as if we were in the same room, was truly mind-blowing,” wrote Provencher. “It felt almost magical to bring this tangible part of our reality into a shared experience.”

This week Provencher affirmed that Unity Slices: Table will be released as a demo on Oculus Quest for anyone to try.

“My team is still hard at work polishing up Unity Slices: Table up for release, look for it on App lab in 2022!”

Beyond being an incredible mixed reality demo, hopefully it’ll also be a fully functional multiplayer chess app. We look forward to mixing reality ourselves next year.

The post Unity’s Incredible Mixed Reality Demo is Coming to Quest in 2022 appeared first on Road to VR.

Manus Launches its Free Motion Capture Software Polygon

Manus Polygon

Manus specialises in building enterprise-level data gloves with precision finger tracking and haptic feedback for a range of use cases including virtual reality (VR). The company is moving beyond pure hardware solutions today by releasing Manus Polygon, motion capture software that’s SteamVR compatible and free to download.

Manus Polygon

Designed as an entry point for developers looking for a simple motion capture solution without the expense, Polygon Free enables live streaming of body data into Unity or Unreal Engine. When it comes to tracker support, Polygon can be used with any SteamVR compatible device, from the Vive controllers for a basic setup to Manus’ own SteamVR Pro Trackers or the Vive Trackers. And, of course, the software is compatible with the company’s own Prime X series gloves.

For a basic motion tracking setup beyond merely using controllers, developers need enough trackers to cover six points, hands, feet, waist and head. With a VR headset on that means five extra trackers are required. Polygon can support more though, adding further trackers to the upper arms to finesse that digital avatar movement.

“At Manus, we believe in a future where content creation in the virtual world becomes as integrated as video is currently. Convincing full-body motion capture will play a large part in the adoption and creation of the metaverse,” says Bart Loosman, CEO at Manus in a statement. With this release, we invite developers and content creators to dive into full-body motion capture and explore the opportunities this offers for VR, animation, digital avatars, virtual production, and the coming metaverse.”

Manus Polygon

Manus Polygon Free provides all the software functionality developers might need to get started, with Polygon Pro and Polygon Elite offering further professional features. Polygon Pro features recording and editing tools within the Manus Core, as well as FBX exporting, timesync and genlock. Pro users will also get the Manus Strap Set to attach SteamVR compatible trackers. Taking that a step further is Polygon Elite which includes the Pro bundle, a perpetual license, and Manus SteamVR Pro Trackers and a charging station.

The Manus SteamVR Pro Trackers were announced earlier this year with pre-orders being taken for them individually. On the Manus website currently, they only seem to come in a 6-pack retailing for €1,999 EUR, available Q4 2021. By comparison, six Vive Trackers would set you back €834.

For continued updates from Manus, keep reading VRFocus.

Facebook Deprecates Proprietary Oculus APIs In Favor Of OpenXR

Facebook will deprecate its proprietary Oculus APIs in favor of industry standard OpenXR.

Facebook says new features “will be delivered via OpenXR extensions” starting with v31, echoing language release by Valve last year regarding new features on SteamVR being connected to OpenXR as well.

According to Facebook, in August of 2022 the existing Oculus Native Mobile and PC APIs will become “unsupported”, meaning that “existing applications will continue to function on Oculus devices” but new applications will be required “to use OpenXR, unless a waiver is provided.” In the interim, Facebook will “help developers build new applications with OpenXR via our Developer Site” and “perform QA testing of OpenXR to ensure features are working.”

Facebook “will be unable to provide access to Oculus Native Mobile and PC APIs but will allow existing applications to continue to use them” and “can provide recommendations for migration of existing applications to OpenXR via guides but are unable to assist with creation of new applications with Oculus Native and PC APIs.”

Broad industry support for OpenXR from Facebook and other major VR players like Valve, Microsoft and HTC — as well as game engines from companies like Unity and Epic Games — should make it easier for developers to make VR apps that run on a wide range of hardware. Microsoft’s Flight Simulator VR is one of the first OpenXR-compatible titles. At the end of 2020 Facebook started recommending game engines use OpenXR.

VR Industry Ramifications

“This is the right move at the right time,” wrote original Oculus Rift creator Palmer Luckey in a direct message. “One standard to rule them all didn’t make sense in the earlier days of VR given the fundamentally different approaches of different companies on the hardware and software side, to say nothing of the business component – there was a time when SteamVR/OpenVR (which was not actually open) had huge issues and many companies were philosophically opposed to things like reprojection, the pain developers went through supporting various APIs was critical in building industry consensus on what works best and why. HTC is probably going to benefit the most from widespread OpenXR adoption on the corporate side in the near future, but there are some upcoming entrants who also stand to gain a lot. Industry-wide standardizing to the lowest common denominator still has some downsides, but they are almost certainly outweighed by the benefits to developers and gamers.”

While the move should make it easier for developers of new apps to build for multiple hardware platforms, those building with earlier APIs or older versions of game engines may face some pressure to update to ensure their software and their players are supported should bugs arise, or to gain access to new features like the new Passthrough API.

“I will eventually switch to OpenXR but it will take months of work as Virtual Desktop was developed against Oculus’ VrApi over the last 4 years,” Virtual Desktop developer Guy Godin wrote in a tweet and direct message. “Still have months of work to port Virtual Desktop from VrApi to OpenXR. A passthrough environment will not be possible until then.”

“I’ll no longer be able to expect that — if a critical issue arises caused by Facebook’s new software — that it will be fixed. Which will affect every PCVR game built before 2020 in Unity, ie. most of them,” wrote H3VR developer Anton Hand in a direct message.

You can read more about the OpenXR transition over on the Oculus developer blog.