Wolvic Chromium 1.1 Adds WebXR AR To The Open-Source XR Browser

Wolvic Chromium 1.1 adds WebXR AR support to the open-source standalone XR browser, after 1.0 added eye tracked navigation.

What Is Wolvic?

Wolvic is the open-source successor to Firefox Reality, which was a web browser for early standalone headsets like Oculus Go, the original Oculus Quest, Vive Focus, and HoloLens 2. In 2020 Mozilla laid off 250 employees including staff working on Firefox Reality, and in 2022 handed over the project to "open source software consultancy" Igalia, which relaunched it as Wolvic.

Igalia has continued to develop Wolvic in the two years since, and today Wolvic is available on Meta Quest, Pico, Magic Leap 2, and Huawei VR Glasses. Wolvic still supports Firefox Sync, letting you maintain your bookmarks, tabs, and passwords across devices.

Earlier this year, Igalia launched a Chromium version Wolvic, with better performance. Currently the Chromium version is available as an APK to download, while the Gecko version is still the one available on app stores.

Originally, WebXR only supported fully immersive VR, and this limitation still exists on Apple Vision Pro for example. The WebXR Augmented Reality module lets web apps use passthrough or transparency as the background.

The Horizon OS Browser and Pico Browser both already support the WebXR AR module. Now, the Chromium version of the open-source Wolvic browser does too.

0:00
/1:51

WebXR AR in Wolvic.

Wolvic Chromium 1.1 also adds support for downloadable VR environments, seen in flatscreen browsing mode. And the update comes just over a month after the 1.0 release, which we missed covering in all the many announcements of September.

Update 1.0's major feature was eye tracked navigation. It means that on headsets with eye tracking like Meta Quest Pro, you can select with your eyes, and pinch with your fingers, as you would in Safari on Apple Vision Pro. This could make Wolvic the web browser of choice for Quest Pro owners.

0:00
/3:00

The update also brought a number of minor improvements, such as password autofill in private browsing mode and dozens of bug and stability fixes.

The Wolvic builds on app stores are still Gecko-based, so to get the Chromium version of Wolvic you need to download the APK and sideload it.

With visionOS 2, Safari On Apple Vision Pro Supports WebXR By Default

WebXR is enabled by default in Safari with visionOS 2.

0:00
/0:43

Apple's demo of WebXR via Safari on Vision Pro.

WebXR is the open standard API that enables webpages to display immersive content on headsets and support interactivity via controllers or hand tracking. WebXR experiences use WebGL for rendering, which is based on OpenGL ES. You can access WebXR apps near-instantly via the web browser like any other web app. No installation or updates are required, and the developer doesn’t need approval from a central app store authority.

Currently on visionOS 1, WebXR isn't enabled by default. Enabling it requires enabling a feature flag in the advanced settings of Safari. visionOS 2, which was announced earlier this week at WWDC24, solves this.

This wasn't revealed in the main WWDC keynote, but an Apple engineer detailed it in a developer session available online.

visionOS 2 also allows the Mac Virtual Display to continue showing while in WebXR, making live development significantly more convenient for developers.

WebXR Now Supports Apple Vision Pro Default Input System
Safari on Apple Vision Pro now lets WebXR leverage the headset’s default gaze-and-pinch input system.

Back in March, WebXR got support for Apple Vision Pro's gaze-and-pinch input system, thanks to Apple working with the W3C standards agency to add a new transient-pointer input mode to the standard.

However, what visionOS still doesn't support is the WebXR Augmented Reality module. That means WebXR on Apple Vision Pro is still limited to VR only, an issue Niantic ran into when porting their 8th Wall Web AR engine to Apple's platform. This is, of course, somewhat ironic, given the primary focus of Vision Pro is typically considered to be AR, while the primary focus of Meta Quest, which does support AR in WebXR, has traditionally been VR.

Niantic's 8th Wall Web AR Engine Now Supports Apple Vision Pro – But Only With A VR Background

Niantic's 8th Wall WebXR AR engine now supports Apple Vision Pro.

8th Wall is a web-based AR engine, which has a fully in-browser editor and offers cloud hosting for built projects. Niantic is the company behind smartphone AR games like Pokémon GO, Pikmin Bloom, and Monster Hunter Now.

Last month we reported on three 8th Wall demos that support crossplay between smartphones and Quest 3, but noted that none of the demos worked on Apple Vision Pro.

Niantic has now added support for Vision Pro in 8th Wall, but there's a major catch.

Apple Vision Pro supports WebXR yes, through enabling an advanced Safari flag, but it does not support the WebXR Augmented Reality module. That means web apps can show fully immersive VR content on Vision Pro, but can't display on top of passthrough for AR.

To work around this, 8th Wall makes it easy for developers to add a virtual environment as the background for their otherwise-AR apps. These apps will run in AR mode on mobile and Meta Quest, and in VR on Vision Pro.

This is of course somewhat ironic, given the primary focus of Vision Pro is AR and the primary focus of Quest has traditionally been VR.

0:00
/0:39

Vision Pro owners can try out Niantic's Basketball Arcade game for an example of an 8th Wall experience that supports the headset.

WebXR In Safari Now Supports Apple Vision Pro's Gaze-Pinch Input System

Safari on Apple Vision Pro now lets WebXR leverage the headset's default input system.

WebXR isn't enabled on Vision Pro by default, but you can turn it on in the advanced settings of Safari. Until now however WebXR developers have only been able to use hand tracking and implement direct touch or wrist-driven interactions, as Vision Pro doesn't include tracked controllers and WebXR doesn't support eye tracking.

Apple has now worked with the W3C standards agency to add a new transient-pointer input mode to WebXR, which lets web developers leverage the headset's default interaction system where you point with your eyes and pinch to click.

0:00
/1:30

WebXR's new transient-pointer in Safari on Apple Vision Pro

Like in the Shared Space for native apps on visionOS, transient-pointer is designed with privacy at the forefront. Developers only receive input when the user is pinching, and it's just a ray relating to where the user is looking and where their wrist is. Developers do not get continuous eye tracking data, and they still have to request skeletal hand tracking data as a permission if they want it.

Apple says it's working with popular WebXR frameworks to incorporate transient-pointer to make it easy for web developers to support.

The transient-pointer mode is included in the latest draft of the WebXR specification, meaning it should become standard soon. Future headsets with hand and eye tracking will likely add support for it in their web browsers, and Meta could even potentially support it on the existing Quest Pro.

Apple is Adding Support for Vision Pro’s Input System to WebXR

Apple is adding support for Vision Pro’s unique input system to WebXR, the web standard which allows XR experiences to run right from a web browser.

One of the most unique things about Apple Vision Pro is its input system which eschews motion controllers in favor of a ‘look and pinch’ system which combines eye-tracking with a pinch gesture. On the whole it’s a really useful way to navigate the headset, but because it works so differently than motion controllers, it doesn’t play too well with  WebXR.

But Apple is working to fix that. This week the company announced the latest version of VisionOS (1.1) includes a new input mode for Safari’s WebXR capabilities called ‘transient-pointer’. This new mode provides inputs from the headset in a standardized way which developers can use to understand what users are selecting inside of a WebXR session running on Vision Pro.

Up to this point, WebXR apps typically expect a headset report a continuously updated position of each controller. But Apple says it built Vision Pro’s input system to reveal as little information about the user as possible, so it doesn’t report the pose or position of the user’s hands by default. Instead, it only reveals such information at the moment of the user’s pinch (though it’s possible for a WebXR app to ask for full hand tracking info).

 

With the new transient-pointer option, when a user pinches the WebXR app will be able to see a ray representing the direction of the user’s gaze and the coordinate position of their pinch. Like in VisionOS itself, the app thus looks at the pinch to decide ‘when’ a user is making an input, and looks at the ray to decide ‘where’ they’re making the input.

For the duration of the pinch, the position of the pinch itself is continuously updated, allowing for interactions like dragging, pushing, and pulling objects. But when the pinch is released, the app no longer has access to the direction the user is looking or where their hand is located.

With these new capabilities, WebXR apps will be able to adapt their interactions to work correctly with Vision Pro.

However, WebXR on Vision Pro is still experimental. Developers must manually enable WebXR capabilities by accessing advanced settings of Safari in the headset. Developers can also experiment with WebXR and the transient-pointer mode using the VisionOS simulator.

The transient-pointer mode for Vision Pro is being baked into the WebXR standard, and has been added to the most recent draft version of the specification. That means that devices which adopt the same input mode will be able to tap into the same WebXR capabilities.

The post Apple is Adding Support for Vision Pro’s Input System to WebXR appeared first on Road to VR.

Check Out These WebXR Mixed Reality Demos With Crossplay Between Quest 3 And Phones

Niantic and Meta partnered to fund three multiplayer mixed reality demo games that run in the browser and support crossplay between Quest 3 and mobile.

The demos were built over a ten week period using Niantic's 8th Wall web-based AR engine, which has a fully in-browser editor and offers cloud hosting for built projects. Niantic is the company behind smartphone AR games like Pokémon GO, Pikmin Bloom, and Harry Potter: Wizards Unite.

The condition of the funding for the three teams was that their projects use 8th Wall's multiplayer module to enable cross-play between headsets and traditional mobile devices like smartphones and tablets.

Niantic Buys 8th Wall WebAR Development Platform
Pokemon Go technology company Niantic continues to stock up on technologies to power AR development with the purchase of 8th Wall. 8th Wall has helped power a large number of Web-based AR experiences for a variety of major brands and Niantic sees the purchase as offering developers a more accessible

The three demos are Cardboard Crashers, Magical Forest, and BeeQuest.

Cardboard Crashers and Magical Forest are publicly playable, while we couldn't find a public URL for BeeQuest. I found the performance of the playable games very poor on Quest 3, with visibly low frame rate, which was surprising given they're both relatively graphically simple and ran well on my phone.

Another note: while the multiplayer fully works, it doesn't include colocation, because WebXR anchors currently don't support sharing/networking. That means you'll have to manually align both the position and size of the board between devices to believably play in the same space.

We also tried to load both games on Apple Vision Pro in Safari, but they seemed to think the Vision Pro was a phone and tried to access the camera, only to obtain a virtual Persona webcam feed. Given the games are designed to be used with tracked controllers, Vision Pro would probably require specific porting anyways.

Cardboard Crashers

Cardboard Crashers is a turn-based game where each player sets the direction and force of their car with the objective of knocking the other player's cars off the board.

The game uses 8th Wall's physics module for the collisions between cars.

0:00
/0:25

You can access Cardboard Crashers at this URL.

Magical Forest

Magical Forest is a realtime exploration game where players each control a character with the goal of uncovering all the insects in the scene.

You have a book with magical spells you can cast to help out.

0:00
/0:55

You can access Magical Forest at this URL.

BeeQuest

BeeQuest is apparently a game where "players are plunged into the charming world of a honey bee, tasked with collecting resources, expanding their colony, and fending off wasp invasion".

0:00
/1:30

Compared to the other games, BeeQuest seems to have a divergent core gameplay between headsets and mobile:

"In the real world, players can use their phones to follow the map and uncover AR flowers, where they assign players to harvest pollen and nectar. With these resources in tow, they can either continue playing on mobile or switch to a headset to manage their resources and use the pollen and nectar to create more honey and bees. On Meta Quest 3, players can also enter a battle mode to protect their hive from pesky wasps wanting to steal their honey."

To pull off the real world map integration on mobile, BeeQuest uses Niatnic's Lightship Maps module, the same technology that powers its games like Pokémon GO.

Unlike the other two demos, there doesn't appear to be a public URL for BeeQuest.

The Sea We Breathe Uses VR To Teach Kids About Climate Change

The Sea We Breathe wants to teach children about climate change using VR.

Narrated by Helena Bonham Carter, 'The Sea We Breathe - Virtual Reality Experience' is a WebXR app designed by Unseen Studio and the Blue Marine Foundation, a marine conservation charity. It aims to educate children about how the oceans can fight climate change, teaching kids about how they capture and store 'blue carbon' in the seabed to offset emissions. Optimized for Quest, this lets you explore marine habitats, kelp forests and seagrass meadows.

0:00
/0:42

"By creating this VR experience, we hope to educate future decision makers in ocean-climate issues like never before," says Joanna Coumbe, Director of Outreach at the Blue Marine Foundation in a prepared statement.

Recently debuted at Earlsfield Primary School in South West London, The Sea We Breathe is being rolled out to over 800 UK schools. You can access it now through web browsers.

The Power Glove Is So Bad A VR Website Improves It

A WebXR experiment from Brielle Garcia simulates the Power Glove as input for a game on the original Nintendo Entertainment System.

It's so bad.

Power Glove

The hardware accessory from Mattel for the original NES was released in 1989 and immortalized in Fred Savage's The Wizard that year. Based on VR's first generation of tech from the 1980s, the Power Glove is one of gaming history's most famous examples of a cumbersome and short-lived gaming accessory.

“I just love the idea of recreating fun and weird bits of gaming history in VR. As a kid, I was so hyped to use the Power Glove after seeing it in 'The Wizard'," Garcia told me over direct message. "Though it would be several years before I got to try one myself and be completely disappointed. Recreating that feeling with cutting edge VR tech is just really funny."

In the years between The Lawnmower Man and The Matrix, the Power Glove lived on in experiments as a low cost tool to test out gesture-based ideas related to VR. Now, with Garcia's experiment, the Power Glove is simulated in a fully immersive website.

WebXR sites are actually the first destinations cross-compatible with both Apple Vision Pro and Meta Quest 3, with hand tracking supported across both systems. Indeed, Garcia's Power Glove can be donned in the Web browsers on both headsets, though it is only working as intended so far on Quest 3.

"This is a fun little showcase for what’s possible with WebXR/A-Frame these days," Garcia told me. "Hand tracking is fully supported and the devices are fast enough to even run emulators in the browser."

Don Hopper visited the website on Meta Quest 3 and was able to use basic gestures as a kind of steering wheel for an emulated game on the NES.

The project is still under development and we'll update this post with the URL when it's complete. You can follow Garcia's progress as @tacolamp on X.

Mozilla is Shutting Down Development on WebXR Social App ‘Hubs’

Mozilla, the company behind the Firefox browser, shuttered most of its web-focused XR development back in 2020. At the time, the company’s web-based social VR app Hubs was spared from the chopping block. Now, after a new organization-wide restructuring, all development on Hubs is set to be wound down in May.

Launched in 2018, Hubs is an XR chatroom that runs directly in a browser, giving both VR headset users and standard monitor and smartphone users a place to connect. It’s an impressive ‘no-install-required’ WebXR social app that never gained the sort of traction that better-backed apps garnered over the years, such as Rec Room, VRChat, or Meta’s Horizon Worlds.

The team responsible for Hubs recently announced its shutdown in a blog post, stating that its last day under Mozilla will be on May 31st, 2024. This includes shutdown of Hubs‘ demo server, managed subscription, and community resources.

The team aims to provide a multi-month transition period leading up to shutdown of those services, starting with disabling new subscriptions on March 1st, and concluding all work on Hubs by May 31st. A tool to download user data will be released on April 1st, the company says.

While this means Mozilla won’t be continuing active development or maintenance of Hubs codebases and community resources post-shutdown, since Hubs‘ code is open source, anyone can continue independent development. The company emphasizes that its so-called ‘Community Edition’ of Hubs can run on any platform that supports Kubernetes, which includes a majority of cloud services, such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform.

The post Mozilla is Shutting Down Development on WebXR Social App ‘Hubs’ appeared first on Road to VR.

Mozilla Hubs Is Shutting Down, Will Be Handed Off To The Community

WebXR social networking framework Hubs has lost its biggest sponsor, Mozilla, after six years of effort.

The Mozilla Hubs service, pitched as "private, virtual 3D worlds in your browser", is being wound down alongside the team maintaining it, even if the underlying technology lives on with community support as an open source project on Github.

WebXR experiences with hand tracking are some of the first truly cross-platform worlds you can visit from both an Apple Vision Pro and Meta Quest. WebXR as a technology is still in its infancy, but Mozilla Hubs represented a bright spot for the concept with Web-based rooms you could meet in across different types of devices with very little set up involved.

Mozilla will shut down its demo server residing at hubs.mozilla.com, subscriptions, and community resources on May 31, 2024.

Three members of the Hubs team will oversee the shutdown transition to community-maintained project. In a blog post from Mozilla explaining the shutdown plan, it was noted that the demo server saw the creation of "115,732 custom avatars, 215,923 scenes, and hosted meetups for close to 10 million attendees."

"After the conclusion of the shutdown, Mozilla will not continue active development or maintenance of Hubs codebases and other community resources," the organization noted.

Mozilla encourages people with additional questions to post on the community Discord, where there's been a somber outpouring of support for the effort.

Tributes posted on Discord about Mozilla Hubs

Last year, Microsoft-owned Altspace issued a shutdown plan with a tool to download content before the servers went offline, and Mozilla is following suit with a tool for downloading data, though the tool is still in development. Mozilla noted:

"Our goal is to enable users who have assets hosted on Mozilla-run servers to download their data in a format compatible with Community Edition instances or other webXR platforms. All data on Hubs is associated with user email addresses. We plan to release a tool for you to download all uploaded media associated with your email, including 3D models, audio files, image files, and video files uploaded through Spoke, as well as gLTFs of published spoke scenes and avatars. This tool will also make it possible to retrieve all Hubs URLs, including scene URLs, room URLs, avatar URLs, and spoke project URLs."

Mozilla added that the avatars, scenes and "other assets" published by its Hubs team should be open sourced as well.

"Many former Hubs team members have returned to the Discord server to remind the community that Hubs was built with life outside of Mozilla in mind. The project’s commitment to open source and focus on self-hosted versions of Hubs mean that no one entity can determine Hubs' future; only this community can do that."

Installing the "community edition" of Hubs requires Kubernetes experience.