New Google Stadia Job Listings Suggest VR Cloud-gaming Ambitions

Stadia, Google’s cloud-gaming company, is looking for new hires with a preference for virtual reality development experience.

Stadia is Google’s cloud-gaming service which runs games on powerful computers in the cloud and then streams them to your PC, laptop, or even smartphone. The idea is to allow any device to feel like a high-end gaming PC.

Stadia doesn’t currently offer VR cloud-gaming, but it seems to be a natural fit for the use-case considering that high-end VR gaming is exclusively in the realm of beefy gaming PCs, which create a high barrier to entry. If you could keep the high-powered processing in the cloud and then stream the results down to a headset connected to a low-powered computer (or even a standalone headset), you could make PC VR much more accessible. Indeed, new Stadia job listings from Google suggest the company is exploring the possibility.

Four job listings posted or updated as recently as September 7th seek developers and engineers with virtual reality experience among the “preferred qualifications” of the roles.

Interestingly, the roles are spread across Stadia teams that cover the spectrum from developer features to end-user features like game discovery.


Google isn’t the only company with VR cloud-gaming ambitions. Nvidia has already created its own VR cloud-streaming infrastructure called CloudXR, though it’s designed as a foundation for others to build upon rather than being a user-facing service itself. Still we may see the company use CloudXR to bring VR cloud-gaming capabilities to its existing consumer-facing cloud-gaming service, GeForce Now.

Similarly, Facebook kicked off its own cloud-gaming service last year. While it doesn’t offer VR gaming yet, the project is led by a former Oculus executive who clearly understands the potential of VR cloud-gaming.

VR cloud-streaming is definitely viable, but the key bottleneck is less about software or bandwidth and much more about latency. As we explored in our article about the ramifications of 5G on VR and AR, the real holdup for such services becoming widely available is the proliferation of edge computing infrastructure.

The post New Google Stadia Job Listings Suggest VR Cloud-gaming Ambitions appeared first on Road to VR.

Google’s VR Studio Reveals Bright & Bubbly Game Coming to Quest & PC VR in 2022

Google’s Owlchemy Labs, the VR studio behind Job Simulator (2016), just revealed a totally new game called Cosmonious High. It’s heading to PC VR and Oculus Quest next year.

First revealed today, Cosmonious High launches you into an alien high school where you unlock powers in order to fix up the place post-disaster, and meet a quirky cast of characters who attend classes with you along the way.

Characters are said to respond to natural gestures such as high fives and fist bumps, which feels a bit like a natural extension of what we saw in Vacation Simulator (2019), which brought more user-to-NPC interaction to the table by way of gestured interactions. Here it seems you’ll also be able to select emotes from a speech bubble.

Image courtesy Owlchemy Labs

And like Vacation Simulator, Owlchemy Labs is offering up another “biggest space” it’s ever built with Cosmonious High, making for “one big interactive playground for your powers,” the studio says.

“With Cosmonious High, we’re breaking all the bounds. Players can go anywhere, interact with any character they see, and use their powers to resolve—or cause—as much chaos as they want,” says Chelsea Howe, product director at Owlchemy Labs.

Owlchemy Labs CEO Devin Reimer says it’s all about “interaction, inclusivity, and accessibility. We continue to push the boundaries of VR and we cannot be more excited to launch our first new IP in five years.”

Cosmonious High is said to support SteamVR headsets and Oculus Quest, and will arrive at some point in Spring 2022. There’s still plenty more to learn, and we’ll have our eyes peeled as Owlchemy Labs continues the drip of info leading up to launch.

The post Google’s VR Studio Reveals Bright & Bubbly Game Coming to Quest & PC VR in 2022 appeared first on Road to VR.

Google Studio Owlchemy Labs Affirms Work on New VR Game, Details Expected This Year

Google-owned Owlchemy Labs, the studio behind VR classics like Job Simulator and Vacation Simulator, has confirmed that it’s working on a new VR project after running quiet for much of the year.

Owlchemy Labs has been around since the early days of the modern VR era, with the studio’s first major title, Job Simulator, bundled as a launch title for the HTC Vive in 2016.

After being acquired by Google in 2017, the studio went on to release Rick and Morty: Virtual Rick-ality (2017) and Vacation Simulator (2019), with most of the studio’s games being ported widely across VR platforms.

Since Vacation Simulator, and a few post-launch updates, Owlchemy has been pretty quiet about what might be next. Especially considering Google’s hasty retreat away from VR, it wasn’t necessarily clear that there even would be a ‘next’.

Luckily the lauded studio has affirmed that it’s alive and well and working on its next project, which the studio has confirmed to Road to VR is definitely a VR game.

“Owlchemy is working on a brand new game and we can’t wait to share details in the coming months!” the studio says. Owlchemy is currently hiring five new positions to support ongoing development.

It remains to be seen whether the studio will continue to build on the success of its Simulator franchise, or branch off into something new. It wouldn’t be surprising if they stick to what works; Owlchemy is one of the only studios to consistently have two titles among the 20 best rated Quest games, and has topped many charts over the years.

Hopefully it won’t be too long yet before we find out exactly what’s in development. The studio said fans can expect to here something concrete about its next game some time this year.

The post Google Studio Owlchemy Labs Affirms Work on New VR Game, Details Expected This Year appeared first on Road to VR.

Key Google AR/VR Director Heads To Facebook Reality Labs

Google continues to bleed experts in AR and VR technology as Facebook Reality Labs staffs up for a bigger push into hardware.

The latest move to Facebook is Joshua To, a key Google director who led “a large team of product designers, artists, writers and researchers focused on supporting our wearables and hardware efforts,” according to a Linkedin profile, which says he started at Facebook Reality Labs this month. A report by Input added that he worked on AR/VR projects like Lens and Daydream and confirmed he’ll be working on AR at Facebook.

To is the latest Googler to move on in recent years after the company shuttered efforts in VR right as it was on the cusp of delivering an all-in-one package similar to Facebook’s Oculus Quest. We’ll note that Paul Debevec left Google recently for Netflix. Debevec is one of the leading researchers in light field technology who joined Google in 2016 after pioneering work at USC working at the same institute Palmer Luckey spent time in before kickstarting the Oculus Rift. Debevec is now “Director of Research, Creative Algorithms and Technology” at Netflix where he’s doing “Research in computer graphics, computer vision, and machine learning to improve the creative filmmaking process.” A recent deal by Netflix with a prolific producer could lead to VR content from the company.

The pair join a long list of experts who have moved on from Google as the company shifts its ambitions in hardware and platforms. The company open sourced its art app Tilt Brush and shut down its 3D object hosting service Poly while adding new AR-centric features to its efforts in smartphones. Meanwhile, Facebook continues to hire at an incredible rate with more than 10,000 people working on VR and AR at the company.

Icosa Gallery Beta Launches For As Open-Source Replacement For Google Poly

Icosa Gallery, a community-built, open-source replacement for Google Poly, has launched in beta, just seven days before the latter service shuts down for good.

It offers VR artists a way to store their creations online, including environments and models built using Tilt Brush and its new open-source counterpart, Open Brush.

In December last year, Google announced that Poly, its 3D object sharing service, would be shutting down on June 30, 2021. Just over a month later in January 2021, Google then announced that it would also be ending official development of popular VR creation tool Tilt Brush and making it open-source, so that the community could continue to tinker and play with the software in lieu of official support.

Since then, community solutions and replacements for both Poly and Tilt Brush have sprung to life. Open Brush offers an open-source, free version of Tilt Brush for PC VR and Quest users via App Lab. Meanwhile Sketchfab’s CEO encouraged creators to upload 3D models to the successful site in Poly’s absence. Likewise, Psychic VR Lab’s platform Styly added direct uploads for Tilt Brush creations in March, which can be viewed both in VR via a native app or online in browser.

Icosa Gallery is the latest option for Tilt Brush creators, with the ability to upload GLTF and GLB files from Tilt Brush sketches and have them display and animate online in the same way as they would in Tilt Brush. There’s also plans for direct integration with Open Brush in the future, as well as support for the .tilt sketch files. It’s also possible to import all of your current Poly creations into Icosa Gallery before the service shuts down in a week’s time.

You can view Icosa Gallery’s beta site here and download Open Brush for Quest via App Lab and PC VR via Steam.

Google’s Project Starline is a Light-field Display System for Immersive Video Calls

This week Google revealed Project Starline, a booth-sized experimental system for immersive video chatting, purportedly using a bevy of sensors, a light-field display, spatial audio, and novel compression to make the whole experience possible over the web.

This week during Google I/O, the company revealed an experimental immersive video chatting system it calls Project Starline. Functionally, it’s a large booth with a big screen which displays another person on the other end of the line at life-sized scale and volumetrically.

Image courtesy Google

The idea is to make the tech seamless enough that it really just looks like you’re seeing someone else sitting a few feet away from you. Though you might imagine the project was inspired by the pandemic, the company says the project has been “years in the making.”

Google isn’t talking much about the tech that makes it all work (the phrase “custom built hardware” has been thrown around), but we can infer what a system like this would require:

  • An immersive display, speakers, and microphone
  • Depth & RGB sensors capable of capturing roughly 180° of the subject
  • Algorithms to fuse the data from multiple sensors into a real-time 3D model of the subject

Google also says that novel data compression and streaming algorithms are an essential part of the system. The company claims that the raw data is “gigabits per second,” and that the compression cuts that down by a factor of 100. According to a preview of Project Starline by Wired, the networking is built atop WebRTC, a popular open-source project for adding real-time communication components to web applications.

As for the display, Google claims it has built a “breakthrough light-field display” for Project Starline. Indeed, from the footage provided, it’s a remarkably high resolution recreation; it isn’t perfect (you can see artifacts here and there), but it’s definitely impressive, especially for real-time.

Granted, it isn’t yet clear exactly how the display works, or whether it fits the genuine definition of a light-field display (which can support both vergence and accommodation), or if Google means something else, like a 3D display showing volumetric content based on eye-tracking input. Hopefully we’ll get more info eventually.

Once hint about how the display works comes from the Wired preview of Project Starline, in which reporter Lauren Goode notes that, “[…] some of the surreality faded each time I shifted in my seat. Move to the side just a few inches and the illusion of volume disappears. Suddenly you’re looking at a 2D version of your video chat partner again […].” This suggests the display has a relatively small eye-box (meaning the view is only correct if your eyes are inside a specific area), which is likely a result of the particular display tech being employed. One guess is that the tech is similar to the Looking Glass displays, but Google has traded eye-box size in favor of resolution.

Image courtesy Google

From the info Google has put out so far, the company indicates Project Starline is early and far from productization. But the company plans to continue experimenting with the system and says it will pilot the tech in select large enterprises later this year.

The post Google’s Project Starline is a Light-field Display System for Immersive Video Calls appeared first on Road to VR.