Hugo Barra, Meta’s former Head of VR, offered some unique insight into the XR industry recently with an extensive blogpost that centers around Apple Vision Pro. Barra warns that, like the company’s first standalone headset Oculus Go, the novelty around casual content consumption will probably fade fairly quickly.
Looking back at his time at Meta (then Facebook), Barra notes that Oculus Go was “the biggest product failure” he’d ever been attached to, stating that although casual content consumption was the headset’s raison d’être, the hype wore off pretty quickly.
Here’s Barra’s appraisal of the situation:
Watching TV/movies in virtual reality seemed like such an incredibly compelling idea that we (the Oculus team at Meta/Facebook) built an entire product around that idea — Oculus Go.
Launched in 2018, Oculus Go was the biggest product failure I’ve ever been associated with for the simple reason that it had extremely low retention despite strong partnerships with Netflix and YouTube.
Most users who bought Oculus Go completely abandoned the headset after a few weeks. The full story is much more nuanced (including the fact that the Oculus Go failure got us on the path to Oculus Quest very quickly), but it taught us an important lesson.
Barra notes that poor retention for Oculus Go had to do with a few common factors, including user comfort, friction in starting a session when not already wearing the headset, and the social isolation of watching content alone—all of which is true for Vision Pro as well.
Barra concludes that, at least as far as Oculus Go went, traditional media consumption was “not a core ‘daily driver’ pillar but more an ancillary use case that adds some value to other core pillars (such as productivity or gaming).”
Granted Barra says Vision Pro brings more to the table thanks to its better displays than previous VR headsets, which can create “magical movie experiences on occasion,” but those same challenges that Oculus Go contended with basically remain.
Barra initially moved to Meta (then Facebook) in 2017 from his role as Global VP at the China-based tech giant Xiaomi, becoming head of Oculus and VP of Reality Labs partnerships. Leveraging his experience at Xiaomi, Meta even tapped the Chinese tech giant to manufacture Oculus Go for both the international market and the Chinese domestic market, also branding it under the name ‘Mi VR Standalone’, belying just how big the company expected Go to resonate.
Only a short year after the release of Oculus Go though, the company shifted gears to launch its first room-scale-capable standalone Oculus Quest, nearly abandoning Oculus Go entirely, which in addition to largely relying on Samsung Gear VR apps, omitted motion controllers due to only being tracked in three degrees of freedom.
Then again, that’s where the comparions stop, as Vision Pro has great hand-tracking, millions of apps, and compelling mixed reality passthrough—all of the things Barra says Apple is hoping to use to make Vision Pro “the future of work.”
Eye-tracking—the ability to quickly and precisely measure the direction a user is looking while inside of a VR headset—is often talked about within the context of foveated rendering, and how it could reduce the performance requirements of XR headsets. And while foveated rendering is an exciting use-case for eye-tracking in AR and VR headsets, eye-tracking stands to bring much more to the table.
Updated – May 2nd, 2023
Eye-tracking has been talked about with regards to XR as a distant technology for many years, but the hardware is finally becoming increasingly available to developers and customers. PSVR 2 and Quest Pro are the most visible examples of headsets with built-in eye-tracking, along with the likes of Varjo Aero, Vive Pro Eye and more.
With this momentum, in just a few years we could see eye-tracking become a standard part of consumer XR headsets. When that happens, there’s a wide range of features the tech can enable to drastically improve the experience.
Foveated Rendering
Let’s first start with the one that many people are already familiar with. Foveated rendering aims to reduce the computational power required for displaying demanding AR and VR scenes. The name comes from the ‘fovea’—a small pit at the center of the human retina which is densely packed with photoreceptors. It’s the fovea which gives us high resolution vision at the center of our field of view; meanwhile our peripheral vision is actually very poor at picking up detail and color, and is better tuned for spotting motion and contrast than seeing detail. You can think of it like a camera which has a large sensor with just a few megapixels, and another smaller sensor in the middle with lots of megapixels.
The region of your vision in which you can see in high detail is actually much smaller than most think—just a few degrees across the center of your view. The difference in resolving power between the fovea and the rest of the retina is so drastic, that without your fovea, you couldn’t make out the text on this page. You can see this easily for yourself: if you keep your eyes focused on this word and try to read just two sentences below, you’ll find it’s almost impossible to make out what the words say, even though you can see something resembling words. The reason that people overestimate the foveal region of their vision seems to be because the brain does a lot of unconscious interpretation and prediction to build a model of how we believe the world to be.
Foveated rendering aims to exploit this quirk of our vision by rendering the virtual scene in high resolution only in the region that the fovea sees, and then drastically cut down the complexity of the scene in our peripheral vision where the detail can’t be resolved anyway. Doing so allows us to focus most of the processing power where it contributes most to detail, while saving processing resources elsewhere. That may not sound like a huge deal, but as the display resolution of XR headsets and field-of-view increases, the power needed to render complex scenes grows quickly.
Eye-tracking of course comes into play because we need to know where the center of the user’s gaze is at all times quickly and with high precision in order to pull off foveated rendering. While it’s difficult to pull this off without the user noticing, it’s possible and has been demonstrated quite effectively on recent headset like Quest Pro and PSVR 2.
Automatic User Detection & Adjustment
In addition to detecting movement, eye-tracking can also be used as a biometric identifier. That makes eye-tracking a great candidate for multiple user profiles across a single headset—when I put on the headset, the system can instantly identify me as a unique user and call up my customized environment, content library, game progress, and settings. When a friend puts on the headset, the system can load their preferences and saved data.
Eye-tracking can also be used to precisely measure IPD (the distance between one’s eyes). Knowing your IPD is important in XR because it’s required to move the lenses and displays into the optimal position for both comfort and visual quality. Unfortunately many people understandably don’t know what their IPD off the top of their head.
With eye-tracking, it would be easy to instantly measure each user’s IPD and then have the headset’s software assist the user in adjusting headset’s IPD to match, or warn users that their IPD is outside the range supported by the headset.
In more advanced headsets, this process can be invisible and automatic—IPD can be measured invisibly, and the headset can have a motorized IPD adjustment that automatically moves the lenses into the correct position without the user needing to be aware of any of it, like on the Varjo Aero, for example.
Varifocal Displays
The optical systems used in today’s VR headsets work pretty well but they’re actually rather simple and don’t support an important function of human vision: dynamic focus. This is because the display in XR headsets is always the same distance from our eyes, even when the stereoscopic depth suggests otherwise. This leads to an issue called vergence-accommodation conflict. If you want to learn a bit more in depth, check out our primer below:
In the real world, to focus on a near object the lens of your eye bends to make the light from the object hit the right spot on your retina, giving you a sharp view of the object. For an object that’s further away, the light is traveling at different angles into your eye and the lens again must bend to ensure the light is focused onto your retina. This is why, if you close one eye and focus on your finger a few inches from your face, the world behind your finger is blurry. Conversely, if you focus on the world behind your finger, your finger becomes blurry. This is called accommodation.
Vergence
Then there’s vergence, which is when each of your eyes rotates inward to ‘converge’ the separate views from each eye into one overlapping image. For very distant objects, your eyes are nearly parallel, because the distance between them is so small in comparison to the distance of the object (meaning each eye sees a nearly identical portion of the object). For very near objects, your eyes must rotate inward to bring each eye’s perspective into alignment. You can see this too with our little finger trick as above: this time, using both eyes, hold your finger a few inches from your face and look at it. Notice that you see double-images of objects far behind your finger. When you then focus on those objects behind your finger, now you see a double finger image.
The Conflict
With precise enough instruments, you could use either vergence or accommodation to know how far away an object is that a person is looking at. But the thing is, both accommodation and vergence happen in your eye together, automatically. And they don’t just happen at the same time—there’s a direct correlation between vergence and accommodation, such that for any given measurement of vergence, there’s a directly corresponding level of accommodation (and vice versa). Since you were a little baby, your brain and eyes have formed muscle memory to make these two things happen together, without thinking, anytime you look at anything.
But when it comes to most of today’s AR and VR headsets, vergence and accommodation are out of sync due to inherent limitations of the optical design.
In a basic AR or VR headset, there’s a display (which is, let’s say, 3″ away from your eye) which shows the virtual scene, and a lens which focuses the light from the display onto your eye (just like the lens in your eye would normally focus the light from the world onto your retina). But since the display is a static distance from your eye, and the lens’ shape is static, the light coming from all objects shown on that display is coming from the same distance. So even if there’s a virtual mountain five miles away and a coffee cup on a table five inches away, the light from both objects enters the eye at the same angle (which means your accommodation—the bending of the lens in your eye—never changes).
That comes in conflict with vergence in such headsets which—because we can show a different image to each eye—is variable. Being able to adjust the imagine independently for each eye, such that our eyes need to converge on objects at different depths, is essentially what gives today’s AR and VR headsets stereoscopy.
But the most realistic (and arguably, most comfortable) display we could create would eliminate the vergence-accommodation issue and let the two work in sync, just like we’re used to in the real world.
Varifocal displays—those which can dynamically alter their focal depth—are proposed as a solution to this problem. There’s a number of approaches to varifocal displays, perhaps the most simple of which is an optical system where the display is physically moved back and forth from the lens in order to change focal depth on the fly.
Achieving such an actuated varifocal display requires eye-tracking because the system needs to know precisely where in the scene the user is looking. By tracing a path into the virtual scene from each of the user’s eyes, the system can find the point that those paths intersect, establishing the proper focal plane that the user is looking at. This information is then sent to the display to adjust accordingly, setting the focal depth to match the virtual distance from the user’s eye to the object.
A well implemented varifocal display could not only eliminate the vergence-accommodation conflict, but also allow users to focus on virtual objects much nearer to them than in existing headsets.
And well before we’re putting varifocal displays into XR headsets, eye-tracking could be used for simulated depth-of-field, which could approximate the blurring of objects outside of the focal plane of the user’s eyes.
As of now, there’s no major headset on the market with varifocal capabilities, but there’s a growing body of research and development trying to figure out how to make the capability compact, reliable, and affordable.
Foveated Displays
While foveated rendering aims to better distribute rendering power between the part of our vision where we can see sharply and our low-detail peripheral vision, something similar can be achieved for the actual pixel count.
Rather than just changing the detail of the rendering on certain parts of the display vs. others, foveated displays are those which are physically moved (or in some cases “steered”) to stay in front of the user’s gaze no matter where they look.
Foveated displays open the door to achieving much higher resolution in AR and VR headsets without brute-forcing the problem by trying to cram pixels at higher resolution across our entire field-of-view. Doing so is not only be costly, but also runs into challenging power and size constraints as the number of pixels approach retinal-resolution. Instead, foveated displays would move a smaller, pixel-dense display to wherever the user is looking based on eye-tracking data. This approach could even lead to higher fields-of-view than could otherwise be achieved with a single flat display.
Varjo is one company working on a foveated display system. They use a typical display that covers a wide field of view (but isn’t very pixel dense), and then superimpose a microdisplay that’s much more pixel dense on top of it. The combination of the two means the user gets both a wide field of view for their peripheral vision, and a region of very high resolution for their foveal vision.
Granted, this foveated display is still static (the high resolution area stays in the middle of the display) rather than dynamic, but the company has considered a number of methods for moving the display to ensure the high resolution area is always at the center of your gaze.
Meta has nearly scrubbed all of its products of the Oculus name, however the company today announced its third-party publishing wing is getting a sort of rebrand that will see the Oculus name live on.
Meta announced at the Game Developers Conference (GDC) that it’s naming its third-party publishing arm Oculus Publishing. The company tells us Oculus Studios, its first-party studio, will continue to exist.
To date, Meta’s growing fleet of acquired first-party studios includes Beat Games (Beat Saber), Sanzaru Games (Asgard’s Wrath), Ready at Dawn (Lone Echo & Echo VR), Downpour Interactive (Onward), BigBox VR (Population: One), and Within (Supernatural).
Third-party titles under Oculus Publishing include Among Us VR (Innersloth, Schell Games), Bonelab (Stress Level Zero), The Walking Dead: Saints & Sinners (Skydance Interactive), and Blade & Sorcery: Nomad (Warpfrog).
Notably, there’s little left that sports the Oculus brand since the company made its big metaverse pivot in October 2021. Besides older hardware, the only things most people see with the ‘Oculus’ moniker is the Oculus PC app and Meta’s Oculus web portal, where the company still lists game libraries for Quest, Rift, Go, and Gear VR.
“This year marks a full decade since the inception of the original Oculus Content Team,” the company says in a developer blog post. “From Kickstarter to Quest, Meta has committed hundreds of millions of dollars in third-party content funding and specialized development support to help make the VR games landscape what it is today. Now, we’re excited to unveil an official name for one of the world’s largest VR games programs for developers: Oculus Publishing.”
The company says Oculus Publishing will continue to directly partner with development teams on conceptualization, funding, production, technology advancement, game engineering, promotion and merchandising.
The company says it’s contributed funding to “more than 300 titles,” and that there are another 150 titles in active development today.
Meta CEO Mark Zuckerberg showed off four new headset prototypes a few days ago that, he said, make virtual reality feel almost real.
The goal is to invent displays “that are as vivid and realistic as the physical world and much more advanced than traditional computer screens we use today,” Zuckerberg said in a YouTube video.
One of the prototype headsets had industry-leading resolution, a second had extra focal length, the third had depth of field, and the fourth headset combined several of these features into a single — but non-functional — light-weight prototype.
The global market for augmented reality and virtual reality headsets grew 92% last year with shipments reaching 11 million units, according to the latest IDC numbers. Meta’s Quest 2 headset was the most popular, according to IDC, with a 78% market share.
One of the reasons is that Meta has been investing significantly in virtual reality technology, starting with a $2 billion acquisition of the Oculus Rift in 2014. The company spent more than $10 billion on virtual reality last year and has promised to do the same for ten years. It’s well on its way to meeting that target this year, having spent $3.7 billion in the first quarter, according to the company’s latest earnings report.
But today’s virtual reality headsets have a lot to be desired. They’re uncomfortable to use, and the graphics quality means that the virtual reality doesn’t feel particularly real.
Meta is working to change that. According to Zuckerberg, the company is investing in improving resolution, increasing focal depth, reducing optical distortion, and improving dynamic range, in addition to making the headsets smaller and easier to wear.
Butterscotch prototype tackles resolution
Resolution refers to the number of pixels on the display. According to Zuckerberg, virtual reality headsets should have at least 60 pixels per degree of vision.
The headset prototype that features this level of resolution is called Butterscotch, he said.
“This prototype… lets you comfortably read the smallest letters on an eye chart,” he said.
Half Dome prototype tackles focal depth and distortion
Regular televisions and monitors are stationary. People only need to focus their eyes on one set distance.
But for virtual reality, people need to be able to focus on something close to them — and also to focus on something far away — in order for the environment to feel realistic. That’s hard to do with a static computer screen, but Meta thinks it has a solution.
“With varifocal and eye tracking tech, our Half Dome prototypes let you focus on any object at any distance,” Zuckerberg said.
Another company goal is to build a unit that can fix optical distortions extremely quickly.
Optical distortions are kind of like an fun-house mirror, where the center of the display makes the image look too fat or too skinny. One way to fix the problem is to pancake two sense — a fattening lens and the skinny lens on top of each other so that they cancel out.
Zuckerberg says that the problem can also be fixed with software, as long as corrections occur so quickly, Zuckerberg said, “that it’s imperceptible to the human eye.”
The Half Dome prototype also includes the distortion corrections, he said.
Starburst prototype tackles dynamic range
“Nature is often 10 or 100 times brighter than modern HDTVs,” said Zuckerberg. On a headset, “we need those colors to be just as vivid to feel realistic.”
For this they built Starburst, which, he said, is the world’s first high dynamic range virtual reality system.
All three of those prototypes are still very bulky and impractical, Zuckerberg admitted.
Holocake 2 prototype features sleek, lightweight design
The fourth prototype, called Holocake 2, is a lightweight headset that combines all the best features of the other three. However, currently it is just a mockup and doesn’t actually work.
“There’s still a long way to go,” Zuckerberg said.
After some discussions internally and a few requests from fans, Purple Yonder decided hand tracking support would be the game’s first big piece of post-launch content.
“We really wanted to jump in at the deep end and see what we could do and if we could make it playable with hands and that’s what we’ve managed to do,” he told me. “It was a lot of work to get to that stage, a lot of challenges, but we’re there and it’s working really well.”
To place objects and build roads, you’ll point at an area of the map and use the familiar pinch gesture, found in many other hand tracking apps. However, movement with your hands is a bit unique in Little Cities — you close and drag your hands in a fist to move laterally, while moving fists closer or further apart will let you zoom. Moving fists in a steering wheel motion will rotate the map.
Apart from that, a lot of the remaining buttons and actions transferred from controllers to hands without much modification. The wristwatch mechanic, for example, works almost exactly the same as it does with controllers. “That just works really well with hand tracking because you just naturally look at your hand and that all still works the same way.”
“When you’re selecting things, if you haven’t played Little Cities, the way it works is you have like a build bubble and you pop that with your finger. And then you get a section of other bubbles which shows different options you can build. And that just works really well,” he explains. “We didn’t really have to change much to get that working with hand tracking and it just feels really good, this kind of tactile feeling. Cause it’s not only you kind of popping these bubbles to select things, but it feels like your real hands when you’re doing it.”
The Big Hands in Little Cities update is out now on the Quest platform. Both Quest 2 and the original Quest will support hand tracking, with players on the newer headset being able to take advantage of Hand Tracking 2.0.
SideQuest’s new in-headset app for Quest 2 and Quest streamlines the installation of custom home environments and popular community-made VR ports of classics like the original Doom, Quake, and Half-Life games.
The new app even makes it easier to find experimental App Lab projects that are also listed on SideQuest. You still need a PC to install SideQuest onto a Quest headset and sign up as a developer to get that access in the first place, but the SideQuest app now walks Quest owners through that process directly.
SideQuest has been available as a PC and Mac app almost as long as the first Quest headset, giving users a way to connect their Quest to a computer and sideload content that isn’t officially approved for the Quest Store. SideQuest is taking this a step further today by launching a new app that installs the platform directly onto Quest 2 and gives users an easier way to browse and install content entirely in-headset.
Previously, it was possible to install the Android mobile version of SideQuest onto a Quest headset for similar results. However, the interface wasn’t designed for VR and things didn’t always work. With this new version specifically designed for VR, SideQuest can be used in-headset with much less friction.
A computer is still required for first-time installation via USB and to install the core files for classic PC games, like the doom.wad file for the original Doom game from 1993. Once the SideQuest app is installed on Quest it can be launched from the Unknown Sources tab and used to browse and download content like QuestZDoom directly to the headset’s internal storage without using the SideQuest PC app.
There’s also a section in the app for custom home environments. Users can browse from a selection of community-made home environments, download them and swap them out for the default Meta options. SideQuest is also launching new guides and presets for creating custom homes, which should streamline the process of creating and exporting custom environments.
SideQuest can even run with multitasking in Quest 2 if you move it to the side. In the below screenshot I’ve got it running alongside the official Oculus Store after using it to install the Star Trek: The Next Generation bridge as my custom home.
It’s been a pretty decent year for VR so far, but there’s still a huge number of games releasing for Quest, PC VR and PSVR in the second half of 2022.
We’ve compiled a list of every confirmed title below — while some have confirmed release dates or months, there’s quite a few games without a specific date yet. Some just have a season or vague release window, but many others are just scheduled for 2022 without any other specifics.
At the very end, there’s a few games we know are in development, but without any indication of release window. Even if unlikely, a lot of these titles could hypothetically be a surprise release before the end of the year — fingers crossed.
2022 VR Games
Kayak VR: Mirage (June 28) – PC VR
A visually arresting take on kayaking in VR, this physics-driven experience lets you take part in single-player exploration and races across several stunning environments.
Wands Alliances (June 30) – Quest 2
Cortopia Studios follows up on its multiplayer spell-battling game with a new title that features 3v3 matches. Pick your spells and jump into arenas to magical combat with a tactical twist.
Vail VR (Beta, July 1) – PC VR
Competitive VR shooter Vail will be going into beta in July after an extensive alpha testing period.
Moss: Book II (July 21) – Quest 2
While already available on PSVR, this follow-up platformer starring adventurous mouse Quill will come to Quest 2 towards the end of July.
The Twilight Zone VR (July 14) – Quest 2
The Twilight Zone VR will launch with three different tales (or ‘episodes’), each essentially a mini story, that span different genres and are handled by different writers, much like a serialized TV show. A PSVR version will release at a later date — no word on potential PC VR or PSVR 2 releases just yet thought.
Nerf: Ultimate Championship (August 25) – Quest 2
Nerf: Ultimate Championship brings foam bullet action into VR as a team-based multiplayer first-person shooter. You’ll be able to choose between different blasters and play across control point and arena modes, with some parkour mechanics thrown in for good measure.
The Chewllers (Summer, Early Access) – Quest
This four-player co-op game will see you stand atop a tower, covering all angles as the horde or Chewllers approaches. Upgrade your weapons and repair your tower between waves to hold out as long as possible. The game will launch in early access for Quest this summer, with PC VR and PSVR releases planned later down the line.
Requisition VR (Early Access in September) – PC VR
When it launches this fall, NFL Pro Era will be the first officially-licensed NFL VR game, available for Quest 2 and PSVR. It will include all 32 professional NFL teams and will let you embody the quarterback during gameplay.
Espire 2 (November) – Quest 2
This sequel will offer more sandbox stealth with some new features and mechanics, alongside a brand new second campaign designed for co-op multiplayer. It will release in November for Quest 2, but no confirmation for other platforms yet.
Among Us VR (Holiday) – Quest 2, PC VR
Among Us VR brings the viral multiplayer game into VR, where one player embodies the impostor and must murder the other members without arousing suspicion or being discovered. It’s coming to Quest 2 and PC VR during the 2022 holiday period, but there’s no specific date just yet. A PSVR 2 release has also been confirmed for when the headset launches — whenever that may be.
2022 VR Releases – Date TBC
Bonelab – Quest 2, PC VR
This highly anticipated follow-up to 2019’s Boneworks is the next title from Stress Level Zero, launching this year for Quest 2 and PC VR. Bonelab is an action-adventure physics game with a brand new story and “two years of innovation and interaction engine progress” from Boneworks.
Red Matter 2 – Quest 2
Red Matter 2 will pick up right after the first game ended, taking you back to the mysterious planet plagued by horrific anomalies. You’re now on a rescue mission, searching for an old friend, with more environmental storytelling and puzzle solving. While it’s coming to Quest 2 this year, there’s no word on PSVR or PC VR releases just yet.
The Walking Dead: Saints & Sinners – Chapter 2: Retribution – Quest 2, PC VR, PSVR
This Walking Dead follow-up game is set to release on all major headset platforms late this year, giving players a chance to step back into the world with a new map and weapons — including a gore-inducing chainsaw. A PSVR 2 release is also confirmed, but not until next year.
Gambit – Quest 2, PC VR
This co-op VR shooter will see you complete heist-style missions, shooting and looting with your friends through a 20+ hour campaign. It’s coming to Quest 2 and PC VR this year, but no confirmation of other platforms yet.
Killer Frequency – Quest 2
This will be the first VR title developed by Team 17, the acclaimed studio known for the Worms franchise. However, don’t expect a Worms-like game here– instead, this horror-comedy is set in the mid-US in the 1980’s, and casts players as a local radio host that must help the citizens of a small town avoid a mysterious masked killer.
Peaky Blinders: The King’s Ransom – Quest 2, PC VR
Based around the titular characters of Netflix fame, Peaky Blinders: The King’s Ransom is being developed by Doctor Who: Edge of Time studio Maze Theory and set for release later this year on Quest 2 and PC VR. It looks like a PSVR 2 release could be in the works too, but we’ll have to wait a bit longer for full confirmation it seems.
What the Bat
What the Bat is a VR follow-up to the flatscreen title What the Golf from Denmark-based studio Triband. You’ll have a bat in either hand, but you won’t be playing baseball — instead, you’ll do just about anything else. The game is coming to Quest 2 and PC VR later this year.
Ziggy’s Cosmic Adventure – Quest 2, PC VR
Ziggy’s Cosmic Adventure is an immersive pilot sim, where you’ll need to balance between ship combat and management while rocketing through space, coming late this year to Quest 2 and PC VR.
Propagation: Paradise Hotel – Quest 2
A sequel to Propagation VR, this single player horror sequel will see you fight in new encounters with all new mechanics. The game will release on “all major VR platforms” but Quest 2 is specifically confirmed for later this year.
Broken Edge – Quest 2, PC VR
This stylish multiplayer game will see two players go head-to-head in swordfighting combat. Developed by Trebuchet and published by Fast Travel Games, it’s coming to Quest 2 and PC VR later this year.
Hubris – PC VR
This stunning VR shooter is coming to PC VR later this year, with Quest and PSVR versions in the works as well.
Dyschronia: Chronos Alternate – Quest 2
The latest game from Tokyo-based MyDearest will see you play as Hal Scion, who will use his ability to access people’s memories to investigate the murder of a futuristic city’s founder. It’s coming to Quest 2 this year, with no confirmation of other headsets yet. It will be an episodic release split in three parts, but the studio aims to have all episodes release by the end of the year.
Paranormal Hunter
You’ll team up with up to four players in this ghost-hunting multiplayer title, set to release in early access for PC VR sometime this year.
Tea for God
After a long time available as a work in progress on Itch.io, Tea for God will properly launch for PC VR on Steam later this year. No news on whether the Quest version will see a similar full release anytime soon though, but keep an out.
Trial by Teng – PC VR
Solve puzzles and work off your ‘Karmic debt’ as you try to work your way out of hell in this satirical VR title, coming to PC VR headsets sometime this year.
Ultimechs – PC VR
Ultimechs is a pretty simple concept: it’s soccer, but instead of kicking the ball, you’re firing rockets at it from a giant mech. While the game is coming to “major VR platforms”, it’s only confirmed for release on PC VR later this year.
Ruinsmagus VR – PC VR, Quest 2
Play as a novice wizard to become a spell-wielding Magus through 26 narrative-drive quests with full Japanese voice acting. Originally set for a spring release, Ruinsmagus is coming to Quest and PC VR sometime this year.
These games are ones we know about, but have absolutely no release date — not even a rough year window.
It’s hard to say whether most (if any) of these will launch this year, but it’s not out of the realm of possibility, hence why we’ve included them.
Assassin’s Creed VR – Quest
Rumored to be titled Assassin’s Creed Nexus, we’ve not heard much about Ubisoft’s upcoming Quest title that will bring the famed franchise to VR for the first time. It could surprise launch later in the year, but we wouldn’t count on it.
At the earliest, that means a launch sometime this year, but at the latest, it means a launch by April 2023 . However, it’s still possible the game gets delayed past that — we’ll just have to wait and see.
Horizon: Call of the Mountain – PSVR 2
While not a confirmed PSVR 2 launch title, Call of the Mountain’s release date obviously hinges on when PSVR 2 itself will release. And yes, it’s looking increasingly unlikely that PSVR 2 will launch in 2022 — 2023 seems much more likely now.
But hypothetically, Call of the Mountain could be a PSVR 2 launch title if the headset released this year. Don’t hold your breath though.
Grand Theft Auto: San Andreas – Quest 2
Since it was announced last October, we’ve heard nothing about GTA: San Andreas on Quest. There’s a slim chance it launches later this year. Fingers crossed?
HeliSquad: Covert Operations – PC VR, Quest 2, Pico Neo Link 3
Only recently revealed, there’s no release window for this helicopter game coming from Warplanes studio Home Next Games.
Onward 2
While Mark Zuckerberg seemingly confirmed Onward 2 is in development, we’ve heard nothing since and there’s been no official announcement yet either. There’s a chance it could be announced and launched later this year, perhaps at Connect, but it’s hard to gauge how far development is.
Splinter Cell VR – Quest
All we know about this game is that it’s part of the Splinter Cell series and it’s coming to Quest — nothing else. It’s hard to see this releasing in 2022, given Assassin’s Creed seems likely to come first, but with so little information, it’s hard to know either way.
Resident Evil 8 VR & Other PSVR 2 Titles
As we covered above, it’s unclear when the PSVR 2 headset is launching. While a 2022 window is increasingly unlikely, Sony has yet to comment properly on the exact release.
The Meta Quest Summer Sale has begun, offering discounted bundles of games and sale prices for popular individual titles as well.
As usual, there are a couple of bundled packs that give you a discount off multiple games (and will usually adjust the discount to exclude any games you might already own too).
The Sports Starter Park offers a 33% discount three games — Golf+, The Thrill of the Fight and Eleven Table Tennis, bringing the total price down to $36 from $54. Then there’s the Battle It Out pack, offer 28% off Superhot, Gorn and The Walking Dead Saints & Sinners, them down to $56 from $78.
The Multiplayer Favorites pack gives you 9% off A Township Tale, Demeo and Walkabout Mini Golf, down to $35 from $38.50. As is tradition now, there’s also a Vader Immortal pack that gives you all three episodes for $21 — down 29% from $30.
There are also discounts on individual games, ranging from 20% off up to 40%. Here are some of the highlights:
– Unplugged for $14.99, down 40% from $24.99
– Myst for $17.99, down 40% from $29.99
– Ragnarock for $14.99, down 40% from $24.99
– Virtuoso for $14.99, down 25% from $19.99
– Ultrawings 2 for $17.99, down 28% from $24.99
– Jurassic World Aftermath Part One for $17.99, down 28% from $24.99
– Jurassic World Aftermath Part Two for $10.99, down 26% from $14.99
– Stride for $10.99, down 26% from $14.99
– A Township Tale for $6.99, down 30% from $9.99
– After the Fall for $27.99, down 30% from $27.99
– Demeo for $20.99, down 30% from $29.99
– Walkabout Mini Golf for $10.49, down 30% from 14.99
– Eleven Table Tennis for $13.99, down 30% from $19.99
You can view the full list of discounts here, with the sale running for a week, until June 26. There’s also a new daily deal every day, available for just 24 hours, which you’ll have to check back for each day.
It’s not the only VR sale coming up this week either, the Steam Summer Sale begins in just two days. We hope to see some decent deals there as well — stay tuned.
When it comes to hand tracking games on Quest, nothing really comes close to Unplugged.
Developed by Anotherway and published by Vertigo Games in late 2021, Unplugged is an air guitar game, inspired by Guitar Hero and many others, that lets you shred in VR with a virtual guitar and your real hands.
As I’ve said elsewhere, Unplugged leverages Quest’s hand tracking technology to breathe life into the imaginary act of air guitar. In doing so, it takes hand tracking to a whole new conceptual and technological level, surpassing everything else available on Quest.
“From the very beginning, our obsession was to understand how the technology is limited and try to polish that stuff,” says studio director and Unplugged creator Ricardo Acosta. “That was the very first thing. Not the graphics, not even the gameplay.”
After speaking with Acosta in our virtual studio (full video interview embedded above), it’s clear that creating a polished and tangible experience was always the goal. “I think that hand tracking is here for good,” he tells me. “I wanted to create something that worked for real. It wasn’t just another demo.”
Such strong commitment to this new form of input is a big call, especially for Acosta, who spent years as a hand tracking skeptic while working on the HoloLens team at Microsoft. “When I was at Microsoft, I was like an advocate for controllers,” he says with a laugh. “At Microsoft, they are all about hand tracking, but I was like, ‘No guys, we need controllers. Controllers are great.’ And now I’m saying the exact opposite thing.”
“On the first version of the HoloLens … you have hand tracking, but just like the blob. It’s just the hand, not the fingers.” Without full and reliable finger tracking, Acosta came away disappointed and skeptical. “With the HoloLens 2, it was a bit better, but the lag between your movement and the hand was very big, for a lot of technical reasons.”
Even so, Unplugged was first conceptualized in 2015 — well before the advent of any modern VR’s hand tracking functionality. “I remember being in a concert in Prague and I was just like doing air guitar,” he recalls. “And at some point I was like, oh, this is an interaction that could work in VR.”
“As soon as I went back home, I prototyped something … and it totally worked. It was like, oh, this is good. This is something that we could actually turn into a game.” The original idea developed into something akin to Rock Band but for VR, using controllers and the first Vive headsets and Oculus SDKs. Acosta said he quit his job at Microsoft to work on the prototype, titled Rock the Stage, over the course of four months.
“I think that it was pretty good,” he says of the Rock the Stage prototype, of which videos still exist online. “The best thing it was that it made you feel like you were there.” But Acosta soon ran into a road bump — music games, and particularly the associated licensing, are complicated work. “You need a lot of money. You need a team of people handling all that music licensing. And I didn’t have all that back in the day. So I decided, at some point, to go back to my job.”
After continuing in Microsoft’s VR/AR division for another few years, Acosta revisited the concept in 2020 while bored at home during the pandemic. “Oculus [had] just released the hand tracking system [for Quest] and suddenly it came to me like, ‘Oh my god, I could actually rescue that…prototype and try [see] if it works using hand tracking.'”
Even in the early stages, hand tracking felt like a turning point for the previously controller-only experience. “It worked so well. . .Back in the day with the controllers was nice, but with hand tracking was exactly what it should be.” Acosta adapted his original prototype into something new, ditching controllers for something much more freeing and immersive. “When I put [hand tracking] on the prototype, it wasn’t perfect, but it was good enough for me to start polishing the experience. I knew that with a bit of work and a few algorithms on top of the hand tracking, I could make it work.”
Acosta created a video showcasing the new prototype game and posted it to social media. It soon exploded and attracted a lot of interest, especially from publishers offering funding. After discussions options with a few different publishers, Acosta signed with Vertigo Games. “They offered the best deal. And also they were bigger, and they really had a super nice vision about what the game should be.”
“At first I was a bit scared about it, because it was a super big project. We didn’t have a company together. It was complicated.” What started as a one-man show had to turn into a burgeoning team. Acosta’s wife joined as a project manager and they were then joined by a few others to make up the small studio now known as Anotherway.
“We are six people now, which is not a lot,” he says. “Very recently, we had the opportunity to grow a little bit, but we decided to stay small. I’ve been working in Microsoft for most of my career. That is a very big company and it’s amazing, but I really like working with just a very small amount of people. It’s a very creative environment.”
Working alongside Vertigo, Unplugged quickly developed into a project with bigger ambitions than Acosta had ever imagined. “I’m very conservative in terms of adding features, because I know that anything you add to a project, it will create a lot of problems, a lot of bugs, a lot of things.”
“They pushed for more staff. They wanted more music, they wanted more venues, they wanted more quality on the game and they’ve been always pushing for that. And I think that, in general, the game would have been way smaller without Vertigo,” he says.
In particular, working with Vertigo opened up opportunities when it came to the proposed tracklist. “In the very beginning we were just going for small bands. And then when we signed up with Vertigo they were like ‘No, like indie bands are cool and we will have a few. But we need famous bands.’ And we were like, oh, but that’s going to be super complicated.”
Vertigo sent Anotherway a Spotify playlist and asked them to add any songs they might want in the game. “And we were like ‘Wait, whatever music?'” It was a big mental shift.
The Offspring’s The Kids Aren’t Alright was the first major song that Vertigo and Anotherway secured the rights to. “We were just like jumping, like, ‘Oh my god, we made it.'” The final selection included some massive artists — The Clash, T. Rex, Weezer and Steel Panther, to name a few. “[Music licensing] is a very time-consuming process, and I knew that. So not even in my wildest dreams I would have dreamed about having Weezer or Tenacious D, The Offspring, or Ozzy…”
The inclusion of Tenacious D’s Roadie is particularly special to Acosta — not only is the band one of his favorites, but he had used the song all the way back in 2015 in the very first prototype. However, the song almost didn’t make it into the final game at all.
Vertigo and Anotherway initially struggled to make contact with Tenacious D to secure the rights. However, Vertigo had a trick up its sleeve — Guitar Hero legend Mark Henderson had been brought on board to assist with the game. “He was like, ‘Guys, leave it up to me. I’ll make it happen.’ So somehow he contacted the manager of Tenacious D and started talking to them.”
With Henderson’s help the rights to the song were secured. But another problem emerged — with a PEGI 12 rating, Roadie’s explicit and frequent F-bombs weren’t going to cut it. “So at another point we were like, ‘Okay, we have the song now, but we cannot use it because we are PEGI 12, so we have to take it out from the list.'”
Acosta made his peace with leaving the song off the tracklist but, in his words, “maybe the stars were in a particular position that night.” Henderson was able to get Tenacious D back into the studio to re-record a clean version of Roadie, specifically for Unplugged, excluding all the swearing.
“It was insane,” says Acosta. “Knowing that my favorite band re-recorded a song just for the game. It’s insane. It’s just amazing. And a lot of people have complained about the fact that it’s a different version of the song, without the swearing. But I’m so proud of that. To me, it’s even better because it’s our song.”
With a solid tracklist secured, Acosta and the team at Anotherway set to work on creating an unforgettable and reliable hand tracking experience. “I am a UX designer, so for me, the main important thing on anything is user experience. If the experience is not good, the whole game won’t work, or the whole experience will be shit, and we didn’t want that.”
As a result, the gameplay itself was adapted and designed to work with, not against, hand tracking. Even tiny changes mad a big difference — the size of the guitar in Unplugged, for example, is a bit smaller than a regular, real-life guitar, which helps keep your hands in view of the cameras.
“In the beginning, with hand tracking 1.0, we had to be very aware of your movements,” he explains. “We had to create the mapping so that the music charts in a way that is always aware of the limitations of the technology.”
That meant that at launch, the mapping in Unplugged didn’t always completely follow the music, leading some players to complain that the music and the notes didn’t always line up. “And we knew why, but we couldn’t do anything about it, because the hand tracking was very limited and you couldn’t move your hand that quickly,” he said.
Nonetheless, Acosta remains proud of the experience offered at launch. “In the first version, it was absolutely playable. Obviously it wasn’t perfect, but it was playable. And I think that we proved that you can actually create a hand tracking game that uses hand tracking very intensively.”
Skip forward a few months after launch and the release of Meta’s Hand Tracking 2.0 software offered huge gains for Unplugged. Not only was the technology more reliable than ever, but it was so good that Anotherway went back and re-mapped the entire tracklist for increased accuracy and challenge. “We want the game to be fully accessible for everyone, obviously. But I think that for 98% of people, the game works very well.”
Nonetheless, Anotherway are still implementing algorithms and workarounds to account for error and improve the experience — the latest being an AI system. “We’re using deep learning in order to see where your hands should be or what’s your pose or what’s your intentions. We made all that stuff so [that] when there is a problem with the hand tracking, there is another layer trying to help and trying to make the experience as smooth as possible.”
There’s more to come too. In the short term, Anotherway just released a new DLC pack — featuring songs by metal band Pantera — and are working on an upcoming accessibility update adding new features and “another thing” that is top secret but will be “really big.”
In terms of song selection, there’s definitely more on the way. “We are working to add more music all the time. We want to add free music [as well], not just DLC. Also, I want to add more indie music because I think that there is a lot of really good indie music out there.”
But what about the long term? What does the next year or more look like for Unplugged? “I cannot talk too much about it because Vertigo will kill me,” Acosta says with a laugh. “But our plans are very big. Unplugged is going to become bigger, at least in terms of features…”
“I would be very excited about Unplugged if I knew what’s going to happen. Probably like in a year, Unplugged will be very different. It will have way more stuff. That’s it. That’s all I can say.”
For a game that has already pioneered a new technology on a cutting edge piece of hardware, there could be a lot of interesting developments in Anotherway’s future.
“Unplugged is going to move forward,” Acosta said. “That is for sure. We are not staying still.”
The advanced combat update adds over 1,000 new animations made possible by extensive motion capture and improvements to enemy AI prediction. While the game has been available on PC VR and PSVR for a while already, the update is available now for all platforms, coinciding with the Quest release.
On Reddit, the developers also addressed some promised features that are missing from the update, or features that are now no longer available. SinnStudio said that features like finishers, executions and grabbing enemies’ weapons were “affected by the new combat mechanics in significant ways and sadly, we were unable to complete them in time for the update.”
It will be interesting to see whether the game picks up interest from the Quest user base, especially given its similarities to other physics-driven combat titles like Blade & Sorcery. We enjoyed parts of Blade & Sorcery, but also found it lacking in some features and encountered performance hitches on Quest 2. It overall felt “still two or three updates away from really escaping its tech demo roots, and … more like a preview both for what the finished product will look like.”
Could Swordsman VR scratch an itch that Blade & Sorcery: Nomad didn’t quite capitalize on? Let us know what you think in the comments.