Lushfoil Photography Sim, a serene photography game built on Unreal Engine 5, is expected to get optional PC VR support following its initial release.
In development by solo developer Matt Newell and to be published by Annapurna Interactive, Lushfoil Photography Sim is designed as a serene walking simulator and photography game that immerses players in beautiful landscapes with a photorealistic style.
Players are equipped with a camera with a range of realistic settings, currently including “shutter speed, ISO, aperture, white balance, and different lens types,” and includes a “learning tool for newcomers that covers the basics of exposure and other settings.”
Given the game’s emphasis on photorealistic visuals, and its Unreal Engine 5 foundation, the developer doesn’t expect a port to Quest or PSVR 2 to be practical for the game.
Lushfoil Photography Sim’s initial release doesn’t have a firm date yet (officially “coming soon”), and it’s not yet clear how long it will take for the addition of VR support after that.
VisionOS 2 is bringing a range of new development features, but some of the most significant are restricted to enterprise applications.
VisionOS 2 will bring some of the top requested development features to the headset, but Apple says its reserving some of them for enterprise applications only.
Developers that want to use the features will need ‘Enterprise’ status, which means having at least 100 employees and being accepted into the Apple Developer Enterprise Program ($300 per year).
Apple says the restriction on the new dev capabilities is to protect privacy and ensure a predictable experience for everyday users.
Up to this point, developers building apps for Vision Pro and VisionOS couldn’t actually ‘see’ the user’s environment through the headset’s cameras. That limits the ability for developers to create Vision Pro apps that directly detect and interact with the world around the user.
With approval from Apple, developers building Vision Pro enterprise apps can now access the headset’s camera feed. This can be used to detect things in the scene, or to stream the view for use elsewhere. This is popular for ‘see what I see’ use-cases, where a remote person can see the video feed of someone at a work site in order to give them help or instruction.
Developers could also use the headset’s camera feed with a computer vision algorithm to detect things in view. This might be used to automatically identify a part, or verify that something was repaired correctly.
Even with Apple’s blessing to use the feature, enterprise apps will need to explicitly ask the user for camera access each time it is used.
Barcode and QR Code Detection
Being able to use the headset’s camera feed naturally opens the door for reading QR codes and barcodes, which allow structured data to be transmitted to the headset visually.
Apple is providing a readymade system for developers to detect, track, and read barcodes using Vision Pro.
The company says this could be useful for workers to retrieve an item in a warehouse and immediately know they’ve found the right thing by looking at a barcode on the box. Or to scan a barcode to easily pull up instructions for assembling something.
Neural Engine Access
Enterprise developers will have the option to tap into Vision Pro’s neural processor to accelerate machine learning tasks. Previously developers could only access the compute resources of the headset’s CPU and GPU.
Object Tracking
Although the new Object Tracking feature is coming to VisionOS 2 more broadly, there are additional enhancements to this feature that will only be available to enterprise developers.
Object Tracking allows apps to include reference models of real-world objects (for instance, a model of a can of soda), which can be detected and tracked once they’re in view of the headset.
Enterprise developers will have greater control over this feature, including the ability to tweak the max number of tracked objects, deciding to track only static or dynamic objects, and changing the object detection rate.
Greater Control Over Vision Pro Performance
Enterprise developers working with VisionOS 2 will have more control over the headset’s performance.
Apple explains that, out of the box, Vision Pro is designed to strike a balance between battery life, performance, and fan noise.
But some specific use-cases might need a different balance of those factors.
Enterprise developers will have the option to increase performance by sacrificing battery life and fan noise. Or perhaps stretch battery life by reducing performance, if that’s best for the given use-case.
There’s more new developer features coming to Vision Pro in VisionOS 2, but these above will be restricted to enterprise developers only.
If you watched the WWDC 2024 keynote, you’ll have seen the announcement of VisionOS 2 and some highlights of what’s coming to the Vision Pro headset. But there’s lots more that Apple didn’t actually show on stage. Here’s everything we know so far.
Beyond what Apple showed of VisionOS 2 in the WWDC keynote, there’s a lot more coming when the update launches this Fall.
We’ve scoured everything Apple has released about the forthcoming Vision Pro update to get a comprehensive picture of everything that’s coming. Let’s jump right in.
Photos App Upgrades
The Photos app is getting additional enhancements in VisionOS 2. Although the iPhone can already capture spatial photos and videos, you need to switch to a special mode during capture. The Photos app will be able to take existing photos and convert them into 3D photos to view on Vision Pro. As far as we know, this will not be an option for videos.
We’ve seen similar 2D to 3D conversion tech from other companies in the past, but with mixed results. It will be interesting to see how well Apple’s approach works.
The Photos app is also getting proper SharePlay support, including Spatial Personas. While it was possible previously to stream the Photos app as a flat window with friends over FaceTime, now the app will support sharing full spatial content (photos and videos) and panoramas with other Vision Pro users.
Videos in the Photos app can also now be trimmed just like on iPhone and iPad.
Mac Virtual Display Upgrades
At launch, one of Vision Pro’s most useful features was Mac Virtual Display, which made it effortless to stream your Mac screen right into the headset. And while you could get a nice big monitor, it was stuck in one aspect ratio.
While many asked for support for multiple virtual monitors, in VisionOS 2 Apple opted to add new ‘wide’ and ‘ultrawide’ monitor options instead.
From a screen real estate standpoint, Apple says, “it’s the equivalent of having two 4K displays sitting side by side,” about the ultrawide mode.
New Gestures for Easier Navigation
With how often you open the app menu and Control Center while using Vision Pro, we were surprised how clunky these actions were on the headset.
Apple heard that feedback loud and clear, and has added two new gestures to make this easier in VisionOS 2.
To open the home screen users can turn their hand palm-up and pinch their fingers. A palm-down pinch will open the clock and make it easy to get to notifications and the Control Center where many helpful tools (like screen casting and recording) are found.
Customizable App Menu
A simple feature missing at launch; VisionOS 2 will allow users to rearrange their app menu, just like they would expect on iPhone and iPad.
You’ll also be able to pull apps out of the ‘Compatible Apps’ folder to put them in the main app menu for easier access.
Immersive Background Improvements
VisionOS shipped with a handful of beautiful immersive background environments, but also several tantalizing “coming soon” bubbles that we’ve been waiting patiently for.
VisionOS 2 will bring a new Bora Bora environment, taking users to an island in the South Pacific.
‘Cinema mode’ puts movies on Vision Pro onto a massive screen that naturally merges with the selected immersive background. But at launch this was restricted to apps like Apple TV.
With VisionOS 2, web videos in Safari will have the option of being presented in the same way. Apple says this will work for YouTube, Netflix, and Amazon videos out of the gate, and presumably web developers will be able to enable this functionality in other players too.
Users will also now be able to tilt videos upwards, while in cinema mode, to make reclined viewing more comfortable.
Keyboard Visible Through Immersive Backgrounds
Speaking of Immersive Backgrounds, at launch Vision Pro had excellent hand occlusions which brings your own hands into immersive environments. Unfortunately if you were using a Magic Keyboard, they keyboard itself would disappear when in an immersive environment, even though your hands would show on top of where it should be.
VisionOS 2 makes the natural improvement of showing the full keyboard even while an immersive background is visible.
Support for Third-party Mice
At launch Vision Pro could only be paired with Apple’s Magic Trackpad. But regular mice weren’t supported (not even the company’s own Magic Mouse).
VisionOS 2 will add support for the Magic Mouse and third-party mice too. Presumably this will cover third-party mice and trackpads.
Guest User Enhancements
It’s always fun to show people Vision Pro for the first time. And the built-in Guest User feature makes this easy. Unfortunately it takes a minute or two for the Guest User to calibrate every time they don the headset.
In VisionOS 2, Guest User can remember the last guest’s calibration data for up to 30 days. So if they need to take a break or you want to grab the headset for a minute to navigate them to a new experience, they can put it back on and skip the calibration step.
Watch Multiple Sports Games in Apple TV
In VisionOS 2, Apple TV will allow users to watch up to five different games at the same time—between MLS and MLB.
Persona Enhancements
Apples Persona avatars have been steadily getting better since the launch of Vision Pro, and VisionOS 2 promises more improvements still.
Apple says users can expect “more accurate skin tones and vibrant clothing colors.” It also promises more natural hand movements.
And if you’re using your Persona as a ‘webcam’ (ie: a windowed view, not a Spatial Persona) you’ll no longer be stuck using the default background that every Vision Pro user is stuck with. With VisionOS 2 Apple says you’ll be able to choose “a variety of backgrounds,” hopefully including choosing your own photos.
Look-to-Dictate in Messages
Look-to-Dictate is a convenient part of Safari, where you just look at the microphone icon in the URL bar, and it prompts you for voice input (skipping the keyboard).
VisionOS 2 will bring the same feature to Messages to make it easier to send texts hands free.
AirPlay to Vision Pro
It’s always been possible to cast content from Vision Pro, but not to cast content to Vision Pro from other AirPlay devices.
VisionOS 2 will make this small but obvious improvement, meaning you’ll be able to cast from your iPhone, Mac, and other AirPlay compatible devices, and see the video pop up in your headset.
Mindfulness App Breathing Detection
Mindfulness is Apple’s first-party meditation app for Vision Pro. In VisionOS 2 the app will now be able to detect your breathing rate and use it to match the app’s visualizations. This is presumably done using the headset’s microphones.
Train Support for Travel Mode
Travel Mode in VisionOS 2 will explicitly support use in trains. The mode changes the headset’s tracking behavior to ensure it tracking works accurately even while being in a moving vehicle.
Panorama Viewing on the Web
In the Photos app, panoramic photos can be stretched around you to experience them in an immersive way. That same feature will now work with in Safari for panoramic photos hosted on the web. We don’t know at this point if the headset will automatically detect panoramic photos or if webpages will need to declare them as such in order to use this feature.
Apple Music Karaoke
The Apple Music app in VisionOS 2 will get support for collaborative playlists between friends, and Apple Music’s ‘Sing’ feature which works like karaoke. It sounds as if friends will even be able to use Sing together through FaceTime.
Live Captions
The Live Captions feature is also coming to VisionOS 2. This acts like ‘real time subtitles’ by showing in the headset a written version of the words being spoken around you, or being played through the headset in audio or videos. Primarily intended as an accessibility feature for those with hearing disabilities, the feature may one day support real-time translation too.
Improved Quick Look
Quick Look is the easiest way to show realistic 3D models in augmented reality at scale—even directly from a web page. VisionOS 2 will bring improvements to Quick Look, like being able to anchor objects to surfaces and change colors and materials without loading additional models. Apple envisions this as being especially useful for previewing how a piece of furniture might fit in your room.
Communication Safety in VisionOS Apps
VisionOS 2 will bring Apple’s Communication Safety feature to the headset. The feature, intended primarily for children,” gives you the option to blur sensitive photos and videos you receive before you choose whether to view them.”
There’s a bunch of other changes and improvements coming for developers too, stay tuned for more coverage on that front.
Today at WWDC 2024, Apple revealed VisionOS 2, the first major update for its Vision Pro headset. The new version of the software will be available for developers to experiment with starting today.
VisionOS 2 is primarily designed to round out some rough edges from the headset’s release earlier this year, while adding some new features and also expanding development capabilities so developers can take greater advantage of the headset.
While it won’t release publicly until the fall, Apple says developers can get their hands on VisionOS 2 starting today. We haven’t spotted the direct page to the developer preview update yet, but the company says it will be available through its official developer website.
VisionOS 2 is bringing a range of new features and improvements like 2D to 3D photo conversion, ultrawide Mac Virtual Display, new ways to navigate the headset’s core interface, and much more. We’ll be breaking down the full range of features soon, stay tuned to the front page!
The original Quest headset, launched in 2019, has been on its way out for quite some time. Now it’s getting a final farewell from Beat Saber, as the game’s new OST 7 music pack is the last that will reach the headset.
Beat Saber’s newest free music pack, OST 7, is available now on all platforms (though temporarily delayed on PS4). The pack brings with it five new tracks:
F.O.O.L — “Damage”
Camellia — “Lustre”
Teminite x Boom Kitty — “The Master”
Lindsey Stirling — “Untamed”
Nitro Fun — “World Wide Web”
It also includes a newly upgraded background visuals:
OST7 comes with a brand-new environment called “Collider,” which builds on the latest lighting tech that was first introduced in the Daft Punk Music Pack. The team also expanded on laser physics, so in addition to colliding, lasers can now reflect from certain surfaces (even multiple times). That adds up to more breathtaking light shows and effects moving forward.
While Beat Saber OST 7 is available on all Quest headsets, it’s the last new music pack that will reach Quest 1. Meta confirmed that no future music packs or content updates will come to the original Quest headset. The company plans to end Beat Saber multiplayer and leaderboards on Quest 1 by November 2, 2024.
Luckily Meta reminds people that Beta Saber is tied to their Meta account, so if they choose to upgrade to a newer Quest headset—or want to play on PC via Quest Link—they’ll be able to jump right back into the latest version of the game, complete with all purchased DLC.
When Quest 1 launched, it was something of a revelation that it could handle Beat Saber at all. Not because the graphics were too heavy or because the game was too big, but because no standalone headset up to that point had motion controllers that were accurate enough to really make the game shine.
That decision helped propel Beat Saber from ‘big’ to ‘massive’—making it easier and more affordable than ever to play VR’s killer app—pushing Beat Saber to become one of VR’s most commercially successful games.
With the success it saw from Quest 1, Meta quickly doubled down on the standalone headset concept. It was only a little more than a year later that it launched Quest 2—a cheaper and more powerful version of the headset.
And while Quest 2 is still chugging along nearly four years later, the original Quest started to be phased out some time ago.
To Meta’s credit, Beat Saber has supported the aging Quest 1 longer than most. The game still has feature parity with every other platform, despite the headset being more than five years old.
With Quest 1’s birth so intertwined with Beat Saber, the end of support marks a significant milestone in the headset’s epilogue.
Two anticipated VR games, Behemoth and Alien: Rogue Incursion, showed their first glimpse of gameplay during PlayStation’s State of Play presentation today. Both are set to launch across all major VR platforms later this year.
Behemoth First Look at VR Gameplay
Kicking things off with Behemoth: this is the latest game from developer Skydance Interactive, the studio behind The Walking Dead: Saints and Sinners series.
Behemoth is due out “Fall” 2024, and coming to PSVR 2, PC VR, and Quest (Skydance hasn’t confirmed exactly which Quest headsets the game will support, so the aging Quest 2 remains in question).
Over at the PlayStation Blog, the developer shared a description of the gameplay. Here’s a few choice bits that caught our eye:
We designed this dark fantasy to stretch the boundaries of what’s possible inside a headset, creating a grim new world teeming with mystery and dangers great and small. When you enter the Forsaken Lands, one thing is clear: You are not welcome here. Everything and everyone wants to kill you. Are you bold enough to forge ahead?
The Forsaken Lands are your gauntlet and your hunting ground. Equipped with a grappling hook and unnatural grip strength, you’ll climb, zip, and reel to ascend or descend walls, towers, and cliffsides. Surmount barriers hindering your path, explore nooks and crannies to claim hard-to-reach items, or grasp onto unsuspecting foes and cast them into the abyss.
Building on our work on The Walking Dead games, we have continued to push our gore-tech and dismemberment systems in a way that players should find truly satisfying.
Your foes are not mindless zombies, they’re skilled, strategic warriors. Stay on your toes. Avail yourself of every weapon you can grasp. Block, parry, hack, and slash to wear their strength down. Then crack skulls, drive blades through hearts, sever limbs and cleave everything in between.
Alien: Rogue Incursion First Look at VR Gameplay
Next we’ve got Alien: Rogue Incursion, the first major VR game in several years from veteran VR studio Survios.
Alien: Rogue Incursion is planned for release “Holiday” 2024, and coming to PSVR 2, PC VR, and, Quest 3 exclusively (among Quest headsets).
Dynamically spawning and pathing with countless unique possibilities, even we couldn’t tell you exactly where and when each Xenomorph will strike, let alone what strategies it might use or if it’ll bring some friends.
While heart-pounding combat is the core of Alien: Rogue Incursion, grabbing your pulse rifle and blasting your way out of every situation is not always the best strategy. Constantly hunted by unpredictable and resourceful Xenomorphs, players will often find creativity and a level head to be their greatest weapons, especially when it comes to leveraging the environment to their advantage.
Google and Magic Leap today announced a “strategic technology partnership.” The move shows Google seeking to gain ground to keep up with the likes of Meta, Apple, and others in a race to control the AR headset market.
While consume VR headsets have been around for years now, none of the world’s major technology companies have launched consumer-focused AR glasses.
But behind the scenes, companies like Meta, Apple, and Google are racing each other toward that future—one they hope will see all-day AR glasses become as big as the smartphone.
Google has had several starts and stops in the XR space. Google Cardboard was first out the door way back in 2014, and introduced millions to a very basic VR experience made by slapping a smartphone into a cardboard viewer with simple lenses.
The company got more serious with Google Daydream, making a more streamlined experience with phones that were specially certified to work with more advanced smartphone viewers.
And while Daydream ultimately failed, it wasn’t just Google’s fault. The whole concept of VR smartphones viewers just never found a foothold. Meta and Samsung’s similar Gear VR project faced a similar fate. And even Oculus Go—a low cost headset that further streamlined the ‘smartphone VR’ experience by building the smartphone components directly into the headset—couldn’t make the formula work.
Higher fidelity VR, with full tracking and motion controllers, seemed to be the way forward as both PC VR headsets and Meta’s Quest line had shown.
But even those headsets are ultimately stepping stones in the eyes of Google, Meta, and Apple. All are gunning for a future where the reality-altering experience of VR can be merged with the real-world through lightweight AR glasses that people can wear all day.
While Google continues to build out its ARCore software platform (which lets Android developers build phone-based AR apps) Google still eventually needs to get a head-worn AR platform off the ground if it hopes to compete with peers like Meta and Apple, which are already getting a foothold on head-worn AR experiences with devices like Quest 3 and Vision Pro.
If Google wants to leapfrog over the preliminary step of building an MR headset like Meta and Apple, one of the biggest barriers is the challenge of optics. Producing a high-resolution, wide-field of view image through something the size of a pair of glasses is an incredible physics challenge that hasn’t been cracked by anyone just yet.
Optics may be one reason why Google has had starts and stops with its own efforts to build a pair of AR glasses. A major internal project to build such a device, called Project Iris, was reportedly shut down sometime last year.
Such starts and stops have seemingly left Google trailing the others in this race. But the company hasn’t given up yet.
Now its tapping AR headset maker Magic Leap in an effort to secure key technology needed to make a compact AR device. The “multi-faceted, strategic technology partnership” with Google was announced today by Magic Leap, and very specifically calls out optics as being a key driver behind the arrangement.
Shahram Izadi, Vice President and General Manager of AR/XR at Google, said, “We look forward to bringing together Magic Leap’s leadership in optics and manufacturing with our technologies to bring a wider range of immersive experiences to market. By combining efforts, we can foster the future of the XR ecosystem with unique and innovative product offerings.”
Magic Leap Chief Technology Officer, Julie Larson-Green said, “This partnership accelerates the transformative power of AR by combining our extensive optics capabilities with Google’s technologies to continue to advance immersive experiences to the developer ecosystem and for customers. We are looking forward to expanding the potential of XR – blending the physical world with valuable, contextually relevant solutions.”
Magic Leap has had starts and stops of its own. But through the course of the company’s wild fundraising ride, near failure, and eventual stabilization, Magic Leap has amassed a large number of patents relating to XR technology. Their latest headset, Magic Leap 2, also has among the widest field-of-view in a device its size with transparent optics.
And that’s probably where Google’s interests lie.
Whether adopting a more advanced version of the optics in Magic Leap 2, or leaning into a novel solution that’s been developed and patented by Magic Leap, it seems Google is hoping to buy itself a shortcut to market, at least as far as the optics are concerned.
It’s still unclear at this point exactly how this partnership will manifest, but the limited information we have points in a particular direction: Google may be developing its own headset, using Magic Leap’s optics technology, but ultimately the whole headset may be manufactured by Magic Leap.
This would leave Magic Leap to continue its enterprise-focused business while still earning money from the consumer AR space that Google and others are trying to bring to fruition.
And still other strategic plays are possible… Samsung went completely unmentioned in the Google and Magic Leap announcement—where does it fit into this picture? For now only time will tell.
The Marvel What If…? experience is set to launch tomorrow exclusively on Vision Pro. We got an early look at the experience and found a polished presentation that foreshadows the possibilities and pitfalls of immersive VR games on the headset.
Marvel’s What If…? on Vision Pro is being described as an “interactive Disney+ Original story.” And while it isn’t meant to be a full-blown VR game, it’s the headset’s biggest fully immersive production to date—most other native ‘games’ on the headset feel a lot like board games that float in front of you.
In What If…? players experience a narrative blended with casual gameplay driven by Vision Pro’s hand-tracking.
Made by Marvel and ILM Immersive, the production quality meets the high bar you’d expect from this kind of experience. Though scarcely an hour long, the experience is polished with strong sound, visuals, and voice acting throughout.
Fittingly for the emphasis on the Marvel multiverse, the narrative plays out across several distinct mediums. Players will see Marvel characters between their own living room (thanks to augmented reality), in fully immersive virtual spaces (thanks to virtual reality), and in 3D animated videos that play on floating screens in front of you.
The jump between these distinct contexts is actually handled very smoothly (aside from an awkward lack of environmental occlusion in AR), whether taking a portal from your living room into a fully immersive space, or seeing a 3D video playing back on a crystal shard floating in front of you (as a representation of some vignette from another timeline).
From a gameplay standpoint, What If…? shows us both the potential and likely pitfalls of fully immersive experiences on Vision Pro.
On the one hand (pun intended), the game makes use of hand-tracking to detect a range of gestures. You’ll use different gestures to summon an arm shield, grab distant objects with a force-like power, or shoot lasers from your hands. Some more complex gesture interactions are used effectively too, like a spell where you pinch your fingers together on both hands then pull them apart to draw a line, rotate your hands to twist the line, then finalize the spell by opening your hands and pushing them together, palm-to-palm.
But even though all of these gestures work reasonably well for this casual experience, they ultimately feel just like that—gestures: simple, prescribed hand motions. You’re never actually touching anything in the experience… just making a gesture and watching it cause things to happen.
I’ve said it before: the least interesting interactions in VR are those that happen far away from you. In VR, it’s not the distant bad guy falling over after you’ve shot them with an arrow that makes VR unique. What’s unique about VR is the fact that you can literally reach over your shoulder to pull out an arrow, nock it to your bow, stretch the string back, aim with one eye, and release—like you had an actual bow in your hands.
Now those reading along carefully should stop to think “isn’t the bow scenario you just described really just a series of ‘gestures’?” Sure. If you want to use the broadest definition of that word, you could dump it into that category. But it’s the fidelity of the gestures that we’re concerned about. It’s the difference between wiggling a Wiimote to make the character on screen swing their sword, versus actually swinging the sword yourself—dictating with precision exactly where it should land on the enemy.
There’s no doubt that Vision Pro as a headset is capable of doing the latter. But it has to be said… it’s damn tough without the precision of tracked controllers.
Falling back to hand gestures—and using those hand gestures to cause distant things to happen—isn’t highly engaging. Not being able to reach out and touch the virtual world around you—even to just pick up a rock and throw it with your own hand—diminishes much of the magic of VR.
And that’s where we find What If…?. It’s doing more than I even would have expected with Vision Pro’s hand-tracking. The gestures are varied and detected fairly consistently, but what you do with them feels kind of… detached. I never managed to cross the chasm from I’m ‘doing a hand gesture’ to ‘I’m casting a spell’… and further still from ‘I’m actually interacting with this world’.
Now to be fair, that’s partly due to the experience’s gameplay itself. It’s very casual by the standards of any ‘gamer’; easy enough that most people will be able to figure it out. That’s clearly on purpose; this is a supposed to be a narrative as much as an interactive experience. Marvel and ILM Immersive could have made the gameplay deeper and squeezed a bit more engagement out of the hand-tracking. It’s just hard to see it crossing that chasm I talked about above.
But let’s concede that this casual gameplay is ultimately in service of telling a story. Is the narrative any good? Well, my opinion will surely be colored by the fact that I’m not terribly attuned to the Marvel Cinematic Universe (even if I’ve caught the rough strokes over the years). Dropping me into a What If…? experience—that’s expressly about throwaway alternate timelines—certainly doesn’t lend itself to meaningful character development or real stakes. The narrative has the distinct feeling of being a framework for this experimentation in this new medium, rather than a story unto itself.
So in the end, is What If…? worth checking out on Vision Pro? Sure. I can’t say it was a terrible amount of fun, but hey, it’s free and well produced. And if not for the entertainment value, it’s a clear lesson on what can and probably can’t be done with the headset’s hand-tracking input. In the end this will go down as an early experiment, but certainly not a killer app.
Asgard’s Wrath 2 is easily Meta’s biggest first-party Quest game to date. But when it launched almost six months ago it was clear it hadn’t been optimized for Quest 3, the company’s flagship headset. A newly released update finally changes that.
The Update
Asgard’s Wrath 2 finally got a meaty Quest 3 graphics update today. The latest version of the game, v5.0.1535650, brings several gigabytes worth of enhanced textures and shaders that make better use of Quest 3’s impressive resolution.
After downloading the upgrade, players can enable the improved assets by going to Settings → User Interface → Enhanced Rendering Features.
While this setting previously existed, it’s now the toggle for the new improvements as well.
Unfortunately the scope of these improvements means Enhanced Rendering Features can no longer be used at the same time as the game’s 90Hz framerate mode. Players will have to choose between the two, effectively giving Asgard’s Wrath 2 a ‘Performance Mode’ and ‘Graphics Mode’, a common way for modern console games to let players choose between favoring framerate or visual fidelity.
Not only does the Quest 3 graphics upgrade for Asgard’s Wrath 2 add improved textures, it also features significantly better speculairty—small, bright highlights that give a more realistic look to the textures when seen from various angles. This makes textures look less like cardboard and also makes the game’s lighting overall appear more cohesive.
Quest 2 to Quest 3 Comparison
The Quest 3 graphics update for Asgard’s Wrath 2 shows new levels of detail compared to what players can see on Quest 2. Meta shared a comparison of improvements ranging from small changes in sharpness to completely transformed textures:
1 of 18
The Story
Quest 3 launched in late 2023, right in time to be Meta’s hot holiday product. Alongside the headset, the company prominently advertised its biggest first-party game, Asgard’s Wrath 2, apparently as a headset-seller. The company even bundled the game with Quest 3… or at least promised a free copy once it eventually launched.
It was understandably confusing for those that bought Quest 3 around its mid-October 2023 launch, when it became apparent that Asgard’s Wrath 2 wasn’t actually available to play when the headset hit the market. It would be another two months until the game actually launched, on December 15th, 2023.
But hey, at least it landed in time for Christmas. So those who had already bought it, and anyone lucky enough to open a Quest 3 for the holiday, would be ready to dive into this hot new Quest 3 seller… right?
Well, sure. But there was one problem. When Asgard’s Wrath 2 launched, the game’s visual presentation—though impressive compared to most Quest games—was clearly not tapping the potential of the brand new Quest 3. In fact, it looked very much like the game was primarily optimized with Quest 2 in mind… a headset which at that point was more than three years old.
For a game heavily advertised and directly bundled with Quest 3, this was understandably a let down to players hoping for a game that could really show what the new headset could do.
And ultimately this makes perfect sense. Marketing aside, Quest 2 remains one of the most popular VR headsets—still more popular than Quest 3 by all accounts—so Meta needed to ensure the game played great on Quest 2 as a baseline experience.
Alas, while first-party Meta studio Sanzaru Games did manage to add some Quest 3-specific improvements to Asgard’s Wrath 2 shortly after launch (like increased draw distance, higher resolution rendering, and a better framerate), it still felt like Quest 3 was nearly an afterthought.
While it was two months after Quest 3 launched until players could actually play Asgard’s Wrath 2, and then another five months until it finally got a serious visual enhancement for Quest 3, the game finally feels at home on the company’s flagship headset.
Beat Saber is the VR rhythm game behemoth that can’t be stopped. Just two months after dropping the Daft Punk music pack, Beat Games is teasing its next music pack, the free OST 7 pack.
Beat Saber launched as a little-known indie game in early access on SteamVR all the way back in May 2018. It was clear then that it was something special, but I don’t think anyone can say they knew just how big it would be—it reportedly surpassed $250 million in lifetime sales as of 2023.
The game has long since reached its 1.0 version and also landed on Quest, PSVR 1, and PSVR 2. Its ‘easy to play, hard to master’ gameplay kept people coming back for more, catching the eye of Meta which ultimately acquired developer Beat Games in 2019. And though that put VR’s killer app solely in the hands of Meta, the company has made good on its promise to keep Beat Saber a multiplatform game, offering feature parity, equal music pack availability, and even cross-play multiplayer between platforms.
Over the years Beat Saber has seen the release of additional music packs, many from major artists. While most are paid, a solid lineup of free packs have been released over the years, and now Beat Saber is getting another one.
Beat Games teased the OST 7 music pack today, but offered little more than “coming soon.”
All the previous ‘OST’ music packs for the game have been free, and it’s unlikely this one will be any different. We have no hints as to how many or which tracks will be included, but most of the ‘OST’ packs have included original and indie music, with a handful of bangers made specifically for the game.
When it does land, you can expect to find the Beat Saber OST 7 music pack on SteamVR, Quest, PSVR, and PSVR 2.
Will Beat Saber still be pumping out music packs another six years from now? As incredible as that would be (a 12 year old VR game staying relevant enough to continue to update) betting against Beat Saber seems to be a bad idea.