Hands-on: CREAL’s Light-field Display Brings a New Layer of Immersion to AR

More than four years after I first caught wind of their tech, CREAL’s light-field display continues to be one of the most interesting and promising solutions for bringing light-fields to immersive headsets. At AWE 2023 I got to check out the company’s latest tech and saw first hand what light-fields mean for immersion in AR headsets.

More Than One Way to Focus

So first, a quick recap. A light-field is a fundamentally different way of showing light to your eyes compared to the typical displays used in most headsets today. The key difference is about how your eyes can focus on the virtual scene.

Your eyes have two focus methods. The one most people are familiar with is vergence (also called stereoscopy), where both eyes point at the same object to bring overlapping views of that object into focus. This is also what makes things look ‘3D’ to us.

But each individual eye is also capable of focusing in a different way by bending the lens of the eye to focus on objects at different distances—the same way that a camera with only one lens focuses. This is called accomodation.

Vergence-Accommodation Conflict

Most XR headsets today support vergenge (stereoscopic focus), but not accomodation (single-eye focus). You may have heard this called Vergence-Accomodation Conflict; also known to the industry as ‘VAC’ because it’s a pervasive challenge for immersive displays.

The reason for the ‘conflict’ is that normally the vergence and accommodation of your eyes work in tandem to achieve optimal focus on the thing you want to look at. But in a headset that supports vergence, but not accomodation, your eyes need to break these typically synchronous functions into independent functions.

It might not be something you ‘feel’ but it’s the reason why in a headset it’s hard to focus on things very near to you—especially objects in your hands that you want to inspect up close.

The conflict between vergence and accommodation can be not just uncomfortable for your eyes, but in a surprising way also rob the scene of immersion.

Creal’s Solution

And this is where we get back to Creal, a company that wants to solve the Vergence-Accommodation Conflict with a light-field display. Light-field displays structure light in the same way that we see it in the real world, allowing both of the focus functions of the eyes—vergence and accommodation—to work in tandem as they normally do.

At AWE 2023 this week, I got to check out the company’s latest light-field display tech, and came away with an added sense of immersion that I haven’t felt in any other AR headset to date.

I’ve seen Creal’s static bench-top demos before, which show static floating imagery through the lens to a single eye, demonstrating that you can indeed focus (accommodate) at different depths. But you won’t really see the magic until you see a light-field with both eyes and head-tracking. Which is exactly what I got to do this week at AWE.

Photo by Road to VR

On an admittedly bulky proof-of-concept AR headset, I got to see the company’s light-field display in its natural habitat—floating immersively in front of me. What really impressed me was when I held my hand out and a little virtual turtle came floating over to the palm of my hand. Even though it was semi-transparent, and not exceptionally high resolution or accurately colored, it felt… weirdly real.

I’ve seen all kinds of immersive XR experiences over the years, and holding something in your hand sounds like a banal demo at this point. But there was just something about the way this little turtle looked—thanks to the fact that my eyes could focus on it in the same way they would in the real world—that made it feel more real than I’ve ever really felt in other headsets. Like it was really there in my hand.

Photo by Road to VR

The trick is that, thanks to the light-field, when I focused my eyes on the turtle in my hand, both the turtle (virtual) and my hand (real) were each in proper focus—something that isn’t possible with conventional displays—making both my hand and the turtle feel more like they were inhabiting the same space right in front of me.

It’s frustratingly impossible to explain exactly how it appeared via text alone; this video from Creal shot through-the-lens gives some idea of what I saw, but can’t quite show how it adds immersion over other AR headsets:

It’s a subtle thing, and such added immersion probably only meaningful impacts objects within arms reach or closer—but then again, that distance is where things have the potential to feel most real to use because they’re in our carefully watched personal space.

Digital Prescriptions

Beyond just adding a new layer of visual immersion, light-field displays stand to solve another key problem, which is vision correction. Most XR headsets today do not support any kind of prescription vision correction, which for maybe even more than half of the population means they either need to wear their correctives while using these devices, buy some kind of clip-on lens, or just suffer through a blurry image.

But the nature of light-fields means you can apply a ‘digital prescription’ to the virtual content that exactly matches the user’s corrective prescription. And because it’s digital, this can be done on-the-fly, meaning the same headset could have its digital corrective vision setting change from one user to the next. Doing so means the focus of virtual image can match the real world image for those with and without glasses.

Continue on Page 2: A More Acceptable Form-factor »

5 VR Games We’re Most Excited for From Quest Gaming Showcase

Meta dumped an avalanche of VR news today in its hour-long Quest Gaming Showcase livestream, revealing trailers and info on more than a dozen new games coming to Quest 2, Quest Pro, and probably also the newly unveiled Quest 3 headset.

Here’s what we’re most excited about:

Asgard’s Wrath 2

Image courtesy Sanzaru Games

That’s right, the sequel to hit Rift title Asgard’s Wrath (2019) is coming to Quest this winter, bringing with it a ton of new places to explore and a cast of new companions and puzzles. Meta’s Sanzaru Games says we should expect phsyics-based melee and a more intuitive combat system altogether, not to mention a brand-new realm to explore which will bring us to a reimagined Ancient Egypt. Catch the full announce here. Also, check out the trailer on YouTube (age-restricted).

I Expect You To Die 3: Cog in the Machine

We’ve known about the upcoming sequel to the hit spy-flavored puzzle game I Expect You To Die, but it seems every new trailer that pops up is just another opportunity to salivate at the Bond-style escape room’s ingenious puzzles and patently dastardly villains. It’s coming to the Quest platform and PSVR 2 “soon,” developers Schell Games says. Catch the full announce here.

Attack on Titan VR: Unbreakable

The trailer for Attack on Titan VR: Unbreakable is admittedly not gameplay, although it’s easy to see where it’s headed, as Japanese studio UNIVRS seems to be heavily suggesting gameplay elements here. It’s bringing both single player and co-op modes to the Titan-killing, swing-tastic game, letting you play in Japanese or English, subs included. It’s coming to Quest platform in winter 2023, which is a little later than the promised summer 2023 launch window previously announced, but better late than never. Catch the full announce here.

Dungeons of Eternity

Although it’s hard to get super excited about a roguelike dungeon crawler—there are a few really great ones out there already—Dungeons of Eternity is coming to the Quest platform this year from a studio called Othergate, which was founded by a bunch of ex-Oculus Studios game designers. The 1-3 player co-op dungeon crawling RPG also incorporates physics-based combat, which is pretty refreshing to see since it focuses on melee as well as archery and magic. Catch the full announce here.

Stranger Things VR

TV show game tie-ins are pretty hit and miss (mostly miss), but we can actually vouch for the studio developing this Stranger Things VR game, coming to Quest this fall from VR pioneers Tender Claws. You may know Tender Claws for its games Virtual Virtual Reality 1 and 2 and The Under Presents, three spectacular titles that really just get what makes VR great. Be the bad guy, Vecna. Do bad shit. Catch the full announce here.

– – — – –

Assassin’s Creed: Nexus VR

Image courtesy Ubisoft

Ok, just one more, but it’s definitely outside of our lineup since we didn’t exactly get an eye-full of Ubisoft’s upcoming Assassin’s Creed game today like we hoped, which is now confirmed to be officially called Assassin’s Creed: Nexus VR (the only real news about AC from the showcase).

The actual reveal is said to come during Ubisoft’s Forward livestream event taking place on June 12th though, so we’re closer than ever to learning whether Ubisoft is set to faithfully translate the franchise’s high-flying, time-tripping assassin into VR. Catch the full announce here.


There were a ton of games announced today. Which one are you looking forward to the most? Let us know in the comments below!

The 20 Best Rated & Most Popular Quest Games & Apps – May 2023

While Oculus doesn’t offer much publicly in the way of understanding how well individual games & apps are performing across its Quest 2 storefront, it’s possible to glean some insight by looking at apps relative to each other. Here’s a snapshot of the 20 best rated Oculus Quest games and apps as of May 2023.

Some quick qualifications before we get to the data:

  • Paid and free apps are separated
  • Only apps with more than 100 reviews are represented
  • App Lab apps are not represented (see our latest Quest App Lab report)
  • Rounded ratings may appear to show ‘ties’ in ratings for some applications, but the ranked order remains correct

Best Rated Oculus Quest 2 Games & Apps – Paid

The rating of each application is an aggregate of user reviews and a useful way to understand the general reception of each title by customers.

Rank Name Rating (# of ratings) Rank Change Price
#1 Moss: Book II 4.89 (594) $30
#2 The Room VR: A Dark Matter 4.89 (12,603) $30
#3 Puzzling Places 4.87 (1,770) $15
#4 Walkabout Mini Golf 4.86 (10,195) $15
#5 I Expect You To Die 2 4.85 (2,757) $25
#6 Swarm 4.82 (2,341) ↑ 3 $25
#7 COMPOUND 4.81 (473) $20
#8 PatchWorld – Make Music Worlds 4.81 (160) ↑ 3 $30
#9 I Expect You To Die 4.81 (5,269) ↑ 3 $25
#10 Moss 4.8 (6,534) ↑ 3 $20
#11 DYSCHRONIA: Chronos Alternate 4.8 (368) ↓ 1 $20
#12 Ragnarock 4.8 (1,277) ↑ 4 $25
#13 ARK and ADE 4.8 (139) ↑ 2 $10
#14 Cubism 4.79 (795) ↑ 3 $10
#15 Red Matter 2 4.79 (1,174) ↓ 1 $30
#16 Ancient Dungeon 4.79 (915) ↑ 2 $20
#17 Eye of the Temple 4.79 (144) New $20
#18 GOLF+ 4.79 (18,143) ↑ 4 $30
#19 Into the Radius 4.78 (4,134) $30
#20 Pistol Whip 4.78 (9,508) ↑ 1 $30

Rank change & stats compared to April 2023

Dropouts:
Breachers, Vermillion, The Last Clockwinder

  • Among the 20 best rated Quest apps
    • Average rating (mean): 4.8 out of 5 (±0)
    • Average price (mean): $23 (±$0)
    • Most common price (mode): $30 (±$0)
  • Among all paid Quest apps
    • Average rating (mean): 4.2 out of 5 (±0)
    • Average price (mean): $20 (±$0)
    • Most common price (mode): $20 (±$0)

Continue on Page 2: Most Popular Paid Oculus Quest Apps »

Eye-tracking is a Game Changer for XR That Goes Far Beyond Foveated Rendering

Eye-tracking—the ability to quickly and precisely measure the direction a user is looking while inside of a VR headset—is often talked about within the context of foveated rendering, and how it could reduce the performance requirements of XR headsets. And while foveated rendering is an exciting use-case for eye-tracking in AR and VR headsets, eye-tracking stands to bring much more to the table.

Updated – May 2nd, 2023

Eye-tracking has been talked about with regards to XR as a distant technology for many years, but the hardware is finally becoming increasingly available to developers and customers. PSVR 2 and Quest Pro are the most visible examples of headsets with built-in eye-tracking, along with the likes of Varjo Aero, Vive Pro Eye and more.

With this momentum, in just a few years we could see eye-tracking become a standard part of consumer XR headsets. When that happens, there’s a wide range of features the tech can enable to drastically improve the experience.

Foveated Rendering

Let’s first start with the one that many people are already familiar with. Foveated rendering aims to reduce the computational power required for displaying demanding AR and VR scenes. The name comes from the ‘fovea’—a small pit at the center of the human retina which is densely packed with photoreceptors. It’s the fovea which gives us high resolution vision at the center of our field of view; meanwhile our peripheral vision is actually very poor at picking up detail and color, and is better tuned for spotting motion and contrast than seeing detail. You can think of it like a camera which has a large sensor with just a few megapixels, and another smaller sensor in the middle with lots of megapixels.

The region of your vision in which you can see in high detail is actually much smaller than most think—just a few degrees across the center of your view. The difference in resolving power between the fovea and the rest of the retina is so drastic, that without your fovea, you couldn’t make out the text on this page. You can see this easily for yourself: if you keep your eyes focused on this word and try to read just two sentences below, you’ll find it’s almost impossible to make out what the words say, even though you can see something resembling words. The reason that people overestimate the foveal region of their vision seems to be because the brain does a lot of unconscious interpretation and prediction to build a model of how we believe the world to be.

SEE ALSO
Abrash Spent Most of His F8 Keynote Convincing the Audience That 'Reality' is Constructed in the Brain

Foveated rendering aims to exploit this quirk of our vision by rendering the virtual scene in high resolution only in the region that the fovea sees, and then drastically cut down the complexity of the scene in our peripheral vision where the detail can’t be resolved anyway. Doing so allows us to focus most of the processing power where it contributes most to detail, while saving processing resources elsewhere. That may not sound like a huge deal, but as the display resolution of XR headsets and field-of-view increases, the power needed to render complex scenes grows quickly.

Eye-tracking of course comes into play because we need to know where the center of the user’s gaze is at all times quickly and with high precision in order to pull off foveated rendering. While it’s difficult to pull this off without the user noticing, it’s possible and has been demonstrated quite effectively on recent headset like Quest Pro and PSVR 2.

Automatic User Detection & Adjustment

 

In addition to detecting movement, eye-tracking can also be used as a biometric identifier. That makes eye-tracking a great candidate for multiple user profiles across a single headset—when I put on the headset, the system can instantly identify me as a unique user and call up my customized environment, content library, game progress, and settings. When a friend puts on the headset, the system can load their preferences and saved data.

Eye-tracking can also be used to precisely measure IPD (the distance between one’s eyes). Knowing your IPD is important in XR because it’s required to move the lenses and displays into the optimal position for both comfort and visual quality. Unfortunately many people understandably don’t know what their IPD off the top of their head.

With eye-tracking, it would be easy to instantly measure each user’s IPD and then have the headset’s software assist the user in adjusting headset’s IPD to match, or warn users that their IPD is outside the range supported by the headset.

In more advanced headsets, this process can be invisible and automatic—IPD can be measured invisibly, and the headset can have a motorized IPD adjustment that automatically moves the lenses into the correct position without the user needing to be aware of any of it, like on the Varjo Aero, for example.

Varifocal Displays

A prototype varifocal headset | Image courtesy NVIDIA

The optical systems used in today’s VR headsets work pretty well but they’re actually rather simple and don’t support an important function of human vision: dynamic focus. This is because the display in XR headsets is always the same distance from our eyes, even when the stereoscopic depth suggests otherwise. This leads to an issue called vergence-accommodation conflict. If you want to learn a bit more in depth, check out our primer below:

Accommodation

Accommodation is the bending of the eye’s lens to focus light from objects at different distances. | Photo courtesy Pearson Scott Foresman

In the real world, to focus on a near object the lens of your eye bends to make the light from the object hit the right spot on your retina, giving you a sharp view of the object. For an object that’s further away, the light is traveling at different angles into your eye and the lens again must bend to ensure the light is focused onto your retina. This is why, if you close one eye and focus on your finger a few inches from your face, the world behind your finger is blurry. Conversely, if you focus on the world behind your finger, your finger becomes blurry. This is called accommodation.

Vergence

Vergence is the inward rotation of each eye to overlap each eye’s view into one aligned image. | Photo courtesy Fred Hsu (CC BY-SA 3.0)

Then there’s vergence, which is when each of your eyes rotates inward to ‘converge’ the separate views from each eye into one overlapping image. For very distant objects, your eyes are nearly parallel, because the distance between them is so small in comparison to the distance of the object (meaning each eye sees a nearly identical portion of the object). For very near objects, your eyes must rotate inward to bring each eye’s perspective into alignment. You can see this too with our little finger trick as above: this time, using both eyes, hold your finger a few inches from your face and look at it. Notice that you see double-images of objects far behind your finger. When you then focus on those objects behind your finger, now you see a double finger image.

The Conflict

With precise enough instruments, you could use either vergence or accommodation to know how far away an object is that a person is looking at. But the thing is, both accommodation and vergence happen in your eye together, automatically. And they don’t just happen at the same time—there’s a direct correlation between vergence and accommodation, such that for any given measurement of vergence, there’s a directly corresponding level of accommodation (and vice versa). Since you were a little baby, your brain and eyes have formed muscle memory to make these two things happen together, without thinking, anytime you look at anything.

But when it comes to most of today’s AR and VR headsets, vergence and accommodation are out of sync due to inherent limitations of the optical design.

In a basic AR or VR headset, there’s a display (which is, let’s say, 3″ away from your eye) which shows the virtual scene, and a lens which focuses the light from the display onto your eye (just like the lens in your eye would normally focus the light from the world onto your retina). But since the display is a static distance from your eye, and the lens’ shape is static, the light coming from all objects shown on that display is coming from the same distance. So even if there’s a virtual mountain five miles away and a coffee cup on a table five inches away, the light from both objects enters the eye at the same angle (which means your accommodation—the bending of the lens in your eye—never changes).

That comes in conflict with vergence in such headsets which—because we can show a different image to each eye—is variable. Being able to adjust the imagine independently for each eye, such that our eyes need to converge on objects at different depths, is essentially what gives today’s AR and VR headsets stereoscopy.

But the most realistic (and arguably, most comfortable) display we could create would eliminate the vergence-accommodation issue and let the two work in sync, just like we’re used to in the real world.

Varifocal displays—those which can dynamically alter their focal depth—are proposed as a solution to this problem. There’s a number of approaches to varifocal displays, perhaps the most simple of which is an optical system where the display is physically moved back and forth from the lens in order to change focal depth on the fly.

Achieving such an actuated varifocal display requires eye-tracking because the system needs to know precisely where in the scene the user is looking. By tracing a path into the virtual scene from each of the user’s eyes, the system can find the point that those paths intersect, establishing the proper focal plane that the user is looking at. This information is then sent to the display to adjust accordingly, setting the focal depth to match the virtual distance from the user’s eye to the object.

SEE ALSO
Oculus on Half Dome Prototype: 'don't expect to see everything in a product anytime soon'

A well implemented varifocal display could not only eliminate the vergence-accommodation conflict, but also allow users to focus on virtual objects much nearer to them than in existing headsets.

And well before we’re putting varifocal displays into XR headsets, eye-tracking could be used for simulated depth-of-field, which could approximate the blurring of objects outside of the focal plane of the user’s eyes.

As of now, there’s no major headset on the market with varifocal capabilities, but there’s a growing body of research and development trying to figure out how to make the capability compact, reliable, and affordable.

Foveated Displays

While foveated rendering aims to better distribute rendering power between the part of our vision where we can see sharply and our low-detail peripheral vision, something similar can be achieved for the actual pixel count.

Rather than just changing the detail of the rendering on certain parts of the display vs. others, foveated displays are those which are physically moved (or in some cases “steered”) to stay in front of the user’s gaze no matter where they look.

Foveated displays open the door to achieving much higher resolution in AR and VR headsets without brute-forcing the problem by trying to cram pixels at higher resolution across our entire field-of-view. Doing so is not only be costly, but also runs into challenging power and size constraints as the number of pixels approach retinal-resolution. Instead, foveated displays would move a smaller, pixel-dense display to wherever the user is looking based on eye-tracking data. This approach could even lead to higher fields-of-view than could otherwise be achieved with a single flat display.

A rough approximation of how a pixel-dense foveated display looks against a larger, much less pixel-dense display in Varjo’s prototype headset. | Photo by Road to VR, based on images courtesy Varjo

Varjo is one company working on a foveated display system. They use a typical display that covers a wide field of view (but isn’t very pixel dense), and then superimpose a microdisplay that’s much more pixel dense on top of it. The combination of the two means the user gets both a wide field of view for their peripheral vision, and a region of very high resolution for their foveal vision.

Granted, this foveated display is still static (the high resolution area stays in the middle of the display) rather than dynamic, but the company has considered a number of methods for moving the display to ensure the high resolution area is always at the center of your gaze.

Continued on Page 2: Better Social Avatars »

The Hidden Design Behind the Ingenious Room-Scale Gameplay in ‘Eye of the Temple’

Eye of the Temple is one of the rare VR games that focuses on not just on pure room-scale movement, but dynamic room-scale movement. The result is a uniquely immersive experience that required some clever design behind the scenes to make it all work. This guest article by developer Rune Skovbo Johansen explains the approach.

Guest Article by Rune Skovbo Johansen

Rune Skovbo Johansen is a Danish independent game developer based in Turku, Finland. His work spans games and other interactive experiences, focused on tech, wonder, and exploration. After positive reception of the 2016 VR game jam game Chrysalis Pyramid, he started working on a more ambitious spiritual successor, Eye of the Temple, and at the end of 2020 he quit his day job to pursue indie game development full-time.

In Eye of the Temple, you move through a vast environment, not by teleportation or artificial locomotion, but by using your own feet. It makes unique use of room-scale VR to deliver an experience of navigating an expansive space.

In Eye of the Temple you move around large environments using your own feet

But how does it work behind the scenes? To mark the upcoming release of Eye of the Temple on Quest 2, I wanted to take the time to explain these aspects of the game’s design that I’ve never fully gone into detail with before. In this article we’ll go over a variety of the tricks the game uses to make it all work. Let’s start with the basics of keeping the player in the play area

Keeping the Player in the Play Area

Say you need to go from one tall pillar in the game to another via a moving platform. You step forward onto the platform, the platform moves, and then you step forward onto the next pillar. But now you’re outside your physical play area.

Moving platforms are positioned in a way to keep players inside the play area

If we instead position the moving platform to the side, it goes like this: You sidestep onto the platform, it moves, and you sidestep onto the next pillar. Since you took a step right, and then left, you’re back where you started in the center of the play area. So the game’s tricks are all about how the platforms are positioned relative to each other.

Now, to get a better sense for it, let’s look at some mixed reality footage (courtesy of Naysy) where a grid representing the play area is overlaid on top.

Mixed reality footage with a grid overlaid on top which represents the play area

Keeping an Overview in the Level Design

Now that we’ve seen how the trick works, let’s take a look at how I keep track of it all when doing the level design for the game. First things first – I made this pattern, which represents the player’s entire play area – or the part of it the game takes advantage of anyway:

A pattern representing the physical play area

As you can see, there’s a thick white border along the edge, and a thick circle in the center.

Every platform in the game has a designated spot in the play area and a pattern overlay that shows what that spot is. For platforms that are a single tile large, it’s generally one of nine positions. The overlay makes it easy to see if a given platform is positioned in the center of the play area, or at an edge or corner.

The play area pattern overlaid on each platform and its end positions make it easy to see if they are lined up correctly in the level design

Additional overlays show a ghostly version of the pattern at both the start and end positions of a moving platform. This is the real trick of keeping track of how the platforms connect together, because these ghostly overlays at the end positions make it trivial to see if the platforms are lined up correctly in the level design when they touch each other. If the adjacent ghostly patterns are continuous like puzzle pieces that fit together, then the platforms work correctly together.

It still took a lot of ingenuity to work out how to position all the platforms so they both fit correctly together and also take the player where they need to go in the virtual world, but now you know how I kept the complexity of it manageable.

Getting the Player’s Cooperation

The whole premise of getting around the world via these moving platforms is based on an understanding that the player should step from one platform to another when they’re lined up, and not at other times. The most basic way the game establishes this is by just telling it outright to the player in safety instructions displayed prior to starting the game.

One of the safety instructions shown before the game begins

This instructions is shown for two reasons:

One is safety. You should avoid jumping over gaps, otherwise you would risk jumping right out of your play area and into a wall, for example.

The other is that the game’s system of traversal only works correctly when stepping from one platform to another when they line up. This is not as critical – I’ll get back to later what happens if stepping onto a platform that’s misaligned – but it still provides the best play experience.

Apart from the explicit instructions, the game also employs more subtle tricks to help ensure the player only steps over when blocks are correctly aligned. Consider the following example of a larger 2 x 2 tile static platform the player can step onto. A moving platform arrives from the side in a way that would allow the player to step off well before the platform has stopped moving, but that would break the game’s traversal logic.

In this room, ‘foot fences’ are used to discourage the player from stepping from one platform to another when they are not correctly aligned

To avoid this, “foot fences” were placed to discourage the player from stepping over onto the static platform (or away from it) at incorrect positions. The fences are purely visual and don’t technically prevent anything. The player can still step over them if they try, or right through them for that matter. However, psychologically it feels like less effort to not step over or through a fence and instead step onto the static platform where there’s a gap in the fence. In this way, a purely non-technical solution is used as part of the game’s arsenal of tricks.

Continued on Page 2: Correcting for Unaligned Platforms »

The Hidden Design Behind the Ingenious Room-Scale Gameplay in ‘Eye of the Temple’

Eye of the Temple is one of the rare VR games that focuses on not just on pure room-scale movement, but dynamic room-scale movement. The result is a uniquely immersive experience that required some clever design behind the scenes to make it all work. This guest article by developer Rune Skovbo Johansen explains the approach.

Guest Article by Rune Skovbo Johansen

Rune Skovbo Johansen is a Danish independent game developer based in Turku, Finland. His work spans games and other interactive experiences, focused on tech, wonder, and exploration. After positive reception of the 2016 VR game jam game Chrysalis Pyramid, he started working on a more ambitious spiritual successor, Eye of the Temple, and at the end of 2020 he quit his day job to pursue indie game development full-time.

In Eye of the Temple, you move through a vast environment, not by teleportation or artificial locomotion, but by using your own feet. It makes unique use of room-scale VR to deliver an experience of navigating an expansive space.

In Eye of the Temple you move around large environments using your own feet

But how does it work behind the scenes? To mark the upcoming release of Eye of the Temple on Quest 2, I wanted to take the time to explain these aspects of the game’s design that I’ve never fully gone into detail with before. In this article we’ll go over a variety of the tricks the game uses to make it all work. Let’s start with the basics of keeping the player in the play area

Keeping the Player in the Play Area

Say you need to go from one tall pillar in the game to another via a moving platform. You step forward onto the platform, the platform moves, and then you step forward onto the next pillar. But now you’re outside your physical play area.

Moving platforms are positioned in a way to keep players inside the play area

If we instead position the moving platform to the side, it goes like this: You sidestep onto the platform, it moves, and you sidestep onto the next pillar. Since you took a step right, and then left, you’re back where you started in the center of the play area. So the game’s tricks are all about how the platforms are positioned relative to each other.

Now, to get a better sense for it, let’s look at some mixed reality footage (courtesy of Naysy) where a grid representing the play area is overlaid on top.

Mixed reality footage with a grid overlaid on top which represents the play area

Keeping an Overview in the Level Design

Now that we’ve seen how the trick works, let’s take a look at how I keep track of it all when doing the level design for the game. First things first – I made this pattern, which represents the player’s entire play area – or the part of it the game takes advantage of anyway:

A pattern representing the physical play area

As you can see, there’s a thick white border along the edge, and a thick circle in the center.

Every platform in the game has a designated spot in the play area and a pattern overlay that shows what that spot is. For platforms that are a single tile large, it’s generally one of nine positions. The overlay makes it easy to see if a given platform is positioned in the center of the play area, or at an edge or corner.

The play area pattern overlaid on each platform and its end positions make it easy to see if they are lined up correctly in the level design

Additional overlays show a ghostly version of the pattern at both the start and end positions of a moving platform. This is the real trick of keeping track of how the platforms connect together, because these ghostly overlays at the end positions make it trivial to see if the platforms are lined up correctly in the level design when they touch each other. If the adjacent ghostly patterns are continuous like puzzle pieces that fit together, then the platforms work correctly together.

It still took a lot of ingenuity to work out how to position all the platforms so they both fit correctly together and also take the player where they need to go in the virtual world, but now you know how I kept the complexity of it manageable.

Getting the Player’s Cooperation

The whole premise of getting around the world via these moving platforms is based on an understanding that the player should step from one platform to another when they’re lined up, and not at other times. The most basic way the game establishes this is by just telling it outright to the player in safety instructions displayed prior to starting the game.

One of the safety instructions shown before the game begins

This instructions is shown for two reasons:

One is safety. You should avoid jumping over gaps, otherwise you would risk jumping right out of your play area and into a wall, for example.

The other is that the game’s system of traversal only works correctly when stepping from one platform to another when they line up. This is not as critical – I’ll get back to later what happens if stepping onto a platform that’s misaligned – but it still provides the best play experience.

Apart from the explicit instructions, the game also employs more subtle tricks to help ensure the player only steps over when blocks are correctly aligned. Consider the following example of a larger 2 x 2 tile static platform the player can step onto. A moving platform arrives from the side in a way that would allow the player to step off well before the platform has stopped moving, but that would break the game’s traversal logic.

In this room, ‘foot fences’ are used to discourage the player from stepping from one platform to another when they are not correctly aligned

To avoid this, “foot fences” were placed to discourage the player from stepping over onto the static platform (or away from it) at incorrect positions. The fences are purely visual and don’t technically prevent anything. The player can still step over them if they try, or right through them for that matter. However, psychologically it feels like less effort to not step over or through a fence and instead step onto the static platform where there’s a gap in the fence. In this way, a purely non-technical solution is used as part of the game’s arsenal of tricks.

Continued on Page 2: Correcting for Unaligned Platforms »

Whether Hit or Flop, Apple’s Entrance Will Be a Pivotal Moment for XR

If the avalanche of recent reports can indicate anything at all, it seems Apple is entering the VR/AR headset market fairly soon, bringing along with it the most inflated expectations the industry has ever seen. It’s probably going to be expensive, but whether it flops or becomes a big hit, the mere existence of Apple in the space is set to change a lot about how things are done.

The iPhone wasn’t the first smartphone. That award goes to an obscure PDA device called the IBM Simon, released in limited numbers in 1994. The Apple Watch wasn’t the first smartwatch either. That was debatably the Seiko Raputer, which was released in 1998 in Japan. Its monochrome LCD wasn’t capable of touch, instead offering up a tiny eight-direction joystick and six function buttons to browse files, play games, and set calendar appointments. Similarly, iPad wasn’t the first tablet. Mac wasn’t the first home computer. iPod wasn’t the first MP3 player. But all of these products have become nothing short of iconic. There’s very little benefit to being first, at least as far as Apple is concerned.

And while it seems the company’s first mixed reality headset could finally debut at its Worldwide Developers Conference (WWDC) in June, like all of its other products, it won’t be the first MR headset. Just the same, like everything else the fruit company makes, it’s going to be the one everyone is talking about—for better or worse.

In case you haven’t noticed, Apple is a big deal. It has an ecosystem of products which connect to each other, design-forward hardware that has helped it maintain brand name cache, and a philosophy that puts user-friendliness at the core of its software experience. Oh, and it’s the most valuable company in the world.

And while the irrational exuberance for successive device generations has mostly petered out since its heydays in the early 2000s, reducing its famed long-line launch extravaganzas to more chill online pre-order releases, becoming an Apple apostate is still unthinkable to many. Once you’re in, you’re in. You buy the phone, the laptop, the headphones, and now, maybe you’ll get the newfangled headset too. Maybe. Let’s put aside the rumors for now. Forget about the spec breakdowns, hardware design leaks, software capabilities, etc. There are plenty of them out there, and you can read about those here. The only thing we know for sure is Apple is… well… Apple. Here’s what you, and probably everyone else is expecting.

Apple’s BKC Store in Mumbai, India | Image courtesy Apple

For Better: What Should Happen

Unless the company is making a drastic departure here, its first mixed reality headset should be built with this same level of user friendliness as all of its other devices, which means it should connect to the Apple ecosystem easily, and have a simple and intuitive UI. Log in with Apple ID. No muss, no fuss (whatever ‘muss’ is). Privacy should be a giant focus for the headset from the outset, since it will almost certainly pack eye-tracking in addition to a host of cameras to get a glimpse of the inside of your immediate surroundings, messiness and all. Apple has its fair share of data collection scandals, yet it seems to inspire enough confidence for privacy to be a big historical selling point for all of its devices.

If you want to avoid drawing the ire of tech reviewers everywhere though, wearing it should be fairly simple and very comfortable, and the experiences within should be of high enough value to overcome that inherent friction of charging it, putting it on, setting up a tracking volume, and wearing it for extended periods of time—everything we expect from any mixed reality headset at this point. It should fit most people, and offer up a clear picture to people with heads and eyes of all shapes and sizes.

Meta Quest Pro | Image courtesy Meta

An obvious analogue here is Meta Quest Pro, which is relatively low friction, but things like a halo strap that forces too much weight on your brow, or a passthrough that’s just a little too grainy, or a display that doesn’t have a high enough pixel per degree (ppd) for staring at text—all of these things make it less appealing to users in the day-to-day, introducing what you might call accumulative friction. You use it a bunch at first until you figure out all of the niggles, at which point you may revert to traditional computing standards like using a laptop or smartphone. The thing isn’t really the all-purpose device you hoped it would be, and the company thinks twice about when to send the better, more improved version down the pipeline.

One would hope that Apple’s headset, on the other hand, should have a mature design language and have obviously useful features from day one. While there’s bound to be some stutters, like with the first Apple Watch, which was critiqued for its slow software, short battery life, and lack of customization, it should all be there, and not require a ton of feature updates to enhance after the big launch day.

It should sell well out of the gate—at least by the standards of the existing XR industry—even if everything isn’t perfect. And it should be so cool that it’s copied. Like a lot. And it should drag top-level studios into the XR scene to start making innovative and useful apps that aren’t just straight ports of ARkit or ARcore apps made for mobile, but things people need and want to use in-headset. A big win from Apple should not only spur its new mixed reality product category, but kick off a buzz among developers, which would include those who currently work in the XR industry and Apple’s existing cohort of dedicated iOS developers.

But more than merely being the latest shiny new headset within the existing XR industry, Apple’s entrance into the field has a real chance of radically expanding the industry itself, by showing that the world’s most iconic tech company now thinks the medium is worth pursuing. That’s the way it happened when Apple jumped into MP3 players, smartphones, tablets, wireless earbuds, and more.

As the saying goes, a rising tide lifts all boats. The inverse is also true though….

For Worse: What Could Happen

Apple’s headset is reportedly (okay, maybe just one rumor) priced somewhere near $3,000, so it probably won’t be the sort of accessory that initially attracts people to the ecosystem; that would be the job of a peripheral like Apple Watch. It will likely rely on the pool of built-in Apple users. Despite the price, the first iteration very likely won’t offer the sort of power you’d expect from a workhorse like Apple MacBook Pro either.

At the outset, any sustained draw from prosumers will invariably hinge on how well it can manage general computing tasks, like you might have with an iPad or MacBook, and everything else current mixed reality headset should do too, namely VR and AR stuff. That includes a large swath of things like fitness apps, both AR and VR games and experiences, productivity apps, standard work apps, everything. Basically, it has to be the Quest Pro that Meta wanted to release but didn’t.

AR turn-by-turn directions on an iPhone | Image courtesy Apple

And if not, it leaves Apple in a pretty precarious situation. If their headset can’t find a proper foothold within its ecosystem and attract enough users, it could lead to low adoption rates and a lack of interest in the technology as a whole. Mixed reality is largely seen as valuable steppingstone to what many consider the true moneymaker: all-day AR glasses. And despite some very glassses-shaped AR headsets out there, we’re still not there yet. Even if Apple is willing to take a hit with a bulky device in service of pushing use cases for its AR glasses yet to come, the short term may not look very bright.

And perhaps most importantly for the industry as a whole are the (metaphorical) optics.

After all, if the iconic Apple can’t manage to make MR something that everybody wants, the rest of the world watching from the sidelines may think the concept just can’t be conquered. In turn, it may mean capital investment in the space will dry up until ‘real’ AR headsets are a thing—the all-day glasses that will let you play Pokémon Go in the park, do turn-by-turn directions, and remind you the name of that person you met last week. The steppingstone of mixed reality may get waterlogged. Those are a lot of ifs, coulds, shoulds, and won’ts though. The only thing truly certain is we’re in for a very interesting few months, which you can of course follow at Road to VR.

Apple’s entrance into XR has the potential to expand the industry by demonstrating its viability, just as Apple has done with previous technologies. It stands a good chance at carving out a sizeable claim in the space, but it’s a gamble that could equally backfire if both sales and public perception aren’t on their side.


Is Apple’s XR headset going to be the “one more thing?” we’ve all been waiting for at WWDC this year? Will it live up to the Apple name, or be an expensive dev kit? Let us know in the comments below!

The 20 Best Rated & Most Popular Quest Games & Apps – April 2023

While Oculus doesn’t offer much publicly in the way of understanding how well individual games & apps are performing across its Quest 2 storefront, it’s possible to glean some insight by looking at apps relative to each other. Here’s a snapshot of the 20 best rated Oculus Quest games and apps as of April 2023.

Some quick qualifications before we get to the data:

  • Paid and free apps are separated
  • Only apps with more than 100 reviews are represented
  • App Lab apps are not represented (see our latest Quest App Lab report)
  • Rounded ratings may appear to show ‘ties’ in ratings for some applications, but the ranked order remains correct

Best Rated Oculus Quest 2 Games & Apps – Paid

The rating of each application is an aggregate of user reviews and a useful way to understand the general reception of each title by customers.

Rank Name Rating (# of ratings) Rank Change Price
#1 Moss: Book II 4.89 (582) ↑ 1 $30
#2 The Room VR: A Dark Matter 4.89 (12,529) ↓ 1 $30
#3 Puzzling Places 4.88 (1,737) $15
#4 Walkabout Mini Golf 4.86 (10,013) $15
#5 I Expect You To Die 2 4.84 (2,714) $25
#6 Breachers 4.84 (970) New $30
#7 COMPOUND 4.82 (441) ↑ 3 $20
#8 Vermillion 4.82 (665) ↓ 2 $20
#9 Swarm 4.82 (2,313) ↓ 2 $25
#10 DYSCHRONIA: Chronos Alternate 4.81 (364) ↓ 1 $20
#11 PatchWorld – Make Music Worlds 4.81 (158) ↓ 3 $30
#12 I Expect You To Die 4.8 (5,224) $25
#13 Moss 4.8 (6,485) $20
#14 Red Matter 2 4.8 (1,136) $30
#15 ARK and ADE 4.8 (133) ↓ 4 $10
#16 Ragnarock 4.79 (1,246) ↑ 1 $25
#17 Cubism 4.79 (793) ↓ 2 $10
#18 Ancient Dungeon 4.79 (875) ↓ 2 $20
#19 Into the Radius 4.78 (3,878) $30
#20 The Last Clockwinder 4.78 (673) $25

Rank change & stats compared to March 2023

Dropouts:
ALTDEUS: Beyond Chronos, Resident Evil 4, Racket: Nx

  • Among the 20 best rated Quest apps
    • Average rating (mean): 4.8 out of 5 (±0)
    • Average price (mean): $23 (±$0)
    • Most common price (mode): $30 (±$0)
  • Among all paid Quest apps
    • Average rating (mean): 4.2 out of 5 (±0)
    • Average price (mean): $20 (±$0)
    • Most common price (mode): $20 (±$0)

Continue on Page 2: Most Popular Paid Oculus Quest Apps »

Hands-on: Bigscreen Beyond – A Little Headset That Could be a Big Deal

It’s exceedingly rare to see a VR software startup transition to making hardware, let alone decent hardware. But that’s exactly what Bigscreen—creators of the long-running social VR theater app of the same name—has done with its upcoming Beyond headset.

Bigscreen has clearly targeted PC VR enthusiasts who are willing to pay for the best hardware they can get their hands on. And with major players like Meta and HTC focusing heavily on standalone headsets, Bigscreen Beyond could prove to be the best option they’ll find any time soon.

Photo by Road to VR

The company has set out to make a headset that’s not just better than what’s out there, but one that’s much smaller too. And while it remains to be seen if the headset will hit all the right notes, my initial hands-on shows plainly the company knows what it’s doing when it comes to building a VR headset.

Bigscreen Beyond Specs
Resolution 2,560 × 2,560 (6.5MP) per-eye
microOLED (2x, RGB stripe)
Pixels Per-degree (claimed) 28
Refresh Rate 75Hz, 90Hz
Lenses Tri-element pancake
Field-of-view (claimed) 93°H × 90°V
Optical Adjustments IPD (fixed, customized per customer)
eye-relief (fixed, customized per facepad)
IPD Adjustment Range 58–72mm (fixed, single IPD value per device)
Connectors DisplayPort 1.4, USB 3.0 (2x)
Accessory Ports USB-C (1x)
Cable Length 5m
Tracking SteamVR Tracking 1.0 or 2.0 (external beacons)
On-board Cameras None
Input SteamVR Tracking controllers
On-board Audio None
Optional Audio Audio Strap accessory, USB-C audio output
Microphone Yes (2x)
Pass-through view No
Weight 170–185g
MSRP $1,000
MSRP (with tracking & controllers) $1,580

Custom-made

Bigscreen is building something unique, quite literally—every Beyond headset comes with a custom-made facepad. And this isn’t a ‘choose one of three options’ situation, Bigscreen has a sleek app that walks buyers through the process of capturing a 3D scan of their face so the company can create a completely unique facepad that conforms to each specific customer.

And it really makes a difference. The first thing that Bigscreen CEO Darshan Shankar showed me during a demo of the Beyond headset was the difference between my personal facepad (which the company created for me prior to our meetup) and someone else’s facepad. The difference was instantly obvious; where mine fit against my face practically like two connected puzzle-pieces, the other facepad awkwardly disagreed with my face in various places. While I’ve recognized for a long time that different facial topology from person-to-person is a real consideration for VR headsets, this made me appreciate even more how significant the differences can be.

The facepad may look rough, but it’s actually made of a soft rubber material | Photo by Road to VR

Shankar says the custom-fit facepad is an essential part of making such a small headset. It ensures not only that the headset is as comfortable as it can be, but also the user’s eyes are exactly where they’re supposed to be with regard to the lenses. For a headset like Beyond, which uses high magnification pancake optics with a small sweet spot, this is especially important. And, as Shankar convincingly demonstrated by shining a flashlight all around the headset while I was wearing it, the custom-fit facepad means absolutely no external light can be seen from inside.

And the custom facepad isn’t the only way each headset is dialed in for each specific customer; instead of wasting weight and space with the mechanics for an IPD adjustment, the headset ships with one of 15 fixed IPD distances, ranging from 58–72mm. The company selects the IPD based on the same face scan that allows them to make the custom facepad. And given the size of the Beyond headset, there’s no way that glasses will fit inside; luckily the company will also sell magnetically attached prescription inserts for those who need them, up to −10 diopter.

Diving In

With my custom facepad easily snapped onto the headset with magnets, it was time to dive into VR.

The baseline version of the $1,000 Bigscreen Beyond headset has a simple soft strap, which I threw over the back of my head and tightened to taste. I felt I had to wear the strap very high on the back of my head for a good hold; Shankar says an optional top-strap will be available, which ought to allow me to wear the rear strap in a lower position.

Photo by Road to VR

As I put on the headset I found myself sitting in a dark Bigscreen theater environment, and the very first thing I noticed was the stellar darks and rich colors that are thanks to the headset’s OLED displays. The second thing I noticed was there was no sound! That’s because the baseline version of the headset doesn’t have on-board audio, so I still had to put on a pair of headphones after the headset was donned.

While the baseline headset lacks on-board audio, Bigscreen is offering a $100 ‘Audio Strap‘, which is a rigid headstrap with built-in speakers. As someone who really values rigid straps and on-board audio, I’m glad to see this as an option—for me it would be the obvious choice. Unfortunately the company wasn’t ready to demo the Audio Strap.

Shankar toured me around a handful of VR environments that showed off the headset’s 2,560 × 2,560 (6.5MP) per-eye displays, which offered a level of clarity similar to that of Varjo’s $2,000 Aero headset, but with a smaller notably field-of-view (Bigscreen claims 90°H × 93°V).

On many current-gen headsets like Quest 2 you can’t quite see the individual lines of the screen-door effect, but it’s still clear that it’s there in aggregate. While the Beyond headset isn’t ‘retina resolution’ there’s essentially no evidence of any screen-door effect. Everything looks really sharp. This was best demonstrated when I ran around in Half-Life: Alyx and the game felt like it had instantly upgraded graphics compared to a headset like Valve’s Index.

There is, however, some persistence blurring and glare. Shankar openly demonstrated how the brightness of the display directly relates to the level of persistence. While there’s some noticeable persistence at the default brightness, when overdriving the display’s brightness the persistence becomes entirely unbearable. The reverse is true; turning the brightness down below the default cuts the persistence down noticeably. While it would be nice if the default brightness had less persistence, at least users will be able to trade brightness for lower persistence based on their specific preference.

Continue on Page 2: Dialing In

One of VR’s Most Veteran Studios Has Gown to 200 Employees While Continuing to Double-down on VR

Having been exclusively building VR games since 2013, nDreams stands as one of the most veteran VR-exclusive game studios to date. And with more than 200 people, one of the largest too. The studio’s CEO & founder, Patrick O’Luanaigh, continues to bet his company’s future on the success of VR.

Speaking exclusively to Road to VR ahead of a presentation at GDC 2023, Patrick O’Luanaigh talks about the growing success of nDreams and why he’s still doubling down on VR.

Starting in 2013, O’Luanaigh has navigated his company from the earliest days of the modern VR era to now, which he believes is VR’s biggest moment so far—and growing.

Between the company’s own internal data and some external sources, O’Luanaigh estimates that VR’s install base is around 40 million headsets across the major platforms, excluding the recently launched PSVR 2. At least half of that, he estimates, is made up by 20 million Quest headsets.

While it’s been a challenge to keep all those headsets in regular use, O’Luanaigh says the size of the addressable VR market today is bigger than ever.

That’s why he’s bulked up the company to some 200 employees, nearly doubling over the course of 2022 through hiring and studio acquisitions.

O’Luanaigh says, “this is the biggest we’ve ever been and it’s showing no signs of slowing down. […] In a decade of exclusively making VR games, we’ve never seen that growth before.”

O’Luanaigh knows well that content is key for getting players into their headsets, and to that end his efforts to scale the company are about building bigger and better VR content to keep up with the growth and expectations of the install base, he says.

“Setting up our fully-remote nDreams studios, Orbital and Elevation, was significant for us in establishing a powerful basis for developing multiple projects in parallel,” he says. “It gives us the specialism to develop the variety of VR titles, across multiple genres, that the growing market now demands.”

O’Luanaigh points to nDreams developed and published titles Phantom: Covert Ops (2020), Shooty Fruity (2020), Fracked (2021), and Little Cities (2022) as some of the most successful VR games the studio has launched thus far, with Phantom: Covert Ops specifically finding “important commercial success” on Quest 2.

With the release of those titles over the years and their ongoing sales, O’Luanaigh shares that nDreams doubled its year-over-year revenue over the last 12 months. And with multiple new projects in the works, including Synapse, Ghostbusters: Rise of the Ghost Lord, and other (unannounced) projects, he believes the company is on track to more than double annual revenue again by 2024.

Phantom: Covert Ops | Image courtesy nDreams

Though he’s leading a company of 200 employees, O’Luanaigh calls himself a “massive VR enthusiast,” and is still very clearly in touch with makes VR such a unique and compelling medium.

He says his studio aims to build around five key pillars that make for compelling VR content:

  1. Aspirational roleplay – first-person embodiment of appealing roles or characters
  2. High-agency interaction – tactile 1:1 mechanics in a freely explorable world
  3. Empowering wielding – Feel, hold, and use visceral weapons, tools, and abilities
  4. Emotional amplification – Immersive situations that provoke strong, diverse feelings
  5. Fictional teleportation – Presence within desirable locations, inaccessible in real life

And while O’Luanaigh could easily steer this studio away from VR—to chase a larger non-VR market—he continues to double down on VR as the studio’s unique advantage. Far from moving away from VR, his company is actively trying to bring others into the fold; O’Luanaigh says nDreams continues to expand its publishing operations.

“The success of Little Cities, which has just launched its free ‘Little Citizens’ update, has been a great validation of our investments into third-party publishing and we are actively on the lookout for more amazing indie developers to work with.”

With the scale that VR has now reached, O’Luanaigh believes the market is truly viable for indie developers. And that’s why he’s glad to see the rise of VR publishers (and not just his own company); having the benefit of longstanding expertise in the medium is crucial to shipping a shipping a quality VR title, and that’s why O’Luanaigh believes VR-specific publishers like nDreams will play an important role in bringing more developers and great content to VR.

[ir]

That expertise is increasingly building upon itself in the company’s VR games which have shown impressive mechanical exploration, giving the studio the chance to test lots of VR gameplay to find out what works.

Few in VR have had the gall to prove out something as seemingly wacky as a ‘VR kayak shooter’ and actually take it to market in a large scale production like Phantom: Covert Ops. And you can clearly see the lineage of a game like nDreams’ Fracked shining through in upcoming titles like Synapse. Though the game is an entirely new IP and visual direction, the unique Fracked cover system is making the leap to Synapse; a clear example of leveraging a now battle-tested mechanic to enhance future titles. But more than just a reskin of a prior shooter, nDreams continues to experiment with unique VR mechanics, this time promising to harness the power of PSVR 2’s eye-tracking to give players compelling telekinetic powers.

Synapse | Image courtesy nDreams

To that end, the studio’s lengthy experience in the medium is clearly an asset—and one that can only be earned rather than bought. Where exactly that experience will take them in the long run is unclear, but even after all the ups and downs the industry has seen, O’Luanaigh and nDreams remain all-in on VR.