Meta Reveals VR Headset Prototypes Designed to Make VR ‘Indistinguishable From Reality’

Meta says its ultimate goal with its VR hardware is to make a comfortable, compact headset with visual finality that’s ‘indistinguishable from reality’. Today the company revealed its latest VR headset prototypes which it says represent steps toward that goal.

Meta has made it no secret that it’s dumping tens of billions of dollars in its XR efforts, much of which is going to long-term R&D through its Reality Labs Research division. Apparently in an effort to shine a bit of light onto what that money is actually accomplishing, the company invited a group of press to sit down for a look at its latest accomplishments in VR hardware R&D.

Reaching the Bar

To start, Meta CEO Mark Zuckerberg spoke alongside Reality Labs Chief Scientist Michael Abrash to explain that the company’s ultimate goal is to build VR hardware that meets all the visual requirements to be accepted as “real” by your visual system.

VR headsets today are impressively immersive, but there’s still no question that what you’re looking at is, well… virtual.

Inside of Meta’s Reality Labs Research division, the company uses the term ‘visual Turing Test’ to represent the bar that needs to be met to convince your visual system that what’s inside the headset is actually real. The concept is borrowed from a similar concept which denotes the point at which a human can tell the difference between another human and an artificial intelligence.

For a headset to completely convince your visual system that what’s inside the headset is actually real, Meta says you need a headset that can pass that “visual Turing Test.”

Four Challenges

Zuckerberg and Abrash outlined what they see as four key visual challenges that VR headsets need to solve before the visual Turing Test can be passed: varifocal, distortion, retina resolution, and HDR.

Briefly, here’s what those mean:

  • Varifocal: the ability to focus on arbitrary depths of the virtual scene, with both essential focus functions of the eyes (vergence and accommodation)
  • Distortion: lenses inherently distort the light that passes through them, often creating artifacts like color separation and pupil swim that make the existence of the lens obvious.
  • Retina resolution: having enough resolution in the display to meet or exceed the resolving power of the human eye, such that there’s no evidence of underlying pixels
  • HDR: also known as high dynamic range, which describes the range of darkness and brightness that we experience in the real world (which almost no display today can properly emulate).

The Display Systems Research team at Reality Labs has built prototypes that function as proof-of-concepts for potential solutions to these challenges.

Varifocal

Image courtesy Meta

To address varifocal, the team developed a series of prototypes which it called ‘Half Dome’. In that series the company first explored a varifocal design which used a mechanically moving display to change the distance between the display and the lens, thus changing the focal depth of the image. Later the team moved to a solid-state electronic system which resulted in varifocal optics that were significantly more compact, reliable, and silent. We’ve covered the Half Dome prototypes in greater detail here if you want to know more.

Virtual Reality… For Lenses

As for distortion, Abrash explained that experimenting with lens designs and distortion-correction algorithms that are specific to those lens designs is a cumbersome process. Novel lenses can’t be made quickly, he said, and once they are made they still need to be carefully integrated into a headset.

To allow the Display Systems Research team to work more quickly on the issue, the team built a ‘distortion simulator’, which actually emulates a VR headset using a 3DTV, and simulates lenses (and their corresponding distortion-correction algorithms) in-software.

Image courtesy Meta

Doing so has allowed the team to iterate on the problem more quickly, wherein the key challenge is to dynamically correct lens distortions as the eye moves, rather than merely correcting for what is seen when the eye is looking in the immediate center of the lens.

Retina Resolution

Image courtesy Meta

On the retina resolution front, Meta revealed a previously unseen headset prototype called Butterscotch, which the company says achieves a retina resolution of 60 pixels per degree, allowing for 20/20 vision. To do so, they used extremely pixel-dense displays and reduced the field-of-view—in order to concentrate the pixels over a smaller area—to about half the size of Quest 2. The company says it also developed a “hybrid lens” that would “fully resolve” the increased resolution, and it shared through-the-lens comparisons between the original Rift, Quest 2, and the Butterscotch prototype.

Image courtesy Meta

While there are already headsets out there today that offer retina resolution—like Varjo’s VR-3 headset—only a small area in the middle of the view (27° × 27°) hits the 60 PPD mark… anything outside of that area drops to 30 PPD or lower. Ostensibly Meta’s Butterscotch prototype has 60 PPD across its entirely of the field-of-view, though the company didn’t explain to what extent resolution is reduced toward the edges of the lens.

Continue on Page 2: High Dynamic Range, Downsizing »

The post Meta Reveals VR Headset Prototypes Designed to Make VR ‘Indistinguishable From Reality’ appeared first on Road to VR.

The Story of Unplugged: Bringing Air Guitar To Life In VR

When it comes to hand tracking games on Quest, nothing really comes close to Unplugged.

Developed by Anotherway and published by Vertigo Games in late 2021, Unplugged is an air guitar game, inspired by Guitar Hero and many others, that lets you shred in VR with a virtual guitar and your real hands.

As I’ve said elsewhere, Unplugged leverages Quest’s hand tracking technology to breathe life into the imaginary act of air guitar. In doing so, it takes hand tracking to a whole new conceptual and technological level, surpassing everything else available on Quest.

“From the very beginning, our obsession was to understand how the technology is limited and try to polish that stuff,” says studio director and Unplugged creator Ricardo Acosta. “That was the very first thing. Not the graphics, not even the gameplay.”

After speaking with Acosta in our virtual studio (full video interview embedded above), it’s clear that creating a polished and tangible experience was always the goal. “I think that hand tracking is here for good,” he tells me. “I wanted to create something that worked for real. It wasn’t just another demo.”

Such strong commitment to this new form of input is a big call, especially for Acosta, who spent years as a hand tracking skeptic while working on the HoloLens team at Microsoft. “When I was at Microsoft, I was like an advocate for controllers,” he says with a laugh. “At Microsoft, they are all about hand tracking, but I was like, ‘No guys, we need controllers. Controllers are great.’ And now I’m saying the exact opposite thing.”

“On the first version of the HoloLens … you have hand tracking, but just like the blob. It’s just the hand, not the fingers.” Without full and reliable finger tracking, Acosta came away disappointed and skeptical. “With the HoloLens 2, it was a bit better, but the lag between your movement and the hand was very big, for a lot of technical reasons.”

Even so, Unplugged was first conceptualized in 2015 — well before the advent of any modern VR’s hand tracking functionality. “I remember being in a concert in Prague and I was just like doing air guitar,” he recalls. “And at some point I was like, oh, this is an interaction that could work in VR.”

“As soon as I went back home, I prototyped something … and it totally worked. It was like, oh, this is good. This is something that we could actually turn into a game.” The original idea developed into something akin to Rock Band but for VR, using controllers and the first Vive headsets and Oculus SDKs. Acosta said he quit his job at Microsoft to work on the prototype, titled Rock the Stage, over the course of four months.

“I think that it was pretty good,” he says of the Rock the Stage prototype, of which videos still exist online.  “The best thing it was that it made you feel like you were there.” But Acosta soon ran into a road bump — music games, and particularly the associated licensing, are complicated work. “You need a lot of money. You need a team of people handling all that music licensing. And I didn’t have all that back in the day. So I decided, at some point, to go back to my job.”

After continuing in Microsoft’s VR/AR division for another few years, Acosta revisited the concept in 2020 while bored at home during the pandemic. “Oculus [had] just released the hand tracking system [for Quest] and suddenly it came to me like, ‘Oh my god, I could actually rescue that…prototype and try [see] if it works using hand tracking.'”

Even in the early stages, hand tracking felt like a turning point for the previously controller-only experience. “It worked so well. . .Back in the day with the controllers was nice, but with hand tracking was exactly what it should be.” Acosta adapted his original prototype into something new, ditching controllers for something much more freeing and immersive. “When I put [hand tracking] on the prototype, it wasn’t perfect, but it was good enough for me to start polishing the experience. I knew that with a bit of work and a few algorithms on top of the hand tracking, I could make it work.”

Acosta created a video showcasing the new prototype game and posted it to social media. It soon exploded and attracted a lot of interest, especially from publishers offering funding. After discussions options with a few different publishers, Acosta signed with Vertigo Games. “They offered the best deal. And also they were bigger, and they really had a super nice vision about what the game should be.”

“At first I was a bit scared about it, because it was a super big project. We didn’t have a company together. It was complicated.” What started as a one-man show had to turn into a burgeoning team. Acosta’s wife joined as a project manager and they were then joined by a few others to make up the small studio now known as Anotherway.

“We are six people now, which is not a lot,” he says. “Very recently, we had the opportunity to grow a little bit, but we decided to stay small. I’ve been working in Microsoft for most of my career. That is a very big company and it’s amazing, but I really like working with just a very small amount of people. It’s a very creative environment.”

Working alongside Vertigo, Unplugged quickly developed into a project with bigger ambitions than Acosta had ever imagined. “I’m very conservative in terms of adding features, because I know that anything you add to a project, it will create a lot of problems, a lot of bugs, a lot of things.”

“They pushed for more staff. They wanted more music, they wanted more venues, they wanted more quality on the game and they’ve been always pushing for that. And I think that, in general, the game would have been way smaller without Vertigo,” he says.

In particular, working with Vertigo opened up opportunities when it came to the proposed tracklist. “In the very beginning we were just going for small bands. And then when we signed up with Vertigo they were like ‘No, like indie bands are cool and we will have a few. But we need famous bands.’ And we were like, oh, but that’s going to be super complicated.”

Vertigo sent Anotherway a Spotify playlist and asked them to add any songs they might want in the game. “And we were like ‘Wait, whatever music?'” It was a big mental shift.

The Offspring’s The Kids Aren’t Alright was the first major song that Vertigo and Anotherway secured the rights to. “We were just like jumping, like, ‘Oh my god, we made it.'” The final selection included some massive artists — The Clash, T. Rex, Weezer and Steel Panther, to name a few. “[Music licensing] is a very time-consuming process, and I knew that. So not even in my wildest dreams I would have dreamed about having Weezer or Tenacious D, The Offspring, or Ozzy…”

The inclusion of Tenacious D’s Roadie is particularly special to Acosta — not only is the band one of his favorites, but he had used the song all the way back in 2015 in the very first prototype. However, the song almost didn’t make it into the final game at all.

Vertigo and Anotherway initially struggled to make contact with Tenacious D to secure the rights. However, Vertigo had a trick up its sleeve — Guitar Hero legend Mark Henderson had been brought on board to assist with the game. “He was like, ‘Guys, leave it up to me. I’ll make it happen.’ So somehow he contacted the manager of Tenacious D and started talking to them.”

With Henderson’s help the rights to the song were secured. But another problem emerged — with a PEGI 12 rating, Roadie’s explicit and frequent F-bombs weren’t going to cut it. “So at another point we were like, ‘Okay, we have the song now, but we cannot use it because we are PEGI 12, so we have to take it out from the list.'”

Acosta made his peace with leaving the song off the tracklist but, in his words, “maybe the stars were in a particular position that night.” Henderson was able to get Tenacious D back into the studio to re-record a clean version of Roadie, specifically for Unplugged, excluding all the swearing.

“It was insane,” says Acosta. “Knowing that my favorite band re-recorded a song just for the game. It’s insane. It’s just amazing. And a lot of people have complained about the fact that it’s a different version of the song, without the swearing. But I’m so proud of that. To me, it’s even better because it’s our song.”

With a solid tracklist secured, Acosta and the team at Anotherway set to work on creating an unforgettable and reliable hand tracking experience. “I am a UX designer, so for me, the main important thing on anything is user experience. If the experience is not good, the whole game won’t work, or the whole experience will be shit, and we didn’t want that.”

As a result, the gameplay itself was adapted and designed to work with, not against, hand tracking. Even tiny changes mad a big difference — the size of the guitar in Unplugged, for example, is a bit smaller than a regular, real-life guitar, which helps keep your hands in view of the cameras.

“In the beginning, with hand tracking 1.0, we had to be very aware of your movements,” he explains. “We had to create the mapping so that the music charts in a way that is always aware of the limitations of the technology.”

That meant that at launch, the mapping in Unplugged didn’t always completely follow the music, leading some players to complain that the music and the notes didn’t always line up. “And we knew why, but we couldn’t do anything about it, because the hand tracking was very limited and you couldn’t move your hand that quickly,” he said.

Nonetheless, Acosta remains proud of the experience offered at launch. “In the first version, it was absolutely playable. Obviously it wasn’t perfect, but it was playable. And I think that we proved that you can actually create a hand tracking game that uses hand tracking very intensively.”

Skip forward a few months after launch and the release of Meta’s Hand Tracking 2.0 software offered huge gains for Unplugged. Not only was the technology more reliable than ever, but it was so good that Anotherway went back and re-mapped the entire tracklist for increased accuracy and challenge. “We want the game to be fully accessible for everyone, obviously. But I think that for 98% of people, the game works very well.”

Nonetheless, Anotherway are still implementing algorithms and workarounds to account for error and improve the experience — the latest being an AI system. “We’re using deep learning in order to see where your hands should be or what’s your pose or what’s your intentions. We made all that stuff so [that] when there is a problem with the hand tracking, there is another layer trying to help and trying to make the experience as smooth as possible.”

There’s more to come too. In the short term, Anotherway just released a new DLC pack — featuring songs by metal band Pantera — and are working on an upcoming accessibility update adding new features and “another thing” that is top secret but will be “really big.”

In terms of song selection, there’s definitely more on the way. “We are working to add more music all the time. We want to add free music [as well], not just DLC. Also, I want to add more indie music because I think that there is a lot of really good indie music out there.”

But what about the long term? What does the next year or more look like for Unplugged? “I cannot talk too much about it because Vertigo will kill me,” Acosta says with a laugh. “But our plans are very big. Unplugged is going to become bigger, at least in terms of features…”

“I would be very excited about Unplugged if I knew what’s going to happen. Probably like in a year, Unplugged will be very different. It will have way more stuff. That’s it. That’s all I can say.”

For a game that has already pioneered a new technology on a cutting edge piece of hardware, there could be a lot of interesting developments in Anotherway’s future.

“Unplugged is going to move forward,” Acosta said. “That is for sure. We are not staying still.”


Unplugged is available on Quest headsets and hand tracking-enabled PC VR headsets on Steam. You can read our full and updated 2022 of the game here

The 20 Best Rated & Most Popular Quest Games & Apps – June 2022

While Oculus doesn’t offer much publicly in the way of understanding how well individual games & apps are performing across its Quest 2 storefront, it’s possible to glean some insight by looking at apps relative to each other. Here’s a snapshot of the 20 best rated Oculus Quest games and apps as of June 2022.

Some quick qualifications before we get to the data:

  • Paid and free apps are separated
  • Only apps with more than 100 reviews are represented
  • App Lab apps are not represented (see our latest Quest App Lab report)
  • Rounded ratings may appear to show ‘ties’ in ratings for some applications, but the ranked order remains correct

Best Rated Oculus Quest 2 Games & Apps – Paid

The rating of each application is an aggregate of user reviews and a useful way to understand the general reception of each title by customers.

Rank Name Rating (# of ratings) Rank Change Price
#1 Puzzling Places 4.9 (1,161) $15
#2 The Room VR: A Dark Matter 4.89 (11,097) $30
#3 Walkabout Mini Golf 4.86 (7,376) ↑ 1 $15
#4 I Expect You To Die 2 4.86 (2,165) ↓ 1 $25
#5 Swarm 4.82 (2,044) $25
#6 Ragnarock 4.81 (902) $25
#7 The Last Clockwinder 4.81 (199) New $25
#8 Moss 4.81 (5,880) ↓ 1 $30
#9 I Expect You To Die 4.8 (4,742) ↓ 1 $25
#10 YUKI 4.8 (194) ↓ 1 $20
#11 Cubism 4.79 (712) ↓ 1 $10
#12 Pistol Whip 4.78 (8,550) $30
#13 The Thrill of the Fight 4.77 (9,381) $10
#14 Five Nights at Freddy’s: Help Wanted 4.77 (10,284) $30
#15 GORN 4.75 (6,720) ↑ 1 $20
#16 In Death: Unchained 4.75 (3,863) ↑ 1 $30
#17 Cosmonious High 4.74 (325) ↓ 6 $30
#18 Yupitergrad 4.74 (526) $15
#19 Resident Evil 4 4.73 (9,238) ↑ 3 $40
#20 Trover Saves the Universe 4.72 (2,148) $30

Rank change & stats compared to May 2022

Dropouts:
Little Cities, The Tale of Onogoro

  • Among the 20 best rated Quest apps
    • Average rating (mean): 4.8 out of 5 (±0)
    • Average price (mean): $24 (+$1)
    • Most common price (mode): $30 (±$0)
  • Among all paid Quest apps
    • Average rating (mean): 4.2 out of 5 (±0)
    • Average price (mean): $20 (±$0)
    • Most common price (mode): $20 (±$0)

Continue on Page 2: Most Popular Paid Oculus Quest Apps »

The post The 20 Best Rated & Most Popular Quest Games & Apps – June 2022 appeared first on Road to VR.

The 20 Best Rated & Most Popular Quest Games & Apps – June 2022

While Oculus doesn’t offer much publicly in the way of understanding how well individual games & apps are performing across its Quest 2 storefront, it’s possible to glean some insight by looking at apps relative to each other. Here’s a snapshot of the 20 best rated Oculus Quest games and apps as of June 2022.

Some quick qualifications before we get to the data:

  • Paid and free apps are separated
  • Only apps with more than 100 reviews are represented
  • App Lab apps are not represented (see our latest Quest App Lab report)
  • Rounded ratings may appear to show ‘ties’ in ratings for some applications, but the ranked order remains correct

Best Rated Oculus Quest 2 Games & Apps – Paid

The rating of each application is an aggregate of user reviews and a useful way to understand the general reception of each title by customers.

Rank Name Rating (# of ratings) Rank Change Price
#1 Puzzling Places 4.9 (1,161) $15
#2 The Room VR: A Dark Matter 4.89 (11,097) $30
#3 Walkabout Mini Golf 4.86 (7,376) ↑ 1 $15
#4 I Expect You To Die 2 4.86 (2,165) ↓ 1 $25
#5 Swarm 4.82 (2,044) $25
#6 Ragnarock 4.81 (902) $25
#7 The Last Clockwinder 4.81 (199) New $25
#8 Moss 4.81 (5,880) ↓ 1 $30
#9 I Expect You To Die 4.8 (4,742) ↓ 1 $25
#10 YUKI 4.8 (194) ↓ 1 $20
#11 Cubism 4.79 (712) ↓ 1 $10
#12 Pistol Whip 4.78 (8,550) $30
#13 The Thrill of the Fight 4.77 (9,381) $10
#14 Five Nights at Freddy’s: Help Wanted 4.77 (10,284) $30
#15 GORN 4.75 (6,720) ↑ 1 $20
#16 In Death: Unchained 4.75 (3,863) ↑ 1 $30
#17 Cosmonious High 4.74 (325) ↓ 6 $30
#18 Yupitergrad 4.74 (526) $15
#19 Resident Evil 4 4.73 (9,238) ↑ 3 $40
#20 Trover Saves the Universe 4.72 (2,148) $30

Rank change & stats compared to May 2022

Dropouts:
Little Cities, The Tale of Onogoro

  • Among the 20 best rated Quest apps
    • Average rating (mean): 4.8 out of 5 (±0)
    • Average price (mean): $24 (+$1)
    • Most common price (mode): $30 (±$0)
  • Among all paid Quest apps
    • Average rating (mean): 4.2 out of 5 (±0)
    • Average price (mean): $20 (±$0)
    • Most common price (mode): $20 (±$0)

Continue on Page 2: Most Popular Paid Oculus Quest Apps »

The post The 20 Best Rated & Most Popular Quest Games & Apps – June 2022 appeared first on Road to VR.

‘Moss: Book II’ Behind-the-scenes – Insights & Artwork from Polyarc

In an industry as young as VR there’s scarcely any original IP that’s had the time to become truly iconic. But if there’s one today that looks well on its way, the world of Moss is a strong contender. With the release of its debut title, Moss (2018), developer Polyarc entrusted big ambitions to a tiny hero, Quill, a character so recognizable it’s easy to forget that it’s not her name on the cover. Four years after the original, Polyarc doubled down on its diminutive hero with the recent release of Moss: Book II which takes the tale of Quill to new heights, further cementing the studio’s IP as a staple of VR gaming. To learn about how the studio has continued to hone its craft of VR game design, we sat down with Polyarc Design Director Josh Stiksma to get a glimpse behind the scenes at the game’s development and the art that inspired it.

Editor’s Note: The exclusive artwork peppered throughout this article is best viewed on a desktop browser with a large screen or in landscape orientation on your phone. All images courtesy Insomniac Games; special thanks to artist Darren Quach.

Making a Seamless World

Image courtesy Chris Alderson, Polyarc

One of many things that sets Moss: Book II apart from many other VR titles is how seamless the game is—that is to say, everything that happens in Book II, from combat to narrative to environment, feels like it’s part of a single cohesive, believable, and beautiful fantasy world.

Book II takes players not just on a figurative journey, but a literal one too—you’re there alongside Quill’s every mouse-sized step; her adventure forms a nearly unbroken path both physically through the game world and temporally through the story.

Image courtesy Polyarc

As you can imagine, this didn’t happen by accident. But putting it all together would be a bit like trying to build a puzzle before you know what each piece looks like. Polyarc, however, was up to the challenge.

Design Director Josh Stiksma told me the studio started by drafting a rough, high-level map outlining some of the narrative and gameplay moments the team had in mind.

Image courtesy Josh Stiksma, Polyarc

“With a rough idea of the overworld mapped out visually, we fleshed out ideas for the individual areas and dungeons. With some concepts firmed up, we then went back and forth trying to massage the world pieces together with the narrative and gameplay beats. This took a long time to get all the connections between locations making sense and lined up with our pacing goals,” he said. “Admittedly, a lot of this isn’t unique to VR and came from our previous experiences in AAA [game development]—but there are many nuances along the way. One challenge that I’ll share is the amount of time we spent thinking about the complexity of the map and worrying about if the player will get lost. The design and art team worked together closely to ensure the different dioramas felt connected and the player could make sense of where they were in the world by looking around the beautiful environments.”

Indeed, the environmental art and direction is a highlight of Book II. The studio deftly wove the game’s golden path through a world full of iconic sights that are as beautiful as they are memorable, making them perfect landmarks help the player keep track of their place within the broader world.

For instance, there’s a huge castle that acts as something of a hub for the game. As you explore beyond its walls you can almost always catch a glimpse of it in the distance to remind you just how far you’ve traveled.

Image courtesy Polyarc

And there’s something to be said about the dichotomy of the game’s two scales: mouse-scale and human scale. The environments in Book II fuse both together in a skillful and meaningful way.

As Quill, you walk through individual blades of grass, tip toe across branches, and clamber up small rocks. But as the player (who exists in the world at human scale) you see the entire patch of grass, the whole tree the branch is connected to, and the hill where the rocks fell from in the first place.

Image courtesy Mike Jensen, Polyarc

The real magic of this formula comes when the game bridges those two scales. Throughout the game you’ll reach out to Quill to heal her, power her up, or help her change weapons. You’ll also push, pull, and twist various obstacles within the world that would otherwise be impassable by Quill alone.

Book II really hammers this synergy of scales home with what is perhaps its most iconic boss fight. If you haven’t played the game yet, the following includes spoilers!

Continue on Page 2: Forging a Fight »

The post ‘Moss: Book II’ Behind-the-scenes – Insights & Artwork from Polyarc appeared first on Road to VR.

Hands-on: Lumus Prototype AR Glasses Are Smaller & Better Than Ever

Lumus’ latest waveguide, dubbed Maximus, is now even more compact thanks to 2D image expansion. With impressive image quality and a more compact optical engine, the company is poised to have a leading display solution for truly glasses-sized AR headsets.

2D expansion adds an additional light bounce to expand the image, allowing for a smaller optical engine

Lumus has been touting its Maximus waveguide since as far back as 2017, but since then its waveguide display has improved and shrunk considerably, thanks to so-called ‘2D expansion’ which allows the optical engine (the part of the waveguide display which actually creates the image) to be considerably smaller without sacrificing quality or field of view. The improvements have moved the company’s display solution closer than ever to actually looking and working like a pair of glasses.

For comparison, here’s a look at the first time we saw Maximus back in 2017. It had thin optics and a fairly wide field-of-view, but the optical engine was huge, requiring a large overhead structure.

Photo by Road to VR

The company’s latest Maximus waveguide has shrunk things down considerably with 2D image expansion. That means the light is reflected twice to magnify the image vertically and then horizontally before bouncing it into your eye. Doing so allows the optical engine (where the display and light source are housed) to be much smaller and mounted on the side of the glasses while retaining plenty of peripheral vision.

What you’re seeing here is a fully functional display prototype (ie: working images through the lens, but battery and compute are not on-board) that I got to check out at last week’s AWE 2022.

Here’s a look at how the optical engine has been shrunk when moving from 1D expansion to 2D expansion. It’s clear to see how much easier it would be to fit the left one into something you could really call glasses.

Lumus waveguide and optical engine with 2D expansion (left) and 1D expansion (right)

Actually looking through the prototype glasses you can see a reasonably wide 50° field-of-view, but more importantly an impressively uniform image, both in color and clarity. By comparison similar devices like HoloLens 2 and Magic Leap tend to have hazy color inconsistency which often shows a faint rainbow haze from one side of the view to the other. Our friend Karl Guttag captured a great through-the-lens comparison from a similar Lumus prototype:

Image courtesy Karl Guttag

Brightness in the Lumus Maximus glasses is also a major advantage, so much so that these glasses don’t need to dim the incoming light at all, compared to many other AR headsets and glasses that have sunglasses-levels of tinting in order to make the virtual image appear more solid against even ambient indoor light. Lumus says this Maximus prototype goes up to 3,000 nits which is usable in broad daylight.

The lack of heavy tinting also means other people can see your eyes just as easily as if you were wearing regular glasses, which is an important social consideration (wearing sunglasses indoors, or otherwise hiding your eyes, has a connotation of untrustworthiness).

The image through the glasses is also quite crisp; the waveguide is paired with a 1,440 × 1,440 microdisplay which resolves small text fairly well given that it’s packed into a 50° field-of-view. The company says the waveguide in no way limits the potential resolution—all that’s needed is a higher resolution microdisplay. In fact the company has previously shown off a similar version of this prototype with a 2,048 × 2,048 display, which was measured to achieve a retina resolution of 60 pixels per-degree.

Lumus’ waveguide offerings clearly have a lot of advantages compared to contemporaries, especially with overall image quality, brightness, and social acceptability. The big question at this point is… why aren’t we seeing them in consumer products yet?

The answer is multifaceted (if anyone from Lumus is reading this, yes, that’s an intentional pun). For one, what Lumus is showing here is a display prototype, which means the displays are functional, but the glasses themselves have none of the other stuff you need for a pair of standalone AR glasses (ie: battery, compute, and sensors). You can of course offload the compute and battery into a tethered ‘puck’ design, but this significantly reduces the consumer appeal. So those other components still require some miniaturization R&D to be done before everything can fit comfortably into this form-factor.

Another reason is manufacturing costs. Lumus insists that its waveguide solutions can be affordably manufactured at large scales—even for consumer-priced products—and has the backing of major electronics manufacturer Quanta Computer and glass manufacturing specialist Shott. But manufacturing at small scale may not be reasonably affordable when it comes to a device priced for the consumer market. That means waiting until a big player is ready to place a big bet on bringing an AR device to consumers.

For Lumus’ part, the company says it has been working closely with several so-called ‘tier-1’ technology companies (a category which would include Facebook, Apple, Google, and others) for years now. Lumus expects to see the first major consumer product incorporating its waveguide solution in 2024.

The post Hands-on: Lumus Prototype AR Glasses Are Smaller & Better Than Ever appeared first on Road to VR.

Tech Secrets Behind ‘Cosmonious High’s’ Cast of Interactive Characters

Cosmonious High contains 18 characters across six species all created by a team with zero dedicated animators. That means lots and lots of code to create realistic behaviors and Owlchemy-quality interactivity! The ‘character system’ in Cosmonious High is a group of around 150 scripts that together answer many design and animation problems related to characters. Whether it’s how they move around, look at things, interact with objects, or react to the player, it’s all highly modular and almost completely procedural.

This modularity enabled a team of content designers to create and animate every single line of dialogue in the game, and for the characters to feel alive and engaging even when they weren’t in the middle of a conversation. Here’s how it works.

Guest Article by Sean Flanagan & Emma Atkinson

Cosmonious High is a game from veteran VR studio Owlchemy Labs about attending an alien high school that’s definitely completely free of malfunctions! Sean Flanagan, one of Owlchemy’s Technical Artists, created Cosmonious High’s core character system amongst many other endeavors. Emma Atkinson is part of the Content Engineering team, collectively responsible for implementing every narrative sequence you see and hear throughout the game.

The Code Side

Almost all code in the character system is reusable and shared between all the species. The characters in Cosmonious High are a bit like modular puppets—built with many of the same parts underneath, but with unique art and content on top that individualizes them.

From the very top, the character system code can be broken down into modules and drivers.

Modules

Every character in Cosmonious High gets its behavior from its set of character modules. Each character module is responsible for a specific domain of problems, like moving or talking. In code, this means that each type of Character is defined by the modules we assign to it. Characters are not required to implement each module in the same way, or at all (e.g. the Intercom can’t wave.)

Some of our most frequently used modules were:

CharacterLocomotion – Responsible for locomotion. It specifies the high-level locomotion behavior common to all characters. The actual movement comes from each implementation. All of the ‘grounded’ characters—the Bipid and Flan—use CharacterNavLocomotion, which moves them around on the scene Nav Mesh.

CharacterPersonality – Responsible for how characters react to the player. This module has one foot in content design—its main responsibility is housing the responses characters have when players wave at them, along with any conversation options. It also houses a few ‘auto’ responses common across the cast, like auto receive (catching anything you throw) and auto gaze (returning eye contact).

CharacterEmotion – Keeps track of the character’s current emotion. Other components can add and remove emotion requests from an internal stack.

CharacterVision – Keeps track of the character’s current vision target(s). Other components can add and remove vision requests from an internal stack.

CharacterSpeech – How characters talk. This module interfaces with Seret, our internal dialogue tool, directly to queue and play VO audio clips, including any associated captions. It exposes a few events for VO playback, interruption, completion, etc.

It’s important to note that animation is a separate concern. The Emotion module doesn’t make a character smile, and the Vision module doesn’t turn a character’s head—they just store the character’s current emotion and vision targets. Animation scripts reference these modules and are responsible for transforming their data into a visible performance.

Drivers

The modules that a character uses collectively outline what that character can do, and can even implement that behavior if it is universal enough (such as Speech and Personality.) However, the majority of character behavior is not capturable at such a high level. The dirty work gets handed off to other scripts—collectively known as drivers—which form the real ‘meat’ of the character system.

Despite their more limited focus, drivers are still written to be as reusable as possible. Some of the most important drivers—like CharacterHead and CharacterLimb—invisibly represent some part of a character in a way that is separate from any specific character type. When you grab a character’s head with Telekinesis, have a character throw something, or tell a character to play a mocap clip, those two scripts are doing the actual work of moving and rotating every frame as needed.

Drivers can be loosely divided into logic drivers and animation drivers.

Logic drivers are like head and limb—they don’t do anything visible themselves, but they capture and perform some reusable part of character behavior and expose any important info. Animation drivers reference logic drivers and use their data to create character animation—moving bones, swapping meshes, solving IK, etc.

Animation drivers also tend to be more specific to each character type. For instance, everyone with eyes uses a few instances of CharacterEye (a logic driver), but a Bipid actually animates their eye shader with BipedAnimationEyes, a Flan with FlanAnimationEyes, etc. Splitting the job of ‘an eye’ into two parts like this allows for unique animation per species that is all backed by the same logic.

Continue on Page 2: The Content Side »

The post Tech Secrets Behind ‘Cosmonious High’s’ Cast of Interactive Characters appeared first on Road to VR.

Preview: ‘Shores of Loci’ is a Gorgeous 3D Puzzler Coming to Quest 2 & SteamVR Next Week

First time VR studio MikeTeevee is soon to release Shores of Loci, a 3D puzzle game backed by gorgeous and fantastical visuals. The game is set for an Early Access release on Quest 2 via App Lab and SteamVR on May 24th.

Though production company MikeTeevee has been around since 2011, the studio has never released a VR game before Shoes of Loci. Along with the game’s initial release on App Lab (also coming to SteamVR), you might expect the studio’s debut project to be rough around the edges. On the contrary, after previewing the game myself I found a polished experience that offers up enjoyable 3D puzzles with a backdrop of sharp and fantastical visuals that are a cut above many games you’d find on Quest 2.

At its most basic, Shores of Loci is like a fictional version of Puzzling Places. While the latter has you snapping together scans of real buildings, Shores of Loci instead slices up totally imagined (and quite beautiful) little dioramas.

A completed puzzle in ‘Shores of Loci’ | Image courtesy MikeTeeVee

Shores of Loci is enhanced by a surrounding environment that’s beautifully rendered and art directed—from the last glimpse of sunlight reflecting at the very edge of the horizon to the towering structures that surround you like silent giants—even on Quest 2 it all looks great.

A completed puzzle in ‘Shores of Loci’ | Image courtesy MikeTeeVee

The game effectively uses VR as a canvas for the imagination and serves up some very striking and creative visuals, like a scene transition that sees the entire world before you enveloped as if being consumed and then regurgitated by a black hole (it’s more peaceful than it sounds, I promise).

Shores of Loci’s puzzling offers a slightly more organic feeling than Puzzling Places, perhaps because of the way that the 3D models you fit together have volume inside of them instead of being hollow textures. In any case, the fundamental gameplay is quite similar in that you’ll need to use a combination of traditional puzzling skills (edge shapes, color matching, etc) with some spatial reasoning to reach the point that you get to snap that final, satisfying piece into place.

Alongside its lovely visual backdrop, Shores of Loci also has some great audio design, with peaceful music and satisfying sonic feedback as you progress through each puzzle.

– – — – –

For anyone that loves puzzles, Shores of Loci is an easy recommendation. You’re getting some fun 3D puzzles and a fantastical visual feast to go along with them. And you won’t need to wait long to try it yourself; Shores of Loci launches on App Lab and SteamVR on May 24th, priced at $15.

The post Preview: ‘Shores of Loci’ is a Gorgeous 3D Puzzler Coming to Quest 2 & SteamVR Next Week appeared first on Road to VR.

The 20 Best Rated & Most Popular Quest Games & Apps – May 2022

While Oculus doesn’t offer much publicly in the way of understanding how well individual games & apps are performing across its Quest 2 storefront, it’s possible to glean some insight by looking at apps relative to each other. Here’s a snapshot of the 20 best rated Oculus Quest games and apps as of May 2022.

Some quick qualifications before we get to the data:

  • Paid and free apps are separated
  • Only apps with more than 100 reviews are represented
  • App Lab apps are not represented (see our latest Quest App Lab report)
  • Rounded ratings may appear to show ‘ties’ in ratings for some applications, but the ranked order remains correct

Best Rated Oculus Quest 2 Games & Apps – Paid

The rating of each application is an aggregate of user reviews and a useful way to understand the general reception of each title by customers.

Rank Name Rating (# of ratings) Rank Change Price
#1 Puzzling Places 4.9 (1,130) $15
#2 The Room VR: A Dark Matter 4.89 (10,954) $30
#3 I Expect You To Die 2 4.86 (2,100) $25
#4 Walkabout Mini Golf 4.86 (7,013) $15
#5 Swarm 4.82 (2,002) $25
#6 Ragnarock 4.82 (844) $25
#7 Moss 4.81 (5,816) $30
#8 I Expect You To Die 4.8 (4,667) $25
#9 YUKI 4.8 (193) $20
#10 Cubism 4.8 (701) $10
#11 Cosmonious High 4.79 (267) $30
#12 Pistol Whip 4.78 (8,478) $30
#13 The Thrill of the Fight 4.78 (9,168) $10
#14 Five Nights at Freddy’s: Help Wanted 4.77 (10,142) $30
#15 Little Cities 4.75 (110) New $20
#16 GORN 4.75 (6,612) ↓ 1 $20
#17 In Death: Unchained 4.75 (3,789) $30
#18 Yupitergrad 4.74 (513) $15
#19 The Tale of Onogoro 4.73 (212) ↓ 3 $30
#20 Trover Saves the Universe 4.73 (2,131) ↓ 1 $30

Rank change & stats compared to April 2022

Dropouts:
Vermillion

  • Among the 20 best rated Quest apps
    • Average rating (mean): 4.8 out of 5 (±0)
    • Average price (mean): $23 (±$0)
    • Most common price (mode): $30 (±$0)
  • Among all paid Quest apps
    • Average rating (mean): 4.2 out of 5 (±0)
    • Average price (mean): $20 (±$0)
    • Most common price (mode): $20 (±$0)

Continue on Page 2: Most Popular Paid Oculus Quest Apps »

The post The 20 Best Rated & Most Popular Quest Games & Apps – May 2022 appeared first on Road to VR.

Little Stories, Little Cities – How Purple Yonder Adapted City Simulators For VR

Given the level of polish and detail in Little Cities, you’d be forgiven for thinking this was a Quest release developed by quite a large team of people.

But speaking to Purple Yonder’s James Howard last week revealed to me just how small the studio really is. “[My wife] Kelly and I both worked on the design,” he tells me. “I did all the programming for the game, so I ended up working out the technical systems for the code and stuff. And then she works on a lot of the stuff like the UI design and getting the levels just right — thinking what each item could be, the different buildings you can get, working all those things out.”

James and Kelly Howard make up the indie UK studio, which is the driving force behind Little Cities. They were helped along by some contracted artists across the development cycle, as well as an audio designer and a composer. nDreams also came on board later in the process, for publishing support, but for the most part, it was quite a tight-knit group.

Could such a little team be what helped Little Cities expertly deliver on a VR-first approach to a city simulator game?

little cities oculus quest

Starting Small, Expanding Out

Little Cities had a curious launch. Sandwiched next to Cities VR, there was an unavoidable risk of being overshadowed on release, perhaps looking like a simplified version of the former. As it turned out, the underdog came out on top; Cities VR failed to fully deliver on its expansive vision, marred by under-performing visuals and overwhelming VR design decisions. Little Cities, meanwhile, came away with a focused approach to the genre that puts intelligent VR implementation first.

But how did this tiny indie studio go from inception in 2018 to releasing a nDreams-supported title on the biggest VR headset of the moment just a few years later?

Long before the days of VR, James began his game development journey as a kid, making games in BASIC. A few years and a computer science degree later, he went on to cut his teeth with opportunities at some big name studios — EA, Rockstar and then, Ninja Theory.

I did a lot of cool stuff there [at Ninja Theory], and that was where I really started to get involved with VR,” he said. We had a really small team. I think there was about two or three of us, depending on when it was. But we were just concentrating on VR and just exploring VR stuff.”

He ended up working on the VR version of Ninja Theory’s 2017 title Hellblade: Senua’s Sacrifice. “I did most of the work for Hellblade VR, and mapping that to VR, which had a lot of challenges,” he said. “And then, Ninja Theory got acquired by Microsoft and they didn’t really have as much of a focus on VR, but that was something I still wanted to do.”

As that experience came to a close, he began thinking about pursuing his own VR projects. “Working on something like Hellblade VR was interesting, because people weren’t doing third person VR games, and it works. It works quite well. And it left us with a feeling of like, ‘Well, which other genres could work in VR, which no one’s attempted?'”

He had always been a fan of the city simulator genre — one that, at the time, had yet to be tackled properly in VR — and began thinking about how early inspirations, like the original Sim City and Sim City 2000, might be adapted for VR.

It was at this point, around 2018, that James began work on what would become Little Cities. “I had this prototype that I was working on, and Kelly, my wife, was like, ‘This is something really special here. We should actually take this a bit more seriously. What do you think? What do you want to do with this type of thing?'” They presented the prototype to the UK Games Fund, and the resulting government grant allowed them to kick Purple Yonder, and Little Cities, fully into action.

“We just like went and jumped into the deep end, set up a company. We built [the prototype] up a little bit more and we ended up taking it to nDreams and they loved it too. As soon as they saw the game, they just got it. They just saw what we were trying to do and they were able to give us some support on the publishing side.”

little cities oculus quest

Interaction, Intuition, Innovation

Four years later and Little Cities is available on the Quest platform, a masterclass in made-for-VR design that presents an experience that is equally accessible and enjoyable for newcomers and experts to VR and the city simulator genre alike.

However, early versions of the game weren’t quite as intuitive as the final product. The user interface itself — one of the game’s shining accomplishments — underwent four or five complete redesigns over the course of development. “Really early on, it was just taking concepts like big flat panels, like you’d have in a PC title. It was just like, ‘Well, this isn’t fun. This isn’t really using VR to its best. What else can we do?'”

It wasn’t just clunky PC-to-VR menu translations that were avoided – every traditional aspect of the genre was reassessed and adjusted accordingly for the new medium. “Say you’ve never played VR before and you go buy a Quest and you bought our game,” explains James. “We want you to have a good experience. We don’t want it to be difficult to get into. We wanted it to be really accessible. And the same if maybe you’ve never played a city builder before either [and] you don’t know any of the general rules that you have around these types of games. We just wanted to make it so anyone can pick up and play it.”

Gridlocked Traffic

All vehicles in Little Cities come from seaports or an airport. This includes construction vehicles, which need to arrive at a vacant space of land before construction can begin. “Traditionally in a game like this, you build your road networks, and then if there’s a lot of traffic, you just get like a stat somewhere, like, ‘Oh, the traffic’s bad.’ And you’re like, okay, guess I have to do something about it.”

Little Cities shifts this stat to a visual representation — you can actually see traffic building up and blocking construction vehicles from reaching their destination, slowing your progress. “That’s not affecting the citizens so much as it’s affecting the player. So now they just naturally get that feedback. That’s directly affecting them instead.”

Adapting the simulation language in this way – away from stats and notifications, focusing on a visual-led approach – avoids overwhelming the player with complex menus, budgeting, finance options and the like.

When you’re building a game based on a simulation, you can go really deep with what you’re simulating — what your citizens are doing and the reasons they’re coming or leaving and stuff like that. But what it comes down to is… If you can’t show the player all those things in the simulation, then it’s sort of wasted.”

Little Cities Volcanic Island

Creating Little Moments

When James asked his non-gamer dad if he wanted to try Little Cities, he only expected him to last 10 minutes before taking off the headset and giving some pleasant remarks. “We lost him for like about two and a half hours. He just played until the battery ran out. It was like, ‘Oh, okay, that’s interesting cause he doesn’t play games, so…'”

It’s easy to get lost in Little Cities, and the backbone of its immersion is a plethora of small yet poignant details at every turn. A whale breeching, hot air balloons taking to the sky, or a flock of birds scattering around you – these charming moments add hugely important depth to the experience. “We got traffic in the game, working with the cars driving around. And suddenly that brought a little bit more life to the game. And then from there, the next natural step was police cars, fire engines, stuff like that. The game’s even more alive.”

“And at some point, we were like, ‘Okay, these little details are really cool.’ That sort of became like a bit of a pillar in our development — little stories, little cities. The idea that there’s actually a whole subsystem in the code which is working out like, ‘Okay, what’s another cool thing to show to the player?'”

Little_cities_roadmap_1920x1080

A Strong Foundation, Ripe For Expansion

There’s lots on the horizon for Little Cities. Work has already begun on hand tracking support, set to arrive in June, thanks to requests and feedback from players. The game wasn’t designed with hand tracking in mind, but the existing virtual hand-based menu and watch system feels like it was. “That just naturally translates really well to hand tracking,” says James. “It just fits, it’s great.”

Cosmetic and decorative items, arriving in July, will give players more personalization options with new one or two tile decorative options. “Maybe you can put a statue or a fountain, or there might be things that you can put [like] benches by the roadsides and stuff. Those are the kinds of things we’re thinking. If you ‘re already building a city, but you want to just make it look a bit nicer.”

Purple Yonder has lots of ideas on where else they can take them game, but they’ll also be listening to the community and shaping support around what they hear. The one thing about Little Cities is that every time someone plays it, they’ve got ideas,” says James. “There’s just so many ways you can build on it and extend it, and we’ve got a whole host of ideas of how we can do that.”

With such a strong foundation, the only way for Little Cities to go is up.


Little Cities is available now on the Quest platform. You can read our full review here.