Tech Secrets Behind ‘Cosmonious High’s’ Cast of Interactive Characters

Cosmonious High contains 18 characters across six species all created by a team with zero dedicated animators. That means lots and lots of code to create realistic behaviors and Owlchemy-quality interactivity! The ‘character system’ in Cosmonious High is a group of around 150 scripts that together answer many design and animation problems related to characters. Whether it’s how they move around, look at things, interact with objects, or react to the player, it’s all highly modular and almost completely procedural.

This modularity enabled a team of content designers to create and animate every single line of dialogue in the game, and for the characters to feel alive and engaging even when they weren’t in the middle of a conversation. Here’s how it works.

Guest Article by Sean Flanagan & Emma Atkinson

Cosmonious High is a game from veteran VR studio Owlchemy Labs about attending an alien high school that’s definitely completely free of malfunctions! Sean Flanagan, one of Owlchemy’s Technical Artists, created Cosmonious High’s core character system amongst many other endeavors. Emma Atkinson is part of the Content Engineering team, collectively responsible for implementing every narrative sequence you see and hear throughout the game.

The Code Side

Almost all code in the character system is reusable and shared between all the species. The characters in Cosmonious High are a bit like modular puppets—built with many of the same parts underneath, but with unique art and content on top that individualizes them.

From the very top, the character system code can be broken down into modules and drivers.

Modules

Every character in Cosmonious High gets its behavior from its set of character modules. Each character module is responsible for a specific domain of problems, like moving or talking. In code, this means that each type of Character is defined by the modules we assign to it. Characters are not required to implement each module in the same way, or at all (e.g. the Intercom can’t wave.)

Some of our most frequently used modules were:

CharacterLocomotion – Responsible for locomotion. It specifies the high-level locomotion behavior common to all characters. The actual movement comes from each implementation. All of the ‘grounded’ characters—the Bipid and Flan—use CharacterNavLocomotion, which moves them around on the scene Nav Mesh.

CharacterPersonality – Responsible for how characters react to the player. This module has one foot in content design—its main responsibility is housing the responses characters have when players wave at them, along with any conversation options. It also houses a few ‘auto’ responses common across the cast, like auto receive (catching anything you throw) and auto gaze (returning eye contact).

CharacterEmotion – Keeps track of the character’s current emotion. Other components can add and remove emotion requests from an internal stack.

CharacterVision – Keeps track of the character’s current vision target(s). Other components can add and remove vision requests from an internal stack.

CharacterSpeech – How characters talk. This module interfaces with Seret, our internal dialogue tool, directly to queue and play VO audio clips, including any associated captions. It exposes a few events for VO playback, interruption, completion, etc.

It’s important to note that animation is a separate concern. The Emotion module doesn’t make a character smile, and the Vision module doesn’t turn a character’s head—they just store the character’s current emotion and vision targets. Animation scripts reference these modules and are responsible for transforming their data into a visible performance.

Drivers

The modules that a character uses collectively outline what that character can do, and can even implement that behavior if it is universal enough (such as Speech and Personality.) However, the majority of character behavior is not capturable at such a high level. The dirty work gets handed off to other scripts—collectively known as drivers—which form the real ‘meat’ of the character system.

Despite their more limited focus, drivers are still written to be as reusable as possible. Some of the most important drivers—like CharacterHead and CharacterLimb—invisibly represent some part of a character in a way that is separate from any specific character type. When you grab a character’s head with Telekinesis, have a character throw something, or tell a character to play a mocap clip, those two scripts are doing the actual work of moving and rotating every frame as needed.

Drivers can be loosely divided into logic drivers and animation drivers.

Logic drivers are like head and limb—they don’t do anything visible themselves, but they capture and perform some reusable part of character behavior and expose any important info. Animation drivers reference logic drivers and use their data to create character animation—moving bones, swapping meshes, solving IK, etc.

Animation drivers also tend to be more specific to each character type. For instance, everyone with eyes uses a few instances of CharacterEye (a logic driver), but a Bipid actually animates their eye shader with BipedAnimationEyes, a Flan with FlanAnimationEyes, etc. Splitting the job of ‘an eye’ into two parts like this allows for unique animation per species that is all backed by the same logic.

Continue on Page 2: The Content Side »

The post Tech Secrets Behind ‘Cosmonious High’s’ Cast of Interactive Characters appeared first on Road to VR.

Catch Road to VR Co-founder Ben Lang on the Between Realities Podcast

Road to VR co-founder Ben Lang recently joined the crew of the Between Realities podcast.

Bringing more than a decade of experience in the XR industry as co-founder of Road to VR, Ben Lang joined hosts Alex VR and Skeeva on Season 5 Episode 15 of the Between Realities podcast. The trio spoke about the impetus for founding the publication, Meta’s first retail store, the state of competition in the XR industry, privacy concerns for the metaverse, and even some musing on simulation theory. You can check out the full episode below or in the Between Realities episode feed on your favorite podcast platform.

In the podcast Lang speaks of a recent article about scientists that believe it’s possible to experimentally test simulation theory, which you can find here.

The post Catch Road to VR Co-founder Ben Lang on the Between Realities Podcast appeared first on Road to VR.

The Metaverse Saved My Life, Now I’m Using it to Save Others

I’ve spent over 10,000 hours in ‘the metaverse’—or at least the proto-metaverse—virtual worlds inhabited by real people represented as avatars. The experiences and relationships I had there saved me from a dark place and set me on a mission to do the same for others.

Guest Article by Noah Robinson

Noah is founder and CEO of Very Real Help, and a clinical psychology doctoral candidate at Vanderbilt University. Very Real Help has built a clinical research platform called Help Club to explore how social virtual reality environments can be used to deliver transdiagnostic cognitive behavioral peer support. Very Real Help has received funding from the National Institutes of Health, National Science Foundation, and investors to use Help Club for the treatment of substance use disorders and other mental health conditions.

There’s real pitfalls and dangers in the metaverse, such as the pedophilia and child grooming recently highlighted by the BBC, or the sexual assault that’s occurred on various platforms. But like any technology, the metaverse can be used for both good and bad. It all depends on how each application is built and how we choose to use them. With immersive virtual reality, where the entire environment can be controlled, the potential to help people is nearly limitless.

When I was 13 I escaped into a virtual social game called Runescape. Right when I hit puberty, I realized I was gay. I was overwhelmed with feelings of shame and anxiety. Several years into the burden of being closeted, I felt hopeless for my future and considered ending my life. But one thing kept me going: as I made friends and leveled up alongside them in Runescape, virtual stimuli created real hits of dopamine. These hits are an important treatment target for depression—in therapy, we teach patients to engage in rewarding behaviors to increase motivation and potentially overcome depression.

I spent most of my teenage years, nearly 10,000 hours, living in this virtual world. Inside I could build a virtual identity in a fantasy world where sexual identity was not a factor. As I gained confidence in the virtual world, I eventually created my first clan which steadily grew in size. Although it consisted of 400 ‘strangers’ on the internet, they were my closest friends. Eventually I felt enough social belonging and validation to come out of the closet to them. My friends accepted me, even when they knew my deepest, darkest secret. Going through this process virtually empowered me to come out of the closet in the real world and eventually to overcome my depression.

From that moment on, I knew I wanted to devote my life to building virtual experiences that were as compelling as a videogame but also as effective as therapy.

I know it’s possible because I’m already on that journey. I’ve built a mental-health focused virtual app called Help Clubavailable on Quest, PC, Mac, and iOS—that allows anyone to improve their well-being and mental health in a virtual setting. You can join as an avatar and attend live groups that are led by trained coaches. A fully realized metaverse has the potential to change millions of lives by making it easy to connect with this kind of virtual support group.

Avatars inside Help Club | Image courtesy Very Real Help

Help Club is just getting started—since launching our beta in October we’ve had thousands of people come into our community—we’re starting to see that a safe, supervised environment can quite literally change peoples’ lives. Help Club is designed from the ground up to support mental health. We’re training everyday, ordinary empathic people to become coaches who can lead support groups and teach the scientifically validated tools of an approach we’ve developed called Cognitive Behavioral Immersion.

It’s a place I wish I had as a 13-year old to guide me toward healing rather than entice me into a world of escape.

Building a mental health space that’s ready for the metaverse isn’t easy and we’ve had to use technology to ensure a safe world for all. We screen folks and monitor interactions—although we’re not delivering therapy, we’ve adopted standard practices developed in therapy training clinics such as recording interactions to monitor for quality and prevent trolls from causing psychological harm. Although we only support people who are 18 or over, we’ve also seen demand from minors who have found our platform and want mental health help.

We’re starting to see exciting results from our virtual mental health platform. It’s attracting people who need help; 53% of our users have (self-reported) levels of clinical depression, and 45% have clinical levels of anxiety. And we’re starting to observe decreases in symptoms of depression and anxiety for those who spend time in our application.

While VRChat is the platform the BBC highlighted recently in its story on child grooming, there are examples of safe spaces on the platform. For example, a beautiful transgender community blossomed in VRChat and allowed safe spaces to exist for some people who were struggling with the same things I did as a teenager. One person even described that they were thinking about transitioning to the opposite gender for 10 years, and it took trying on a female avatar in VRChat to finally begin the acceptance process and seek out a gender therapist.

We’ve also seen Alcoholics Anonymous meetings and chronic illness support groups come to fruition in Altspace. These groups find refuge in these virtual spaces—safe places to connect with others in a nonjudgmental space. The spaces are safe because people have the comfort of being anonymous while also feeling the immersive social support of avatars around them. Although these platforms can deliver help, they can also cause harm if there is no moderation or accountability. These platforms also need to protect minors by keeping them in safe, moderated environments.

Help Club is also changing lives. We have Help Club members who started out with severe social anxiety or depression and have now completed our coach training and are leading meetings to help others.

One of our members publicly shared that she had not been able to leave her house in nearly three years. Help Club helped her to feel comfortable leaving the house again, and she reported her experience was “infinitely better than three years of therapy.” Now that she can leave home, she’s able to engage in rewarding real-world activities that help people to overcome depression.

Image courtesy Very Real Help

Another member reported that he was too depressed to go to work and had been lying in bed all day. For nearly two weeks he went to Help Club meetings every day and reported that he was able to go to work for the first time in a long time. He told us he had tears in his eyes after coming home from his first day of work, thinking about how Help Club had gotten him there.

This is just the beginning. More research is needed, including randomized control trials, to truly know if the metaverse can deliver on its promise of helping people overcome real-life problems. But even right now I know that there are thousands of other people out there like me, looking to escape into the metaverse to avoid, and maybe even heal, real-life pain.

The post The Metaverse Saved My Life, Now I’m Using it to Save Others appeared first on Road to VR.

Cas & Chary Present: Hands-on with the Lynx R-1 Mixed Reality Headset

Lynx is a Paris-based startup building the Lynx R-1, a standalone mixed reality headset that’s small and lightweight. The headset’s Kickstarter reached its goal 15 hours after launch and is still ongoing, having recently doubled its funding goal. You can pledge for a unit starting at $500.

Cas & Chary Present

Cas and Chary VR is a YouTube channel hosted by Netherland-based duo Casandra Vuong and Chary Keijzer who have been documenting their VR journeys since 2016. They share a curated selection of their content with extra insights for the Road to VR audience.

Since they are a startup and have never released a product before, it’s understandable to be cautious about the project. So we reached out to Lynx to ask for a demo and they invited us to come over. In a recent video, we shared our first impressions and showed footage recorded in the headset. In this article, I’ll share a summary of my impressions.

Introduction to the Lynx R-1

Lynx wants to build the world’s most open and versatile device capable of both VR and AR. Not only can you use it standalone, but you can also stream PC VR content or use it tethered with a PC. The Lynx R-1 headset houses six cameras: two B&W cameras for 6DoF positional tracking, two IR cameras for hand-tracking, and two RGB visible light cameras for the color passthrough AR mode. Aside from this, the headset also has a novel lens design that makes it possible to fit it all in a tiny package (compared to current consumer headsets).

Before we dive into my impressions, here are the headset’s full specs:

Lynx R-1 Specs
Display 1,600 × 1,600 (2.6MP) per-eye, LCD (2x)
Refresh Rate 90Hz
Lenses Four-fold catadioptric freeform prism
Field-of-view (claimed) 90° diagonal
Optical Adjustments IPD
Processor Snapdragon XR2
RAM 6GB LPDDR5
Storage 128GB (up to 1TB via microSD)
Wireless Wi-Fi 6 (802.11ax), Bluetooth 5.0
Connectors USB Type-C
Battery Life 3 hours
Tracking Inside-out (no external beacons)
On-board cameras 2x B&W, 2x IR, 2x RGB
Input Hand-tracking, controllers
Audio Stereo speakers, 3.5mm TRRS jack
Microphone Yes (2x)
Pass-through view Yes

When we arrived at the office, it was apparent that it was a startup. The office was relatively small but had a team of about 15 passionate people working on the Lynx R-1 headset. One half of the office had desks where the software was tested, the other half of the room was full of hardware. For example, there was a robot arm that calibrated the display, and pre-production units were assembled by hand. I loved seeing this. I imagined this must have been like when Oculus just started out.

Office Vibes

Image courtesy Cas & Chary

Before the demos, Stan Larroque, the founder, and CEO of Lynx showed us around and he told us that the demo devices were pre-production units that demonstrate basic engineering validations but not yet the experience you’d expect when you buy the headset.

Design & Comfort

Image courtesy Cas & Chary

The headset has a small form factor, a lightweight front attached to a ‘halo’ headstrap. The front can be flipped up 90-degrees quickly peeking outside the headset. The forehead holder and backside of the strap have magnetically-attached pads. To balance the front weight, the battery is placed at the back. This design works out well as the headset is one of most comfortable I’ve tried so far.

Novel Lens Design

The headset makes use of what the company calls a “four-fold catadioptric freeform prism” lens. It’s a novel design not seen in other headsets before and this is, according to Lynx, what makes it possible to pack all the components in such a small form factor. You can move these lenses separately to adjust for your IPD.

Image courtesy Cas & Chary

AR + Hand-tracking

The first thing we tried was AR passthrough, which is the primary focus of the headset. The headset is designed to be used this way without any cover on the sides so that you can see as much of the real world as possible (there is a magnetic cover for using the headset in VR mode).

Image courtesy Cas & Chary

I tried an AR color passthrough demo that showed off the hand tracking. The passthrough had a bit of latency but was low enough for me to comfortably walk around while looking at the passthrough footage. The colors weren’t 100% fine-tuned yet, so sometimes it changed in color temperature but other than that, it worked well.

The hand-tracking is powered by Ultraleap and that’s one of my favorite aspects of the headset as it is very precise and responsive, more so than hand-tracking on Oculus Quest 2. It had no trouble understanding which hand was which, even when my fingers were overlapping (which is a tough challenge for hand-tracking systems). It’s not perfect yet, but I think it’s promising.

Image courtesy Cas & Chary

Then I tried another AR app where a solar system was embedded into the real world and I could walk around it, or look at it from a closer distance. While the camera footage of the real world wasn’t as sharp as real life, the solar system looked beautifully sharp.

I was impressed by the AR functionality in combination with the hand tracking. I can imagine the possibilities this opens up for developers.

VR Mode

I’ve tried the Lynx R-1 in VR mode too which employs the ‘immersive cover’ and shows you a purely virtual world instead of passthrough AR. Lynx had a couple of self-developed VR tech demos available, one demo was a Beat Saber clone where I used my hands to slash boxes and another demo which had me casting spells with my hands.

Image courtesy Cas & Chary

In fully immersive VR mode, it took me a while to get used to the lenses as I needed more time than usual to find the sweet spot and for a couple of moments I found it, but I wasn’t able to keep it. So most of my time in the VR demos I saw something that looked like ghosting because of the lens’ four-fold design that’s corrected by software. Visuals became wobbly around the folding areas. This made it hard for me to judge the display. Strangely, this wasn’t noticeable in AR mode.

Image courtesy Cas & Chary

Asking founder Stan Larroque, he told me the lenses and display were not precisely calibrated together in some eyepieces yet, but this should be resolved once they hit production using the tooling of their manufacturer and once the software matures.

Software & Content

The Lynx R-1 is built on Android 10 and will support Unity with an SDK made in collaboration with Qualcomm and Ultraleap. Support for OpenXR is also in the pipeline. Lynx aims to open-source the SDK and make the ecosystem as open as it can. This should make it easy for developers to release games on it.

However, the software is still in its early stages. All the demos shown to me were launched via the command line as there’s no in-VR operating system yet. Lynx says that software is what they’ll be working on from now until release. It will take a few more months but once the software matures, Lynx says it will fix the issues I had during my demos.

Controllers

As for the controllers, Lynx says it will design and manufacture its own. For the tracking technology, it partnered up with FinchShift which designs controllers that can work without any cameras at all as they rely on IMUs for the tracking. Lynx will add the capability of more accurate tracking with the visible light sensors to their own controllers. This wasn’t ready yet, so I couldn’t test it.

– – — – –

While my demos weren’t all issue-free, we did get a good impression of the hardware and company. Lynx’s full transparency is great and it brought me comfort seeing how capable Lynx’s team is. They’ve also made partnerships and are collaborating closely with companies like Qualcomm and Ultraleap, which I think is a good sign. So I think the device is promising, but it is still too early to say if Lynx is on track until we see the hardware and software all come together in the shipping product.

The software will be distributed online as soon as the first units can be sent to developers at the start of 2022. The goal is to start shipping the devices to consumers starting April 2022. If you’re interested in supporting them, check out their Kickstarter.


Disclosure: Lynx sponsored our trip and accommodation. We were not obligated to post a video.

The post Cas & Chary Present: Hands-on with the Lynx R-1 Mixed Reality Headset appeared first on Road to VR.

Cas & Chary Present – Hands-on with the SenseGlove Nova Force-feedback VR Gloves

SenseGlove, a Dutch-based producer of VR haptic gloves, has revealed an early prototype of their second glove, the SenseGlove Nova. We recently visited the company to see how it works and feels.

Cas & Chary Present

Cas and Chary VR is a YouTube channel hosted by Netherland-based duo Casandra Vuong and Chary Keijzer who have been documenting their VR journeys since 2016. They share a curated selection of their content with extra insights for the Road to VR audience.

SenseGlove’s first glove was called the DK1 and were tethered haptic gloves that use an exoskeleton attached to each finger for the force-feedback.

SenseGlove DK1 | Image courtesy SenseGlove

The first significant difference you’ll see in the SenseGlove Nova compared to the prior model is that the newer model has a futuristic design that’s much more like a glove and designed to be put on in just five seconds, according to the company. This time it’s designed for ease of use; it’s wireless and is compatible with standalone VR headsets like the Oculus Quest, Pico Neo, and Vive Focus.

The SenseGlove Nova can simulate the feeling of shapes, textures, stiffness, impact, and resistance. This is made possible with the company’s trifecta of touch: force-feedback, vibrotactile feedback, and motion tracking.

Force Feedback

Image courtesy Cas & Chary VR

The Nova uses a passive force-feedback system; instead of actively pulling the fingers back, the gloves allow you to feel virtual objects by ‘stopping’ your fingers from moving. The gloves use one magnet per finger that’s attached to pulleys and wires. Once a user grabs a virtual object, the magnets will exert power to ‘stop’ your fingers. By determining how much force it applies, you can feel the difference between hard and soft objects.

Vibrotactile Feedback

Image courtesy Cas & Chary VR

The force-feedback is enhanced by vibrotactile feedback. Both the thumb and the index finger have their own vibrotactile actuator, while an advanced voice coil actuator is located at the bottom of the glove. The voice coil actuator allows the Nova to render the feeling of realistic button clicks and impact simulations. Developers can do this by recording sound, converting it to a vibration waveform, and then bringing it inside the virtual environment to be played back by the glove.

Motion Tracking

As for motion tracking, the Nova combines sensor-based finger tracking with computer vision hand tracking algorithms. With this combination, there’s no need for external tracking devices on headsets that offer third-party access to the tracking cameras. On a more closed-off system like the Oculus Quest, the controllers are mounted on the gloves to make motion tracking possible.

During my visit, SenseGlove showed us a demo inside a zero-gravity space station where I had to do a simple repair mission. In front of me was a table with objects of different densities that I could touch and grab.

Image courtesy Cas & Chary VR

I have to note here that the Nova gloves didn’t come in the right size for my hands, so the haptic feedback wasn’t optimal. Still, like the first glove, I was able to feel the difference between, for example, a soft, squeezable ball and a battery made of glass. When I touched an object, I felt vibration. When I grabbed something, I could feel a slight resistance on my fingers, which, combined with seeing the object, made me automatically stop my fingers from moving further. It felt natural, real and I thought it was impressive, especially given the wireless freedom of movement without much latency. I did have a greater wow-factor when I demoed the DK1 as it felt more precise since the first glove had twice the strength (40 Newton instead of 20N).

SenseGlove told us that the Nova will be sold alongside the DK1 as both gloves have their pros and cons. If you’re looking for accuracy, the DK1 is a good option. The Nova is here if ease of use is more important.

The SenseGlove Nova primary use case is for commercial purposes and costs €4,500, which is about $5,300. Consumer plans are not in the works, but I think it is interesting to watch this space as it gives us a good idea of what is in store for consumers in the future.

The post Cas & Chary Present – Hands-on with the SenseGlove Nova Force-feedback VR Gloves appeared first on Road to VR.

Case Study: The Design Behind ‘Cubism’s’ Hand-tracking

Hand tracking first became available on the Oculus Quest back in late 2019. Out of enthusiasm for this new input method, I published a demo of Cubism to SideQuest with experimental hand tracking support only a few days later. Needless to say, this initial demo had several flaws, and didn’t really take the limitations of the technology into account, which is why I decided to initially omit hand tracking support from the full release of Cubism on the Oculus Store. It took more development, leaning on lessons learned from the work of fellow developers, to build something I was happy to release in the recent Cubism hand-tracking update. Here’s an inside-look at the design process.

Guest Article by Thomas Van Bouwel

Thomas is a Belgian-Brazilian VR developer currently based in Brussels. Although his original background is in architecture, his current work in VR spans from indie games like Cubism to enterprise software for architects and engineers like Resolve.

This update builds on lessons learned from many other games and developers who have been exploring hand tracking over the last year (The Curious Tale of the Stolen Pets, Vacation Simulator, Luca Mefisto, Dennys Kuhnert, and several others).

In this article I’d like to share some things I’ve learned when tackling the challenges specific to Cubism’s hand interactions.

Optimizing for Precise Interactions

Cubism’s interactions revolve around placing small irregular puzzle pieces in a puzzle grid. This meant the main requirement for hand tracking input was precision, both in picking up and placing pieces on to the grid, as well as precisely picking out pieces from a completed puzzle. This informed most of the design decisions regarding hand input.

Ghost Hands

I decided early on to not make the hands physics-based, but instead let them pass through pieces until one is actively grabbed.

This avoided clumsily pushing the floating puzzle pieces away when you are trying to grab them mid-air, but more importantly, it made plucking pieces in the middle of a full puzzle easier since you can just stick your fingers in and grab a piece instead of needing to figure out how to physically pry them out.

Signaled by their transparency, hands are not physical, making it easier to pick out pieces from the middle of a puzzle.

Contact Grabbing

There are several approaches to detecting a users intent to grab and release objects, like focusing on finger pinches or total finger joint rotation while checking a general interaction zone in the palm of the hand.

For Cubism’s small and irregular puzzle pieces however, the approach that seemed to handle the precision requirements the best was a contact based approach, where a piece is grabbed as soon as thumb and index intersect the same piece and are brought together over a small distance, without requiring a full pinch.

Similar to the approach in The Curious Tale of the Stolen Pets, the fingers are locked in place as soon as a grab starts, to help give the impression of a more stable looking grab. The piece is parented to the root of the hand (the wrist) while grabbed. Since this seems to be the most stable tracked joint, it helps produce a steadier grip, and guarantees the piece stays aligned with the locked fingers.

Piece is grabbed when thumb and index intersect it and are brought together slightly. Rotation of index and thumb are then locked in place to help give the impression of a stable grab.

As soon as a piece is grabbed, the distance between thumb and index is saved, and a release margin is calculated based on that distance. Once thumb and index move apart beyond that margin, the piece is released.

Several safeguards try to prevent unintentional releases: we don’t check for release when tracking confidence is below a certain threshold, and after tracking confidence is re-gained, we wait several frames until checking for release again. Fingers are also required to be beyond the release margin for several frames before actually releasing.

Debug visualization: during a grab, the initial grab distance between fingertips is saved (outer red circle). The piece is released when the real position of the fingertips move beyond a certain margin (blue circle).

There is also a system in place similar to Vacation Simulator’s overgrab method. Due to the lack of haptic feedback when grabbing a piece, it’s not uncommon for fingers to drift closer to one another during a grab. If they close beyond a certain threshold, the release margins are adjusted to make releasing the piece easier.

Try it yourself: to see these debug visualizations in-game, go to ‘Settings > Hand Tracking > Debug visualizations’ and turn on ‘Interactions widgets’.

Debug visualization: If fingers drift to each other during a grab over a certain threshold (inner red circle), the release margins are re-adjusted to make releasing the piece feel less “sticky”.

One limit to this approach is that it makes supporting grabbing with fingers other than the index a bit harder. An earlier implementation also allowed grabbing between middle finger and thumb, but this often led to false positives when grabbing pieces out of a full puzzle grid, since it was hard to evaluate which finger the player was intending to grab a specific piece with.

This would not have been an issue if grabbing revolved around full finger pinches, since that results in a more clear input binary from which to determine user intent (at the cost of a less natural feeling grab pose).

Midpoint Check

Besides checking which piece the index and thumb are intersecting, an additional check happens at the midpoint between index fingertip and thumb fingertip.

Whatever piece this midpoint hovers over will be prioritized for grabbing, which helps avoid false positives when a player tries to grab a piece in a full grid.

In the example below, if the player intends to grab the green piece by its right edge, they would unintentionally grab the yellow piece if we didn’t do this midpoint check.

Left: thumb, index & midpoint between fingertips are in yellow → grab yellow. Right: thumb & index are in yellow, midpoint is in green → grab green

Grabbing the Puzzle

Grabbing the puzzle works similar to grabbing puzzle pieces, except it is initiated by performing a full pinch within the grab zone around the puzzle.

The size of this zone is dynamically increased when switching from controllers to hands. This makes it a bit easier to grab, and helps reduce the likelihood of accidentally grabbing a piece in the grid instead of the grid itself.

The grab zone around the puzzle expands when switching from controllers to hands, making it easier to grab. Although it requires a full pinch, grabbing the puzzle works similar to grabbing puzzle pieces.

Dynamic Hand Smoothing

The hand tracking data provided by the Oculus Quest still can have a bit of jitter to it, even when tracking confidence is high. This can actually affect game play too, since jitter can be much more noticeable when holding the puzzle grid or a long puzzle piece by the edge, making precise placement of pieces on the grid harder.

Smoothing the tracking data can go a long way to produce more stable looking grabs, but needs to be done in moderation since too much smoothing will result in a “laggy” feeling to the hands. To balance this, hand smoothing in Cubism is dynamically adjusted depending on whether your hand is holding something or not.

Try it yourself: to see the impact of hand smoothing, try turning it off under
‘Settings > Hand Tracking > Hand smoothing’.

Increasing the smoothing of hand positions while holding objects helps produce a more stable grip, making precise placement on the grid a bit easier.

Pressing Buttons

One thing I noticed with Cubism’s original hand tracking demo was that most people tried pressing the buttons even though that was not supported at the time. Therefore, one of my goals with this new version of hand tracking was to make the buttons actually pushable.

Buttons can be hovered over when a raycast from the index finger tip hits a collider at the back of the button. If the index finger then intersects with the collider, a press is registered. If the index intersects the collider without first hovering it, no press is registered. This helps prevent false positives when the finger moves from bottom to top.

There are a few more checks in place to prevent false positives: the raycast is disabled when the finger is not facing the button, or when the player is not looking at their finger when pressing.

Try it yourself: to see this debug visualization in-game, go to ‘Settings > Hand Tracking > Debug visualizations’ and turn on ‘Interactions widgets’.

Debug visualization: a raycast from the index tip checks whether the finger is hovering over a button. To help prevent false positives, interaction is disabled when the finger is not facing the button, or when the player is not looking at their finger.

Guiding Interactions

One of the main challenges of building any interaction for hand tracking is that, in contrast to buttons on a controller which are either pushed or not pushed, there are many different ways people may try to approach an interaction with their hands while expecting the same outcome.

Playtesting with a diverse set of people can help you learn how people are approaching the interactions presented to them, and can help refine the interaction cues that guide them to the expected gestures. Playtesting can also help you learn some of the outliers you may want to catch by adding some interaction redundancy.

Interaction Cues

There are several cues while grabbing a piece. When a user first hovers over a piece, their index and thumb take on the color of that piece, both to indicate it can be grabbed, and to signal which fingers can grab it (inspired by previous work by Luca Mefisto, Barrett Fox, and Martin Schubert). The piece is also highlighted to indicate it can be grabbed.

Several cues also indicate when the grab is successful: the fingertips become solid, the highlights on the piece flash, and a short audio cue is played.

Various cues both on the hand and the puzzle piece guide and confirm the grab interaction.

Buttons have several cues to help indicate that they can be pushed. Much like with puzzle pieces, the index fingertip is highlighted in white once you hover over a button, indicating which finger can interact. Like they did with controllers, buttons extend outward when hovered, but this time the extended button can actually be pressed: once the index touches it, it follows the finger until it is fully pressed down, at which point an audio cue confirms the click.

A subtle drop shadow on the button surface indicates where the position and distance of the index to the button and helps guide the press interaction.

Various cues guide interactions with buttons: buttons extend outward when hovered, the index fingertip is highlighted, a drop shadow shows where the tip will interact, and the button follows the finger when pushed.

Interaction Redundancy

Since some people may approach some interactions in unintended ways, it can be good to try and account for this where possible by adding some redundancy to the ways people can use their hands to interact. Interaction cues can still guide them to the intended interaction, but redundancy can help avoid them getting unnecessarily stuck.

When it comes to grabbing pieces, a few playtesters would try to grab pieces by making a fist at first instead of using their finger tips. By having the colliders cover the entire finger instead of just the fingertip, a decent amount of these first grabs will still be registered.

I should note this approach still needs some improvement, since it also introduces some issues producing unintended grabs in cases when there are a lot of pieces floating around the play area. A better approach in the future might be to also perform a check on the total finger rotation to account for fist grabs instead.

Though grabbing is designed around fingertips, colliders on index and thumb cover the entire finger to help catch different forms of grabbing.

With buttons, there were a few playtesters who would try pinching them instead of pushing them. In part this seemed to occur when they previously learned how to pinch buttons in the Oculus home screen, right before launching the game.

For this reason, buttons can also be clicked by pinching once they are hovered, and hopefully cues like the highlighted index and drop shadow will eventually guide them to pressing the buttons instead.

Pinching while hovering over buttons also registers as a click.

The first button players encounter when using hands also explicitly states “Push to Start”, to help transition people from pinching to pushing after coming from the Oculus Home menu.

Teaching Limitations

Although the quality of Quest’s hand tracking has improved over the last year, it still has its limitations — and a player’s awareness of these limitations can have a big impact on how good they perceive their experience to be.

Cubism implements a few ways of teaching player’s about the current limitations of hand tracking on Quest.

When the player first switches to hand tracking (either at launch or mid-game), a modal informs them of some best practices, like playing in a well-lit space and avoiding crossing hands.

When a user switches to hand tracking, a modal informs them about limitations and best-practices. The “Push to Start” instruction helps teach new users that buttons can be naturally pushed in this game.

It is important to acknowledge that most people are likely to immediately dismiss modals like this or quickly forget its guidelines, so signaling why things can go wrong during the experience is also important.

In Cubism, hands will turn red to signal when tracking was lost. In some playtests, people would keep one hand on their lap and play with the other, and be puzzled why their lap hand would appear frozen. To help inform cases like this, a message is displayed on the hand to clearly state why the hand is frozen if tracking loss persists. If tracking is lost specifically because the player is crossing their hands, the message changes to inform them not to do that.

Left: hands turn red when tracking is first lost. Middle: when tracking loss persists, a message informs the player about what is going on. Right: if tracking is lost due to occluded hands this is also indicated

For more seasoned players, or players who prefer playing with one hand, this feature can be replaced in the settings by having hands fade out when they lose tracking instead, more closely resembling the behavior in the Oculus home menu.

The red hands and warning messages can be replaced in the settings by fading hands.

Future Work

Hand tracking on Quest still has its limitations, and though Cubism’s support for it is already in its second version, there is still plenty of room for improvement.

Regardless, I’m excited to start exploring and supporting these new input methods. In the short term, I think they can help make experiences like this more accessible and easier to share with new VR users.

Mixed reality footage captured on an iPhone with Fabio Dela Antonio’s app Reality Mixer gives an idea of what it may be like to play Cubism on an AR headset in the future.

In the long term, there seems to be a good chance that hand tracking will be the go-to input for future standalone AR devices, so hopefully this update can be a first small step towards an AR version of Cubism.


If you enjoyed this look at at the hand-tracking design in Cubism, be sure to check out Thomas’ prior Guest Article which overviews the design of the broader game.

The post Case Study: The Design Behind ‘Cubism’s’ Hand-tracking appeared first on Road to VR.

Road to VR’s 2020 Game of the Year Awards

It’s the time of the season again for reflection, when we look back at this year’s greatest achievements in VR gaming and remind ourselves just how far we’ve come in the four years since consumers first delved head-first into truly immersive worlds.

Due to the ongoing pandemic, this year was plunged into an global economic cooldown which saw many industries grind to a halt. Comparatively unaffected though was the games industry, which could thankfully continue as developers took to finishing their projects at home from a safe distance.

In our fourth annual Game of the Year Awards, we again put ourselves to the task of celebrating this year’s greats in VR gaming. Moreover, we salute all developers for offering up their hard work and steadfast curiosity in the face of the same personal adversity we’ve all no doubt shared. We’re grateful for having safe places where we can connect and explore, and for lighting a world which at times may have seemed grim and unrelenting.

For many, this steady stream of VR games has been a lifeline to sanity, as physically stepping outside of our homes could mean either putting ourselves or our loved ones in danger’s path. We thank you for willing your virtual realties into existence for all of us to enjoy.

Now, our games of the year:


Half-Life: Alyx

Developer: Valve

Available On: Steam

Release Date: March 23rd, 2020

If you would have told anyone back in 2016—the year the first consumer PC VR headsets hit the market—that Valve (of all companies) would one day build a AAA Half-Life game (of all franchises) specifically for VR, we’d say you were crazy. Yet here we are, in 2020, giving Half-Life: Alyx our PC VR Game of the Year Award.

But before the release of Alyx earlier this year, there was still plenty of skepticism to go around. It was Valve’s first full-fledged VR game and the first Half-Life game in more than a decade. Could Valve deliver anything to possibly meet all that hype?

Well, the answer is now resoundingly clear. It turns out that Valve’s old-school, methodical (if sometimes messy) approach to game design works just as well for VR games as it does for non-VR games.

From the very opening scene—where players are, for the first time, truly standing before the monolithic Citadel in the middle of City 17—Alyx is immersive through and through thanks to heaps of detail, an engaging and interactive world, and one of the most memorable sequences seen in any VR game to date… the dreaded ‘Jeff’.

With excellent pacing that weaves together combat, exploration, and puzzles, Alyx takes players on a seamless journey through the well-realized streets, cellars, and rooftops of City 17, all the way to a mysterious conclusion that has serious consequences for the future of the franchise.

Against all odds, one of the most legendary game developers brought one of the most legendary franchises to VR in stunning fashion. Given that the studio stood to make tens of millions (if not hundreds of millions) more in revenue by making a non-VR game, it’s hard to call Alyx anything but a love letter to the VR medium.

Half-Life: Alyx stands as VR’s new benchmark in graphics, immersion, and scope, and I’m sure that Valve itself is as eager as the rest of us to see who will be next to raise the bar.


Iron Man VR

Developer: Camouflaj

Available On: PlayStation VR

Release Date: July 3rd, 2020

While it surely brings ample opportunity, there’s nearly an equal amount of risk in using the likeness of an iconic character like Iron Man. While the character has plenty of backstory to draw from, delivering the experience of actually stepping into the character’s shoes—the experience of actually being Iron Man rather than just watching him—is no trivial task, especially in the still young and often ill-defined medium of VR.

Before Iron Man VR arrived to the rescue, there really were no standout superhero games in VR. There were attempts, certainly, but none that truly planted a flag and said “this is how it’s done.” Developer Camouflaj, however, turned out to be up to the task.

And they did it in a most ambitious way. While choosing to focus their game on a superhero that didn’t fly would have surely avoid plenty of headaches, picking one that did fly forced them to tackle the serious challenge of keeping players comfortable even as they sailed through the sky.

What’s more, the game’s innovative flying system was specially designed around Iron Man’s character—around his palm-mounted repulsor jets specifically—bringing an immersive flair to the way players control themselves in the game by aiming their hands to control thrust. The result was a truly fun and thrilling method of locomotion that balanced high-speed maneuvering with aerial combat.

But more than just coming up with a novel flight system for VR, Iron Man VR contextualized its gameplay with an engaging story that explored the man behind the mask, Tony Stark, nearly as much as his superhero persona. Combined with immersive details sprinkled throughout, Iron Man VR delivered a package that felt whole and delivered the fantasy it promised.


The Walking Dead: Saints & Sinners

Developer: Skydance Interactive

Available On: Oculus Quest, Oculus Rift, Steam, PSVR

Release Date: January 23rd, 2020

It wasn’t clear what to expect from Skydance Interactive’s take on the storied The Walking Dead zombie franchise. The Walking Dead: Saints & Sinners could have easily been a ham-handed attempt at shoehorning a standard first-person shooter into VR. We’ve seen them before, and they weren’t pretty.

As soon as you start the game though, it becomes immediately apparent that Saints & Sinners demands the player to invest themselves completely in the experience—it’s a true VR native. In this scaled-down RPG, moral choices meet zombie-killing carnage in a way we simply haven’t seen in VR up to this point. You’re instantly thrust into a world where supplies are scarce, crafting useful items is key, and coming in contact with any zombie is a fight for survival.

It’s a gruesome and realistic experience in all the right ways: a zombie can be hacked to pieces with any manner of sharp object, but you’ll lose precious stamina than you’ll need as you run away from the evening horde. If you’re a decent shot, you can try to stick to headshots the entire way, but as the mob grabs at you, you’re left with very little choice but to look them straight in the eye sockets and brain them with a knife, cleaver, or pointy stick.  Complete your mission and get the hell out of dodge, or face the consequences; with each zombie presenting potential death, the horde isn’t something you’ll ever want to face.

Outside of its impressive physics-based melee and gun combat, one of the most frightening parts is navigating the muddy waters of the New Orleans gang life, where you literally choose to side with one faction by stoking blood fueds by personally executing NPCs, or by walking your own path as a freelancer. Although the adventure isn’t open-world, discrete maps are so large and rich in detail that you’ll probably forget in the first five minutes anyway.

The standalone version of the game on both Quest and Quest 2 is lower res than its PC VR forbear, but that’s saying very little. As is, the game is more than the sum of its parts, and shines even with the obligatory knock in visual fidelity for a game of this scale, polish, and depth.


Design Awards


Half-Life: Alyx

Developer: Valve

Available On: Steam

Release Date: March 23rd, 2020

Each year we try to come up with games that shine in specific departments, so we tend to highlight titles that haven’t already won our platform-based awards. This year though, there’s simply no ignoring the titanic effort that went into making Half-Life: Alyx the most immersive VR game of 2020.

From the liquid shaders inside the many errant bottles laying around, to flippable light switches, to the full baby grand piano, every object has been loving realized with one thing in mind: immersing the player into the world of Half-Life like never before.

While full, unfettered object interaction is great for immersion, this also lets players get creative with how to use seemingly banal stuff to their advantage, like carrying a basket full of grenades when you run out of space in your inventory.

In Half-Life: Alyx, there are only a few misses in terms of immersion, which are more linked to stylistic choices by Valve. You can’t melee enemies, and the gesture-based menu pulls you out a bit from the action, but even with those minor offenses, Valve has effectively created VR’s most detailed game to date that will be difficult to rival in the years to come.


Phantom: Covert Ops

Developer: nDreams

Available On:  Oculus RiftOculus Quest

Release Date: June 25th, 2020

Building a new car is, for the most part, putting new spin on a concept that’s largely already been figured out by those that came before. While non-VR game development similarly stands on the shoulders of past giants, in VR, almost any step you take is likely to mean breaking fresh ground—right down to rethinking how players will even move around your game world.

Developer nDreams embraced the unknown and built an entire game around a novel locomotion scheme that had players sleuthing through sluices in a tactical kayak.

It might sound a little ridiculous on the surface, but dive a little deeper and you’ll see that it really fits VR well. Not only is paddling a much more immersive and intentional way to get around than using using a joystick, the kayak worked great as a sort of ‘inventory’ system for the player thanks to weapon and ammo holsters along its sides.

While a smooth moving and turning kayak could surely prove challenging from a comfort standpoint, nDreams managed to come up with a snap-turn solution that worked seamlessly with the kayak locomotion, allowing more players to enjoy their time on the waterways.

The locomotion innovation of Phantom: Covert Ops makes us excited to see what the studio comes up with next.


Star Wars: Squadrons

Developer: Motive Studios, EA Games

Available On: SteamEpic GamesOriginPSVR

Release Date: October 2nd, 2020

Flying an X-Wing in VR has been the dream ever since EA Games released the free X-Wing VR Mission DLC for Star Wars: Battlefront Rogue One in 2016 on PS4. And in a big way, EA’s Motive Studios delivered on that dream with this massive first-person dogfighter, which lets you play through an well-crafted singe-player campaign, or cross-platform online battles.

Motive Studios took on the mantle of making Star Wars: Squadrons feel like a native VR game which lets you play with a giant pool of players, delivering support for PC VR, traditional PC monitors, PSVR, PS4, and Xbox One players together. And when it comes to dropping in for a casual dogfight, you simply can’t waste time waiting around.

To boot, playing in VR has its clear advantages, as you can naturally track enemies by looking through your cockpit’s canopy windows, all while keeping an eye on your 3D radar. One of the hopes we had for the game was motion controller support for added immersion, however simulator enthusiasts know that the most immersive way to control a vehicle in VR is using a HOTAS setup, which lets you play with physical thrusters and flight sticks so you can truly feel like you’ve stepped into your own Star Wars universe spaceship. You can also play with gamepad, which is fun too since the game offers up arcade controls instead of pure simulator-style flying like you might find in Elite Dangerous (2014).

Both the world inside and outside of your canopy is a visual treat. While cinematic cutscenes are reduced to 2D windows, the game makes up for this by putting you on the deck of each ship to speak face-to-face with some of the most detailed character models we’ve seen in VR. Crafted with motion capture, the game’s NPCs seem to inch very close to the far side of the Uncanny Valley—something you’ll appreciate more from the inside of a VR headset.

In all, Star Wars: Squadrons gives VR gamers everything it has to offer on traditional platforms and more. It also sends a clear message to AAA studios that VR doesn’t have to be a second class citizen when it can slot in so well.


Cubism

Developer: Thomas Van Bouwel

Available On: Steam, Oculus Rift, Oculus Quest

Release Date: September 17th, 2020

Cubism is a spatial puzzle game that shows that an interface can be beautiful through simplicity. The interface strikes a perfect balance between recognizable affordances and VR native flourishes like the use of depth and placement within arms reach. When it’s done the job selecting a level, it gets completely out of the way, allowing the player to directly interact with the puzzle before them.

The interface also hides a little secret which also doubles as a subtle but enjoyable means of ‘progression’ in the game. Each puzzle you complete represents a musical chord which you can hear when you select the level. Played one after another, each of these chords is part of a complete song which is every bit as beautiful in its simplicity as the interface itself. Once you complete all puzzles, the song is yours to enjoy.

There’s not much else to say—and that’s the point. Cubism’s interface does exactly what it needs to do and nothing more.


Pixel Ripped 1995

Developer: ARVORE Immersive Experiences

Available On: Steam, Oculus Rift, Oculus Quest, PSVR

Release Date: April 23rd, 2020

Indie studios take risks that larger, more established names in the industry simply won’t. And supporting those indie devs can mean playing some of the most unique and inventive games out there. Granted, there was a tad less risk involved for Pixel Ripped 1995, a retro-inspired VR game that follows in the footsteps of its popular predecessor, Pixel Ripped 1989 (2018). Still, it’s an amazingly creative slice of mid-90s nostalgia that’s expertly interwoven into the pioneering genres that made so many of us fall in love with games in the first place.

Pitching a unique ‘game within a game’ storytelling style, Pixel Ripped 1995 acts as the setting for its constant flights of fancy, mashing up the fourth console generation’s pioneering genres into a charming 3D world. Without brushing to close to infringe on any copyrights, Pixel Ripped 1995 authors a love letter to the generation’s colorful platformers, side-scrolling beat ’em ups, and RPGs.

At five hours of gameplay, it’s short and sweet, but critically doesn’t overextend itself either. Its linear gameplay offers a virtual smorgasbord of variety as you’re always left guessing at what’s next, leaving little room for boredom.


The Under Presents: The Tempest

Developer: Tender Claws

Available On: Oculus QuestOculus Rift

Release Date: Available from July 7th- November 15th, 2020

The Under Presents (2019) wasn’t released this year, but it did host a very special limited time immersive theater show to Oculus Quest and Rift-owning audiences that delved into some seriously interesting experimental territory. In a sea of graphical and technical marvels this year, the game’s immersive reinterpretation of William Shakespeare’s play The Tempest took the cake.

In a time when live actors are mostly out of work, The Under Presents invited expert thespians to lead groups of up to eight VR users through a rejuvenated retelling of the popular 17th-century theater piece. Built with user participation in mind, it felt more like acting in a high school theater play, with roles dolled out on the fly.

Showing up in the lobby, which is conveniently placed at the entrance of the game’s main area, participants were greeted with interesting toys and magical object to play around with as you hang with your fellow amateur actors. Once the show begins, you’re transported to a campfire to meet a live actor, who in the show’s meta-narrative took on the role of Prospero and many others. The guide weaves the story throughout dreamlike set pieces, and gets everyone involved in acting out parts in the story. Since players are mute, your guide acts as a professional voice over artist by filling in your lines.

In a time when interacting in large groups can be dangerous, The Under Presents The Tempest offered up a truly novel and creative experience that, even with its low-poly art style, felt like a tantalizingly real break from reality. We’re hoping to see more from developers Tender Claws in the near future, whether it be encore presentations of the experience or entirely new interactive theater pieces yet to come.


Note: Games eligible for Road to VR‘s Game of the Year Award must be available to the public on or before December 13th, 2020 to allow for ample deliberation. Games must also natively support the target platform as to ensure full operability.

The post Road to VR’s 2020 Game of the Year Awards appeared first on Road to VR.

How a Solo Indie Developer Built the Best Rated Game on Oculus Quest

Recently released on the Quest store, Cubism is a spatial puzzle game with a slick minimal presentation. Designed by a solo indie developer Thomas Van Bouwel as a side project, the game impressively holds the highest user rating among all Quest apps with more than 100 reviews, according to our latest ranking. We reached to Van Bouwel to learn more about his approach to the project and lessons learned along the way.

Guest Article by Thomas Van Bouwel

Thomas is a Brazilian VR developer currently based in Brussels, Belgium. Although his original background is in architecture, his current work in VR spans from indie games like Cubism to enterprise software for architects and engineers like Resolve.

In September I launched Cubism, a minimal VR puzzle game about assembling colorful blocks into complex geometric shapes. It was my first commercial release as a game developer.

I developed Cubism on my own in my spare time, all while keeping my job as lead product engineer at Resolve, a Brooklyn-based enterprise VR start-up. I’ve only recently transitioned into working part-time for the game in the months leading up to its release.

Bootstrapping your first game alongside a full-time job can be a good way to allow for a flexible development schedule and reduce financial risk—but I think it’s only feasible if you design around your limitations and don’t over-scope your game.

In this article, I want to share some lessons I learned on how to stay on track when bootstrapping your first VR game.

1. Prototype & Playtest as Soon as Possible

The first questions you need to answer when starting any new game is: “is this fun?” and “could this have an audience?” A good way to answer these questions is to build a vertical slice—a small but fully playable segment of your game idea—as soon as possible, and put it in front of strangers to gauge their reaction.

I built the first functional prototype for Cubism over a weekend back in 2017. The prototype was pretty bare bones, but playable, and was enough for me to test the concept with friends and colleagues and to share the idea with strangers online to see if a game like this could spark interest.

The first prototype of Cubism had 3 puzzles, but no menu or game logic. Two of those three original puzzles ended up in the final game.

2. Scope Within Your Constraints

Choosing the right scope for a game is the best way to ensure you can actually finish it, and this will be determined mostly by your constraints (budget, skillset, time, etc).

For Cubism, I knew I’d have limited time to work on it, I knew I wanted to work solo to keep my schedule flexible, and I knew that things like 3D modelling, graphics programming and audio design weren’t my strong suit. Cubism’s minimal aesthetic and straight-forward gameplay leaned into these constraints, and helped inform many design decisions along the way.

For example, the minimal environment removed the need for detailed environment modelling or complex lighting, and helped put the focus on the puzzle in front of the player. This lack of environment also meant that having gravity made no sense, since pieces had no surface to fall on other than the floor—so instead, everything floats. This actually made puzzle solving a bit easier and enabled more complex puzzle shapes which wouldn’t be possible if gravity applied.

The lack of gravity in Cubism allows for more complex puzzle shapes.

Adjusting scope is something that will inevitably happen throughout development as well. One instance where I realized I was over-scoping was with my plans to support hand-tracking in the initial release of the game.

When hand-tracking first became available, I quickly prototyped experimental support for the feature and released it in a demo on SideQuest as it seemed like hand-tracking could make for a very intuitive way of playing the game. The reality was that hand-tracking at the time still had limitations, and the quality of people’s experience with it varied highly depending on their lighting conditions and their expectations of the feature. The demo I made did not handle these limitations well.

Linus from Linus Tech Tips struggling with Cubism’s experimental hand tracking input (source).

I realized that properly implementing this feature would require much more work than I originally anticipated, which would delay the release of the actual game. I instead decided to remove the feature from the release scope and plan it for a future update.

This was a difficult call to make, since the SideQuest demo set expectations for the full game to support this feature as well, but I think it was the right call as it ensured I could give the development of this feature the time it required to be done properly.

3. Build Tools That Save You Time

When you recognize that an aspect of your game will require a ton of iteration to get right, it’s worth looking into what tools you can buy or build to help make that iteration more efficient.

For Cubism, I realized early on that I would need to iterate a lot on the design of the puzzles in the game, so one of the first things I built was a simple puzzle editor. It was far from release-ready, but as a developer tool it had a huge impact on how quickly I was able to iterate and find interesting puzzle designs.

An early in-VR level editor tool helped me speed up puzzle design and iteration

Another aspect of the game that required a lot of iteration was the audio design. In Cubism, every puzzle piece is associated with a note, meaning every puzzle forms a complete chord once finished. Completed puzzles and their associated chords form a complete song. When a player presses the play button in the menu, it will play this song as it goes through all the levels they’ve solved.

Pressing the play button lets players hear a song composed of chords associated with each puzzle they’ve completed.

In Unity3D, I built a simple editor tool that would let me modify the notes associated with the puzzles and would save these notes in a separate file. This allowed me to test multiple songs for the game in parallel and made it easier for me to keep the notes associated with puzzles up to date while the puzzle designs evolved during development.

This simple puzzle song editor let me modify the notes associated with pieces of each puzzle and preview what this would sound like in sequence in the game.

4. Don’t Playtest Your Game with Gamers (at first)

If you want to make your VR game accessible for newcomers to the medium, take special care to playtest it with non-gamers during development.

Since Cubism was meant to be a casual game, one of my design goals was to make it as pick-up-and-play as possible for newcomers to the medium. However, about a year and a half into development I realized one of the biggest blockers for newcomers was the game’s control scheme and the onboarding tutorial to teach new players.

Almost every button had a function mapped to it, and the game would start by walking you through each button, which would be very disorienting for people who weren’t used to holding controllers.

Cubism originally used every button on the controller and the onboarding tutorial would walk users though each one.

It took me a long time to realize this was an issue, because I had mainly been testing the game with other developers, gamers, and VR enthusiasts who would tend to breeze through the controller onboarding. It was only during a more family-oriented game event, where I got a chance to test the game with more non-gamers, that I realized input was a barrier to entry for some folks.

After that insight I focussed on simplifying the control scheme by making the entire game playable with just the triggers. This had some design implications as well: the menu moved from being anchored to the player’s hand to being anchored underneath the puzzle. And moving the entire puzzle, which used to be mapped to the grip buttons, now happened by grabbing the puzzle within its bounding box.

These changes greatly simplified onboarding and made the game much more easy to pick-up-and-play. Where some players used to get confused by the original tutorial, they would now breeze through it and be solving their first puzzle within 10-20 seconds of launching the game.

Cubism can now fully be played with just the triggers, greatly simplifying and shortening the onboarding tutorial.

5. Don’t Solo Dev Alone

Even though I made Cubism on my own, I would never have been able to finish the game without the support of various friends and organizations within the VR community. They kept me motivated throughout development and have given me valuable advice when I was stuck.

In most cities I’ve lived in since I started working on Cubism, I’ve been able to find meetup groups for Unity developers, indie game developers, or VR enthusiasts. And even though going to actual meetups is harder these days, many of these groups also have active online communities on Slack or Discord.

If you’re planning on developing on the Oculus platform, I also highly recommend joining their Oculus Start program. Beyond the support Oculus provides to Start developers, they also have a really active and supportive community on Discord.

– – — – –

The choice of whether or not to work solo and/or part-time on a project will likely depend on your circumstances and the nature of the game you’re making. I’ve definitely felt the downsides of solo spare-time development as well: a dev cycle that was probably longer than it needed to be, being confronted with gaps in my own knowledge when it came to actually finishing and publishing a game, or the lack of a creative sparring partner to work through design problems and help make decisions.

But for Cubism, there was a flip-side to each of these downsides as well: not having to compromise between a game I wanted to make and a job I enjoyed doing, being forced to learn new skills, and being incentivised to seek out the wider gamedev community for advice, support and motivation.

In many cases, it will make more sense to work together with others or to seek funding for development, but if you’re planning on solo-bootstrapping your first game, I hope this article will be helpful!

The post How a Solo Indie Developer Built the Best Rated Game on Oculus Quest appeared first on Road to VR.

Cas & Chary Present: Awesome Things to Do in VR Other Than Gaming

When it comes to consumer VR headsets, gaming is clearly on of the most common use-cases. However, there are so many other things that are compelling and fun to do in VR. In this video and article, I share things you can do in VR other than gaming that I enjoy.

Cas & Chary Present

Cas and Chary VR is a YouTube channel hosted by Netherland-based duo Casandra Vuong and Chary Keijzer who have been documenting their VR journeys since 2016. They share a curated selection of their content with extra insights for the Road to VR audience.

Play Instruments in VR

Some VR apps allow you to spawn instruments, like complete drum sets, out of thin air in front of you. This is great because it removes physical limitations like space and even noise that could annoy your neighbors!

Image courtesy Paradiddle

Paradiddle is a great example for people who want to drum. It’s also pretty cool to see others drum entire covers using it.

If you have an Oculus Quest, there’s even a piano tutorial app available on SideQuest called VRTuos. It uses hand tracking and a real piano to teach you to play songs. If you’ve got a piano, you should try it! There’s also a VRTuos Pro version if you want unlock more features and support the developer.

Meditate in VR

For those looking for distraction-free relaxation after a heavy workday, try out meditation apps in VR. I’m not the type to meditate myself as I get distracted way too fast. However, it works better in VR for me, and even if it only lasts for five minutes, I feel rejuvenated after.

The apps I recommend checking out are Guided Meditation VRtheBlu, or for Quest NatureTreks VR. And check out Road to VR’s expansive list of relaxing games and experiences if you’re looking for even more ways to chill out in VR.

Attend VR Performance with Live Actors

Image courtesy Tender Claws

Attending a virtual theater show was one of the best things to do this year. In The Under Presents, you could book a ticket to attend a 40 minutes live actor show based on Shakespeare’s The Tempest. This was with up to eight attendees, but we could only communicate by body language as we couldn’t speak to each other. The only person that spoke was the live actor who narrated and improvised the whole show. The actor had the power to spawn objects and put us in different environments. It was a magical experience.

Right now, The Under Presents does not have any shows running (though it still has much to explore), but keep an eye on the immersive theater VR space as I’m expecting to see more of this genre!

Watch Movies with Friends in VR

Image courtesy Bigscreen

In Bigscreen, you can book tickets to 3D and non-3D movies and watch them together with friends in a big cinema. It’s like sitting in your own private theater! This is a great way to catch up with movies or hang out with friends that may be far away.

Attend VR Concerts

Image courtesy Wave

If you haven’t tried attending a VR concert yet, you should check out Wave. ‘Waves’ are live, interactive, and social concerts in a virtual environment where you’re able to connect with the artist and other fans in an immersive way. Currently, there are no shows available, but check out the recap of the Lindsey Stirling show to get a feel of what it’s like. There are also other experiences to explore inside the app, which you can download from Steam.

Work out in VR

Don’t like going to the gym? Then VR might be an excellent motivation to get more exercise as it is less boring, at least in my opinion. There are subscription-based apps like VZFit or Holofit that give you access to an app and tracker and allow you to bring an exercise bike or even an elliptical into your VR experience.

On Oculus Quest, you could try VRWorkout (on SideQuest) and Supernatural VR.

If you’re especially interested in VR fitness, check out our video & article on why working out in VR is a game-changer!

Co-work in VR

Image courtesy Virtual Desktop

More and more VR apps are coming out that allow us to co-work innovatively. This may not work for all types of meetings but could be a nice change of environment for affairs like brainstorming sessions.

You can try it out with the free VR chatroom, Mozilla Hubs. It’s easy to use and doesn’t require an installation. All you need is to do is create and share a link for people to join. More apps you can check out are Immersed VR and Bigscreen.

Visit Your Your Old House & Travel Virtually in VR

Image courtesy Google

With Google Earth VR, you can visit the whole world without leaving the comfort of your home. Imagine walking on the Great Wall Of China or climbing on top of a pyramid in Egypt. There are some things you can only do in VR. Unfortunately, this app isn’t available on Quest, but it has something close: Wander.

Design in VR (and even 3D print)

Image courtesy Gravity Sketch, design by James Robbins

It’s much easier to create 3D models when you can use the whole environment and your hands instead of a mouse and keyboard. And if you know someone with a 3D printer, you could even bring some of your creations into the real world. Apps to try are Tilt Brush, Gravity Sketch, or Adobe Medium. After that, export it to your PC with 3D printer settings and print! There is something very fascinating about seeing something you made come to life this way.

Learn in VR

Image courtesy Force Field

I believe VR experiences will make a significant impact in the educational sector in the future as learning in VR seems to stick much more than reading a book. If you want to get a feel of what this is like, there are great interactive learning apps available like National Geographic Explore VR and Anne Frank House VR. Or try a documentary, like Traveling While Black.


And that’s just a sample of what’s out there. Know any great non-gaming VR apps? Share your recommendations in the comments below!

The post Cas & Chary Present: Awesome Things to Do in VR Other Than Gaming appeared first on Road to VR.

Cas & Chary Present: Awesome Things to Do in VR Other Than Gaming

When it comes to consumer VR headsets, gaming is clearly on of the most common use-cases. However, there are so many other things that are compelling and fun to do in VR. In this video and article, I share things you can do in VR other than gaming that I enjoy.

Cas & Chary Present

Cas and Chary VR is a YouTube channel hosted by Netherland-based duo Casandra Vuong and Chary Keijzer who have been documenting their VR journeys since 2016. They share a curated selection of their content with extra insights for the Road to VR audience.

Play Instruments in VR

Some VR apps allow you to spawn instruments, like complete drum sets, out of thin air in front of you. This is great because it removes physical limitations like space and even noise that could annoy your neighbors!

Image courtesy Paradiddle

Paradiddle is a great example for people who want to drum. It’s also pretty cool to see others drum entire covers using it.

If you have an Oculus Quest, there’s even a piano tutorial app available on SideQuest called VRTuos. It uses hand tracking and a real piano to teach you to play songs. If you’ve got a piano, you should try it! There’s also a VRTuos Pro version if you want unlock more features and support the developer.

Meditate in VR

For those looking for distraction-free relaxation after a heavy workday, try out meditation apps in VR. I’m not the type to meditate myself as I get distracted way too fast. However, it works better in VR for me, and even if it only lasts for five minutes, I feel rejuvenated after.

The apps I recommend checking out are Guided Meditation VRtheBlu, or for Quest NatureTreks VR. And check out Road to VR’s expansive list of relaxing games and experiences if you’re looking for even more ways to chill out in VR.

Attend VR Performance with Live Actors

Image courtesy Tender Claws

Attending a virtual theater show was one of the best things to do this year. In The Under Presents, you could book a ticket to attend a 40 minutes live actor show based on Shakespeare’s The Tempest. This was with up to eight attendees, but we could only communicate by body language as we couldn’t speak to each other. The only person that spoke was the live actor who narrated and improvised the whole show. The actor had the power to spawn objects and put us in different environments. It was a magical experience.

Right now, The Under Presents does not have any shows running (though it still has much to explore), but keep an eye on the immersive theater VR space as I’m expecting to see more of this genre!

Watch Movies with Friends in VR

Image courtesy Bigscreen

In Bigscreen, you can book tickets to 3D and non-3D movies and watch them together with friends in a big cinema. It’s like sitting in your own private theater! This is a great way to catch up with movies or hang out with friends that may be far away.

Attend VR Concerts

Image courtesy Wave

If you haven’t tried attending a VR concert yet, you should check out Wave. ‘Waves’ are live, interactive, and social concerts in a virtual environment where you’re able to connect with the artist and other fans in an immersive way. Currently, there are no shows available, but check out the recap of the Lindsey Stirling show to get a feel of what it’s like. There are also other experiences to explore inside the app, which you can download from Steam.

Work out in VR

Don’t like going to the gym? Then VR might be an excellent motivation to get more exercise as it is less boring, at least in my opinion. There are subscription-based apps like VZFit or Holofit that give you access to an app and tracker and allow you to bring an exercise bike or even an elliptical into your VR experience.

On Oculus Quest, you could try VRWorkout (on SideQuest) and Supernatural VR.

If you’re especially interested in VR fitness, check out our video & article on why working out in VR is a game-changer!

Co-work in VR

Image courtesy Virtual Desktop

More and more VR apps are coming out that allow us to co-work innovatively. This may not work for all types of meetings but could be a nice change of environment for affairs like brainstorming sessions.

You can try it out with the free VR chatroom, Mozilla Hubs. It’s easy to use and doesn’t require an installation. All you need is to do is create and share a link for people to join. More apps you can check out are Immersed VR and Bigscreen.

Visit Your Your Old House & Travel Virtually in VR

Image courtesy Google

With Google Earth VR, you can visit the whole world without leaving the comfort of your home. Imagine walking on the Great Wall Of China or climbing on top of a pyramid in Egypt. There are some things you can only do in VR. Unfortunately, this app isn’t available on Quest, but it has something close: Wander.

Design in VR (and even 3D print)

Image courtesy Gravity Sketch, design by James Robbins

It’s much easier to create 3D models when you can use the whole environment and your hands instead of a mouse and keyboard. And if you know someone with a 3D printer, you could even bring some of your creations into the real world. Apps to try are Tilt Brush, Gravity Sketch, or Adobe Medium. After that, export it to your PC with 3D printer settings and print! There is something very fascinating about seeing something you made come to life this way.

Learn in VR

Image courtesy Force Field

I believe VR experiences will make a significant impact in the educational sector in the future as learning in VR seems to stick much more than reading a book. If you want to get a feel of what this is like, there are great interactive learning apps available like National Geographic Explore VR and Anne Frank House VR. Or try a documentary, like Traveling While Black.


And that’s just a sample of what’s out there. Know any great non-gaming VR apps? Share your recommendations in the comments below!

The post Cas & Chary Present: Awesome Things to Do in VR Other Than Gaming appeared first on Road to VR.