On the Hunt for VR’s Killer App with Sony’s Head of PlayStation Magic Lab, Richard Marks

Everyone in the VR industry can envision a world in the next 10 years that’s radically changed by virtual reality. From healthcare, education, social, training, cinema, gaming, and more, VR has a lot of Killer Use-cases. But it seems most of the industry is in agreement that the Killer App—a single, platform-defining piece of software that compels buyers—has not yet arrived. Sony’s Richard Marks weighs in on where we might come to find it.

Every day this week leading up to the 2017 Game Developers Conference in San Francisco, we’re featuring insights on the hunt for the killer app from virtual reality’s leading companies. Today we hear from Richard Marks, Senior Research Engineer on R&D at Sony Interactive Entertainment.


Dr. Richard Marks

sony-richard-marksMarks heads the PlayStation Magic Lab within Sony Interactive Entertainment’s Research & Development group. Magic Lab was founded by Marks to push the boundaries of play by exploring how technology can be used to create new entertainment experiences. Marks joined PlayStation in 1999 to investigate the use of live video input for gaming and to develop new interactive user experiences. He helped create the EyeToy and PlayStation Eye cameras, as well as the PlayStation Move controller. Most recently, Marks and his team have been involved with PlayStation VR, experiments with eye tracking technology, and other innovations. He received a Bachelor of Science in avionics from the Massachusetts Institute of Technology and a doctorate in the field of underwater robotics from Stanford University.


Road to VR:
What traits do you think VR’s Killer App needs to have?

Marks:
I don’t believe there will be a single ‘Killer App’ for VR. It is too broad a medium to have just a single app that defines it. If you enjoy sci-fi, you’ll most likely enjoy sci-fi in VR. If you enjoy history, you’ll most likely enjoy history in VR. Virtual reality truly is a new medium, and the idea that there could be a single Killer App discounts it huge potential scope.

But I do believe there will be a collection of ‘Hero Apps’ that help drive adoption and interest, much like we’ve seen in the past with other mediums. These Hero Apps will successfully leverage the capabilities that are unique to VR, in a content area that has broad appeal. The two most important capabilities of VR are presence (being in a different place) and co-presence (sharing a different place with others).

There are multiple technologies that can be used to achieve these capabilities, and a big challenge for mass adoption of VR is the significant hardware required—so accessible (e.g., low cost, low encumbrance) approaches that retain high quality will have the greatest success at reaching a wide audience. Of course, we believe accessibility is one of the biggest advantages of PlayStation VR; it “just works” with over 53 million PS4 consoles in consumers’ homes, making it easier for someone to jump into VR.

Road to VR:
If you had to make a bet, which sector of VR would you predict as the place where the first Killer App emerges?

Marks at PlayStation's Magic Lab
Marks at PlayStation’s Magic Lab

Marks:
The entertainment sector, including gaming, is likely to have the earliest success because VR is being added to expand the offerings. Also, game developers are already practiced at creating interesting virtual worlds; I’ve heard many game designers say that VR is the medium that lets them finally realize the visions they’ve been trying to create.

We are already starting to see the emergence of several Hero Apps. Horror is a genre that leverages VR presence, and many people recently have been enjoying the visceral experience of Resident Evil 7 in VR. And I’ve heard many people say that the Star Wars Battlefront Rouge One VR Mission, while not a full game, is their favorite VR experience so far.

I also believe in the near term, social gaming will be the most successful implementation of social VR because it provides an interesting focus, a “raison d’être” for social interaction in VR, rather than something open-ended such as [a VR chat room].

SEE ALSO
Latest Figures Suggest 'Resident Evil 7' Could Have Some 280,000 PSVR Players

Road to VR:
Do you think VR’s Killer App will launch in 2017?

Marks:
I strongly believe 2017 will introduce a Hero App (game) that leverages VR co-presence for a new level of interaction beyond any we’ve ever seen in games before, and that is what I’m looking forward to this year.

The post On the Hunt for VR’s Killer App with Sony’s Head of PlayStation Magic Lab, Richard Marks appeared first on Road to VR.

Developer Owlchemy Labs Talks Keys to Early VR Indie Success

Cy_WiseOwlchemy Labs recently announced that Job Simulator has grossed over $3 million, making it one of the most successful indie VR titles to date, and so it’s worth reflecting on some of the design principles of agency and plausibility that have proven to be some of the key affordances of the virtual reality medium. I had a chance to talk to Owlchemy Labs’ Cy Wise at PAX West where she shared with me some guiding principles for Job Simulator as well as some of the more existential reactions from users questioning the nature o reality.

LISTEN TO THE VOICES OF VR PODCAST

Wise says that one of the key design principles of Job Simulator was to make sure that everything was interactive. Their goal was to not make it feel like a game, but rather that people would get so lost in the plausible interactions that they’d be able to achieve a deep sense of presence. She cites the example of making tea in that they had to account for the dozens of different ways that people make their tea in order to maintain that level of plausibility that they’ve created in their virtual world. If it’s not intuitive, then the rules and limitations of the simulation make it feel like a game rather than just executing a task given that the affordances of the environment match their expectations of how it should behave.

This reflects what Kimberly Voll recently said about having a fidelity contract where affordances match the user’s expectations. It also seems to validate Mel Slater’s theory of presence with the Place Illusion and Plausibility or what Richard Skarbez would describe as immersion and coherence.

Designing for agency and plausibility has been a key theme in my previous interviews with Owlchemy Labs’ Alex Schwartz from GDC 2015, Vision Summit 2016, and PAX West 2016.

Owlchemy Labs was able to do such a good job at creating a sense of presence in people that Wise said that it would often create a bit of an existential crisis since it blurred their boundaries of reality. VR developers talk about this as the sense of presence in VR, but there isn’t a common language for people who are having a direct experience of VR presence for the first time.

Wise asks, “How do you talk about the ‘not real’ real? Or how do you talk about the imaginary real life?” And that if people were able to have a direct lived experiences within a virtual simulation, and it felt completely real, then it begs the question of whether or not we’re already living in a simulation. The Atlantic did a profile on people who experienced a post-VR existential crisis that made them question whether actual reality is real or not.

Hassan Karaouni recently told me that if we’re not already in a simulation, then we’re most certainly going to create virtual realities that are indistinguishable for reality that will have intelligent agents within these simulations who will be asking these exact same questions.

Wise has explored the line of thinking of “How far deep do the layers of inception go?” many times, and I’ve also started to have more conversations with people in the VR community about simulation theory and it’s implications.

Wise has been on the front-lines of having these types of interactions with users of experiences from Owlchemy Labs, and it’s only natural that these types of VR experiences start to make people question the balance between fate and free will in their lives as VR experiences enable new expressions of our agency in what could be classified as an “Erlebnis” direct experience within an incepted virtual reality.

VR is starting to give us more and more experiences that are impossible to have in reality, and our memories of these experiences can be just as vivid as “real life” experiences, which further blurs the line between the “virtual” and “real.” The long-term implications of this are still unclear, but what is clear is that Owlchemy Labs has been focused on the principles of Plausibility and Agency, which mirrors what OSSIC CEO Jason Riggs recently declared that the future is going to be Immersive and Interactive.

If we are in a simulation, then it’s possible that we may never be able to reach base reality. As we continue to experience simulations that are more and more indistinguishable from reality, then perhaps the best that we can do is to strive to reach the deepest sense of presence at each layer of inception that we discover.


Support Voices of VR

Music: Fatality & Summer Trip

The post Developer Owlchemy Labs Talks Keys to Early VR Indie Success appeared first on Road to VR.

VR Interface Design Contest with $10,000 in Cash Prizes Launched by Purple Pill

Immersive content agency Purple Pill has announced a VR interface design competition and is offering $10,000 in cash prizes to those who create the best virtual reality interfaces.

From gaze-based interaction modalities to laser pointer menus to skeuomorphic knobs and buttons, today’s VR interfaces are all over the place. Even from one motion controller to the next, VR interface designs don’t agree on the best way to pick up and hold virtual objects. It’s going to take time before reaching any sort of consensus on VR interface design, but Purple Pill is hoping to spur things along.

oculus-home
Most of today’s VR interfaces are carryovers from screen-based interfaces

The company has announced a VR interface design contest that begins today and runs until March 15th. Entries will be judged on Usability, Design, Creativity, and Performance. The first place prize is $7,500 in cash and the second place prize is $2,500.

“The majority of interfaces we see in the current generation of VR apps are confusing and rather plain. They’re usually not much more than a floating plane with some text on it,” says Purple Pill’s Nick Kraakman. “With this competition we want to stimulate designers and developers from around the world to come up with fresh ideas about UI’s in VR and create some innovative designs that push the boundaries of this exciting medium.”

SEE ALSO
VR Interface Design Insights from Mike Alger

What’s the catch? Well, Purple Pill isn’t quite doing this just out of the goodness of their hearts—entries must be based on the company’s Headjack Unity API, a foundation for creating cross-platform VR apps which include 360 video.

The contest’s official rules require that each entry:

• Is created using the Headjack Template API
• Runs smoothly and without frame drops
• Is submitted during the Competition Period
• Is added to the Marketplace as a Public free template

Although not part of the official rules, Purple Pill says entries “Should have support for mobile VR.” The rules further say that entries can be submitted in the following way:

  1. Sign up for a free Headjack account on https://app.headjack.io
  2. Create a VR Template using the Headjack API found at https://headjack.io/docs
  3. Upload the template to the Headjack Template Marketplace at
  4. https://app.headjack.io/#/templates/my-templates/add as a Public free template

Starting today, participants can submit any number of entries but are only eligible to win one of the two cash prizes. Purple Pill says that the winners will be announced one month after the March 15th submission deadline.

The post VR Interface Design Contest with $10,000 in Cash Prizes Launched by Purple Pill appeared first on Road to VR.

‘Freedom Locomotion System’ is a Comprehensive Package for VR Movement

Moving around comfortably and immersively in VR remains a hurdle for VR game developers. VR studio Huge Robot has created the Freedom Locomotion System which brings together a number of VR movement systems into a comprehensive and functional package which allows for comfortable walking, running, and climbing in VR.

Since video games have existed, traversing great distances in large virtual worlds has been part of game design. In games like Halo, players run, drive, and fly across hundreds of virtual miles. But in VR, while driving and flying is usually pretty comfortable, running and walking often isn’t. So many developers have had to experiment and implement novel locomotion techniques for games which require traversal beyond the player’s available physical space.

freedom locomotion systemThere’s a bunch of different techniques out there. Many of them are completely comfortable, but not necessarily immersive. The common method of ‘blinking’ from one place to the next makes it hard to maintain a firm mental map of the space around you.

SEE ALSO
'Ninja Run' May Be the Craziest VR Locomotion Technique Yet

In an effort to tackle the challenge of comfortable and immersive VR locomotion, studio Huge Robot has created the Freedom Locomotion System, a comprehensive locomotion package that Director George Kong boldly believes is “as close to solving the issue of immersive VR locomotion as we can get within the current practical limitations of VR.”

caots-freedom-locomotion-systemThe system is underscored by what Kong calls CAOTS (Controller Assisted On the Spot) movement. It’s a sort of ‘run-in-place’ movement system of Huge Robot’s own design. Kong says it lets players comfortably and immersively move while leaving their hands free for interactions with the virtual world (especially important for games where you might regularly wield a weapon like a gun or sword).

In addition to CAOTS, the Freedom Locomotion System, also includes a number of subsystems which offer different modes of locomotion and methods of smart interactions between the player’s movement and the virtual world.

For instance, with the Freedom Locomotion System, players will move up or down in elevation along slopes and stairs if they walk along them in their physical space (instead of clipping through the geometry). There’s also a climbing system which detects ‘grabable’ geometry, providing a procedural way for making models climbable for players. There’s also a smart method for dealing with players clipping into walls and over edges. Kong offers a detailed breakdown of the package and its capabilities:

When combined with the CAOTS system, the VR movement provided by the Freedom Locomotion System looks intuitive and immersive. It isn’t clear yet if or how Huge Robot plans to distribute this system as a foundation for VR developers, but Kong says an extensive VR demo will be available soon on Steam and we’re excited to give it a try.

The post ‘Freedom Locomotion System’ is a Comprehensive Package for VR Movement appeared first on Road to VR.

Making VR Experiences Wheelchair Accessible

brian_van_burenBrian Van Buren is a narrative designer at Tomorrow Today Labs, and he’s also a wheelchair user who has been evangelizing how to make virtual reality experiences more accessible. I had a chance to catch up with him at the Intel Buzz Workshop in June where we talked about some of his accessibility recommendations to other virtual reality developers, some good and bad examples of accessibility in VR, as well as some of things that VR technologies enable him to do in a virtual world that he can’t do in the real world.

LISTEN TO THE VOICES OF VR PODCAST

Some of the primary recommendations that Van Buren gives is that you can’t assume the dimensions of your user. Just because he’s is 4 foot 6 inches, doesn’t mean that he should be automatically assigned a child’s body avatar. Also, because he’s primarily sitting down, he’d still like to be able to participate in games that require you to crouch down and duck. Some of the experiences that handle this well are Hover Junkers that provides a head model adjustment for people of different heights, and he’s also able to play Space Pirate Trainer. The little human mode in Job Simulator will also raise the head a foot and a half to provide access to both children as well as people in wheelchairs.

Van Buren recommends against placing objects on the ground as they’re essentially game-breaking bugs for people in wheelchairs, but also generally not ergonomically comfortable for most people. Placing buttons at waist height when standing has the side effect of being fairly comfortable for people are sitting or in a wheelchair, and that highly placed objects are completely out of reach. There are Americans with Disabilities Act (ADA) regulations that most federal and government buildings have to follow, and virtual reality environment developers should keep some of these design constraints in mind.

He says that it’s easier to take accessibility into consideration at the design stage rather than afterwards, and so the sooner that you account for mobility constraints, the better. There are tradeoffs of including kinesthetic gameplay mechanics like crouching, crawling, bending, reaching up that may provide deeper sense of presence for able-bodied people who are of a certain height, but Van Buren asks to consider whether or not some of those mechanics are absolutely vital to the game that it’s worth making the game inaccessible to a portion of people.

Van Buren had heard my previous interview with Katie Goode about accessibility, which encouraged him in that there were other people who were thinking about making VR more accessible. Katie wrote up a great blog post talking about the accessibility design considerations in Unseen Diplomacy, and Adrienne Hunter also wrote up a great overview of designing VR for people with physical limitations.

For a more in-depth discussion on “Making VR and AR Truly Accessible,” then be sure to also check out this Virtual Reality Developer’s Conference panel discussion featuring Minds + Assembly’s Tracey John, Radial Games’ Andy Moore, Tomorrow Today Labs’ Brian Van Buren, and independent designer Kayla Kinnunen:


Support Voices of VR

Music: Fatality & Summer Trip

The post Making VR Experiences Wheelchair Accessible appeared first on Road to VR.

Oculus ‘Quill’ is Spectacular, But It’s the Instantly Usable Interface That Surprised Us Most

Quill is Oculus’ ‘VR paint’ app, and it’s nothing short of spectacular. Tucked away inside this creative-focused app, however, is an interface that’s so simple, you might miss its brilliance.

Quill, which is available to every Touch owner for free, is impressive. I thought initially that there would be a lot of overlap between Oculus Medium [our review] and Quill (both art-focused in-house projects from Oculus), but Quill does seem to firmly maintain own its own style and functionality—Medium is to clay as Quill is to pencil & paper. Quill also feels like a substantial artist’s tool, even more so than Medium, with a powerful layer system, brush styles and opacities, exporting, and capturing functionality.

Sketching is an extremely accessible medium, and that’s recreated by Quill. But just like the same pencil & paper can make a stick figure or a detailed human body, in the right hands, Quill can do amazing things. One of the pre-loaded scenes, First Tuesday by Ric Carrasquillo, is jaw-dropping in its style, skill, and scope. It might be one of the first VR art masterpieces.

And while it’s clear that artists are going to be able to make amazing works inside of Quill, it’s the program’s dead-simple, but somehow highly functional, interface that’s the most unexpected. Graphically, the interface looks charmingly like something out of the early days of the first GUI operating systems, but what’s important is the fact that the interface takes known PC affordances and applies them easily and effectively in VR.

The interface pops up in your off-hand like a tablet, while your main hand becomes a little blue 3D cursor.  On your virtual tablet you’ll find familiar buttons slightly raised up in 3D, that are just so easily pressable with your cursor. Even though the menu manages to cram lots of buttons close together, they remain incredibly easy to hit thanks to the precision of Touch, and its relatively small size which makes it easy to get your hands close together. I haven’t once felt like I accidentally hit the wrong button, even when clicking through the robust layer menu.

vlcsnap-2016-12-07-03h17m56s805 vlcsnap-2016-12-07-03h17m33s661

From the outside, this might look like a placeholder interface (and a bad one for VR at that), but it’s actually impressively functional, and it builds on decades of PC interface design.

‘How do you click’? Well the cursor and the height of the buttons makes that obvious, just press the cursor to the button you want. ‘How do you scroll?’ Well, everyone knows how to use a scroll bar, just drag it down with the cursor. Crucially, you can even hover over buttons and icons to get a modal popup that gives a text description of the button. Like our PC interfaces, Quill’s interface manages high functional density without losing precision.

vlcsnap-2016-12-07-03h16m38s043 vlcsnap-2016-12-07-03h18m41s195

The ‘back to basics’ interface design is important because it mimics what so many of us already know about using computers. That’s important for adoption, especially for a tool like Quill which, for many digital artists, may be the first time they ever use VR. And they’ll be looking for familiar things, like Quill’s layers menu which uses an easily understood structure, icons, and interactions that are similar to industry-standard digital art tools.

We’ve seen plenty of similar PC-style interfaces in VR—like SteamVR’s menu, for instance—mostly floating in front of you as a large static panel with a laser pointer in your hand. Just like leverage over a length multiplies force, an extended pointing device—like a laser pointing stretching 10 feet from your hand—amplifies the subtle movements of your hands so that a tiny hand movement makes for a much larger movement at the end of the pointer. And when the ‘click’ action is in the form of a trigger pull, that means your hand will move (and move the end of the laser pointer by a greater distance) every time you’re trying to press a button or initiate an action. The result is that the interface and its corresponding elements need to be extra large to compensate for this amplified imprecision of our hands, which then necessities large angular movements of our arms to navigate from one side of the panel to the other.

With Quill’s interface, you retain the full dexterity that your hand is capable of at a 1:1 scale since you’re hand literally becomes the cursor. The confidence of control afforded by the program’s interface makes it much prefered to many other interfaces I’ve seen in VR.

SEE ALSO
VR Interface Design Insights from Mike Alger

Normally I’d say that an interface like Quill’s is too reliant on old paradigms for VR, but the level of effortless functionality is rather remarkable; building on this—on what we already know about using computer interfaces—might be the best direction to head to find out what’s next.

The post Oculus ‘Quill’ is Spectacular, But It’s the Instantly Usable Interface That Surprised Us Most appeared first on Road to VR.

Battlezone PSVR Dev Diary #3: The Art of Battlezone

In our final developer diary from the team behind Rebellion’s PlayStation VR launch title Battlezone, Lead Artist on Sun He reflects on how they developed and executed on the artistic ethos behind the game.

Guest Article by Sun He, Lead artist on Battlezone – performing magic in 3D!
Guest Article by Sun He, Lead artist on Battlezone – performing magic in 3D!

As an art team beginning work on our very first VR game, we knew we were undertaking a challenge with Battlezone, but our goals were always crystal clear. We wanted to fashion an art style that not only wowed in VR, but also retained a strong visual connection to the original 1980s arcade game. A simple goal to understand then, but very complex to deliver! Here’s how we approached it, and the lessons we learned along the way:

1. Rethink the workflow

battleszone_07_1k

At the earliest stages of production we conceptualized the game’s look the traditional way.

Metaphorically, we knew we weren’t about to paint on canvas, but we still tried sketching out ideas on paper. Sure enough, this traditional approach produced results that looked rather different in VR compared to how we imagined.

It quickly became clear just how important it was to consider the scale and spacing in VR as early as possible. In VR, artists are working in a truly digital world. In other words, that meant conceptualizing in 3D and indeed in VR right from the off.

The best allegory for this shift I can come up with is that whereas before VR we were artists making nice paintings of houses, we are now actually building the houses and designing their interiors! It’s quite a jump.

We’d take 3D concept models into VR and scale them, move them around in the 3D space and test them again in again in as many different scenarios as possible, essentially trying to break them! Once an outline was set, we could then finally add detail and texturing.

Special effects were a particularly good example of this. In a traditional approach we would simply generate effects with 2D sprites, but in VR this led to effects that lacked an inherent sense of depth and volume. This, interestingly, was particularly noticeable with larger particles.

If you play Battlezone, you’ll notice a lot of the game has a polygonal feel, from the hexes of the campaign map and the in-game surfaces to the polyhedral pieces of data that spawn from defeated enemies. This is certainly part of the game’s retro-futuristic feel. But with our effects, using a polygonal design allowed us to create effects in 3D meshes. By this I mean instead of drawing a 3D sphere, for example, as a texture to put on 2D particle sprites to create what looks like a sphere in-game, you’re actually using a 3D sphere. And you can see that particularly in the explosions: Bright yellow and orange dynamic polygons that look like lots of smaller shapes combined together. It’s a striking look that really resonates with the retro style but is also very well suited for VR art design.

SEE ALSO
Battlezone PSVR Dev Diary #1: The Importance of Feedback in Uncharted Territory

2. Exaggerate the scale

battleszone_11_1k

Creating a game for VR, we of course wanted to create a world that people would naturally look around. One of the ways we tried to achieve this is something we’ve mentioned in a previous post: Using very tall, imposing structures in the vertical space that really hammer home the sense of scale. These work both as visual landmarks and orientation tools in VR, much like you’d use the tallest building as a point of reference in a busy metropolis.

In addition to this, we used a combination of “vanishing points” in scenery to make perspectives feel more exaggerated. A vanishing point is, essentially, the point in your perspective where two parallel lines appear to converge. Try imagining a picture of a road leading to the horizon. At some point you see the two sides of the road meet towards the horizon, essentially disappearing. That’s a simple example of a vanishing point, though it can comprise more than just lines, and it’s often used to simulate 3D in 2D art.

By using multiple vanishing points in the Battlezone scenery, we were able to make our 3D perspectives feel more exaggerated in scale; structures would feel taller and environments even bigger. For instance, during the opening launch sequence, the hangar feels incredibly spacious in VR because we’ve exaggerated the draw distance. And then as the tank lifts you out of the level, your eyes are drawn upwards towards that epic landmark in the sky.

SEE ALSO
Battlezone PSVR Dev Diary #2: Building Levels in VR That Welcome Players Into the World

3. Design a VR-friendly art style

battleszone_09_1k

I’d probably describe Battlezone’s look as a retro-futuristic style with very neon, chunky and blocky shapes. We wanted to inherit some of the classic elements from the original Battlezone, like the colour, the neon wireframe and the polygonal look, but at the same time give it the kind of makeover players would expect from a next-gen VR experience

In early development, Battlezone had a litany of thin neon lines and very, very detailed environments. However, it started to look noisy, with elements a little indistinctive in the mid-to-far distance. As artists, we found we needed to be a little bit more restrained in VR.

With that in mind, we began to sculpt the buildings and vehicles into big, blocky shapes at first, and then balance things like the level of detail and the thickness of lines. Once again, we were testing assets early and regularly in VR, so we could have a much clearer idea of how much additional detail we could add.

Battlezone’s chunky neon polygons became the basis around which we chose the colour palette. We tinted environmental themes around it – the volcanic theme has primary and secondary colours of brown and grey, which really contrasted against the neon orange and yellow of things like effects to make everything more pronounced in VR. And placing our player in the cockpit meant we could bring back the game’s classic green look into the displays and user interface right at the front of the view. The end result is something that harkens back to the original arcade game and yet feels undeniably modern, digital and virtual – retro-futuristic, classic but modern, familiar and yet in VR.

Buy Battlezone from Amazon
Buy Battlezone for PSVR

It’s been so exciting to be a part of this VR journey, and I’m looking forward to further exploring this new area of gaming and finding more solutions for future development. We are really lucky to be one of the few teams to create art content for a brand new platform in such uncharted territory. I really hope you’ll enjoy Battlezone and appreciate the work that has gone into its art style.


Our thanks to Sun He and the rest of the Rebellion team for putting together these developer diaries. Battlezone is out now on PlayStation VR.

The post Battlezone PSVR Dev Diary #3: The Art of Battlezone appeared first on Road to VR.

Battlezone PSVR Dev Diary #2: Building Levels in VR That Welcome Players Into the World

In the 2nd of our 3 part developer diary series from Battlezone developers Rebellion, Game Designer Grant Stewart writes on making levels that don’t just use VR, but also welcome you into the game world.


grant-stewartGuest Article by Grant Stewart

Grant designs games at Rebellion. Nobody seems to have stopped him yet.


At its heart, Battlezone is an arcade game. It’s designed for frantic, quick bursts of tank-combat gameplay. Some developers have opted for shorter experiences or a more laconic pace, but we want to get your pulse racing and your trigger finger itching. Every facet of Battlezone leads into providing this feeling and accentuating it with VR. Enemies swoop and careen around you, explosions light up your view and the battlefield feels alive with action.

Creating the levels to house this action in VR has been a unique experience. We built Battlezone in Rebellion’s in-house engine, Asura. Our tech team crafted a selection of tools for the project that enabled us to rapidly prototype. Every level supports every mission, and they each work a number of ways in each level.

Our first attempts at crafting environments for tank warfare were inspired by the original 80s and 90s Battlezone games. We cautiously attempted undulating terrain that stretched across vast areas, though some on the team weren’t convinced. The glowing vector mountains of the arcade cabinet became faceted rock formations, polygonal trees dispersed around them. The extraterrestrial landscapes of the Activision strategy games allowed us to zip across levels, flowing under and over rolling hillocks. You can see some of these environments in one of our earliest trailers (along with the old cockpit!)

These all seemed like strong ideas at first, but the longer we played in VR the more we saw the problems. We kept snagging on those polygonal trees and the undulations were beginning to make us feel uncomfortable. So, as ever with this project, we experimented and iterated.

We knew combat had to happen at a variety of ranges and heights; what’s the point of having full freedom of movement if it doesn’t translate into variable gameplay? So we flattened out the hills and tied the plateaus together with easily navigable ramps. We also swapped out the smaller objects for rocks, vents and snow drifts, each offering cover in a fight. We kept some of the speed and all of the freedom, but we still circumvented problems.

After extensive playtesting and iteration, we started to apply more dressing to the levels. The campaign in Battlezone sees you perpetually work your way towards The Corporation AI Core’s volcano lair, a nod to the original Battlezone. Across that journey you fight in five distinct settings: Frozen Wastes, Robotic Metropolis, Neon Cities, Industrial Complexes and Volcanic Ridges. Each theme has its own unique palette and style, and all of them are calibrated to complement the enemy designs.

battlezone-dev2-2

In addition to the traditional aspects of environmental design, VR gave us something new: Scale. Being wowed not just by the world you see, but the world you are in.

So we ensured each level features a unique landmark, a structure towering above you. In VR they are frankly awe-inspiring – you gaze up and appreciate the size of your surroundings. We embrace this with a moment at the start of every mission. Before the action begins you watch as your cockpit comes online. The shutters around your tank gradually come up and the world slowly comes into focus. It’s an opportunity to marvel before plunging into the action.

battlezon-dev2-1

As well as being stunning, these structures provide an anchor point for orienting yourself. Knowing where you are helps you get into the action that must faster. As you switch targets from close to long range, ground to air, and so on, you’ll always be able to find the horizon, that landmark and your bearings.

This was especially important to use because each level can play out missions in a variety of ways. Our procedurally generated campaign algorithmically connects missions and maps, so you never know exactly what you’re going to get in each level. We wanted to ensure that any combination offers a unique scenario. Even the placement of structures and enemies is chosen by chance! So with all that in mind, having n imposing landmark enhances the readability of this exotic world, which is something you really need in the heady swell of VR.

SEE ALSO
Battlezone PSVR Dev Diary #1: The Importance of Feedback in Uncharted Territory

Battlezone and VR provided us with a lot of unique challenges and opportunities. It’s been a real joy to carve out our gameplay niche on unfamiliar ground. Every aspect of development has been influenced by it. So the whole team and I are proud of what Battlezone has shaped up to be. We can’t wait for you to play it!


Our thanks to Grant for penning this diary entry. Battlezone is a PSVR launch title, available to buy alongside PlayStation VR on October 13.

The post Battlezone PSVR Dev Diary #2: Building Levels in VR That Welcome Players Into the World appeared first on Road to VR.

VR Design Best Practices with Google VR’s Alex Faaborg

alex-faaborgAt the company’s annual I/O conference, Google announced the Daydream VR platform and mobile headset that will be coming to the latest Android phones later this year. There’s a DIY dev kit that you can start using today to start developing Daydream-ready apps, and Google has also released a Google VR Unity SDK that includes a number of DaydreamLabs Controller Playground examples to demonstrate different user interactions with the 3DOF controller.

LISTEN TO THE VOICES OF VR PODCAST

I had a chance to catch up with Google VR’s Alex Faaborg at the Casual Connect conference where he talked about some of the VR design best practices, some of the early survey results from Google showing an average VR play time of 30 minutes per session, what can be learned from Pokémon Go, the differences between Tango and Daydream app design, social norms of using VR around other people, and the future of conversational interfaces.

Here’s the presentation from Google I/O on Designing for Daydream:


Support Voices of VR

Music: Fatality & Summer Trip

The post VR Design Best Practices with Google VR’s Alex Faaborg appeared first on Road to VR.

Prototype VR Robot Fighter Reminds us of ‘Pacific Rim’ in the Best Possible Way

This footage of a physics-driven, motion-controlled robot brawler not only looks amazing, but also features an extremely clever VR control mechanic we’d love to get our hands on.

Of all the sub genres I’d ever considered as “things I’d love to play in VR”, one-on-one robot brawling games wasn’t one of them. But a developer on twitter, snowskin (@snowdehiski), has released footage of a prototype VR experience for the HTC Vive which puts your in direct motion control of your own giant mech and asked you to punch the hell out of a giant mech-monster. It reminds us in more than a few ways of Pacific Rim (2013).

The reason why this looks so promising though is not just that it features big robots (although, clearly it’s off to a head start there), but because of fusion of physics and a neat decoupled motion control system. By that I mean your in-game hands, as represented the Vive’s controllers, do not directly map to the robot’s arms; rather, you operate virtual controls which do that instead.

This means that there’s comfortable visual parity with your own hand movements and in game controllers whilst allowing the robot’s arms not to follow 1:1 to your movements. This helps with in-game clipping and in general, as your arms are punching thin air in both the virtual and physical world, there should (in theory) be less immersion breaking. It also allows much more believable animated contact between the in-game robots as they’re not slavishly following your every action. There’s also opportunity for a sense of momentum as the robot’s arms can trace a slower path than your own, alluding to their weight without making the user themselves feel sluggish within the virtual world.

This issue of in-game motion not matching physical feedback is one of the trickier faced by VR game designers, with the HTC Vive featuring 1:1 tracked controllers out of the box and Oculus’ Touch controllers on their way soon. One game which demonstrated an interesting way to deal with this issue at E3 this year was Oculus Rift and Touch title, Wilson’s Heart. As you can see from the trailer below, segments of the game which require motion control interaction with virtual scenery or props allow a level of ‘drift’ between your in-game hands and the relative position of your real hands – represented in the game as a ghost version.

At any rate, the tweets above garnered quite a bit of attention, not least from well known virtual reality enthusiast and Adventure Time creator @buenothebear (aka Pendleton Ward) who, seemingly impressed, simply commented:

There’s no indication if this impressive looking experiment, which appears to be under the working name Big Robot, will see the light of day as anything other than a prototype. If not, we can but hope @snowdehiski considers releasing the WIP version, as I really want to try it out.

The post Prototype VR Robot Fighter Reminds us of ‘Pacific Rim’ in the Best Possible Way appeared first on Road to VR.