‘Lone Echo’ Behind-the-Scenes – Insights & Artwork from Ready At Dawn [Key Scavenger Hunt]

After more than a decade of working on home and portable consoles, Ready At Dawn last year released Lone Echo and its multiplayer spinoff Echo Arena on the Oculus Rift & Touch. We talked with Ready At Dawn Founder & Chief Creative Officer Ru Weerasuriya to understand how the games came to be, the challenges faced during production, and the future of VR.

Update (2/7/18): In celebration of Lone Echo’s win as the Rift Game of the Year in Road to VR’s 2017 Game of the Year Awards, Oculus has provided us with five keys each valid for a full copy of Lone Echo on the Oculus Store. We decided to give them away in this cool behind-the-scenes article that we published back at the game’s 2017 launch.

We’re not going to make you sign up for anything to get them (though if you aren’t already following us, feel free to link on FacebookTwitter, or snag our Daily Roundup newsletter); the keys are hidden in the images of this article. Use these instructions to redeem your code through the Oculus desktop app. After you’ve redeemed your code, it would be mighty kind of you to drop your code as a reply in the comments below, that way your fellow readers don’t waste their time trying to redeem codes which have already been claimed.

Original Article (7/24/17): “Literally, from that one conversation, everything happened,” an animated Ru Weerasuriya told us, talking on the eve of Lone Echo’s launch. He’s referring to a conversation he had with Jason Rubin—Head of Content at Oculus—way back in late 2014.

“We were in the last six months of working on The Order (2015) and we had ideas in the team about what we wanted to do next. I happened to meet Jason at a convention and we just took a little time to the side and started talking. He was telling me about why he had joined Oculus and what they wanted to do there, and I shared some ideas about what Ready at Dawn wanted to do.”

The team had their interest in VR piqued around 2012, at an internal conference for Sony developers.

An early prototype of PlayStation VR, codenamed Project Morpheus | Image courtesy Sony

“Shu was there with the tech team,” Weerasuriya said, referring to Shuhei Yoshida, President of Sony’s Worldwide Studios, “they were showing the very first iterations of ‘Morpheus’ which of course became PSVR. We had a chance to play, a chance to experience what VR would feel like in this generation.”

Clearly that early experience had a big impact on the team.

“Every time before that where VR had started and stopped I don’t think any of us felt, as development and content creators, that we were totally sold on VR. This time around, though, there was something different about Presence. Not just the resolution, but we truly felt that we’d be able to make people feel like they’re in another world.”

SEE ALSO
'Robo Recall' Behind-the-scenes – Insights & Artwork from Epic Games

With development on The Order wrapping up, with Ready at Dawn talking about future projects, and with a familiar face from the game development scene in Jason Rubin arriving at Oculus, there was what Weerasuriya calls a perfect storm.

“But we didn’t want to just start making a VR game, we wanted to figure out what about VR we needed to do differently. And that’s how it started: we created a movement model, trying to figure out how to break certain boundaries in VR. We did a very, very small demo. It was just this little room with the movement model with a controller and two balls that were basically your hands. As you pressed the bumper buttons you would basically reach out to the world and move around. From there Oculus decided to sign it on.”

Prototypes & Development

Space suit concepts | Image courtesy Ready At Dawn

And so work began in earnest in May 2015 on the project that was to become Lone Echo. At the time, however, it had a different name. “It was called Ascendant,” Weerasuriya said, “since we were creating zero-G movement and a zero-G experience how better to do it than to do it in space, in this plausible vision of a future for humanity.”

Turning introspective for a moment, he continued. “What we didn’t understand at the start of the project was what VR was gonna be about. So we started off with the idea of this story-driven single player game, same kinda thing that we’ve done in the past. But we had to relearn the basics of building a game to build it in VR.”

This learning extended to team sizes. “At the very beginning it was a handful of guys, four or five guys. Then after the demos the team went from 5 to 15 people very early on.” This is a number that crops up a lot when talking to VR teams working on AA content. It didn’t stop there, though. “We ended up, at the height of the project, at about 60 people on Lone Echo,” he told us. “It’s a pretty big team. A very big team, for VR definitely.”

From one conversation and the demo it spawned, Lone Echo and its evocative setting began to take shape. Before fully committing to a direction however, another demo was required to explore the idea further.

“The second demo was a space station and a big ball that was Saturn.” And the visual fidelity? “It was all grey-box, very basic.”

Weerasuriya shared stories about people just hanging out in this demo, as simple as it was, and how the team knew they were on the right track. Remembering back to those days, and indeed our own formative experiences with early VR content, we know exactly why this would have been captivating for people. Indeed, in the final game it’s possible to lose a lot of time just soaking up the atmosphere.

On the subject of time we shared that Road to VR’s Scott Hayden, in reviewing the game, spent around six hours inside. Was that about par for a play through? “That’s actually on the shorter side,” he suggests. Of course a lot comes down to the individual; some people race between objectives on a sprint to the finish, while others stop to smell the virtual roses.

The Tech Behind the Magic

A finished corridor aboard the Khronos II space station | Image courtesy Ready At Dawn

With the team in place it was time to move beyond the grey boxes into a world of 90Hz refreshes and forward renderers. Was there a temptation to jump into engines such as Unity or Unreal to jumpstart development? As it turns out, no.

“Everything is proprietary. It’s still the RAD engine. It’s the engine that shipped The Order, it’s the engine that shipped Deformers.”

Perhaps that answer isn’t surprising. Ready At Dawn are famed in the industry for being able to extract every ounce of power out of a host system, notably with their work on the God of War franchise on PSP and PS3, and more recently The Order: 1886 on PS4.

It’s one thing to build for a more traditional console title, however, and something else entirely to build for VR. When quizzed about the pitfalls and challenges in moving the RAD engine into VR, Weerasuriya seems entirely unfazed: “Those are the barriers that we love having, put it that way.” A little bit of revolution to mix in with the usual technological evolution then.

Were they not perturbed at the challenges of making their engine work with the intricacies of VR rendering?

“The reality is that everything is hard. At the end of the day it doesn’t matter whether you work on mobile, or VR, or PS4. Whatever it is, everything is hard,” Weerasuriya said. It’s a very pragmatic view to take, and suggests a fearlessness in the face of the technical hurdles. “It’s a brand new beast. You have no idea what it’s going to be like.”

“The engine actually adapted quite well where the rendering was concerned. Yes: hitting 90 FPS is a must and therefore we needed to find inventive ways to actually get there, but I trust [our tech team].”

The larger challenges, it turns out, were to come in an area that works hand-in-hand with the pure rendering capabilities of an engine: believable human characters driven by AI.

Continued on Page 2: Believable Characters »

The post ‘Lone Echo’ Behind-the-Scenes – Insights & Artwork from Ready At Dawn [Key Scavenger Hunt] appeared first on Road to VR.

Oculus Teams With Crytek to Share VR Locomotion Experiments

Oculus recently published the first installment of their ‘Developer Perspectives’ video playlists, detailed on the developer blog. Crytek’s VR movement research is presented by Julius Carter, Game Designer at the studio behind award-winning VR games 

Artificial VR locomotion—moving the player around the virtual world in ways that takes them beyond their available physical space—is an ongoing challenge for VR developers, due to the need to traverse virtual environments larger than the real play space, and its potential to cause disorientation and nausea if done incorrectly.

In order to figure out what would and wouldn’t work in the context of VR locomotion, Crytek has run a vast number of experiments, many of which informed the locomotion design of their first two VR titles, The Climb and Robinson: The Journey. Over the coming weeks, Crytek says they they will publish some 40 videos exploring those experiments and the thinking behind them. The videos will be added over time to this YouTube playlist which presently has seven already published:

Some conventional locomotion techniques, established over decades of screen-based game development, such as WASD, joystick inputs, and button sprinting can cause discomfort for some users in VR, whereas others, such as button jumping, don’t seem to be much of a problem. Alternative rotation methods like snap turning or ‘compass rotation’ have proven to be successful at reducing the chances of nausea, and variations of this are found in many VR titles as a ‘comfort mode’ option.

These methods, together with some problematic alternative tests, such as ‘scaling rotation’, where real head rotation is amplified in VR, are explained in the first set of videos. Carter also describes the importance of using an appropriate test environment for the type of experience you’re creating, and the challenges involved in testing so many types of locomotion. Much of this information will be familiar to VR developers, but it’s a useful starting point for those new to the medium.

The post Oculus Teams With Crytek to Share VR Locomotion Experiments appeared first on Road to VR.

Live Mixed Reality Demo Shows Seamless Cross-AR/VR Collaboration

At today’s Microsoft Build 2017 presentation, creators at Cirque du Soleil demonstrated a custom creative toolset on stage using HoloLens technology. The world-class theater production company says the tools can be used to greatly enhance and accelerate the set design process for future shows thanks to real-time local digital collaborative design. But, speaking to Microsoft’s self-described Mixed Reality “spectrum,” it was also demonstrated how a remote VR user could seamless collaborate as if they were standing on stage with the others.

As the largest theatrical production company in the world, Cirque du Soleil is famous for its spectacular contemporary circus shows which use huge sets and equipment. When creating a new show, the design process from initial idea to opening performance takes 18 to 24 months, an operation that could significantly benefit from mixed reality tools.

Representing the first ‘deep collaboration’ between Cirque’s innovative C:LAB and Microsoft, the demonstration involved mocking up a prototype set on the Build 2017 stage using a collaborative augmented reality visualisation.

Equipped with HoloLens headsets, Geneviève Pesant, Product Manager at C:LAB and Carl Fillion, Scenic Designer at Cirque, appeared to rapidly construct a set. Objects were created, moved, and resized with a speed and precision that was clearly scripted – forgivable as there was a lot to explain in a limited time. The rendering itself certainly appeared to be running in real-time.

After blocking out some shapes the pair were briefly joined on stage by the VR avatar of Michel Laprise, a writer and director at Cirque based in Europe, who appeared to be speaking to them using one of the new Mixed Reality headsets and motion controllers. While brief, this showed seamless collaboration between AR and VR users, and is representative of why Microsoft likes to lump both types of device into the “Mixed Reality” category.

Chantal Tremblay, Director of Creation at Cirque, said “we’re really happy to be part of this development of Microsoft HoloLens. Innovation for us at Cirque is our priority. Today, as we’re growing faster than ever, and we’re creating more and more shows, we really think that this technology will help us to become more agile”.

The post Live Mixed Reality Demo Shows Seamless Cross-AR/VR Collaboration appeared first on Road to VR.

Crytek-Incubated ‘VR First’ Program to Double Number of Academic VR/AR Labs in 2017

VR First, the global initiative to seed academic institutions with VR/AR hardware and software, today announced that it’s nearly doubling the number of its VR labs in universities and science parks across the globe by the end of 2017. With currently 26 labs operating in the United States, Europe, Asia and Oceania, VR First is bringing the number to 40 labs by the end of year.

Unveiled last January by Crytek, the now fully-independent VR First program is designed to foster innovation by not only creating independent VR labs where they once weren’t, but by also converting PC labs into AR/VR-ready facilities—and it’s doing it in top universities and science parks across 23 countries including Purdue University, University of Florida, University of Southern California Viterbi School of Engineering, Oklahoma State, and Vancouver Film School.

image courtesy VR First

The idea is to stimulate development in VR where questions can be answered best, and in a place where projects can be developed without the heavy startup cost associated with PC VR.

Currently VR First is boasting more than 50 projects in development at the labs, and not just games. In fact, games only account for 35% of projects underway, as the rest are focused on fields like psychology and neuroscience (12%), education (7%), tourism (7% ), and architecture and real estate (6%). Development is divided evenly across the HTC Vive and Oculus Rift platforms, each accounting for 31% of projects created, with other major VR platforms such as Samsung Gear VR, OSVR, and Daydream rounding out the bottom numbers. While the program was principally incubated by Crytek, students working in VR First labs tend use Unity (48%), followed by Unreal (20%) and then CRYENGINE (14%).

image courtesy VR First

AR headset distribution is much more dramatic, as students are mainly developing on Microsoft HoloLens (43%), with Google Glass, Vuzix AR headsets, Epson Moverio, Meta AR Dev Kits making up the rest of the pie chart. A full 25% of projects however are reported to be using ‘other’ hardware platforms, but considering the massive number of Google Glass-style headsets out there, it’s no wonder they didn’t name them all.

With a growing reach comes a bigger voice too, so VR First is also helping to push a new Institute of Electrical and Electronics Engineers (IEEE) standard, called IEEE P2048.5, which focuses on setting quality assurance and testing standards for VR/AR hardware in regards to fully immersive environments. To that end, VR First is also partnering with benchmarking company Futuremark to support benchmarking requirements through its standardization efforts. Through its Lab Renovation Program, VR First is promoting the adoption of these new standards by lab partners, governments and science parks.

development at Carleton University, image courtesy VR First

“The progress VR First has made in just its first year, from 26 labs open and more coming soon, to its growing technology partner network and the unveiling of dozens of VR projects developed at VR First Labs, is excellent momentum to democratize the innovation VR/AR landscape,” explains Ferhan Özkan, co-founder, VR First.  “With more than 65% of universities planning dedicated VR labs, plus science parks and governments interest to do the same, continued growth of our efforts is without question. We invite all industry technology providers and stakeholders to join us in this meaningful program.”

The program, which also includes a network of participating schools, has over 500 universities and science parks worldwide. The broader-reaching network is designed to help developers convert ideas into business opportunities and introduce them to established industry partners.

VR First is fully independent, and supports VR First labs no matter their choice of engines or platforms, and welcomes all hardware and software providers. We believe that if you have a noble vision like “Democratization of VR/AR Innovation,” it deserves to be protected from any competitive sacrifice. This vision could only be achieved with a vendor-neutral approach,” added Özkan.

VR First Institutions (including new openings)

  • Aalborg University Copenhagen, Denmark
  • Academy of Art San Francisco, USA
  • Bahçesehir University (BAU), Turkey
  • California State University Monterey Bay, USA
  • Canterbury University, New Zealand
  • Carleton University, Canada
  • Comenius University Bratislava, Slovakia
  • Dania Academy of Higher Education, Denmark
  • Darmstadt University of Applied Sciences, Germany
  • Doña Ana Community College, USA
  • Graz University of Technology, Austria
  • HTW Berlin, Germany
  • Ilia State University, Georgia
  • Kajaani University of Applied Sciences, Finland
  • LLC Technology Companies Development Center, Ukraine
  • Manchester Metropolitan University, UK
  • Middle East Technical University, Turkey
  • NHTV Breda University of Applied Sciences, Netherlands
  • North Carolina State University, USA
  • North Metropolitan TAFE, Australia
  • Oklahoma State University, USA
  • Paneuropean University Bratislava, Slovakia
  • Purdue University, USA
  • Rochester Institute of Technology, USA
  • RUBIKA, France
  • Sogang University, South Korea
  • South East European University, Macedonia
  • State University of New York at Oswego, USA
  • Tallinn University of Technology, Estonia
  • Universität Hamburg, Germany
  • Universität Heidelberg, Germany
  • University College Cork, Ireland
  • University College London, UK
  • University of Florida, USA
  • University of Southern California (USC), USA
  • University of the Aegean, Greece
  • Vancouver Film School, Canada
  • VIGAMUS Academy, Italy
  • Vilnius Gediminas Technical University, Lithuania
  • Warsaw University of Technology, Poland

Update 04/13/2017: the article was updated to clarify that VR First initially incubated under Crytek, but is now a fully independent program. 

The post Crytek-Incubated ‘VR First’ Program to Double Number of Academic VR/AR Labs in 2017 appeared first on Road to VR.

On the Hunt for VR’s Killer App with Vive’s China President, Alvin Wang Graylin

Everyone in the VR industry can envision a world in the next 10 years that’s radically changed by virtual reality. From healthcare, education, social, training, cinema, gaming, and more, VR has a lot of Killer Use-cases. But it seems most of the industry is in agreement that the Killer App—a single, platform-defining piece of software that compels buyers—has not yet arrived. Vive’s Alvin Wang Graylin weighs in on how we might come to find it.

We’re featuring insights on the hunt for the killer app from virtual reality’s leading companies. Today we hear from Alvin Wang Graylin, HTC’s China President of Vive.

Alvin Wang Graylin

alvin-wang-graylin-headshotGraylin is the China President of Vive at HTC leading all aspects of the Vive/VR business in the region. He is also currently Vice-Chairman of the 300-member company Industry of Virtual Reality Alliance, President of the $15 Billion Virtual Reality Venture Capital Alliance, and oversees the Vive X VR accelerator in Asia. He has had over 22 years of business management experience in the tech industry, including 15 years operating in Greater China. Prior to HTC, Graylin was a serial entrepreneur, having founded four venture-backed startups in the mobile and internet spaces, covering mobile social, adtech, search, big data and digital media. Additionally, he has held P&L roles at several public companies.

Road to VR:
What traits do you think VR’s Killer App needs to have?

Graylin:
The concept of killer app applies more to application specific platforms like a defining AAA game (i.e. Halo) for a specific game console or perhaps Lotus 123/Word Perfect for the original business-focused PC. VR may not really fit this categorization, as its application can/should be much broader than a single user group. It’s like asking what’s the killer app for the Internet. It’s true the initial core users of VR today are largely gamers, but that’s going to change very soon as more high quality content/titles of various categories become available.

SEE ALSO
Vive Consumer and Business Headsets Will Become Increasingly Differentiated VR Systems, Vive President Says

Road to VR:
If you had to make a bet, which sector of VR would you predict as the place where the first Killer App emerges?

Graylin:
Although I believe there will be many ‘Killer apps/content’ for VR, I believe the first type of ‘Killer app’ to attract a mass audience will likely be a VR MMO built upon a big IP… it’ll be much more of an experience/discovery content with extremely high replay value vs. a hardcore game. There are a few such projects already in the works and I am very much looking forward to their release. Maybe the best virtual reality app is just an alternate reality.

Additionally, given the passive nature of the mass market today and it’s acceptance of basic video viewing, a second natural mass adoption VR use case is 360 degree life streaming. It’ll likely start from 1-2 celebrities streaming access to their exclusive lifestyles, and quickly move into any individual streaming their life’s special moments, then soon after their most mundane moments… It’s just an extension of what people do on Facebook and Instagram or WeChat today. Once 360 degree cameras and streaming is built into low cost devices, this use case will explode.

The real killer application of VR I’m more excited about is core curriculum VR Education. It’ll take longer to gain traction as there’s many moving parts involved and the education industry is generally a slow adopter, but when it does happen, it will have the biggest long term impact on our world/society as a whole. It will likely happen first in Asia where governments and parents prioritize their children’s education over all else.

The other impactful use case for VR is in collaboration and productivity. If we no longer need to do business travel or commute to work without compromising effectiveness, how cool would that be? It’ll be possible very soon in VR, and when it happens, the cost/time savings and productivity gains it creates will force companies to adopt it in droves. This may even happen faster than consumer mass market, as price is much more elastic for this market.

Road to VR:
Do you think VR’s Killer App will launch in 2017?

Graylin:
Initial versions of above applications could happen in late 2017 or the first half of 2018. But likely it’ll be the second half of 2018 or 2019 for mass adoption of such apps to really take off. The reason for the timing is a combination of device availability at mass market price points, the install base needs time to build up, and high quality apps/content just take time to create.

SEE ALSO
Vive President Says Next-gen VR Headsets Likely to Come in 1 to 3 Year Cycles

More from the ‘On the Hunt for VR’s Killer App’ Series:

The post On the Hunt for VR’s Killer App with Vive’s China President, Alvin Wang Graylin appeared first on Road to VR.

Hands-on: ‘VELOCIBEASTS’ Explores Another Promising Form of Experimental VR Locomotion

Moving around in VR isn’t a solved problem. In 5 years, the best games and experiences won’t be using point-and-click teleportation. At GDC 2017 last week, I saw two promising new types of experimental locomotion in VR. First was Sprint Vector, which greatly surprised me at its comfort, speed, and fun. And then there was VELOCIBEASTS.

The teleporting movement we see in many of today’s VR games is a crutch. We all know how we want to be able to move in VR: quickly, interactively, intuitively, and comfortably. But how is the question. And until we have an answer, many developers are turning to teleportation. But we need experimentation to continue, and thankfully developers like CHURROBOROS are taking on that mantle.

While it’s technically a type of teleportation—which I have issues with as immersion breaker—the locomotion technique in Velocibeasts is moving away from the worse parts of teleportation by adding increased interactivity, more freedom of movement, and direct integration into gameplay.

The game’s method of locomotion doesn’t just get players through a traditional game world, it’s tied directly into how the game is played. In this case, the player’s weapons can be thrown like discs, and the player can teleport to that disc at any time.

This makes Velocibeasts’ locomotion immediately more immersive because it gets your body into the game in a way that lazily pointing your controller and clicking a button does not. Each movement is a conscious choice because you need to do something beyond just deciding where to go—you need to effectively aim your throw in both direction and distance, something you can actually get better at over time with practice, by honing your physical throwing ability

This method also enhances freedom of player movement in a way that point-and-click locomotion cannot; the ability to freely move along the Z-axis. Point-and-click locomotion requires your cursor to intersect with some part of the game world, which defines the Z height of your teleport. With Velocibeasts’ thrown approach, players can quickly and easily teleport themselves 100 feet forward and 50 feet upward, unrestricted by where the game’s geometry is placed. Importantly, as above, this capability feeds back into the gameplay.

SEE ALSO
7 Ways to Move Users Around in VR Without Making Them Sick

When talking about VR locomotion, I always feel it’s important to point this out: Churroboros didn’t invent thrown teleportation. As with all VR locomotion, and just like Survios’ Sprint Vector, they’ve borrowed and remixed other methods and experimented to add their own flavor. This experimentation is healthy and should absolutely continue.

The VR games best making use of teleportation locomotion today are doing the same things as Velocibeasts—ensuring that the locomotion is core to the gameplay and is interactive. And if there’s a narrative, it should be part of that too, not just a crutch to get players through an otherwise normal game world.

budget-cutsTake Budget Cuts for instance. The lore of the game revolves around the fictional company ‘Trans Corp’ which “develops and refines space-time technology to perfection,” and is equipping spies with their tech to infiltrate competitors. I previously called Budget Cut’s locomotion a “lesson for VR developers:

Navigation is done with a portalling/blinking system, which essentially means you choose where you want to go and you appear there. We’ve seen systems like this before, and they work quite well to allow players to cross large virtual distances without any motion sickness. But without any context, portaling/blinking navigation can seem out of place. Budget Cuts instead makes sense of the system by making it part of the gameplay and the game lore.

It works like this: the player holds two futuristic-looking multi-tools which can be used to grab objects and store inventory items inside of them. The multi-tools can also turn into a teleportation gun by clicking on the thumb-pad of the Vive Pre controller. From here, the player can shoot the portal gun which lobs a blue ball into the world. This ball will bounce off of surfaces until it hits the ground. Once planted, the multi-tool shows a small circular window of the new location on a floating display which can be articulated using your hands. When you want to teleport to the location, you squeeze the side buttons on the Vive controller and the window into the new location envelops you, suddenly transporting you to the new space.

This system is quick and seamless, and smartly ties into the gameplay, grounding it within the world and making it seem sensible rather than out of place. For instance, at certain points in the game you’ll be blocked by a locked door. If, however, you’re able to find a vent which you can fire your portal ball through, then you’ll be able to teleport to the other side even though you couldn’t fit yourself through the vent.

The portal preview window is also useful; you’ll be trying to avoid patrolling robots, and looking through the window to make sure the coast is clear, before actually teleporting, is a must for remaining undetected. In the game you’ll find yourself lobbing the portal ball around corners and then holding the window up to look in every direction before jumping through.

As for Velocibeasts, it’s still quite early. Developer Churroboros says the title is in development with a planned release date at the end of the year. And while I’ve praised the game’s approach to locomotion, I had a fair deal of feedback which I gave to a member of the development team after playing it.

The novel locomotion method is a good step in the right direction and a great seed of an idea, but the present attempt to turn what still feels like an experiment into a full blown multiplayer game feels like a misstep to me.

Even in a basic mouse + keyboard FPS, where all the control and locomotion issues have been long worked out, creating compelling, balanced gameplay is still challenging. Because the experimental locomotion in Velocibeasts is core to its gameplay, finding multiplayer balance for compelling gameplay is going to be doubly hard while continuing to try to hone the locomotion system. Every time the locomotion is tweaked, the balance could be thrown off, and vice versa.

My feeling is that—like Budget Cuts—Churroboros may be better off fleshing out this experimental locomotion and the gameplay therein in a single player game or demo before committing to the challenges of multiplayer.

The post Hands-on: ‘VELOCIBEASTS’ Explores Another Promising Form of Experimental VR Locomotion appeared first on Road to VR.

Hands-on: ‘Sprint Vector’s’ Breakthrough Locomotion Could Inspire an Entirely New Genre of VR Games

“Adrenaline platformer” is the apt descriptor that Survios is using for their newly announced title, Sprint Vector. The core of the game is what the company is calling the Fluid Locomotion System, a synthesis of VR movement techniques seen elsewhere that together add up to a supremely satisfying way to move around VR worlds at high speed without getting dizzy.

Moving players through virtual spaces is presently one of VR’s biggest challenges. Basic first-person locomotion—the foundation upon which a major part of the last two decades of game design has been built—makes many users nauseous when applied to VR. The industry has been researching and uncovering new techniques to move players across large virtual spaces in a ways that are comfortable. Some of the popular methods are putting players in cockpits (largely applicable to vehicle games), blinking/teleporting (where players click where they want to go and instantly appear there), or no virtual locomotion at all (designing the game to not require any virtual movement).

Except for the cockpit method (which doesn’t work thematically with non-vehicle games), few VR locomotion methods discovered so far allow players to move quickly and immersively across large distances.

SEE ALSO
7 Ways to Move Users Around in VR Without Making Them Sick

Enter Sprint Vector’s new approach to VR locomotion which has players literally racing through virtual environments by means of direct interaction with the game world. At first glance it’s the sort of virtual movement that VR veterans would suspect would lead to instant nausea. And while it’s too early to say if it will work for every VR player (as nausea can be triggered differently from one player to the next), my hands-on time with the Fluid Locomotion System in Sprint Vector has astounded me. It didn’t only let me race through virtual space with no nausea, it was also incredibly fun.

So how does it work? At its core, the Fluid Locomotion System works by the player pulling a trigger on their controller and then swinging their arm backward as they release the trigger. This propels the player forward with a bit of momentum. Your other arm does the same thing, and using both in a swinging or running motion gets you into a continuous cycle of propulsion that lets you ‘skate’ through the world. Doing so quickly makes you move even faster. Vibrations in the controllers help you feel how much each swing of your arm is contributing to your momentum, which lets you quickly realize if your form is good or needs adjusting.

And while skating or running in this way is the primary method of movement, it gets seamlessly blended with jumping, flying, climbing, and swerving.

To jump you use another button on the controller to pull and release which gives you a little boost upward, moreso if you time it just right. You can double jump too, by doing the same with the other controller while already in the air.

Once you’re in the air, you can also have brief moments of flight. You control your flight by pointing both hands out in front of you like Superman, aiming your direction based on where you point your hands.

Climbing works by grabbing onto special hand-holds on walls and then using the controller to fling yourself upward.

Then there’s swerving, which uses a variety of inputs from your head and hands to let you quickly juke side to side, which comes in especially handy for dodging obstacles that would otherwise slow you down.

It’s clear why Survios is calling this the Fluid Locomotion System; all of these different forms of movement work together cohesively in Sprint Vector to add up to a thrilling race through the virtual world. As a player you feel deeply in control of how you’re speeding through the level, with your ability to weave each skill together determining how quickly you can complete each the stage.

Another reason the Fluid Locomotion System is compelling is because it keeps you immersed. Up to now, immersion and movement in VR have largely been a tradeoff. Blinking lets players move across large spaces, but over millions of years our brains have evolved a spatial sense that relies partly upon seeing the world move around us to map our surroundings; constant teleportation in VR is an immersion killer because it doesn’t let you map the virtual world in the same way that you do the real world. The Fluid Locomotion System, on the other hand, lets you see the world as you move through it, and asks you to directly interact with it at every move, further reinforcing the realism of the virtual world around you.

SEE ALSO
'Raw Data' Developer Survios Raises $50 Million, Now Top Funded VR Dev Studio

The significance of Sprint Vector and the Fluid Locomotion System should not be underestimated. Doom (1993) didn’t invent the mechanics of the first-person shooter, but it wrapped up the locomotion and control learnings of prior works into a functional and compelling package that inspired widespread adoption of the game itself and an entire genre to come after it. I think Sprint Vector has a good shot at doing the same.

– – — – –

As for Sprint Vector itself, Survios insists that it’s still very early days for the game, and say they still have lots of improvements and refinements they want to make to the Fluid Locomotion System. So far they aren’t committing the game to any particular VR platforms (though it was demonstrated at GDC on the HTC Vive, so that’s a pretty good bet), and (sadly) aren’t ready to talk about a launch date yet.

The post Hands-on: ‘Sprint Vector’s’ Breakthrough Locomotion Could Inspire an Entirely New Genre of VR Games appeared first on Road to VR.

Survios’ New Title ‘Sprint Vector’ Could be a Watershed Moment for VR Locomotion

When it comes to VR, best practices say that the player’s in-game movement should be static, and if the player needs to move, it should be within their physical space, or through some locomotion method that’s comfortable in VR, like blinking or teleportation. Then along comes Sprint Vector, the latest title from Survios, the developer of Raw Data, which smashes those best practices with surprising success.

While there’s generally recommended rules for how to let players move inside of VR to avoid nausea, every once and awhile someone comes along and changes our expectation of the limitations of VR locomotion with some inventive thinking.

SEE ALSO
7 Ways to Move Users Around in VR Without Making Them Sick

Lucid Trips is one such app. Using a combination of players physically ‘pulling’ themselves through the world with their hands, and then pushing off the ground for short bouts of flight, the game’s locomotion—that by all accounts seems like it should be a recipe for instant nausea in VR—works surprisingly well, and made fluid navigation across large spaces work in VR.

Now comes Sprint Vector which takes this idea and throws it into overdrive with a system the developers are calling the Fluid Locomotion System. Instead of pulling your body through the world, you’re swinging your arms quickly to achieve a fast sprinting motion as you dodge, wall jump, veer, and occasionally fly through the air as you guide your motions with outstretched arms like superman.

View post on imgur.com

The studio calls the game an “adrenaline platformer that merges the physical thrill of high-octane athletic competition with the unhinged energy of zany interactive game shows, all powered by a proprietary motion system that allows for a new level of immersion.”

From what we’ve seen so far, Sprint Vector turns VR locomotion on its head. Moving from the norms of static, cockpit, or blink-based movement systems to a full-speed virtual dash. The game didn’t invent this type of locomotion (and for that matter, neither did Lucid Trips), but it certainly looks to have pushed the approach on locomotion into all new territory, and found a way to build gameplay directly around it.

Now, having built up some skepticism about VR locomotion after watching this space for many years, we wouldn’t trust just anyone to tell us that they’d trashed the generally accepted rules of VR locomotion and gotten away with it. However, Survios, who is behind the popular Raw Data, is one of the most senior and now the top funded VR studios in the industry. That is to say that we have a bit of faith that Sprint Vector is more than a little indie locomotion experiment.

We’ll be trying the game for ourselves soon and are eager to find out how it feels to move this fast through VR. If Survios has really made it work, it could be a watershed moment for locomotion in VR, opening up new fast-paced VR gameplay opportunities and, self-evidently, entirely new games.

The post Survios’ New Title ‘Sprint Vector’ Could be a Watershed Moment for VR Locomotion appeared first on Road to VR.

On the Hunt for VR’s Killer App with Unity’s Head of VR & AR, Tony Parisi

Everyone in the VR industry can envision a world in the next 10 years that’s radically changed by virtual reality. From healthcare, education, social, training, cinema, gaming, and more, VR has a lot of Killer Use-cases. But it seems most of the industry is in agreement that the Killer App—a single, platform-defining piece of software that compels buyers—has not yet arrived. Unity’s Tony Parisi weighs in on how we might come to find it.

Every day this week leading up to the 2017 Game Developers Conference in San Francisco, we’re featuring insights on the hunt for the killer app from virtual reality’s leading companies. Today we hear from Tony Parisi, Head of VR & AR at Unity.

Tony Parisi

Tony-Parisi-1Parisi is a virtual reality pioneer, serial entrepreneur and angel investor. He is the co-creator of 3D graphics standards, including VRML, X3D and glTF, the new file format standard for 3D web and mobile applications. Parisi is also the author of O’Reilly Media’s books on Virtual Reality and WebGL: Learning Virtual Reality (2015), Programming 3D Applications in HTML5 and WebGL (2014), and WebGL Up and Running (2012). He is currently Head of VR and AR at Unity Technologies, where he oversees the company’s strategy for virtual and augmented reality.

Road to VR:
What traits do you think VR’s Killer App needs to have?

Parisi:
The first killer apps have already made their appearances, they are phenomenal and more will ship in 2017. That said, with each new innovation and upgrade in hardware/software a new crop of ‘killer apps’ will be created. It’s really just up to the creators and inventors to break new molds using each iteration of technology. As to today? Well, here are some examples in various genres:

Creativity apps – Tilt Brush
This is a foundational app that represents the evolution of painting/illustrating in the digital medium. While it might seems super obvious it’s a killer app not only because it blurs the lines between sculpting and painting but it also allows anyone to have the freedom to create, in a fully three-dimensional space. Add to that the recent updates that allow for animating your creations and you’re going to see some amazing works of art.

World Building – Job Simulator
It’s impressive and a great early sign that Owlchemy Labs was able to gross $3 million at this stage in the VR life cycle. Why is this a killer app? It’s one of the most creative takes on what it means to build and interact in a 3D virtual world, immersing users in the future’s take of the present. Like another great title, Fantastic Contraption, they make effective and interesting use of a room scale setup.

Storytelling
Sundance has a plethora of amazing experiences that pushed the boundaries of creativity. ASTEROIDS! utilized animation and interactive storytelling in both a beautiful and engaging way which would unfold differently depending on whether you were a passive or interactive user. ZeroDays adapted the documentary format to VR effectively and in an immersive way and was one of first of its kind. Overall, we hit a watershed, where the medium has moved beyond experimentation into the first natively-designed storytelling VR experiences that truly take advantage of the technology.

You might wonder how I can be talking about killer apps when there are only a few million devices out there. But I don’t judge an app’s ‘killer-ness’ by current scale. You just need to imagine that these kinds of apps will explode in popularity as the systems are adopted in greater numbers.

SEE ALSO
Unity Raises $181 Million Series C in Anticipation of VR/AR Growth

Road to VR:
If you had to make a bet, which sector of VR would you predict as the place where the first Killer App emerges?

Parisi:
I believe there will be some common attributes of killer VR apps including:

  • Engaged Creativity: What I mean by this is, creativity doesn’t just come from the creators. Killer experiences will need to invite viewers/players/experiencers into the experiences and offer them the opportunity to continually engage themselves, engage with the experience, and engage with others who may be in at the same time. We’re a ways out from the ‘choose your own adventure’ style of VR experiences but it’s coming
  • High Production Value: Seems like something that’s self evident, but to fully immerse people, to the point where they may even question what ‘reality’ is, requires thoughtful and thorough storytelling, beautiful and compelling graphics, and increasing levels of interactivity, whether that’s conversations, physical interaction, or social constructs that promote engagement.
  • Social Interactions: I mentioned it earlier, but it’s worth reiterating, social will be a huge component to amazing VR experiences. It will make the difference between immersing yourself and losing yourself. To share is human, and this remains true for VR Killer Apps.

Beyond all that, everything else is the about fully utilizing the features that the software and hardware allow (motion, gestures, movement, graphics etc).

Road to VR:
Do you think VR’s Killer App will launch in 2017?

Parisi:
I don’t think there’s just one ‘killer app’, there will be many, across many different industries, and 2017 promises to reveal a lot more amazing apps. We are seeing Unity being used to develop applications in education, training, healthcare, design, manufacturing, film, and automotive, to name just a few. Most of these applications are still at the stage of early experimentation, with organizations figuring out how to make their 3D models and scenes interactive VR applications using Unity as a platform. Aside from understanding how to tell stories, there is a slew of technical and design work to be done, but the foundation is being laid now and it’s exciting to see.

See Also: 2017 Is Going to be a Watershed Year for Cinematic VR

The post On the Hunt for VR’s Killer App with Unity’s Head of VR & AR, Tony Parisi appeared first on Road to VR.

On the Hunt for VR’s Killer App with Epic Games’ Technical Director of VR & AR, Nick Whiting

Everyone in the VR industry can envision a world in the next 10 years that’s radically changed by virtual reality. From healthcare, education, social, training, cinema, gaming, and more, VR has a lot of Killer Use-cases. But it seems most of the industry is in agreement that the Killer App—a single, platform-defining piece of software that compels buyers—has not yet arrived. Epic’s Nick Whiting weighs in on how we might come to find it.

Every day this week leading up to the 2017 Game Developers Conference in San Francisco, we’re featuring insights on the hunt for the killer app from virtual reality’s leading companies. Today we hear from Nick Whiting, Technical Director of VR & AR at Epic Games.

Nick Whiting

nick-whiting-epic-gamesWhiting oversees the development of the award-winning Unreal Engine 4’s virtual reality efforts. In addition to shipping working on Robo Recall, Bullet Train, Thief in the Shadows, Showdown, and Couch Knights for VR platforms, he has also helped shipped titles in the blockbuster Gears of War series, including Gears of War 3 and Gears of War: Judgment.

Road to VR:
What traits do you think VR’s Killer App needs to have?

Whiting:
To me, the Killer App has to be something that uniquely justifies the medium. It needs to be an app that brings people into the VR ecosystem because there simply isn’t another way to get the same experience in any other way. That’s the defining characteristic!

Right now, we’re still very early in VR. As with other mediums, we’re in a period that is largely comprised of imitation of other media. For VR games, we’re largely imitating the canon of 3D games that’s been developing since the ’90s. For entertainment, we’re largely using the same techniques of framing and timing from film, but adapted a little bit to make it feel better in VR.

This isn’t anything new! You can see the same pattern in early film, which were largely imitations and recordings of stage plays or common events. It’s easy to forget that the grammar of cinematography that we know today took decades to develop into what we know of as film today! The same was true with games, which imitated sports, comics, and movies for many years before they really started breaking new ground.

robo_recall_1
Epic’s Unreal Engine 4 is one of the leading tools for VR game development. It also powers ‘Robo Recall‘, the company’s first ‘full’ VR title which is due to launch by the end of Q1 2017.

This wave of consumer VR has only really been around for a little over a year now, so I think we still have a little bit of time to go before we develop those ideas that are unique to the medium. I don’t know what they will specifically be, but I think we can hazard a guess based on the strengths.

Experiences out there right now, if we’re honest with ourselves, largely rely on the novelty of the experience of the hardware. As we all know, it’s magical the first time you put on a headset and can look around! However, without compelling content, that novelty wears off, and those experiences don’t seem quite as compelling. To make something with staying power, we need to identify what makes the medium unique, and figure out how to leverage that.

To me, the most important feature of VR is what I like to call “immersive interaction.” The idea is similar to presence, but centers more around the fact that unlike any other medium, you’re physically represented in the world, as well as your direct actions. You can not only look around, but reach out and grab things in a way that a game with its pre-baked animations can’t really match. You’re part of the action, and that builds the magical sense of presence. Because of that, I believe that the killer app must include interaction with motion controllers. It takes the immersive visuals of VR, and adds immersive interaction, which truly lets you be internalized as part of another world.

I think another very powerful extension of this is the social aspect. Social experiences in VR are so compelling because we track real human motion. So, if I nod at you, all the parts of your brain that are trained to recognize that motion do, and you feel the presence of another human in a shared space. That’s something 2D video can’t match, and something uniquely powerful for the medium. Multiple people sharing the same virtual space with such intimacy can’t be replicated without VR. As tracking technology improves, this could truly be something that is revolutionary.

Road to VR:
If you had to make a bet, which sector of VR would you predict as the place where the first Killer App emerges?

Whiting:
Depending on how you define it, the “killer app” might already be here for enterprise. While it doesn’t move tons of headsets, or have the flash of entertainment applications, we’re already starting to see huge wins in terms of savings and cost reduction in enterprise applications, which is causing steady growth for VR usage.

A simple example is in the architecture and construction industries. When a client orders a multi-million dollar building, the architect has to do his best to give the client an idea of what the finished product will be like, years before it’s even built. While renderings and previsualization can give you a great sense of the style and aesthetics of a building, it is distinctly lacking in some of the “human factors” of how the space feels. Because of this, large-scale projects often spend large amounts of money after construction has begun to redesign and redo work once the client has been able to physically stand in a space. Savvy builders and architects have realized that this can be greatly reduced through putting the client in a VR mockup of the space, which allows them to get a better feeling for the final product, and make those changes while it’s still on paper, rather than already half-built!

SEE ALSO
HoloLens App Envisions Immersive Future for Architecture and Construction

You can easily expand that to many other areas of engineering and design, where ideas and concepts have to go through separate teams in order to bring a product to fruition. Being able to have everyone visualize a product while it’s still in early planning helps ensure that everything from design to construction to training can be accomplished before any fabrication has begun. That’s a huge cost and time savings, and given VR’s proliferation in those industries, I think that deserves to be called a Killer App.

Road to VR:
Do you think VR’s Killer App will launch in 2017?

Whiting:
Of course, because Robo Recall launches Q1 2017! In reality though, I think that 2017 is somewhat optimistic for a killer app. Great content, yes. But, killer apps are built on the shoulders of countless lessons learned from the apps that came before them. While we’re starting to see a lot of great content trickle out (the mainstream market attention of Resident Evil 7 and Rez Infinite are great indicators), you have to remember that great content generally takes two or three years to develop.

SEE ALSO
Latest Figures Suggest 'Resident Evil 7' Could Have Some 280,000 PSVR Players

It was only last year that consumer headsets were first widely commercially available, and it was only a few months before that when the big players announced release dates and pricing. Because of that, many of the traditional funding vehicles that create killer content didn’t kick in until a little over a year ago. That means many projects that took that initial round of funding still have about a year to go before they see the light of day. Because of that, I think 2018 is going to be the year where we start seeing a wider variety of great content from a variety of developers, and hopefully our killer app is somewhere in that batch.


More from the ‘On the Hunt for VR’s Killer App’ Series:

The post On the Hunt for VR’s Killer App with Epic Games’ Technical Director of VR & AR, Nick Whiting appeared first on Road to VR.