‘Rick and Morty: Virtual Rick-ality’ Review

You’re dying for season 3 of Rick and Morty (2013) to come out, and the release of episode 1 on April Fool’s Day isn’t helping. You’ve got a fever that only the drunken ramblings of the genius Rick Sanchez and his level-headed, albeit hopelessly outmatched grandson Morty Smith can cure. The good news: Rick and Morty: Virtual Rick-ality (2017) is here to fill the void in your meaningless existence. The less good news: it’s basically Job Simulator (2016) expertly grafted to an episode of Rick and Morty. And you know what? Th-th-th*ugghhb*at’s just fine by me, Jack. Don’t know why I’m calling you Jack all of a sudden. Let’s just get on with the review.


Rick and Morty: Virtual Rick-ality Details:

Official Site

Developer: Owlchemy Labs
Publisher: Adult Swim Games

Available On: Steam (HTC Vive, Oculus Touch), Home (Oculus Touch)
Reviewed On: HTC Vive, Oculus Touch
Release Date: April 20, 2017


Gameplay

In Owlchemy Lab’s new Rick and Morty VR game, you’re lower than the low. Not only are you a Morty, but you’re a Morty-clone who has less purpose (and respect) in life than a butter-fetching robot. The only thing that might be construed as a lower being on the totem pole of galactic intelligence in the game is a Mr. Meeseeks, cleverly renamed Mr. You-seeks for the purpose of the game, of which you have in infinite supply. But all he does is mirror your movements, letting you pick up objects that go out of your teleportation range, making you basically the lowest life form in the entire multiverse.

image courtesy Adult Swim Games

It all starts one day when Rick, in his infinite wisdom, conjures you up to do the simple task of cleaning his clothes. Open the washer, pop in the suds and dirty clothes, hit a button, and you’re done. Game over. But not quite. From there you take on grander tasks, like retrieving “important parts” (for his spaceship), fixing the toilet, drinking gasoline—you know, menial Morty-tasks that need doing while the real Morty goes with Rick on actual adventures.

I genuinely started to feel jealous of my namesake as he flies away on Rick’s space ship, or hops through portals while I’m stuck in the Smith’s garage charging micro-verse batteries, ordering parts online to fix more “important things”, or feeding an alien laxatives. If you can get over the fact that you’ll never truly have that free-wheeling Rick and Morty adventure so tantalizingly close to your grasp, and that you will invariably be the butt of every joke, you’ll begin to see the game for what it is: a true glimpse into the Rick and Morty universe, one that’s masterfully stitched into Job Simulator’s object interaction.

image courtesy Adult Swim Games

Even though your tasks are essentially meaningless—and believe me, there’s plenty of plumbus-bopping and bottle-smashing—the patently absurd story arch playing out before you really makes you feel like you’re in an episode of the show, albeit a subplot to a grander adventure waiting behind Rick’s portal. In unmistakable Rick-like fashion though, eventually the old man’s machinations are revealed, giving the inane object bashing that much more importance and authenticity.

Easter eggs are also everywhere, with 13 collectible mix tapes featuring silly songs and ramblings from the show’s characters. The fictional VR game Roy: A Life Well Lived, made famous in the episode Mortynight Run (2015) in Season 2, also makes an appearance in the guise of a knockoff called TROY complete with cardboard cut-outs to give it that cheap-o feel.

image courtesy Adult Swim Games

Rick’s sci-fi ‘combining machine’ alone will keep you mixing and matching in efforts to create the weirdest object combination (think growth hormone + plumbus). I played through with minimal faffing and completed the main story in a little over 2 hours, but if you’re hunting for every last one of the game’s Easter eggs, it could take you much longer.

Immersion

The brilliance of the Rick and Morty TV show is how it reaches through your television and grabs you by the ears, sometimes directly by breaking the 4th wall, but often times by disarming you with absurdity while delivering powerful messages on mortality, loss—you know, the human condition. The VR game is all of this and more. You only need a few minutes in Purgatory after your first death, listening to the devil’s secretary tell you about why you shouldn’t reanimate back into the game to see what I mean.

From Rick’s lovingly recreated garage-lab, to all of the interactive items ripped straight from the show (including low poly 3D versions of Rick, Morty and Summer), there’s a feeling of familiarity that fans will definitely click with. But there’s something more insidious lurking in Rick and Morty: Virtual Rick-ality though.

image courtesy Adult Swim Games

The show’s characters get in your head in VR in a way the TV show just can’t. Because you’re physically in front of the almighty Rick (voiced by show creator Justin Roiland) you can’t help but seek his approval, if only so he doesn’t dismiss you as just another stupid Morty-clone. You begin to wear Morty’s persona, the sycophant grandchild who just wants to please his ultimately powerful grandfather. If you do a job right the first time, you might get a backhanded compliment like “Hey, it looks like this Morty-clone isn’t a complete pile of flaming garbage afterall.”

And that’s when I started understanding something about the game: you just aren’t good enough to go on a real adventure with Rick. Hell, the real Morty barely is. Sure, there are action sequences with the promise of multiple deaths around the corner, but these are remarkably few in number, and stink of Rick’s characteristic manipulation. It isn’t a real adventure at all. And yet somehow, all of this is okay given the absurdity of both Job Simulator and the show itself.

getting instructions from Rick via wristwatch, image captured by Road to VR

All of this is done in a beautifully rendered environment that easily mashes up with the show’s hand-drawn feel. It’s like living in your favorite cartoon (if Rick and Morty is your favorite cartoon, that is).

Comfort

Getting to the nitty-gritty, Rick and Morty: Virtual Rick-ality offers many of the same features of Job Simulator, including its ‘smaller person’ mode that lets you scale down the size of your environment to let you access things easier. Despite this, the game is very much a standing experience that requires at least 2m x 1.5m (about 6.5 feet x 5 feet). Object interaction is the exactly the same as Job Simulator; bottles have poppable corks, and jars have screwable tops, i.e. almost everything is interactive and articulated enough to seem plausibly real.

There are three nodes you can teleport to, all of them inside the garage. This makes it an ultimately very comfortable experience, one that requires little explaining to master (even a 6-year old can do it).

Strangely enough, the Oculus Rift version doesn’t offer any form of ‘comfort-mode’ snap-turn for people with only a two-sensor set-up, which considering the 360 nature of the game may initially sound like a no-go for anyone without at least 3 sensors. Despite this, I found most interactions to be forward-facing, so I didn’t have to deal with Touch tracking issues all that often. The HTC Vive’s standard Lighthouse tracking predictably handles all room-scale interactions with ease.

Check out the first 10 minutes of gameplay to get a better idea of just what Rick and Morty: Virtual Rick-ality has to offer.


exemplar-2We partnered with AVA Direct to create the Exemplar 2 Ultimate, our high-end VR hardware reference point against which we perform our tests and reviews. Exemplar 2 is designed to push virtual reality experiences above and beyond what’s possible with systems built to lesser recommended VR specifications.

The post ‘Rick and Morty: Virtual Rick-ality’ Review appeared first on Road to VR.

Facebook is Researching Brain-Computer Interfaces, “Just the Kind of Interface AR Needs”

Regina Dugan, VP of Engineering at Facebook’s Building 8 skunkworks and former head of DARPA, took the stage today at F8, the company’s annual developer conference, to highlight some of the research into brain-computer interfaces going on at the world’s most far-reaching social network. While it’s still early days, Facebook wants to start solving some of the AR input problem today by using tech that will essentially read your mind.

6 months in the making, Facebook has assembled a team of more than 60 scientists, engineers and system integrators specialized in machine learning methods for decoding speech and language, in optical neuroimaging systems, and “the most advanced neural prosthesis in the world,” all in effort to crack the question: How do people interact with the greater digital world when you can’t speak and don’t have use of your hands?

Facebook’s Regina Dugan, image courtesy Facebook

At first blush, the question may seem like it’s geared entirely at people without the use of their limbs, like those with Locked-in Syndrome, a malady that causes full-body paralysis and inability to produce speech. But in the realm of consumer tech, making what Dugan calls even a simple “brain-mouse for AR” that lets you click a binary ‘yes’ or ‘no’ could have big implications to the field. The goal, she says, is direct brain-to-text typing and “it’s just the kind of fluid computer interface need for AR.”

While research regarding brain-computer interfaces has mainly been in service of these sorts of debilitating conditions, the overall goal of the project, Dugan says, is to create a brain-computer system capable of letting you type 100 words per minute—reportedly 5 times faster than you can type on a smartphone—with words taken straight from the speech center of your brain. And it’s not just for the disabled, but targeted at everyone.

“We’re talking about decoding those words, the ones you’ve already decided to share by sending them to the speech center of your brain: a silent speech-interface with all the flexibility and speed of voice, but with the privacy of typed text,” Dugan says—something that would be invaluable to an always-on wearable like a light, glasses-like AR headset.

image courtesy Facebook

Because basic systems in use today now don’t operate in real-time and require surgery to implant electrodes—a giant barrier we’ve yet to surmount—Facebook’s new team is researching non-invasive sensors based on optical imaging that Dugan says would need to sample data at hundreds of times per second and precise to millimeters. A tall order, but technically feasible, she says.

This could be done by bombarding the brain with quasi-ballistic photons, light particles that Dugan says can give more accurate readings of the brain than contemporary methods. When designing a non-invasive optical imaging-based system, you need light to go through hair, skull, and all the wibbly bits in between and then read the brain for activity. Again, it’s early days, but Facebook has determined optical imaging as the best place to start.

The big picture, Dugan says, is about creating ways for people to even connect across language barriers by reading the semantic meanings of words behind human languages like Mandarin or Spanish.

Check out Facebook’s F8 day-2 keynote here. Regina Dugan’s talk starts at 1:18:00.

The post Facebook is Researching Brain-Computer Interfaces, “Just the Kind of Interface AR Needs” appeared first on Road to VR.

Oculus’ New 360 Capture SDK Lets You Create and Share In-Game 360 Videos and Photos

Facebook F8, the company’s annual developer conference, is currently in full swing on its second and final day. Today’s big VR announce from the social platform giant? Facebook’s VR headset subsidiary Oculus is finally integrating a native tool that will let you capture and share your Rift experiences through 360 photos and videos—one that developers can integrate into Unity and Unreal Engines, and across NVIDIA and AMD GPUs—and it’s available today on GitHub.

This hopefully also means we’re getting something Oculus promised as a feature back when Home was initially announced for the consumer Rift back in the summer of 2015: 360 video previews of games. While this is playful speculation, and would greatly improve the buying experience for a store almost entirely bereft of demos, the 360 Capture SDK will at very least let users show their in-game exploits via Facebook’s News Feed or directly in your VR headset, so that next Road to VR review can feel more immersive to prospective customers.

According to Oculus software engineers Homin Lee and Chetan Gupta, flushing out the functionality of the SDK wasn’t as simple as taking a 360 video and stitching it together though.

“We solved the problem by rethinking the way 360 content is created,” writes Lee and Gupta. “Typically, the process starts by capturing various photos, stitching them together, and then finally encoding them. Previously, we needed to capture the content within a game engine, while ensuring we could produce a high-quality image quickly and on baseline hardware for VR. Now, all that’s possible with the 360 Capture SDK.”

Instead of traditional capture and post-process stitching, a process similar to what 360 cameras do that requires serious specs to accomplish, Oculus’ 360 Capture SDK uses cube mapping, a technique which promises to run “on baseline recommended hardware for VR without compromising the experience,” meaning you can still hit those critical 90 frames per second for VR while capturing 360 video and photos.

Oculus is touting video resolution at 1080p for News Feed viewing, and 4K for in-headset VR viewing. Video is however capped at 30 fps, but considering conventional stitching methods take anywhere from 20 to 40 seconds to capture and stitch 360 photos alone, Oculus’ cube mapping technique seems to be best in class.

The SDK plugin is said to be compatible with multiple game engines including Unity and Unreal, and even native engines. Graphical hardware like NVIDIA and AMD GPUs are also naively supported, meaning most, if not all of the Rift ecosystem can capture 360 photo/video and share it with the rest of the VR community. When it launches, you bet we’ll be checking to see if it works with HTC Vive through the ReVive hack (Facebook Space social VR is confirmed).

According to Lee and Gupta, there are a few inherent benefits of using cube maps besides speed of capture:

  • They don’t have geometry distortion within the faces, so each face looks exactly like you were looking at the object head-on with a perspective camera, which warps or transforms an object and its surrounding area. This is important because video codecs assume motion vectors as straight lines. That’s why it encodes better than with bended motions in equirectangular format.
  • Their pixels are well-distributed—each face is equally important. There are no poles as in equirectangular projection, which contains redundant information.
  • They’re easier to project. Each face is mapped only on the corresponding face of the cube. We realized we could skip the stitching process and instead use the game engine to natively capture a cube map—saving us on performance and speeding up our efforts. As an added bonus, the cube map content was actually higher quality compared to stitch content, because we didn’t lose quality in stitching and converting to equirectangular.
transforming a capture into a cube map, image courtesy Facebook

Check the Facebook blog post for more info on how Facebook creates cube maps for 360 video.

The post Oculus’ New 360 Capture SDK Lets You Create and Share In-Game 360 Videos and Photos appeared first on Road to VR.

Facebook Unveils Two New Volumetric Video ‘Surround360’ Cameras, Coming Later this Year

Facebook today announced two new additions to the Surround360 hardware initiative that are poised to make 360 video more immersive. Unveiled at the company’s yearly developer conference, F8, the so-called x24 and x6 cameras are said to capture 360 video with depth information, giving captured video six degrees of freedom (6DoF). This means you can not only move your vantage point up/down, left/right like before, but now forwards/backwards, pitch, yaw and roll are possible while in a 360 video.

Even the best stereoscopic 360 videos can’t provide this sort of movement currently, so the possibility of a small, robust camera(s) that can, is pretty exciting—because let’s face it, when you’re used to engaging with the digital world thanks to the immersive, positional tracking capabilities of the Oculus Rift, HTC Vive, or PSVR, you really notice when it’s gone. Check out the gif below to see exactly what that means.

Originally announced at last year’s F8 as an open source hardware platform and rendering pipeline for 3D 360 video for VR that anyone could construct or iterate on, Facebook is taking their new Surround360 reference designs in a different direction. While Facebook doesn’t plan on selling the 360 6DoF cameras directly, the company will be licensing the x24 and x6 designs—named to indicate the number of on-board sensors—to a select number of commercial partners. Facebook says a product should emerge sometime later this year.

The rigs are smaller than the original Surround360, now dubbed Surround360 ‘Open Edition’, but are critically smaller than rigs capable of volumetric capture like unwieldy rigs like HypeVR’s high-end camera/LIDAR camera.

Specs are still thin on the ground, but the x24 appears to be around 10 inches in diameter (257mm at its widest, 252mm at its thinnest), and is said to capture full RGB and depth at every pixel in each of the 24 cameras. It is also said to oversample 4x at every point in full 360, providing “best in-class image quality and full-resolution 6DoF point clouds.”

The x6, although not specified, looks to be about half the diameter at 5 inches, and is said to oversample by 3x. No pricing info has been made public for either camera.

Facebook says depth information is captured for every frame in the video, and because it outputs in 3D, video can be feed into existing visual effects (VFX) software tools to create a mashup of live-action capture and computer-generated imagery (CGI). Take a look at the gif below for an idea of what’s possible.

Creating good-looking 6DoF 360 video is still an imperfect process though, so Facebook is also partnering with a number of post-production companies and VFX studios to help build out workflows and toolchains. Adobe, Otoy, Foundry, Mettle, DXO, Here Be Dragons, Framestore, Magnopus, and The Mill are all working with Facebook in some capacity.

“We’ve designed with Facebook an amazing cloud rendering and publishing solution to make x24’s interactive volumetric video within reach for all,” said Jules Urbach, Founder & CEO Otoy. “Our ORBX ecosystem opens up 28 different authoring and editing tools and interactive light field streaming across all major platforms and browsers. It’s a simple and powerful solution this game-changing camera deserves.”

Keep an eye on this article, as we’ll be updating information as it comes in.

The post Facebook Unveils Two New Volumetric Video ‘Surround360’ Cameras, Coming Later this Year appeared first on Road to VR.

Facebook Launches New Camera Tools as a Foundation for Advanced Augmented Reality

F8, Facebook’s yearly developer conference, is here again for a two-day info blast from the world’s biggest social platform. Founder and CEO Mark Zuckerberg took the stage today during the keynote address to announce a new initiative that will bring about the beginnings of the company’s augmented reality (AR) platform, an “open platform” called Camera Effects that he says will form the basis of Facebook’s AR future.

Zuckerberg maintains that the smartphone will be Facebook’s first big leap into building the future of AR, triumphantly announcing that the camera will be “the first augmented reality platform.” To do this, the company is asking developers to build AR apps to work with the newly announced Camera Effects Platform, a platform that lets developers create AR effects for the Facebook camera like masks, special effects, etc.

A closed beta of the platform is launching today.

For now the AR-focused Camera Effects platform is necessarily a smartphone-only affair, but promises to provide the basis for some of the interactions that will take place in the perfect ‘glasses form-factor’ AR headset of the future.

“I used to think glasses were going to be the first mainstream augmented reality platform, and that maybe 5-10 years from now we’d get the form factor that we all want,” Zuckerberg admits. “But over the last couple of years, we’ve started to see primitive versions of each of these use-cases on our phones and camera.”

image captured by Road to VR

“We look at [these basic applications] and we see something different. We see the beginning of a new platform,” Zuckerberg told the crowd. “We’re not using primitive tools today because we prefer primitive tools. We’re using primitive tools because we’re still early in the journey to create better ones. And in order to create better tools, first we need an open platform where any developer in the world can build for augmented reality without having to first build their own camera and get a lot of people to use it. But when you look around at all of the different cameras that are out there today, no one has built a platform yet.”

According to Facebook’s Camera Effects page, two main tools take the center stage of the platform: AR Studio, a collection of tools for 3D artists and developers who want to create effects for photos and videos, including masks, animated frames, and other augmented reality interactions; and Frame Studio, a way of creating interesting, albeit static overlays.

Zuckerberg maintains that ‘AR building blocks’ like simultaneous localization and mapping (SLAM), 3D effects, and AI-driven object recognition are fundamental in creating apps with the Camera Effects platform in the future. To demonstrate, Zuckerberg revealed a visual concept showing the Facebook camera recognizing a number of objects in real-time, making it an interactive item like a green plant sprout flowers, or a real-world bottle of wine displaying extra information like its vintage and location of production.

“Some of these effects are going to be fun, and others are going to be useful,” he said.

“This will all go into the glasses that we want. It’s all the same technology, and this is a step to the way there,” Zuckerberg said.

The post Facebook Launches New Camera Tools as a Foundation for Advanced Augmented Reality appeared first on Road to VR.

‘Mindshow’ Creative App Launches Open Beta in Q3 2017, Watch New Trailer Here

Mindshow, the app from Visionary VR that lets you make animated movies in VR with your body and voice, has revealed they are launching into open beta sometime in Q3 2017.

Revealed in the description of the new promo released today on the company’s YouTube channel, the makers of Mindshow haven’t cranked down on a definite release date yet, but have said it will be launching on HTC Vive. Although not specified on the company’s website, a Techcrunch report maintains Mindshow will also be rolling out on the other major VR platforms including both the PSVR and Oculus Rift, according to a chat with CCO Jonnie Ross and CEO Gil Baron.

image courtesy of Visionary VR

First revealed a year ago at VRLA Summer Expo 2016, the application aims to give anyone access to simple tools to tell stories in virtual reality. The company had raised $6 million in venture capital before coming out of stealth last year.

Mindshow allows users to choose a scene, stage the set with props, drop characters into the scene, and even animate them using motion capture through the HTC Vive. Once everything is set, users can record the scene and capture voice-over as they are acting out the scene.

And if the promo tells us anything about the actual app, it’s going to be awesome.

The post ‘Mindshow’ Creative App Launches Open Beta in Q3 2017, Watch New Trailer Here appeared first on Road to VR.

VR’s Trippiest Social-Music Platform ‘TheWaveVR’ Launches on Steam, Adds $4M in Seed Funds

Social VR platform TheWaveVR is now in open beta and delivering weekly DJ sets in its wild and weird environment filled with psychedelic interactive art. Distributing free of charge on Steam Early Access for Oculus Rift and HTC Vive, the platform lets you watch, host, and socialize in shows while music is mixed live.

The company has been developing TheWaveVR for a year, and while it’s basically a social VR platform that lets you invite friends, chat and explore—like AltspaceVR, RecRoom, or VRChat—the software is more centered around hosting weekly virtual concerts where select musicians can mix music live. These so-called “Wave Shows” are said to pick up in frequency over the coming months, likely playing as cornerstones to the platform’s general draw.

To further its development, the company has also garnered an additional $4 million in seed funds led by Upfront Ventures in Los Angeles, bringing total funds to $6.5 million. Other investors in the second seed round include RRE Ventures, KPCB Edge, Greycroft VR Gaming Tracker Fund and The VR Fund.

“We’re hyper focused on creating the most new and engaging social experiences for music content in VR,” said CEO and Co-Founder Adam Arrigo, who spent seven years working for Harmonix, creators of Rock Band and Dance Central. “We think the potential of this new medium isn’t in replicating reality, but amplifying it, so we want to give fans interactive experiences that can only be achieved in virtual reality.”

image by Road to VR

The platform does this via a virtual mixing table, where musicians can grab track samples, represented by actual records, and toss them onto one of two turntables. There’s a fader, volume controls, various filters, and a tempo slider; but in addition to the standard turntable knobs, there are two transparent cubes to the right and left where you can apply effects in real-time ranging from a basic echo to a low-fi bitcrush option that pixelates your vision when activated.

“On the artist side, we’re building out the tools that let musicians and visual artists easily import and distribute their content on our platform,” Arrigo added.

image by Road to VR

I stepped into TheWaveVR right before last night’s inaugural kick-off party and was overwhelmed by the choices presented before me. I could emit a number of effects from my hands, a whole galaxy of multi-colored uppers, downers, screamers, laughers—Hunter S. Thompson references aside—plenty of beautiful and mystifying lights that were nothing short of entrancing. The social lobby, a dark rocky world, was populated with interactive ‘entities’ that would either produce music or a cool visual effect. Adding a room of dancing avatars and a DJ standing atop a rocky outcrop to the mix—all shooting off disco balls and crackling streamers—was the sort of impossible and trippy world only possible in VR.

CCO and co-founder Aaron Lemke, also known for his meditative VR experiences Eden River (2014) and Zen Zone (2015), said that they’ve user tested TheWaveVR with “dozens of music creators and fans,” saying that feedback during the open beta “will directly fuel the development of new features.”

The post VR’s Trippiest Social-Music Platform ‘TheWaveVR’ Launches on Steam, Adds $4M in Seed Funds appeared first on Road to VR.

Crytek-Incubated ‘VR First’ Program to Double Number of Academic VR/AR Labs in 2017

VR First, the global initiative to seed academic institutions with VR/AR hardware and software, today announced that it’s nearly doubling the number of its VR labs in universities and science parks across the globe by the end of 2017. With currently 26 labs operating in the United States, Europe, Asia and Oceania, VR First is bringing the number to 40 labs by the end of year.

Unveiled last January by Crytek, the now fully-independent VR First program is designed to foster innovation by not only creating independent VR labs where they once weren’t, but by also converting PC labs into AR/VR-ready facilities—and it’s doing it in top universities and science parks across 23 countries including Purdue University, University of Florida, University of Southern California Viterbi School of Engineering, Oklahoma State, and Vancouver Film School.

image courtesy VR First

The idea is to stimulate development in VR where questions can be answered best, and in a place where projects can be developed without the heavy startup cost associated with PC VR.

Currently VR First is boasting more than 50 projects in development at the labs, and not just games. In fact, games only account for 35% of projects underway, as the rest are focused on fields like psychology and neuroscience (12%), education (7%), tourism (7% ), and architecture and real estate (6%). Development is divided evenly across the HTC Vive and Oculus Rift platforms, each accounting for 31% of projects created, with other major VR platforms such as Samsung Gear VR, OSVR, and Daydream rounding out the bottom numbers. While the program was principally incubated by Crytek, students working in VR First labs tend use Unity (48%), followed by Unreal (20%) and then CRYENGINE (14%).

image courtesy VR First

AR headset distribution is much more dramatic, as students are mainly developing on Microsoft HoloLens (43%), with Google Glass, Vuzix AR headsets, Epson Moverio, Meta AR Dev Kits making up the rest of the pie chart. A full 25% of projects however are reported to be using ‘other’ hardware platforms, but considering the massive number of Google Glass-style headsets out there, it’s no wonder they didn’t name them all.

With a growing reach comes a bigger voice too, so VR First is also helping to push a new Institute of Electrical and Electronics Engineers (IEEE) standard, called IEEE P2048.5, which focuses on setting quality assurance and testing standards for VR/AR hardware in regards to fully immersive environments. To that end, VR First is also partnering with benchmarking company Futuremark to support benchmarking requirements through its standardization efforts. Through its Lab Renovation Program, VR First is promoting the adoption of these new standards by lab partners, governments and science parks.

development at Carleton University, image courtesy VR First

“The progress VR First has made in just its first year, from 26 labs open and more coming soon, to its growing technology partner network and the unveiling of dozens of VR projects developed at VR First Labs, is excellent momentum to democratize the innovation VR/AR landscape,” explains Ferhan Özkan, co-founder, VR First.  “With more than 65% of universities planning dedicated VR labs, plus science parks and governments interest to do the same, continued growth of our efforts is without question. We invite all industry technology providers and stakeholders to join us in this meaningful program.”

The program, which also includes a network of participating schools, has over 500 universities and science parks worldwide. The broader-reaching network is designed to help developers convert ideas into business opportunities and introduce them to established industry partners.

VR First is fully independent, and supports VR First labs no matter their choice of engines or platforms, and welcomes all hardware and software providers. We believe that if you have a noble vision like “Democratization of VR/AR Innovation,” it deserves to be protected from any competitive sacrifice. This vision could only be achieved with a vendor-neutral approach,” added Özkan.

VR First Institutions (including new openings)

  • Aalborg University Copenhagen, Denmark
  • Academy of Art San Francisco, USA
  • Bahçesehir University (BAU), Turkey
  • California State University Monterey Bay, USA
  • Canterbury University, New Zealand
  • Carleton University, Canada
  • Comenius University Bratislava, Slovakia
  • Dania Academy of Higher Education, Denmark
  • Darmstadt University of Applied Sciences, Germany
  • Doña Ana Community College, USA
  • Graz University of Technology, Austria
  • HTW Berlin, Germany
  • Ilia State University, Georgia
  • Kajaani University of Applied Sciences, Finland
  • LLC Technology Companies Development Center, Ukraine
  • Manchester Metropolitan University, UK
  • Middle East Technical University, Turkey
  • NHTV Breda University of Applied Sciences, Netherlands
  • North Carolina State University, USA
  • North Metropolitan TAFE, Australia
  • Oklahoma State University, USA
  • Paneuropean University Bratislava, Slovakia
  • Purdue University, USA
  • Rochester Institute of Technology, USA
  • RUBIKA, France
  • Sogang University, South Korea
  • South East European University, Macedonia
  • State University of New York at Oswego, USA
  • Tallinn University of Technology, Estonia
  • Universität Hamburg, Germany
  • Universität Heidelberg, Germany
  • University College Cork, Ireland
  • University College London, UK
  • University of Florida, USA
  • University of Southern California (USC), USA
  • University of the Aegean, Greece
  • Vancouver Film School, Canada
  • VIGAMUS Academy, Italy
  • Vilnius Gediminas Technical University, Lithuania
  • Warsaw University of Technology, Poland

Update 04/13/2017: the article was updated to clarify that VR First initially incubated under Crytek, but is now a fully independent program. 

The post Crytek-Incubated ‘VR First’ Program to Double Number of Academic VR/AR Labs in 2017 appeared first on Road to VR.

‘Rick and Morty’ VR Game is Releasing on Oculus Rift and HTC Vive April 20th

Rick and Morty are finally making their way to VR in Rick and Morty: Virtual Rick-ality. Coming to HTC Vive and Oculus Rift headsets on Steam and Oculus Home, you can take part in the dimension-hopping adventure starting Thursday, April 20th for for $29.99.

Created by Adult Swim Games and Owlchemy Labs, makers of the tongue-in-cheek VR game Job Simulator (2016), Rick and Morty: Virtual Rick-ality is said to be a “fast-paced, chaotic VR adventure.” So expect plenty of puzzles and multi-dimensional missions as you, a clone of Morty, navigate and rummage through Rick’s garage and the Smith house for interactive items abound.

Road to VR‘s Michael Glombicki got a hands-on with an early version of the game, saying it’s “full of the same absurdist sci-fi humor that fans of the acclaimed Rick and Morty show know and love.”

In the game, you take control of a Morty clone, ostensibly created for the sole purpose of doing chores for Rick. The first task Rick gives you is to wash his dirty laundry by placing it in the washing machine and turning it on. It’s a very simple task, but everything about it, from placing the dirty underwear in the machine to turning the knobs, felt like a activity in Job Simulator. The reason for the similarity is that Owlchemy built the game using version 2 of their VR interaction system and so they were able reuse a lot of the same technology that powered Job Simulator.

 

image courtesy Adult Swim Games

Show creator and principal voice actor Justin Roiland has already published a VR experience through his newly created studio Squanchtendo that’s delivered a mix of his signature brand of bizarre and absurdity called Accounting.

Roiland has however had his eye on VR since at least late summer 2015 as he and Owlchemy Labs’ studio head Alex Schwartz (via the official Owlchemy Labs twitter) exchanged a few choice tweets discussing the possibility of collaboration.

Owlchemy Labs has developed and published over 20 games spanning desktop and mobile, including Aaaaaculus! (2011), one of the first games with Oculus DK1 support on Steam. As a launch title on HTC Vive, PSVR and Oculus Touch, the motion control-focused Job Simulator has not only garnered critical acclaim since release, but has reportedly surpassed over $3million in sales earlier this year and making it one of the most financially successful VR games to date.

“We really believe fans are going to lose their minds at what we’ve developed,” says Owlchemy Labs CEO Alex Schwartz. “It’s been an incredible experience to develop for one of our favorite shows and see the joy on players’ faces when they get to explore Rick’s garage in VR, physically step through portals, and interact naturally with their hands in the world they’re already so familiar with. Players are interacting with the world of Rick and Morty in a way only possible in virtual reality, and they love it!”

Check back for a full hands-on with the Rick and Morty: Virtual Rick-ality game on April 20th.

The post ‘Rick and Morty’ VR Game is Releasing on Oculus Rift and HTC Vive April 20th appeared first on Road to VR.

Hands-on: ‘EVE: Valkyrie’ Groundrush Update

Fanfest is upon us again, CCP’s yearly festival in Reykjavik for all things EVE Online, but to go along with the round table discussions and announcements surrounding the massive multiplayer is a new update to the studio’s sci-fi arcade dogfighter EVE: Valkyrie (2016). Coming April 11th to Oculus Rift, HTC Vive and PSVR is an expansion called Groundrush that promises to deliver even more of that Star Wars-esque spaceship experience with planet-based maps (and one that looks awfully like the planet Hoth) among a host of other updates.

Popping on the PS4 Pro-powered PlayStation VR headset, I found myself back in the game’s familiar launch tubes and sitting in the basic ship, the Banshee. My teammates, two disparate packs of German and Russian journalists, were conjoined for the mission and chattered on through the game’s comm system in their respective languages.

psvr groundrush valkyrie
photo by Road to VR

Rocketing out of the carrier at the same blistering speed I’ve come to know from the game’s many space-based maps, I flew out over what I was told would be Valkyrie’s first planet-based map, Solitude. Some sort of derelict research facility was at the center point, still showing the signs of an apparent attack. Craggy peaks covered in snow jutted out everywhere—a great place to smash into if you fly as bad as me.

Seeing the horizon where open 3D space once stood was hard to get used to at first. However the snowy scenery and comparably light skybox helped soften the disorientation I expected when engaging in the game’s wild corkscrew battles. The demo was only 10 minutes long, so I can’t tell how longer sessions would play out in regards to motion sickness, but I felt more or less the same flying in the topsy-turvy space battles I’m used to in space. So while the sensation of flying was mostly the same, I had a new foe to contend with: the ground. And yes, you can smash into it.

photo courtesy CCP photo courtesy CCP

The map itself looks noticeably smaller than the space-based maps, and combat tended to take place near the ground, winding around the structures and geological features of the planet. Flanked on one side by the enemy carrier—another hazard to watch out for lest you want a quick death and a “killed by enemy carrier” message to flash at you before being dropped serendipitous into a fresh clone—and on the other was my own.

Flying with afterburners between the two took about 30-40 seconds. It was faster than I thought, but it seemed to keep combat at a brimming level. The match I played hosted 10 players, two teams of five, so there was plenty of action to go around.

I also noticed a number of interesting tight spots to chase enemies through, something that always proves to be a crowd-pleaser by invoking the iconic Star Wars trench run. Check out the video below to get a better idea of what Groundrush is offering.


Disclosure: CCP Games provided airfare and lodging for Road to VR to attend Fanfest 2017

The post Hands-on: ‘EVE: Valkyrie’ Groundrush Update appeared first on Road to VR.