Oculus Chief Scientist Predicts the Next 5 Years of VR Technology

The annual presentation at Oculus Connect by Michael Abrash, Chief Scientist at Oculus, is always a highlight of the company’s annual developer event, projecting a forward-thinking and ever inspirational look at the future of virtual reality. This time, at Oculus Connect 3, he made some bold, specific predictions about the state of VR in five years.

michael-abrash-predictions-oculus-connect-3-2016-1

Firstly, the visuals, as this is the most critical area for near-term improvement. Current high-end headsets like the Rift and Vive, with their roughly 100 degree field of view and 1080×1200 display panels, equate to around 15 pixels per degree. Humans are capable of seeing at least 220 degrees field of view at around 120 pixels per degree (assuming 20/20 vision), Abrash says, and display and optics technologies are far away from achieving this (forget 4K or 8K, this is beyond 24K per eye). In five years, he predicts a doubling of the current pixels per degree to 30, with a widening of FoV to 140 degrees, using a resolution of around 4000×4000 per eye. In addition, the current fixed depth of focus of current headsets should become variable. Widening the FOV beyond 100 degrees and achieving variable focus both require new advancements in displays and optics, but Abrash believes this will be solved within five years.

michael-abrash-predictions-oculus-connect-3-2016-3

Rendering 4K x 4K per eye at 90Hz is an order of magnitude more demanding than the current spec, so for this to be achievable in the next five years, foveated rendering is essential, Abrash says. This is a technique where only the tiny portion of the image that lands on the fovea—the only part of the retina that can see significant detail—is rendered at full quality, with the rest blending to a much lower fidelity (massively reducing rendering requirements). Estimating the position of the fovea requires “virtually perfect” eye tracking, which Abrash describes as “not a solved problem at all” due to the variability of pupils, eyelids, and the complexities of building a system that works across the full range of eye motion for a broad userbase. But as it is so critical, Abrash believes it will be tackled in five years, but admits it has the highest risk factor among his predictions.

michael-abrash-predictions-oculus-connect-3-2016-2

Next, he moved briefly to audio; personalised head-related transfer functions (HRTFs) will enhance the realism of positional audio. The Rift’s current 3D audio solution generates a real-time HRTF based on head tracking, but this is general across all users. HRTFs vary by individual due to the size of the torso and head and the shape of the ears; creation of personalised HRTFs should significantly improve the audio experience for everyone, Abrash believes. While he didn’t go into detail as to how this would be achieved (it typically requires an anechoic chamber), he suggested it could be “quick and easy” to generate one in your own home within the next five years. In addition, he expects advancements to the modelling of reflection, diffraction and interference patterns will improve sound propagation to a more realistic level. Accurate audio is arguably even more complex than visual improvement due to the discernible effect of the speed of sound; despite these impressive advancements, real-time virtual sound propagation will likely remain simplified well beyond five years, as it is so computationally expensive, Abrash says.

On controllers, Abrash believes hand-held motion devices like Oculus Touch could remain the default interaction technology “40 years from now”. Ergonomics, functionality and accuracy will no doubt improve over that period, but this style of controller could well become “the mouse of VR.” Interestingly, he suggested hand tracking (without the use of any controller or gloves) would become standard within five years, accurate enough to represent precise hand movements in VR, and particularly useful for expressive avatars, and for simple interactions that don’t require holding a Touch-like controller, such as web browsing or launching a movie. I think there are parallels with smartphones compared to consoles and PC here; touchscreens are great for casual interaction, but nothing beats physical buttons for typing or intense gaming. It makes sense that no matter how good hand tracking becomes, you’ll still want to be holding something with physical resistance in many situations.

michael-abrash-predictions-oculus-connect-3-2016-7

Addressing more general improvements, Abrash predicts that despite the complexities of eye tracking and wider-FoV displays and optics, headsets will be lighter in five years, with better weight distribution. Plus, he says, they should have more convenient handling of prescription correction (perhaps a bonus improvement that comes with depth of focus technology). Most significantly, at the high end, VR headsets will become wireless. We’ve heard many times that existing wireless solutions are simply not up the task of meeting even the current bandwidth and latency requirements of VR, and Abrash repeated this sentiment, but believes it can be achieved in five years, assuming foveated rendering is part of the equation.

Next he talked about the potential of bringing the real world into the virtual space, something he referred to as “augmented VR”; this would scan your real environment to be rendered convincingly in the headset, or it could place you in another scanned environment. This could serve as the ultimate mixed-reality ‘chaperone’ system for confidently moving around your real space, picking up real objects, and seeing who just walked in, but also to make you feel like you were anywhere on the planet, blurring the line between real-world and VR. While we can already create a believable, high-resolution pre-scanned recreation of many environments (see Realities), doing this in real-time in a consumer product has significant hurdles to negotiate, but Abrash believes many of them will be solved in five years. He clarified that augmented VR would be very different to AR glasses (e.g. Hololens) that use displays to overlay the real world, as augmented VR would allow complete control over every pixel in the scene, allowing for much more precise changes, complete transformations of the entire scene, and anything in between.

The real significance of augmented VR is being able to share any environment with other people, locally or across the world. The VR avatars coming soon to Oculus Home are primitive compared to what Abrash expects to be possible in five years. Even with hand tracking close to the accuracy of retroreflective-studded gloves in a motion capture environment, advancements in facial expression capture/reproduction and markerless full-body tracking, the realistic representation of virtual humans is by far the most challenging aspect of VR, Abrash says, due to the way we are so finely tuned to the most subtle changes in expression and body language of people around us. We can expect a huge number improvements to the believability of sharing a virtual space with others, but staying on the right side of the uncanny valley will still be the goal in five years he believes. It could be decades before anyone gets close to feeling they are in the presence of a “true human” in VR.

michael-abrash-predictions-oculus-connect-3-2016-6

Finally, Abrash revisited his “dream workspace” he discussed last year, with unlimited whiteboards, monitors, or holographic displays in any size and configuration, instantly switchable depending on the task at hand, for the most productive work environment possible. Add virtual humans, and it becomes an equally powerful group working tool. But in order for this to be comfortable to use as an all-day work environment, all of the advancements he covered would be required. For example, the display technology would need to be sharp enough so that virtual monitors could replace real monitors, augmented VR would need to be capable of reproducing and sharing the real environment with accuracy, the FOV would need to be wide enough to see everyone in the meeting at once, and spatial audio would need to be accurate enough to pinpoint who is speaking. Not every aspect of this dream will come true in five years, he says, but Abrash believes we will be well along the path.

The prospect of such a giant step forward in so many areas in such a short space of time is exciting, but can it really happen? And will it all be within one generational leap, or can we expect to be on third-generation consumer headsets by then? Well, Abrash made decent predictions about the specifications of consumer VR headsets almost three years ago, so let’s hope he’s been just as accurate this time.

The post Oculus Chief Scientist Predicts the Next 5 Years of VR Technology appeared first on Road to VR.

Hands-on: ‘Lone Echo’ Multiplayer is a Totally Surprising Triumph for Competitive Zero-G Locomotion

Flying through an obstacle-filled arena in zero gravity like in the Battle Room scenes of Ender’s Game, catching and throwing a disk to score in an opposing team’s goal, all while in a Tron-looking virtual reality, is probably about the best way I can describe Lone Echo’s surprisingly successful multiplayer mode in one sentence.

At Oculus Connect 3, I was able to try a singleplayer demo of Lone Echo, an Oculus Touch exclusive by developer Ready at Dawn that lets you grab, pull, and push yourself around in zero gravity as a robot, but I didn’t know what to expect when I was told that I’d get to try out a multiplayer demo as well, since the singleplayer didn’t really have any activities that I could imagine doing with someone else in a meaningful way, much less in a competitively.

Nonetheless, while it didn’t seem to have anything to do with the single player, aside from the robot you inhabit in both modes, and the zero-g movement style, it was a very different and fresh taste of what could be done with zero gravity sporting in VR. From the outside, this floating, zero-g movement seems like a prime candidate for causing nausea in VR, and yet it managed to be an incredibly effective way of getting around that didn’t seem to cause me or other people I played with any dizziness.

I played in a match of five vs. five. Our team captain—a real person playing in another room—taught us not only how to play the game, but also how to navigate in the 3D, zerg-g arena. We would move around in zero gravity either with thrusters, or by grabbing, pulling, or pushing ourselves on our way with the help of walls or floating geometry (or even teammates or enemy characters). We’d be vying for a glowing disc in the middle of the Ender’s Game Battle Room-style arena (though it wasn’t nearly as big). Then we’d have to grab the disc and throw it into the holographic goal at the end of other team’s side.

Photo courtesy Ready at Dawn
Photo courtesy Ready at Dawn

A final piece of the puzzle was a punch you could do only to opponents’ heads to briefly stun them, preventing them from being able to move and hold the disc. You could also grab and climb onto bodies, so a common maneuver would be to grab onto a limb, clambor up, punch them in the face, and snatch the disk right from their hands, then give yourself a shove off of their stunned body to head toward the goal.

It sounds simple enough, but the mechanics seem like they could allow complex strategy as you might expect in a real sport. Of course, there was nothing to enforce any ‘rules’ or positions (like a goalie or reciever), but smart players would naturally fall into such roles to beat the opposing team (who were most likely playing the game for the first time and didn’t have any strategies other than to all flock for the disc like it was second grade soccer at recess).

Photo courtesy Ready at Dawn Photo courtesy Ready at Dawn

Speaking of soccer, passing is a huge part of the game, especially when your teammates are careening across the area in zero-g. It’s fun not only to be the thrower, needing to skillfully lead the disk to where your teammate is headed on their trajectory, but equally as much to be the receiver who has to launch themselves in the right direction at the right time to intercept the disk. More advanced players will see opportunities for bouncing the disc off the arena’s angular walls to send it around opponents land it in a key position in front of the opposing goal.

You can defend the goal, and (in the demo version we were playing) you might be really good at end the game there because the other team couldn’t organize well enough in the short time frame of the demo to score. My time playing Lone Echo’s multiplayer was a fun, heated battle for the disc, that didn’t leave me nauseous despite flying around in every which direction unhindered by gravity.

SEE ALSO
Physicality & Spectatorship in 'Project Arena' Could Blur the Line Between E-sport and Actual Sport

Every direction indeed; while playing the demo I was constantly spinning around in real life, reaching out with my touch controllers to try to grab a passing disc or tossing it to a teammate while shoving off of a wall to avoid an enemy. Based on what I saw, the game will almost certainly require the two-camera ‘opposing’ setup for the Rift (cameras opposite each other), or the three-camera setup for full 360 coverage. Otherwise it seems like it would be extremely difficult to keep yourself facing forward for the two-camera ‘front-facing’ setup (as you’d lose tracking on your hands regularly when turned away from the cameras).

Lone Echo’s surprisingly successful multiplayer feels like it could be a major addition to the game. Despite not yet having an official released date, what I played seemed very polished and fun already, with huge potential to become something even greater than what we’ve seen from the still burgeoning VR e-sports sector. 

The post Hands-on: ‘Lone Echo’ Multiplayer is a Totally Surprising Triumph for Competitive Zero-G Locomotion appeared first on Road to VR.

Oculus on Platform-exclusive VR Content: ‘it’s the only viable way to jumpstart the market’

Earlier this month we saw a slew of new Oculus Touch game announcements. Many of these new VR titles, which benefit from funding or publishing (or both) by Oculus, will launch exclusively for the Rift and Touch. And while the company has come under fire for that practice in the past, Oculus’ Head of Content defends the decision as the right approach to get the VR market moving.

Last month’s Connect 3 conference saw the reveal of ambitious (and exclusive) new VR content for Oculus Touch, like Robo Recall, Arktika.1, and Lone Echo. With $250 million already invested into VR content and another $250 million on the way, the company shows no signs of slowing its strategy of exclusivity deals.

jason-rubin
Jason Rubin, Head of Content at Oculus

Jason Rubin is the Head of Content at Oculus, and a veteran of the videogame industry. Known for his work on the Crash Bandicoot franchise and founding of studio Naughty Dog, Rubin has been around the block, and contrasts his experience in the early days of the videogame industry with what’s happening today in virtual reality.

Speaking with Road to VR at Gamescom 2016 earlier this year, Rubin laid out the company’s decision to aggressively invest in VR content, often in exchange for exclusivity agreements.

“A lot of the games you see here today are larger than would be practically financeable by developers and publishers at the launch of a hardware system. When you’re talking about VR, you’re talking about a new hardware that has no past analogue, there is nothing that can be ported well onto VR. There are games that work ok like Project Cars for controller. But when you talk about hand-tracking, there’s nowhere those sorts of games can come from,” Rubin said. “By definition you’re shipping into a zero-person install base when you ship this new hardware. For developers to make large leaps of faith—to do multi-million dollar projects—it simply doesn’t happen without the hardware manufacturers believing in their hardware and believing in the ecosystem and helping those developers out with large grants. There’s no other way that Wilson’s Heart, Chronos, or The Climb gets made.”

Rubin speaks to the chicken-and-egg conundrum of getting developers to make content for a platform with no customers, or getting customers to buy a platform with no content.

“Once those games are out there [customers] say, ‘Oh, there’s great stuff out there. I’m going to buy into VR. I get it now, I understand why I want to play.’ Then they buy the systems, and they’re now an addressable market,” he said. “Then the second or third generation of developer doesn’t need our money. They can see there’s an established userbase there built by first generation games, they can build a bigger title than they would have otherwise because there’s now enough consumers to buy their game. That’s the only way these systems work.”

SEE ALSO
Survey: 78% of AR/VR Developers Not Planning Platform-exclusive Content

Looking back at his experience in the early days of the videogame industry, Rubin suggests that not aggressively funding VR content through exclusive deals would mean painfully slow growth for the VR space.

“The other way to [create a sustainable customer/developer ecosystem], is to do it the way PC originally did it which started 30 years ago; I was making games when the PC came out. The way you do it is, you put it in a ziplock bag, you put it on the shelf, somebody buys your ziplock game, and the addressable market gets a little bigger. [Then] you make a better game that you put in a very cheap box and over time it gets to $100 million games. It took 30 years. We don’t want [VR] to take 30 years,” Rubin explained. “We want this generation to race forward. Because we don’t have the luxury that the PC market had, where it was the best-looking thing out there. Well, we’re going up against Grand Theft Auto and Call of Duty. [Gamers] have the ability to play these triple-A games. So if we don’t compete visually [and] depth-wise, if we don’t jump out there and give them great games to play right off the bat, we may never have what the PC had. We may never have the stepping stone. What we are doing now is the only way to viably jumpstart the market.”

Rubin points to the Oculus store and its library of games as proof that the approach is working.

“There have been a lot of suppositions from various parties who haven’t wanted to fund [VR games] about how it could happen elsewhere, or their particular methodology for getting [VR] to spread, but the proof of any system is in its results. The proof is there in our system—funding developers—that’s creating next generation content that other systems have not.”

If the exclusive titles Oculus has helped fund weren’t any good, there probably wouldn’t be much backlash from the community. But it’s clear that some of VR’s most substantial titles are found on Oculus’ platform, even prompting the creation of ‘Revive’, a hack that allows some Oculus exclusive games to be played on the HTC Vive. When the hack was patched by Oculus, outrage from the VR community prompted the company to backpedal on their approach to DRM.

SEE ALSO
Platform Politics: Inside the Oculus and 'Revive' Dilemma

Meanwhile, Sony has taken a similar approach with exclusive funding of VR content for PlayStation VR but rarely sees the level of criticism directed at Oculus. That’s likely due to the fact that platform exclusive titles are a norm in the console space, whereas PC has long been seen as an ‘open’ platform.


Additional reporting by Scott Hayden

The post Oculus on Platform-exclusive VR Content: ‘it’s the only viable way to jumpstart the market’ appeared first on Road to VR.

‘Lone Echo’ Multiplayer Is ‘Tron’ Meets Soccer In Zero Gravity VR

‘Lone Echo’ Multiplayer Is ‘Tron’ Meets Soccer In Zero Gravity VR

The only thing standing between me and the goal is a floating orange robot from the other team. Calling upon all my strategic cunning I devise a brilliant plan to get by him and score a point for my team. I punch him in the face.

As he lies stunned and sparking in my wake I hurl the Tron-esque frisbee in my hand toward the portal in front of me and watch it sail to victory with a satisfying explosion as it crosses the finish line. I raise my own mechanical hands in the air as the rest of my team rockets over in celebration.

Digitally, I’m a robot floating in a futuristic, zero-gravity sports chamber. Physically, I’m standing in the demo hall of Oculus Connect 3 with a Rift strapped to my face and two Touch controllers in my hands. The experience I’m sampling is a multiplayer version of the upcoming Lone Echo from Ready at Dawn (The Order: 1886.) As the game resets itself for the next match all I can think is, “This thing is going to consume my entire life.”

Lone Echo multiplayer is essentially zero-gravity Ultimate Frisbee with robots that can punch each other in the face to stun each other for a few precious seconds. This supplementary mode brilliantly embraces the core mechanics of Lone Echo‘s single player mission to create an adrenaline packed multiplayer experience.

Lone Echo’s most innovative feature is its movement system. The game takes place without gravity and so you need to use the Oculus Touch controllers to grab onto hand holds, stationary surfaces, and each other to propel yourself through the environments like an astronaut. You also have a pair of Iron Man-like hand thrusters that can be used to propel you around the environments.

If that sounds like a huge amount of fun that’s because it is. Now imagine that with multiple players per side all zipping, propelling, flying and fighting over a single tiny frisbee. Now it’s just insane.

The story mode of Lone Echo is dark, atmospheric, slow-paced and highly narratively driven. The multiplayer mode…isn’t at all.

There’s nothing slow-paced or dark about grabbing onto a teammates shoulder as you rocket out of the start tunnels, shoot through the air at what feels like light speed, punch an enemy straight in the head as you both meet at the frisbees starting location, and keep on blasting forward laughing at the disruption you’ve just caused. It feels like an almost completely different game, but that is not a bad thing.

Many of the greatest multiplayer games of today only came about as spin-offs or experiments. A Warcraft mod led to DOTA and the entire MOBA scene, which is now the most popular multiplayer gaming genre in the world; Team Fortress 2 was built by overlaying cartoony graphics and interesting characters on the realistic QUAKE engine; and no one will ever be able to convince me that Donkey Kong 64′s death matches weren’t superior to Golden Eye‘s.

The best multiplayer happens when a developer focuses on the good mechanics that they’ve built for the game and translate those into an interesting competitive experience. These may have nothing to do with the plot or aesthetic of the original game, and that often leads to wonderfully wacky new use cases for the interesting gameplay that’s already discovered. Lone Echo may be coming to a brand new type of gaming platform, but it is one of the purest representations of this tried and true multiplayer tradition.

After playing it, I’m thrilled for it to be fully released next year, but I’m also sad. Sad that I won’t be seeing my friends and family for long stretches of time once it’s available. I’m an addict now and I need my fix of frisbees and face punches. No seriously. I need it. Give it to me. GIVE IT TO ME NOW.

Editors Note: Joe Durbin has been taken to a nice patch of quiet grass under a beautiful Oak Tree until he stops trying to hit his coworkers in the face and throw their personal effects through windows he keeps calling goals. He appreciates your support in this difficult time.

Watch: 17 Mins of ‘Arktika.1′ Oculus Touch Gameplay

Arktika.1 is the VR debut for 4A Games, the developers bind teh Metro series of first person shooters. The title is built from the ground up for Oculus Touch, here’s 17 minutes of the Oculus Connect 3 demo where the title made it’s debut.

Oculus Connect felt like another milestone marking the maturity of content for virtual reality. VR is beginning to gain support from mainstream, triple-A developers and the games which have been in gestation are now beginning to filter out, with a step change in production design and polish, which the traditional games market takes for granted.

One such title revealed at the event was Arktika.1 from developers 4A Games. You may know them from the hugely popular (and technically excellent) Metro series of first person shooter games. Their Malta studio has been dedicated to building a new made-for-VR title which was unveiled for the first time at Oculus Connect 3. It’s a first person shooter designed around motion controls, specifically Oculus’ forthcoming Touch devices. It’s a first person shooter game set in an icy, futuristic wastelend:

Road to VR‘s Frank He went hands on with the game at the event, and had this to say about his experience:

The feel of the weapons as they shot, the strong haptics induced in the Touch controllers, and the quality of the sounds, were all satisfying, not to mention the look of the projectiles and the trails in the air left by them. All of this contributed to the high quality AAA feel of the game. Out of the assortment, I picked what looked like a revolver that shot a scattering of bullets made of pure energy, and a handgun that also scattered but with what seemed to be green projectiles leaving light distorting streaks in the air.

SEE ALSO
Hands-on: 'Arktika.1' is a Sci-Fi Gun Fanatic’s VR Dream

The gameplay seen here was shot ‘off screen’ so there’s no in game audio unfortunately, but it’s well worth a watch to get a handle on how 4A Games have approached gunplay and VR locomotion in the game, which promises a lengthy campaign mode to play through when it releases in 2017. Arktika.1 is exclusive to the Oculus Rift and Touch and is published by Oculus Studios.

SEE ALSO
Robo Recall Design Insights from Developers Epic Games

The post Watch: 17 Mins of ‘Arktika.1′ Oculus Touch Gameplay appeared first on Road to VR.

Oculus Touch Will Support 360 and Room-scale Tracking With Extra Cameras

Long a point of contention, Oculus last week announced a welcome new stance: the Rift will be officially supporting 360 degree tracking and room-scale as part of the forthcoming Touch launch.

At the company’s ‘Connect’ developer conference it was finally revealed that Oculus now plans to support four Rift and Touch camera configurations:

  • One camera in front: seated/standing gamepad experiences
  • Two cameras in front: standing front-facing Touch experiences
  • Two cameras opposite: standing 360 degree Touch experiences
  • Three cameras in triangle layout: ‘room-scale’ Touch experience

Prior to the announcement last week, Oculus had only officially committed to the first two configurations. Though they will officially support these other setups, Oculus says they believe the front-facing setup is the most practical and will be used by the majority of users. Questions remain as to what amount of fragmentation these varying configurations might cause among the Rift userbase.

SEE ALSO
Oculus: Touch is "Fully Capable" of Roomscale Tracking, But Skeptical It's "Absolutely Necessary for VR"

As the Touch launch approaches in December, support for four tracking cameras has already been baked into the Oculus Home platform (though the company’s recent announcements only made mention of configurations utilizing three cameras). While the $200 Touch price tag will include one additional tracking camera, that leaves users with just two (the first being included with the Rift); Oculus plans to sell individual cameras for $80 to achieve the three-camera room-scale setup.

oculus-touch-2
Oculus’ front-facing camera setup gives you a little bit of space to move around, I like to call it ‘rug-scale’

Room-scale is, quite literally, a big deal. Much can be accomplished constrained to chairs and Oculus’ prolific demo mats, but VR frequently compels us to stand up, inspect details up close, flail our weapons in all directions, and duck & dodge in ways that demand a big space—something HTC’s Vive has been doing since its launch earlier this year. Officially unlocking these experiences on the Rift will make Home all the richer.

With the release of Touch on December 6th, the Rift platform will have three recommended setups. The fundamental reason for this is occlusion, the possibility that some crucial portion of the tracking markers on the Rift and Touch might be blocked from the camera’s view. Modern VR headsets use highly calibrated IMUs which are capable of determining an object’s rotations accurately at very low latencies, but which rapidly drift out of alignment when used to further judge the object’s movement through space. Both Rift and Touch use infrared LED markers to serve as a corrective frame of reference for positional tracking, but can’t do this when too many of these markers are blocked from view.

Introducing two Touch controllers complicates matters considerably further. Now a player can turn 180° from the camera, hold the controllers near their chest, and the controllers become completely occluded from the camera, eliminating their ability to be tracked through space. When it comes to Oculus’ tracking system, there’s only one solution to the occlusion problem: more cameras.

Two cameras placed in opposite corners of the room will provide a large playspace, but this risks placing them so far away from the headset that they have trouble seeing the markers clearly; this is ‘room-scale’ capable and a full 360° degree experience, but not terribly robust, according to Oculus.

SEE ALSO
Final Touch Battery Life Improved 40% Over Original Prototype, Up to 30 Hours of Use

With a third camera, robust room-scale interactions in full 360° become possible, and as Owlchemy CTO Devin Reimer has said, this is the point at which a developer can “almost get [presence] for free”, thanks to the added immersion that comes with being able to physically explore the space around you in a significant way.

Oculus’ newfound support for more tracking configurations means that a whole host of previously problematic content (designed for Vive’s room-scale capability) may find itself welcome both on Home and on the hard drives of VR enthusiasts everywhere.

The post Oculus Touch Will Support 360 and Room-scale Tracking With Extra Cameras appeared first on Road to VR.

All of Oculus Connect 3’s Videos Are Now Available

If you happened to miss it Oculus held its third annual conference – Oculus Connect 3 – last week with the main keynote address taking place on the second day. All the usual figures were there, CEO Brendan Iribe, Facebook CEO Mark Zuckerberg, Chief Scientist Michael Abrash and more. Now Oculus has released all of the videos detailing all the announcements made during the event.

The main keynote addresses were all live streamed so its likely you’ve all ready seen them. These included Zuckerberg revealing the future social plans for Oculus Rift and Oculus Touch, Iribe making the big Touch announcements on price and availability, with Abrash closing the main keynote with an in depth discussion on VR.

Oculus CTO John Carmack didn’t feature in the first keynote, instead he hosted the closing session. His talks are renowned for covering a wide range of VR topics and the fact that he can never keep to his allotted time as he’s got so much he wants to talk about. Needless to say he was given 90 minutes to fit everything in, still going over and being forced to stop even though he hadn’t covered everything he planned.

The full playlist below features 32 videos, many which won’t have been seen as they weren’t streamed. These cover everything from Quill and Medium to 360-degree videos and Unreal Engine.

Take a look – there’s many hours worth of content – and for all the latest Oculus News keep reading VRFocus.

Oculus Connect 3 Session Videos Released

Oculus Connect 3 Session Videos Released

The session videos for Oculus Connect 3 have been released, letting developers who couldn’t make it to the event get caught up on all the deep dives into VR technology offered by Oculus over the course of the event last week.

Here’s a playlist with 32 videos from the event included:

We’ll be combing through the videos for individual stories we can single out to save people time, but we wanted to bring these to your attention now.

Robo Recall Design Insights from Developers Epic Games

Jerome-PlatteauEpic Games has had a long history of releasing new demo content at big gaming and developer conferences to showcase the latest Oculus hardware, and this year was no different. Oculus Studios provided funding to further develop the Bullet Train demo from last year into a fully-fledged FPS game called Robo Recall. This demo had one of the most polished and mature game mechanics expanding upon the Bullet Train bullet capture-and-throw mechanic into new weapons and upclose hand-to-hand combat with stylized arcade AI robots gone rouge.

nick-whitingI had a chance to talk with Epic Games VR lead Nick Whiting and artist Jerome Platteaux about their design process, deeper intentions, and overall art style and direction of the game. They debuted a new locomotion technique that was designed to help subtly guide players to facing the true north of the front-facing cameras, and Nick admitted that there are some design constraints to creating a game with the Oculus’ recommended front-facing camera arrangement. Jerome also said that there are new gameplay options that open up with a potential third tracking camera, but they didn’t give any more specifics as to whether Robo Recall intends on supporting the optional room-scale type of gameplay.

LISTEN TO THE VOICES OF VR PODCAST


Support Voices of VR

Music: Fatality & Summer Trip

The post Robo Recall Design Insights from Developers Epic Games appeared first on Road to VR.

VR vs. Roculus Balboa

Well, we did it. We all managed to survive the first part of this fortnight of virtual reality (VR) mayhem and Oculus Connect is over for another year. Being the VR industry, which apparently moves at the speed of light, we’ve got to forget about all that now and move on rapidly to the next event on our calendar: PlayStation VR’s launch. A major topic indeed.

For once I actually have a number of topics I could write about which is something of a first for me on VR vs. I have however talked about PlayStation VR extensively in recent months and precious little has changed to alter my thoughts on the matter. We will get onto those other items soon but for this week I would like to touch on last week’s predictions/suggestions/hopes that I voiced. (Be sure to check out our ‘Best of PlayStation VR Launch’ series incidentally.)

Oculus Comes Out Swinging

At Oculus Connect 3 (OC3) I challenged Oculus to at the very least fight, and they needed to. Badly. Really badly. Let’s not forget that prior to OC3 the company was in a bit of a mess internally and, as I said, a perceived third to a company that hadn’t ever released their product yet.

Continuing my boxing analogy of the last fortnight whilst Oculus have been a VR fan’s favourite punching bag what we got after a month which saw them get knocked for a loop was a Rocky-style comeback. (“I ain’t goin’ down no more!”) Oculus addressed things, sometimes indirectly but tactfully and sometimes pretty directly indeed. There was a slight feeling of defiance during the opening. ‘Look at what we’ve done. Look at what we’re doing. We’re doing more – and we’re doing this, and this and this. Can you imagine this? It’s here today. We’re taking you to this future and that’s the end of the matter!’. They almost dared the VR community to disagree. Challenged everyone to say something bad about what they shown.

Room scale? – Oculus Rift can do it thank you very much.
VR is too expensive? – A VR spec PC is now available for $499 and we’re continuing to work with more and more partners to make it cheaper. “PC VR is more afordable than ever.”
Games are all that matters? – Well here’s what we’re doing with VR and education.
VR needs investment? – We’ve put in $250 million and we’re doubling that.
Publishing is hard – We’ll pay the fees if you use Unreal Engine 4.
Wires suck? – We agree. Let’s do something about that.
Software and programming needs to be better – Here’s how we’re making it better.

They were even very honest, brutally so. John Carmack at one point noting VR was “coasting on novelty” and needed to do more. A wince inducing statement of honesty and also of intent.

So. Content? Check.

There was plenty of content announced during the event. New media partnerships, including a potentially juicy one with Disney. New games, like Epic’ Bullet Train follow-up Robo Recall and Arktika.1 that look really good – although the latter is a bit disappointing judging by Kevin Joyce’s hands-on, which is a bit of a bummer. Games from ‘Oculus Studios’ were also included, which was good to see.

Robo_Recall_OC3_A4_screenshot_05

Santa Cruz (You’re Not That Far)

I did mention that I wanted to see a hint at the future, a CV2, but not the future in practice. Well move aside Crescent Bay, there’s a new prototype in town and it’s names is the Santa Cruz. We got a video package and members of the press got a hands-on with the tech which showed off a MacGyver’d Oculus Rift as a standalone headset. We might not have gotten the console connection, the ‘universal VR platform’ as I wanted. But Oculus did set its brand out to be ‘VR for All’.

You can read our hands-on with the Santa Cruz here.

Standalone VR Oculus - 3 (Santa Cruz)

Punished Palmer, Zuckerberg Rising

Please mentally insert a picture of an eye-patch wearing Palmer Luckey here for the purposes of subtitle continuity.

Where was Palmer Luckey? At home. Exactly where he needed to be for Oculus this time around. As Jason Rubin later addressed Luckey didn’t want to be a distraction and he surely would have been. It still took people by surprise and that he was not in any video packages either was surprising as well. All of this set the stage for someone to step up as the person to do the demonstrations live, but I didn’t think it would be Mark Zuckerberg. Someone who’s had his own share of issues in the last couple of months. However Zuckerberg took to the stage and whilst he hammered the idea that VR is the ‘new computing platform’ into the ground so hard I’m surprised it didn’t annihilate half the pipes under the building he was an engaging presence on stage. The demonstrations he did were really good and showed off some mind bending uses of VR technology.

In fact let me just write this down so we’re all clear:

Mark Zuckerberg, whilst in VR as an avatar, took a video call (as his avatar) with his wife (who was not an avatar) and they then proceeded to take a selfie of the two of them ‘together’ in their home with their dog who was a part of the 360 degree footage of their home that Zuckerberg was in… and then he posted it to Facebook from within the same app.

That’s just… crazy.

He did a really good job at showing just how amazing the technology we’re all using is, emphasised Facebook’s commitment to making VR the best it can be and came across as relatable and sincere in his excitement of it all.

Suitably Abrashed

I also mentioned it was time to start shining the spotlight a bit more wisely. Start making some other ‘faces’ of the company. We got more Iribe and John Carmack had his keynote, which I quickly gave up trying to take notes for. Dear God, does the man even breathe? You’d need to create a shorthand for your shorthand. Carmack did howver provide my favorite moment at OC3 by telling people who are snobbish about 360 video not being “true VR” to essentially get off their high horse and reminding everyone just how much that’s viewed compared to everyone else.

Here’s an appropriate internet friendly summary:

Carmackman

And here’s my immediate reaction:

(You can now follow me on Twitter and tell me what a terrible person I am.)

One person I forgot to mention last week, and who wasn’t even mentioned to me in the discussions I had in the run up to OC3 was Michael Abrash. We see so much of Luckey, Carmack, Iribe and Zuckerberg we tend to forget about Abrash. Which based on his performance at OC3 is a real shame. I’ve mentioned many times on here that I’m not the most technical person and whilst Carmack in full technobabble flow had my brains dribbling out of my ears by the end of it Abrash was relatable, commanded the stage (despite being ill) and was everything you could ask of making how VR works entertaining and understandable.

I was in the big Reddit discussion thread during the keynote and noted a lot of praise for his performance. One member complained Abrash wasn’t “hardcore” enough as a programmer anymore for them to care what he said. (What does that even mean??) Regardless of what they thought let’s hear more regularly from Abrash in the future please Oculus. I was genuinely entertained and informed by his delivery.

And deliver Oculus did.

Price details (though no across the board realignment as I wanted), dates, info, even things that might have passed you by in the flurry of news. Things like Oculus opening up aspects the headset design so people can design custom additions. The only way Oculus could have staked a louder claim for reestablishing themselves as the leaders of VR’s new age would have been if Iribe marched on stage with an Oculus flag jammed it into the floor and just yelled at everyone. Looking back I’m almost surprised he didn’t. Oculus didn’t ask people to acknowledge what they were doing, they straight up demanded it.

Next stop: the future, and Oculus are going to drag you there by the lapels.