Futuremark Explains Why VR Benchmarking is About More Than Just Numbers

Futuremark have now released the full version of their long awaited, dedicated virtual reality benchmark, VRMark. And, after months of research and development, the company has found itself having to redefine its own views on how the difficult subject of VR performance testing should be tackled.

Futuremark are developers of some of the world’s best known and most widely used performance testing software. In enthusiast PC gaming circles, their visually impressive proprietary synthetic gaming benchmark series 3DMark has been the basis for many a GPU fanboy debate over the years with every new version bringing with it a glimpse at what the forthcoming generation of PC gaming visuals might deliver and PC hardware fanatics can aspire to achieve.

Therefore, it was inevitable that once virtual reality reached the consumer phase, the company would take an active part in VRs renaissance, in fact with immersive gaming came lofty initial hardware requirements and a necessary obsession with low latency visuals and minimum frame rates of 90FPS. So surely a new Futuremark product, one focused purely on the needs of VR users, would be a slam dunk for the company. VRMark is the company’s first foray into the world of consumer VR performance testing and recently launched in full via Steam, offering up a selection of pure performance and experiential ‘benchmarks’, the latter viewable inside a VR headset.

vrmark-6However, as anyone who has experienced enough virtual reality across different platforms will tell you, putting a number on how ‘good’ a VR system performs is anything but simple. With dedicated VR headsets come complex proprietary rendering techniques and specialist dedicated display technology a lot of which simply hadn’t been done at a consumer level before. The biggest challenge however, the biggest set of variables Futuremark had to account for, was human physiology and the full gamut of possible human responses to a VR system.

Futuremark initially approached the issue from a pure, analytical perspective, as you might expect. You may remember that we went hands on with a very early version of the software last year which at the time came complete with some pretty expensive additional hardware. Futuremark’s aim at that time (at least in part), to measure the much coveted ‘motion to photons’ value – the time it takes for an image to reach the human eye, from render time to display. However, you’ll notice that if you’ve popped onto Steam to purchase the newly released VRMark, it does not list ‘USB oscilloscope’ or ‘photo-sensitive sensor’ as requirements. Why is that?

vrmark-preview-hardwareWe asked Futuremark’s James Gallagher to enlighten us.

“After many months of testing, we’ve seen that there are more significant factors that affect the user’s experience,” he says, “Simply put, measuring the latency of popular headsets does not provide meaningful insight into the actual VR experience. What’s more, we’ve seen that it can be misleading to infer anything about VR performance based on latency figures alone.” Gallagher continues, “We’ve also found that the concept of ‘VR-ready’ is more subtle than a simple pass or fail. VR headsets use many clever techniques to compensate for latency and missed frames. Techniques like Asynchronous Timewarp, frame reprojection, motion prediction, and image warping are surprisingly effective.”

vrmark-4Gallagher is of course referring to techniques that almost all current consumer VR hardware vendors now employ to help deal with the rigours of hitting those required frame rates and the unpredictable nature of PC (and console in the case of PSVR) performance. All these techniques (Oculus has Asynchronous Timewarp and now Spacewarp, Valve’s SteamVR recently introduced Asynchronous Reprojection) work along similar lines to achieve a similar goal, to ensure that the motions you think you’re making in VR (say, when you turn your head) matches with what your eyes see inside the VR headset. The upshot is minimised judder and stuttering, two effects very likely to induce nausea in VR users.

vrmark-5“With VRMark, you can judge the effectiveness of these techniques for yourself,” says Gallagher, “This lets you judge the quality of the VR experience with your own eyes. You can see for yourself if you notice any latency, stuttering, or dropped frames.” And Gallagher shares something surprising about their research, “In our own tests, most people could not identify the under-performing system, even when the frame rate was consistently below the target. You may find that you can get a comfortable VR experience on relatively inexpensive hardware.”

To describe Futuremark’s VR benchmarking methodology for consumers in more detail, here’s James Gallagher explaining it in his own words.


[Futuremark are] recommending a combination of objective benchmark testing and subjective “see for yourself” testing. We think this is the best way to get the whole picture, especially for systems below the recommended spec for the Rift and the Vive.
The reason is that the concept of “VR-ready” is more subtle than a simple pass or fail.
On the one hand, a literal definition would say that to be truly VR-ready a system must be able to achieve a consistent frame rate of 90 FPS on the headset without dropping a single frame. In this case, every frame you see comes from the game or app. You are getting exactly the experience the developer wanted you to have. You would use VRMark benchmarks to test this case.
On the other hand, when a system is unable to maintain 90 FPS on the headset the VR SDK will try to compensate by using Asynchronous Time Warp or frame reprojection or other techniques. In this case, only some of the frames you see on the headset are the real frames from the game. The others are created by the SDK to fill in the gaps caused by missed frames. Now, if the SDK does such a good job of hiding the dropped frames that you cannot tell the difference between it and the pure 90 FPS experience, then you could perhaps say that this second system is VR-ready as well. You can use VRMark experience mode to test this case.
Here’s an example to illustrate:
System A:
VRMark Orange Room benchmark score: 6500
Average frame rate: 140 FPS
System B: 
VRMark Orange Room benchmark score: 5000
Average frame rate: 109 FPS
System C:
VRMark Orange Room benchmark score: 3500
Average frame rate: 75 FPS
System D:
VRMark Orange Room benchmark score: 2000
Average frame rate: 40 FPS
The benchmark results show that system A and System B are both VR-ready for the Rift and the Vive in the pure sense. Both have enough performance to render every frame at 90 FPS when connected to a VR headset. But the difference in scores and average frame rate tells you that system A has more headroom for using higher settings or for running more demanding VR games and apps.
vrmark-2 vrmark-3 vrmark-1
System C and system D did not achieve the target frame rate. So the question now is whether the VR SDKs can compensate for the missed frames? For that, you would use VRMark Orange Room experience mode with a connected headset.
You might find that you cannot tell the difference between system C and system B when using experience mode. Even though system C is regularly dropping frames, the SDK is able to compensate and hide the effects from the user. The VR experience is as good as a true VR-ready system.
With system D you might find that there are noticeable problems with the VR experience. The SDK is not able to compensate for the low frame rate. You might notice stuttering or other distracting effects.
From this, you would conclude:
  • System A is VR-ready with room to grow for more demanding experiences.
  • System B is VR-ready for games designed for the recommended performance requirements of the HTC Vive and Oculus Rift.
  • System C is technically not VR-ready but is still able to provide a good VR experience thanks to VR software techniques.
  • System D is not VR-ready and cannot provide a good VR experience.
I think many gamers will want to know that the system they are considering will be truly VR-ready in the technical and pure sense. You can only get that insight from a benchmark. You also need a benchmark test that runs on your monitor to see how far beyond 90 FPS a system can go. The VRMark Blue Room benchmark is a more demanding test that is ideal for comparing hardware that outperforms the Rift and Vive recommended spec.
At the other end of the scale, price-conscious gamers might be perfectly happy with a cheaper system that can appear to be VR-ready through technical tricks, for example, the new Oculus Rift minimum spec announced at Oculus Connect in October. These systems can be evaluated with the benchmark (how much will the VR SDK have to compensate) and with experience mode (how well does the SDK compensate).

With all of that laid out, I asked Gallagher to explain why, if Futuremark are now recommending people adopt a ‘see for yourselves’ methodology for VR benchmarking, why does he believe VRMark is needed at all? In theory any single VR application or game could be chosen to be used in the above methodology. Why should people invest in VRMark?

“I think the value of VRMark is that it gives you an easy way to make both these objective and subjective assessments using common content in one app,” he says, “The benchmark tests provide a convenient, easily repeated VR workload. They give you a pure test for VR-readiness. Experience mode gives you a way to judge the quality of the user experience on systems that don’t meet the pure definition.”
The latest VRMark is now on sale via Steam for use with the Oculus Rift, HTC Vive and OSVR compatible headsets. Current feedback on the title is mixed, with some criticising the lack of more extensive ‘pure’ benchmark functionality. Purely as a showcase for VR, the price (£14.99 / $19.99) seem perhaps a tad steep right now, especially considering a chunk of that pretty showcase (‘The Orange Room’) is available in the free demo version. That said, VRMark is a sight to behold in VR and along with the methodology above, there are many who many find the money worthwhile.
We’d love to hear your thoughts on Futuremark’s recommended methodologies your experiences with VRMark and thoughts on how VR behcmkarking may evolve over time in the comments below.

The post Futuremark Explains Why VR Benchmarking is About More Than Just Numbers appeared first on Road to VR.

Breaking The Silence: How The VR Industry Mirrors The Early Days of The Silent Film Era

Breaking The Silence: How The VR Industry Mirrors The Early Days of The Silent Film Era

In December of 1895, Georges Méliès, a Parisian stage magician, became obsessed with a new technology. The cinematograph, a device for capturing and projecting moving images, fired his imagination, but its creators, the Lumières, refused to sell him one. So Méliès spent a year designing and building his own camera. He taught himself to develop and print film. He constructed a studio, assembled a production team, and recruited actors. Together they produced over 500 short films, full of creativity and spectacle. They also discovered dozens of new cinematic techniques in the process. Film, they came to realize, was more than “stage plays projected on a wall.” It was a whole new medium, with new constraints and new abilities.

Today’s VR storytellers find themselves in the same challenging position as Méliès. Like film before it, VR is a new medium, demanding new techniques for conveying time compression, simultaneous action, locomotion, scene transition, and more. Much as the early pioneers of film did, early storytellers in VR must work within tight technical constraints, and must frequently construct their own tools in the process.

Creating Inside The Box

The modern cinematic toolkit is full of tricks and techniques which seem obvious in retrospect. Each represents a battle won by some early filmmaker, struggling against the limitations of the medium. Cuts, cross-cuts, and dissolves appeared quickly, introduced by Robert Paul and James Williamson in film’s first few years. Discovery of other techniques took a decade or more. Tracking shots, popularized by Giovanni Pastrone in Cabiria, first appeared in 1912. Clarence Badger pioneered the use of zoom shots in 1927’s It, the film which made Clara Bow the “It Girl,” but the technique wouldn’t find mainstream use until 1960s French New Wave.

VR storytelling, still in its infancy, lives within the constraints of its limited toolkit. Lost and Henry, the first VR shorts from Oculus Story Studio, each play out in a single scene, in a single location, in real-time. These are restrictions Méliès and his contemporaries would have found entirely familiar.

While film can make use of close-ups, pans and zooms to direct a viewer’s attention, VR has little access to these techniques. Forced camera motion can destroy VR’s fragile sense of presence, and also induces nausea in many viewers. VR storytellers have, so far, largely drawn from the tools of stagecraft, using light, sound, and motion direct the viewer’s gaze.

But more robust, VR-specific techniques are emerging to supplement these older methods. Colosse, directed by Nick Pittom, makes use of gaze-driven triggers to control the flow of the story. If a viewer is looking away when an important event is about to happen, that event will wait to unfold until the viewer’s gaze returns to the proper direction.  “We can also move events from where a viewer is ignoring to where the viewer is paying attention,” Pittom elaborates in an interview, “or alter the events completely.”

Read More: Why Personal Space is One of VR’s Most Powerful Storytelling Tools

Pittom’s team is experimenting with other narrative choices unavailable in film. “I often consider how, rather than directing the viewer towards events, I can direct the events around the viewer,” he says. “Maybe it’s okay to have more than one event happening at any one time and allow the viewer to decide what is important to them. As VR storytelling evolves, we’re going to be able to walk the line between interaction and storytelling.”

While much pioneering work on VR’s narrative language is being done in traditional storytelling, game designers must also tackle some of VR’s thornier editing challenges. Building off of camera work each had done previously, Gunfire Games and Oculus Studios collaborated on a new type of transition for Chronos. “The camera system utilized in Chronos breaks many of the traditional rules you would typically find in film,” says Richard Vorodi of Gunfire Games.

In film, if an actor reverses direction when exiting one shot and entering the next, the result is a jarring break in continuity. In Chronos, when a player’s avatar moves between rooms, the player’s 3rd-person perspective jumps from a camera in the first room to a camera in the next. This perspective change can reverse the relative direction of the avatar’s motion.

But in VR, unlike film, the reversal produces no ill effects. Vorodi elaborates: “We found that with interactive VR, as long as the [avatar’s] position remained the same while exiting and entering shots, we could make just about any transition that was required for compelling gameplay.”

Expanding the Palette

Early filmmakers gave up a great deal moving from stage to film: color, sound, depth, and the interplay of energy between audience and actors. But film is also capable of things which are impossible in theatrical work. Location shooting lets filmmakers make the world their stage. Close-up shots bring the audience near enough to see and appreciate nuanced performances. Cross-cutting gives filmmakers a tool to show simultaneous action in multiple locations.

And camera tricks such as double exposures, substitution cuts, and running film backwards (all staples of Méliès films) allow for effects impossible to produce on stage.

Read More: Ultimate Beginner’s Guide to VR Storytelling

VR, in turn, has capabilities that extend beyond what is possible on film. The giant robot in Lost makes jaw-dropping use of VR’s superior sense of scale. In Henry, VR’s shared space and intimacy convert what would be a slapstick fall “on screen” into a moment of sad empathy “in person.” And without a fourth-wall to break, Henry’s eye contact with the audience feels genuine.

“The early days of cinema over a century ago was an era of incredible artistic freedom and technological progress,” says Eugene Chung, Founder & CEO of Penrose Studios. “Established studio systems and conventions did not exist, and therefore, artists and technologists weren’t constrained by the need to sell to a broader audience. They were free to explore. With the emergence of VR, we’re seeing a similar pattern of artistic freedom and rapid technological progress.”

Penrose Studios has done pioneering work in not only allowing the viewer to move within the scene but compelling them to do so in order to follow the narrative. In Penrose’s first film, the 5-minute The Rose & I, the main character lives on a tiny planet suspended in front of the viewer. When the action of the story takes him to the far side of his planet, the viewer must lean or walk around it to see what happens next. The studio takes this even further in its second release, Allumette, a 20-minute animated narrative that is one of the longest VR films of its type. The film held its red carpet World Premiere at the Tribeca Film Festival in April.

“With Allumette, we wanted to craft a full, emotionally-engaging story that was longer than anything we had attempted before,” says Chung. “The first films in the 1890s were only a few seconds long, then a few years later came The Great Train Robbery at 12 minutes. Eventually, over time, creators figured out that feature length films could actually work. In VR, the sweet spot of length of time for an experience is still under question, but we’re making strides towards figuring it out.”

The Road Ahead

Most VR storytellers are aware that the medium is still in its Georges Méliès period. Eugene Chung (who also cofounded Oculus Story Studio) opened the VR Filmmaking panel at the first Oculus Connect with a screening of the Lumière Brothers’ Arrival of a Train at La Ciotat. And, perhaps knowingly, Oculus hosted Connect in a hotel interlocked with a full-scale replica of the Babylon set from D.W. Griffith’s Intolerance (1916).

It took filmmakers a decade to master enough of their craft to attempt the first feature length film (Charles Tait’s The Story of the Kelly Gang, 1906), and it would be many years more before film would mature and achieve greatness. VR may not yet be capable of producing its own Citizen Kane, but the medium’s Lumières, Méliès, and Blachés are working tirelessly to create the tools which will one day support one. And we get to share in their joy of discovery as they each take us on their own Trip to the Moon.

Tagged with: , , , , , , , , , ,

You want Magic Cheap? Support ZapBox for budget-AR!

Great gag! Darn, sorry, it wasn´t mine. I took it from the new ZappAR kickstarter pledge on their plans to bring budget AR to the masses! Well-known, London-based AR company ZappAR runs a campaign to get funding for their new projects titled “ZapBox”. Read about it today to learn how you could achieve a complete AR experience for just 30 bucks!

Cardboard and paper-controllers

If ZapBox gets its funding (sure seems so) we will be able to use their cardboard-style AR experience as the most simple form of a head-mounted-display AR experience. While you know AR from ZappAR on your phone already – it´s always the typical smartphone-AR – a window to the AR world, hands blocked by holding the phone between you and the subject of (augmented) interest.

Now with ZapBox they want to go a step further – and this dirt cheap! They plan to bring us not only a cardboard to put your phone in, but also to give us two paper hand controllers (like a paper version of HTC Vive one´s). To make it happen they decided to you use a wide-field-of-fiew lens (a clip-on onto your smartphone camera) and fiducial markers (like in good old ARToolkit) they tagged “point codes”. With this they want to allow a fully mixed-reality room-scale setup:

ZapBox combines physical components with advanced software to provide magical Mixed Reality experiences. Insert your smartphone into the ZapBox headset, start the ZapBox app, and step into a whole new world of interactive content.

The ZapBox app displays the live feed from your smartphone’s camera so you can see what’s around you in the room. The magic part is that it can also display virtual objects that appear anchored to the real world, so you can change your viewpoint by simply moving around.

The video below shows their idea in a short clip:

What do we get?

Well, as listed as a supporter you would get a cardboard, the lens clip, the paper controllers and the circular markers. The software package for the supports will include the seen demos from Painting, Mini-Golf, a dancing the mini-game, xylophone play and the Mars exploration.

Developers have full access to the system to develop their own apps. The ZappWorks Studio will support ZapBox nicely and they also have a JavaScript-based API for ease of use. The whole package is supposed to be shipping to early backers in March 2017 and April for the non-beta backers.

So, is it worth at all investing in cardboard quality?

ZappAR definitely did a good job with their kickstarter pledge answering all questions and presenting the idea nicely. 30 Dollar also sure sounds promosing! But does the whole concept make sense at all?

On a feature-to-money ratio it sure does. They also list this on their page, ending up with the result that ZapBox must be as good as the Hololens (comparing the “checked” features) – only drawback you need a cardboard-ready phone in addition to the 30 bucks. This can still be way cheaper than any commercial Hololens device in 2017. So, why would I want to get a Hololens then??

Obviously their ad and listing is quite biased and there was nothing yet said about the quality of the experience. For starters, internal sensors, camera, CPU… of a regular Android phone on the street (unless it has Daydream-ready specs) will just not reach a highly sophisticated Hololens or other thousands-of-dollar-devices.

Second, the whole thing is a video-see-through system with a mono camera. How can you get a stereoscopic impression (for both augmented content and real world capture) with a mono lens? They do explain in their FAQ that they did some magic tricks to bend the video feed into two versions that work stereoscopically. This sure is possible, but I´m a bit afraid of the quality.

Third, why use old school markers again? Didn´t we get past them years ago? Do I really have to put paper points all over my place again? Well, check back again on the title of the post. We want it dirt cheap! I can understand ZappAR that they opted for this way. The today´s phone sensors just don´t have depth sensors in it (waiting for Tango!) and doing a full SLAM tracking might steal too much performance, especially if you need to combine it with the tracking of the hand-controllers (which just wouldn´t work). Markers might be ugly, but are just the damn cheapest thing to do!

Sure, this setup has high potential of making you motion-sick, let augmented objects jitter around or get cumbersome interaction with the controllers. Also I´m no fan of video-see-through AR, never was. The missing (real) stereo could also get on your nerves. But hey! That said, what I like about their idea is that they – as they say themselves – want to democratize HMD-based AR with it! Everybody can play! It sure is the cheapest start to do fully interactive hand-controller-supported, room-scale MR (you can also switch to full VR mode!). Every student or young professional without a budget could try this out to make first prototypes of AR until devices get cheaper.

I just hope for them that they don´t become outdated before they ship the packages… Tango is around the corner and obviously everybody could snap a Tango-phone into their cardboard next year getting probably way better results. Maybe then hand interaction will even work – without the need for paper controllers. Also room-scale could then do its job well without paper markers on the walls… So, if they are quick, it might work as an interim solution. But I sure hope, they build the SDK / API with these updates in mind, so that their platform can easily support Tango and others next year! ZapBox definitely is a fun project to follow.

So, check out their kickstarter page, where they go into more details and support them if you think this system is for you!

The 9 Best PlayStation VR Games to Play Right Now

The 9 Best PlayStation VR Games to Play Right Now

The age of consumer grade video game console-powered virtual reality is finally here with the PlayStation VR. Sony’s headset isn’t as powerful as the Oculus Rift or HTC Vive, or as portable as the Samsung Gear VR or Google Daydream View, and it doesn’t really offer fully-featured roomscale tracking, but it brings VR into the living rooms of over 50 million PlayStation 4 owners at an affordable price with a strong lineup of software.

There are dozens of games already available for the PS VR and it can be overwhelming to look at the PSN Store or gaming store shelves to see so many options. Which games are the best? What if I want a shooter and a music rhythm game, or an adventure title and a horror experience? We’ve compiled our definitive list of the 9 best PS VR games that you can play right now to alleviate those concerns. There’s something for everyone on this list — guaranteed! The Playroom VR, Call of Duty: Infinite Warfare’s Jack Assault, and other free experiences are not included. You should be downloading those regardless.

The following games are listed in no particular order and several awesome titles were left off that we wanted to include. Make sure you check the footnotes at the bottom of the article for past entries on this list that were retired to make room for newer games. The PS VR has plenty of great games already, these are just what we deemed as the very best so far.

RIGS: Mechanized Combat League [Review: 8/10]

If the PlayStation VR is here to prove that capable virtual reality is possible on a home video game console, then RIGS is here to prove that VR gaming can deliver a fast-paced, intense, competitive multiplayer shooter unlike anything else on any of the headsets. It combines the intense action of games like Unreal Tournament with the mechanized combat and spectacle of a futuristic sports league.

What RIGS lacks in non-competitive content it more than makes up for with depth and breadth within its core game modes. The variety of ways to blow other mechs up, as well as the variety of the mechs themselves, make it an addictive experience worth coming back to over and over again. More maps, mechs, abilities, game modes, and features are planned over the coming months ensuring this game will get the support that it deserves.

Until Dawn: Rush of Blood [Review: 7/10]

If you’re looking for something to get your pulse pounding in a slightly different way, then look no further than Until Dawn: Rush of Blood. This fright fest is based loosely on the same world of Until Dawn, a narrative-based adventure horror title for PS4. In this spin-off, you are descending down into the metaphorical, and quite literal, depths of hell fighting off crazed clowns and creepy monsters.

I won’t lie and say that this is anywhere near the best that VR has to offer — there are higher scoring PS VR games I left off of this list, for example — but it scratches a very specific itch so incredibly well. The 360-degree audio does an incredible job of immersing you and the jump scares are so well-timed it becomes addictive to see what’s around that next corner. This is one trippy on-rails shooter that’s unlike anything else out there and it lasts a hefty 3+ hours for a single play through on Normal difficulty, which is great for the price tag.

Battlezone [Review: 8/10]

After you’re done blasting away enemies in rival mechs and wetting yourself from spooky thrills, it’s time to get behind the controls of a futuristic tank in the revamped and revitalized Battlezone. The series made its debut decades ago when games were mere lines and pixels, but now, with the power of VR, Battlezone is back at it again with a fresh coat of 3D immersive paint.

Other shooters in VR will pit you against one another, but Battlezone is instead a celebration of cooperative strategy. If you have a few friends that got PS VR headsets as well, then this should be your go-to buddy game that lets you roll out as a squad of super-powered neon death machines.

 How We Soar [Review: 8/10]

This isn’t a game for everyone and it’s hard to fully appreciate without trying, but for gamers that enjoy a more esoteric gaming experience from time to time — something that evokes a calming sense of relaxation mixed with gorgeous vistas — there is plenty here to enjoy.

It lacks the action-packed gameplay of something like Eagle Flight, instead opting for a more slow-paced and deliberate design that’s beautiful to behold. It feels more in line with something like a walking simulator adapted for the immersive space, but for those that find pleasure from these sorts of experiences will discover one of PS VR’s most uplifting journeys.

Resident Evil 7: Biohazard [Review: 9/10]

This is the game that PS VR fans have been waiting on ever since it was announced back at E3 2016. Resident Evil 7: Biohazard does so many things it’s hard to believe Capcom actually pulled it off. First and foremost, it reinvents the stagnating Resident Evil series with a swift kick in the pants moving it from the third-person to the first-person perspective. However, while undergoing that change, this new game also serves as a return to form for the series as its survival horror roots are reintroduced to great effect.

But the most impressive feat by our record is the fact that it delivers a 12+ hour long campaign that’s fully playable in VR with a multitude of comfort options. The atmosphere is haunting, the story is memorable, and the gameplay is rewarding enough to easily make this rank among the top of the pile for the entire horror genre in recent years.

EVE: Valkyrie [Review: 9/10]

In our original review of EVE: Valkyrie from back when it launched on the Oculus Rift, we called it the game that VR headsets were designed to play. Now, it’s available on both the HTC Vive and PS VR as well and features fully-operational cross-play multiplayer between all three major devices and that vision is even more realized.

With a variety of game modes released for free since launch, as well as new maps and ships, there is a ton of content in the package now. Competitive multiplayer dogfights are still the heart of the experience and this offers some of the most intense multiplayer matches you’ll find in any game.

Thumper [Review: 9/10]

Trust me: you need to play Thumper. This game is so hard to properly articulate that even watching a video won’t do it justice. But, alas, that’s what I have here, along with my words, so that will have to do for now. To put things simply, it’s one of the most viscerally satisfying and visually enthralling experiences you can have inside of any VR headset on the market.

Described as a rhythm-violence game, Thumper pits you on a track and asks you to time your button presses, turns, and evasions across a series of dozens of tracks to the beat of a thumping soundtrack geared to make your face melt. It’s dark, twisted, and at-times infuriating, but it’s also simply fantastic.

Bound [Review: 9/10]

We’d forgive you if you wrote this off as a non-VR title, because at first, that’s what it was. But a free update to the game adds VR support and helps separate this from the pack as one of the most unique and breathtaking experiences you can have on the PS VR.

Bound tells an emotional and intimate story unlike anything else you’ll see in the medium and its beautiful world of bright visuals and evocative dance is worth exploring even for the most jaded of gamers.

Driveclub VR [Review: 7/10]

This is the racing game that you’ve been waiting for. If you already played the original Driveclub on PS4, there isn’t a whole lot here that you haven’t seen before, but it does add a few new tracks, cars, and game modes to give you a bit of a new experience mixed with the old.

For those that missed out the first time around, or simply want a realistic racing game to enjoy on the PS VR, then this is a no-brainer. Paired with a full-size racing wheel, the PS VR can quickly become a portal to the car of your dreams instead of just a hunk of plastic on your face.

1/30/2017 Update: Job Simulator and Robinson: The Journey have been retired from the list to make room for Resident Evil 7: Biohazard and Howe We Soar.

Tagged with: , , , , ,

Testing HoloMaps: viewing real-time & 3D map data on the Hololens sure is fun

Hey everybody,

just yesterday, HoloMaps was launched for the Hololens. This holographic map tool let´s you view 3D map data and enrich it with real-time data like tweets, annotations or traffic data. Read my first impression review on it and what you can use it for now.

What can it do?

HoloMaps is available since November, 15th for Windows Holographic only. So, you won´t be able to try it out without a Hololens. The developing company taqtile writes that they have about 200 cities and landmarks available in 3D to browse through. They describe it in short:

HoloMaps displays topography, infrastructure, and buildings in 3D while integrating data sources to bring contextually relevant information into the map. Collaborate and present to colleagues in the same room or remotely.

There are different versions out in the wild. The one you can download from the store will use Bing map data and show the 2D version during loading time and build up 3D renderings of the cities where available. Quality may vary here drastically and are often low res reconstructions from the aerial photographs from Microsoft. But obviously taqtile depends on the input data here and some landmarks are available in hi-res version to enjoy. They list the following: Empire State Building, Central Park, The Financial District California, Niagara Falls, The Space Needle, La Sagrada Familia Barcelona, The Colosseum, The Magic Kingdom Disney World, Fenway Park and Copenhagen Denmark. So, enough on the cities. What can you see? Check out their video that shows it quickly:

There are also professional cooperations planned and the first one that was already available and used was the HoloMaps version used for the PGA Tour during the 2016 Players Championship of Golf. A new video (just online right now) shows the demo nicely. Here, the user can scroll around on the golf course using the tap-and-hold gesture or by flying to points of interest by talking to Cortana (“go to the club house”, “go to hole 16”). Different views and layers can be activated, for example (quite interesting) the heatmap of golf ball landings on the courts. 3D-scans / 3D models are quite hi-res here and make a very good impression. The software is also multi-user ready to enjoy together.

So, how does it perform and what can I use it for?

ironmanmap

So, I tried it out myself. It worked out of the box, just place the hologram and off you go! I flew over to Munich (“show me Munich” via Cortana) and scrolled around by tap-hold commands. First impression: wow! In almost every Sci-Fi or near-future movie you can see hologram tactical maps or overview charts projected into the room (be it Ironman, Star Wars, whatever) – and with this app you can finally see it happen!

I like the simple presentation of the circular portion you get presented with the small menus at the edge of the plate. The menu moves around according to your physical movements giving you access to it from any point of view. The same works for the weather forecast that is presented behind the city landscape.

Selected layers for traffic flow or annotations worked nicely, though tweets did not show up (maybe there were none at Marienplatz, Munich?). Doing text annotations myself was a bit cumbersome since you can only do those through Cortana (no typing). With special characters or mixed languages it will just not work. But then again: this piece sure is just a demonstrator and obviously you could add more easily. Same happens for the 3D ink annotations where you can use your finger to draw above the 3D buildings. Accuracy is very limited and I guess you would only use it (for now) to highlight a certain building by scribbleing ontop of it briefly.

Overall the demo was great to look at (though sometimes the 3D buildings quality is really rough) and to pan around the land scape. You really feel like a giant observing the small human peasants (or at least their tiny houses) and can easily see that this kind of 3D planning and overview can become a new means to interact and look at information. Sometimes a bit hard was the movement by tap-holding and waving your hand around. Here taqtile just uses the official Hololens gestures of course. But having more interaction forms would help make the experience better. Maybe wave away a map, air select other cities or sights and bring more “haptics” in, e.g. that you can directly click on a building, etc. Would be nice, but that´s limited through the current Hololens sensors / SDK. So, currently interaction is very slim and only shows tweets or traffic as examples. Right now there is not much to do or manipulate – but it´s already a lot of fun to fly around the cities. It´s definitely a wow-effect seeing your real world neighborhoud on your desk and to pan around in it!

What will be next?

I can sure see many use cases and how it continues. The above video from PGA with the golf court is a nice demonstrator, planning city traffic or architectual city planning – or basically any landscape related planning could profit here. I couldn´t try the multi-user mode, but I´m sure especially this social feature will be a wonderful seller for AR! Just like when having a discussion around a clay or cardboard model of a city district (or whatever) you can easily gather around with more people and take a joint look.

Like in the above still from Ironman we can basically take any movie that uses holographic maps as an example for use cases: for planning heists, attacks, killing major holy trees, anything!

Exclusive: ‘ROM: Extraction’ Is The Debut VR Shooter From ‘Call of Duty’ Vets, First Contact

Exclusive: ‘ROM: Extraction’ Is The Debut VR Shooter From ‘Call of Duty’ Vets, First Contact

Think of a Call of Duty game that was released in the past decade and not only have one of the two dozen or so developers at First Contact, probably worked on it, but they also probably have the box art framed on one of the walls in their Santa Monica, CA studio.

In addition to Activision’s flagship franchise and working at both Infinity Ward and Treyarch over the years, senior leadership at the company have backgrounds with companies like Blizzard and Starbreeze as well. It doesn’t get a whole lot more AAA than World of Warcraft and COD. When we first broke the news about First Contact’s initial investment, Hess Barber, Co-Founder and President of the company, told us that they wanted to create more “expansive” VR content by taking “proven mechanics and added some new ones to create a very unique and enjoyable gameplay combo.”

After going hands-on with their debut game, ROM: Extraction, I can confirm that they’ve done exactly that.

“We built the original prototype for ROM: Extraction in just a week,” says Barber during my visit. “It’s been polished a ton since then obviously, but the core of it’s functionally is the same. We came up with this really addictive combination of mechanics in the first week and just ran with it.” Barber was joined by First Contact’s Vice President, Josh Ochoa, and Director of Community Management, Jessica Ward, during our interview and tour of the studio.

Whereas a lot of developers may workshop a world first, then build a game to fit into that setting, the team at First Contact opted for the reverse. With how visceral and immersive VR can be, they decided to hone in and perfect a particular gameplay loop from the outset, then build a concept around that core mechanic. In doing so, everything in the experience feeds into itself to enhance the gameplay.

It’s simple, but incredibly engaging. From the outset, you choose a dominant shooting hand — right hand for me — and are given a pistol that shoots quick laser bullets with a satisfying ‘pew’ sound. In my left hand is where the real magic happens. With the press of a button, I can spawn a bright, glowing orb. If I throw the orb, it bounces around walls like a supercharged pinball and if I shoot it out of the air it emits a massive burst of energy as it explodes. But the real catch here is that, not only can I throw and shoot orbs, but I can slow down time as well. Simple to pick up, not too complex, but addicting and fun like a classic arcade shooter. In ROM: Extraction, this is the secret sauce.

“That’s the heart of the game mechanic: spawn, throw, slow, shoot,” says Barber. “You get into a rhythm with it eventually and it has a very distinct look when you’re just watching someone play the game inside the headset. That’s the magic. We found out it was fun as hell and wanted to build something around that specifically.”

“The premise of the universe is that in the next several years humans are on the moon digging for diamonds and other resources,” explains Barber. “They unearth these chambers that look like a nesting ground almost, kind of like where a turtle would lay its eggs, and they find these orbs around the size of baseballs. They discovered that if these things were dropped, or hit, they started to ignite internally, bounce around, and explode. Naturally, us humans decide to weaponize them.”

In ROM: Extraction, everything takes place several decades in the future from the initial mining and humans have established an entire industry around the orbs. Earth’s moon is eventually depleted, so we venture out to other moons to continue scavenging. That is, until the aliens have had enough of us capturing and destroying their eggs.

In the feature image at the top of this article you can get a glimpse of what the aliens might look like in ROM: Extraction. I say might because that isn’t an alien actually — it’s a robot they’ve built. Just like humans create cyborgs and robots that look like other humans, the aliens have created these robots in their likeness as well. So while we don’t get to see the aliens yet, we have a pretty good idea of what they may look like. They deploy these robots to try and stave off our harvest, a sign you’d think would be enough to turn us away.

For my demo, it was a simple level that’s reminiscent of games I’ve already played before, but the orbs add a bombastic twist that’s hard to explain without experiencing. I’m standing in the center of a sci-fi console platform within a circular room as a timer ticks down. I can’t move around the environment at all, but I can maneuver around my physical room space. Enemies start to emerge, opening fire on me from all around. It’s a multi-tier level as some enemies sulk on the ground floor, peeking out from behind cover, and others take aim from the safety of windows above me. My instincts kick in.

Drawing my right arm up, I open fire and rapidly blast robots with my pistol, only to notice it takes a dozen or more shots to defeat even a single one. As I should have expected, the standard pistol isn’t very effective. So I pull my left arm back, throw an orb, slow time, and blast it just as it nears the head of one of my adversaries — then my jaw falls open. The huge explosion rendered in sl0-mo sends the robot cartwheeling across the environment using glorious rag doll animations. A smile creeps onto my face as it clicks for me; never mind the fact that I’m exploding alien fetuses to defeat an army of hostile robots, these explosion are incredible.

Some robots are crouching down behind pillars, but no worries, I throw an orb, wait for it to pass the pillar and — BOOM! — dead. Power-ups start to appear around me — some make the orbs stick, some make them split off into other orbs, and some result in deadly streaks behind them as they bounce. Before I know it I feel like a mixture between Gambit from X-Men throwing orbs instead of cards and Deadpool, bending and dodging bullets as I mow down my enemies. After the three minutes are up, my platform descends downward towards what I can only assume is the next phase of the mission: actually extracting orbs.

What I played was a brief and exciting glimpse into the world that First Contact is building. There are plans for an exciting asynchronous cooperative multiplayer experience, various maps, new enemy types, more weapons, unique power-ups, and more. So while what I played was just the beginning, it could grow and expand exponentially from here.

First Contact is planning to release the first phase of ROM: Extraction later this year onto Steam, with regular content updates planned every one or two months into next year. It’s a good business model that has been working for other VR studios, as long as the updates are meaningful. While I came away impressed with the polish and quality of the experience I saw, it’s far from the “expansive VR play that mixes real gaming and narrative content” that they originally alluded to when their investment was announced. But Barber assures me they’re working on expanding ROM: Extraction and debuting other, larger projects as well. Unfortunately, First Contct was not ready to show gameplay footage of ROM: Extraction the game at this time, but a teaser trailer is slated to release in the coming weeks.

ROM: Extraction is scheduled to release later this year for HTC Vive and Oculus Rift with regular updates throughout the next several months. They’re interested in bringing the experience to PS VR as well, as long as they can preserve the 360-gameplay with the limited tracking of Sony’s headset.

You can keep up with First Contact and ROM: Extraction through the official website and following the company on Facebook and Twitter.

‘Into The Rift’ Episode 1: A Brief History of Oculus

‘Into The Rift’ Episode 1: A Brief History of Oculus

Hey there friend. If you clicked on this post than you are either already a fan of the scrappy young company known as Oculus, or you’re dying to learn more and become a fan yourself. Either way I just want you to know that within these digital walls, you’re among friends. This is a show for all of us that witnessed the 2012 Kickstarter, endured the Great Component Shortage of 2016 and are currently engaged in a rousing game of “Where in The World is Palmer Luckey.”

This is a place for curious newcomers to learn more about the company and its products. This is a place for time, tested fans to swap war stories and compare theories. This is a place for all of us to dream about the future of virtual reality. This is “Into The Rift.”

Welcome to the party pal.

Episode One: A Brief History of Oculus

In this inaugural episodes, your hosts Joe Durbin (hey, that’s me), Ashley Whitlach and Az Balabanian dive into the history of Oculus. We discuss where the company has been, where it is now and where it may go in the future. This is a fantastic starting place for new fans, and a great refresher course for all you longtime believers. Some highlights of the episode include:

One young boy’s journey from tinkerer to tech titan

Our take on Oculus Touch and what it means for the company’s future

Ashley’s strange infatuation with Captain Planet

What Now?

Thanks for listening! Now we want to hear from YOU! Hit the comments below with your burning questions about Oculus and we’ll answer them on the air during next week’s episode! You can also email Joe@uploadvr.com or Tweet @UploadVR.

Next week’s episode features a lively discussion with the creators of Rift launch title Lucky’s Tale. We’ll be talking all about the creative process for that game and what it’s like to work with Oculus on content. Going forward, new episodes will air every Tuesday.

Right now you can find us on SoundCloud  but we’ll be on iTunes, Stitcher and other major podcasting platforms as soon as we can.

Thanks for listening, sound off in the comments, check out next week’s episode and get excited people. The future of Oculus looks bright.

Tagged with: , , , , , ,

Brave New World: How Crytek Is Using VR To Push Gaming To New Limits

Brave New World: How Crytek Is Using VR To Push Gaming To New Limits

Crytek is one of the most influential studios when it comes to introducing ambitious technology to video games. Since it was founded 17 years ago in 1999, it’s produced some of the most technically impressive titles in the industry, changing first-person shooters in the process.

The first Far Cry game, released in 2004, was lauded for its massive open world setting, long draw distances, seamless transitions between outdoor and indoor areas, and even an advanced rendering system for vegetation. Crytek developed its own game engine, CryEnginefor Far Cry as well. Since then, Far Cry has quickly become one of the premium shooter franchises, and CryEngine has been used in a slew of stunning titles, including Ryse: Son of Rome, and Everybody’s Gone to the Rapture.

Crytek undoubtedly has a storied history with technology, one that’s replete with accomplishments like Far Cry. It shouldn’t be a surprise, then, to see the studio embrace virtual reality.

From Far Cry To VR

“Innovation is in Crytek’s DNA and we are always investigating new technology,” said Elijah Freeman, Crytek’s Executive Producer. “VR is a medium that allows us to invent and try out new gameplay ideas, and with CryEngine we have a great foundation that gives us the freedom to experiment and translate our vision to a new platform. We have been developing VR for the last two years and one of our goals is to be one of the leading AAA VR content and technology creators.”

Like Ratchet & Clank developer Insomniac, Crytek is one of the more well known studios that’s been leading the charge on VR from the start. As much as Far Cry, Crysis, and a slew of other titles helped shape what Crytek is and means for fans, its heritage is very much rooted in exploring new technologies. The studio is convinced that VR is here to stay and isn’t just another gimmick that’ll be forgotten a few years from now

With The Climb [Review: 8/10] a rock climbing simulator, and Robinson: The Journey [Review: 7/10] — a first-person exploration game — Crytek already has a decent understanding of what works and what doesn’t work in VR. The studio’s experience making technically advanced shooters has definitely helped.

“The original Far Cry is now over 12 years old, and since then a great deal of work at the studio has been focused around creating technologically groundbreaking first-person games,” said Freeman. “Through that process, the team here has learned a lot about what does and doesn’t work from the player’s perspective, and that knowledge has always been poured back into CRYENGINE too – which has made it a very powerful resource for a project like Robinson.

“On top of that, Crytek has always sought to push the envelope in terms of visual fidelity, so we’ve developed a lot of techniques and tools over the years that are advantageous for VR development. Add in the fact we released Crysis 2 in stereoscopic 3D half a decade ago and I think you have a picture of a studio that was really primed for VR. It has certainly felt like a very natural step for us.”

Rock Climbing Was The Beginning

Crytek had a couple of ideas for what its first two VR games should be. Though it’s capable of delivering enjoyable and engaging gunplay, the studio also has experience making games where platforming is a major part of gameplay. In both Far Cry and Crysis, players often have to traverse mammoth environments and structures. Crytek realized that first-person platforming would work well in VR.

“Well, we experimented with a lot of different gameplay mechanics to find out what would work in VR, what would be compelling, what had potential, and so on,” said Freeman. “The climbing movement through a VR space felt very natural to everyone who tried it and as we iterated on the concept, it became clear that it deserved its own game. Free solo rock climbing is one of the most extreme, exciting, and dangerous sports on the planet. VR allows players to become present in the game world, and free solo rock climbing is the perfect kind of experience to make use of that.”

Crytek started researching rock climbing, and even had expert rock climbers play The Climb for feedback.

“You only need to take a quick look at footage of some of the world’s best free solo rock climbers to see how spectacular the sport is, as they perform death defying moves in amazingly beautiful natural environments,” said Freeman. “We’ve had both amateur and professional climbers play the game, and they’ve told us it captures an essence of the sport and a sense of realism. It’d be cool if non-climbers played the game, and were inspired to try out real life climbing after they get a taste of the sport in The Climb.”

A Love For Dinosaurs

During The Climb’s development, Crytek was coming up with an idea for a more full-fledged triple-A VR title. The studio fell in love with a particular setting — a time in history when dinosaurs roamed the earth, and humans were still relatively primitive. Crytek wanted to create a game that allowed players to visit a world they wouldn’t able to actually experience.

“When we knew that we wanted to do a VR project, we spent time thinking about some of things we would love to experience in real life, but simply can’t,” said Freeman. “Around that time, we were also working on our ‘Back to Dinosaur Island’ VR demos, and realized how magical it was to encounter prehistoric creatures in VR. We decided to explore this setting further and mix it up with something else we love – space and science fiction.”

Robinson doesn’t feature combat, as Crytek wanted to focus more on exploration and making players feel like interactive tourists. Finding and scanning exotic creatures is the crux of what you’ll be doing n Robinson. Like with Far Cry and Crysis, Crytek wants to the deliver a stunning title that can be a great primer for even more ambitious VR experiences.

“Almost everything in the game’s universe has a background, and the lore extends far beyond what players will be able to see first-hand,” said Freeman. “We feel like this approach really adds to the believability of the world, and we want players to become more immersed in the story as they explore every inch of their environment and gain an understanding of their place in this fictional universe.”

 

The Climb is available on the Oculus Rift using a gamepad with Touch support arriving later this year, and Robinson: The Journey is now out on PlayStation VR. Have you played either of these titles yet? If so, what did you think?

Alex Gilyadov is a freelance writer with work appearing in multiple publications, such as GameSpot, VICE, Playboy, Polygon, and more. You can follow him on Twitter: @rparampampam

Tagged with: , , , , ,

Community Download: Does The HTC Vive Have A Content Problem?

Community Download: Does The HTC Vive Have A Content Problem?

Oh boy easy there fellah. You came in here pretty fired up from that headline, huh? Well here, take a seat and I’ll put that pitchfork in the closet. Don’t worry it will still be there if you need it but for now just here me out.

The HTC Vive is a miracle of modern engineering. It’s undoubtedly at the absolute pinnacle of performance for modern virtual reality headsets. The entire industry owes it a debt of gratitude for what it’s pioneered with room-scale and hand tracked controllers. However, for all of its technical prowess it may have an achilles heel: content.

Our question for you dear reader today is this: does the Vive have a content problem? 

There is a ton of content available through Steam VR for the Vive. However, a good majority of that content is unfinished, early access, and lacking the level of polish that its competitors at Oculus and Sony are providing for their respective platforms.

It’s not that there are no high-quality, fully developed titles for the Vive, its just that for every The Gallery: Call of The Starseed or Raw Data there are tech demos that should have never been released and certainly not with a price tag attached. The signal versus noise ratio is at best breaking even on the Vive and that type of distribution philosophy may not pack enough of a punch to motivate the purchase of a $699 headset.

One final thing to consider before we let you loose is the promise of future content. Sony and Oculus both either already have, or are actively building, reliable pipelines for producing high quality VR content. Sony has its tried and true ecosystem of first second and third party studios, and Oculus has already provided $250 million of support money to burgeoning gaming companies through its parent company, Facebook. HTC is starting to ramp up a similar system, but its partner company, Valve has yet to enter the arena as vehemently.

So now it’s your turn. Let us know in the comments if you think the Vive’s current content lineup and its future prospects are completely fine, or a cause for concern. 

Phab Tango: depth-sensing released to the public soon

We´ve seen the demonstrators and the tablet-sized mock-ups and DEV kits of Google`s Project Tango. But now it is finally to hit the market real soon! The Lenovo Phab 2 Pro is the first consumer device to integrate all Tango tech and sensors to bring a Hololens-like experience to our pocket. But is it worth spending the money? Will this define new must-have features in the future?

Project Tango for starters

If you have missed it, Project Tango started off at Google ATAP team (Advanced Technology and Projects) to let smart devices know where they are independently from outside signals. Short version: they use different built-in sensors and computer vision algorithms to learn about the environment and position the device. But with the reconstruction of the world you can also do great augmented reality to occlude virtually inserted objects and have them glued down to the ground pretty well.

The team kicked off their work in 2014 (releasing the first peanut phone device for developers as I wrote back then). The work was lead by Johnny Lee, who was before well known for his work on Microsoft´s Kinect and his hacks on the Wii (e.g. to track your head for more immersive renderings taking your perspective into account). After the peanut phone the yellowstone tablet for developers and the research scene started off with many fun projects with it – but a consumer device was yet to be released.

Lenovo making the move

Now, Lenovo stepped up and brings the first consumer device to the market supporting Project Tango. It integrates the depth sensor, a fisheye and high res RGB camera on the back side to make it happen. This comes at a price of 6.4 inches size but also with a nice resolution of 2560 x 1440 pixels on the IPS panel. Pre-orders are now open for US market (darn, when is Europe getting it?) and shipping is supposed to start in December for $499. Time for the developers to build some X-Mas games now!

What to do with it?

First off, the technology of Tango always was a dream for researchers and developers in AR. It combines all current technology with the 3D reconstruction and depth sensing functionality we have all been waiting for in the AR field – if it turns out to work well as hoped for in everyday usage. It will sure continue to be a developer and researcher platform to try out new ideas and mock-up the shit out of it! AR with full knowledge of your space, real-time occlusions and positional tracking is just the way it was meant to be played.

The consumer ideas and current Play store apps show some mixed ideas. From simple measuring tools to check furniture in your room before you buy it to AR games and other augmented shopping experiences. A nice demo is the Woorld game, where you populate your living room with houses, objects and critters – a fun, noisy, colorful version of Minecraft combined with some Populous or any other random mini game like fishing. With the integrated physics it is probably good tech demo fun for a while:

If you like it a bit more scary, Ghostly Mansion could be another gaming option to try things out. Aaron from Rabbx Inc. was holding a post-mortem speech on their development for Project Tango at the Vision VR/AR Summit this year.

Worth the trouble?

Right now, people report on long loading times on AR apps and obviously the number of apps is limited today. Having it as a primary phone won´t work for many people (due to it´s huge size). Probably it will still be the developer´s phablet for a while or a rich kid´s toy. The phone is too small for a tablet but too big for your pocket. But if the phone works well and benchmarks on eye level with phones like the iPhone or the Galaxy S series it could become a good new standard for roughly 500 bucks. I hope I can write an official hands-on review on it to really get the answer down.

I can sure see how the additional sensors could become a commodity and included in the basic configuration of any future smartphone (of Google). Then Pokemon and others would really work well with your phone! I´m pretty positive that this integration will happen. Google even just moved their Tango people over to the Daydream (VR) team. Does it mean they will combine forces to finally bring inside-out positional tracking through the Tango sensors to their Daydream VR devices in the future? Johnny Lee does not really deny that, he only talks about the current problem of overheating.

We can imagine how many new games and applications can be build on top of it and AR finally becomes a standard… in our pockets… or on our nose? Is it maybe just another 2-3 years period of trials with advanced pocket devices like the Tango until HMDs like Hololens, Meta or MagicLeap are ready and until we throw away our smartphones? It is just a trial device on our way to hands-free head-worn AR? In any case, shut up and take my money! ;-)