The Future of Play: How Intel And The ESL Are Helping Bring Virtual Reality To ESports

The Future of Play: How Intel And The ESL Are Helping Bring Virtual Reality To ESports

While traditional eSports aren’t going away, many companies are building the foundation for a virtual reality subset. The current eSports numbers are staggering. According to Newzoo, of the 1.3 billion gamers worldwide, 256 million are eSports fans today. That number will grow to 385 million by 2017. ESports generated over $493 million in revenues last year and are expected to jump to $696 million this year.

When you look at the gaming landscape, the only thing as “hot “as eSports is virtual reality. Newzoo forecasts global virtual reality and augmented reality will generate $569 billion by 2025. And gamers will be a big part of that revenue, with projections of spending $100 billion on VR hardware by 2018.

Record-Breaking Numbers

So it should come as no surprise that the world’s largest eSports company, The Electronic Sports League (ESL), and one of the giants in both the tech and gaming markets, Intel, are already laying the groundwork for virtual reality eSports. The eleventh season of the Intel Extreme Masters (IEM) eSports tournament just concluded in Katowice, Poland over two sold-out weekends this month. Over 173,000 people attended and another 40 million people tuned into the livestreams across Twitch, Twitter, and a dozen television networks globally. IEM Season 12 will kick off its year-long tournament tour schedule on May 6 in Sydney, Australia.

ESL and Intel partnered with Sliver.tv to broadcast IEM Katowice in 360-degree video as part of a 2017 contract that will include seven global eSports events across ESL One and IEM. At Katowice, the included a first-person virtual eSports stadium experience delivering an immersive 360 VR space that includes live stats, replays, and scores in real-time. The VR stream featured a 200% growth in peak concurrent viewers compared to IEM’s first virtual reality live stream in Oakland, with 340,000 total unique viewers tuning into the VR broadcast.

The tech companies first experimented with 360-degree broadcasting across Counter-Strike: Global Offensive, League of Legends, and Dota 2 at ESL One New York and IEM Oakland last fall. More than 130,000 unique viewers tuned into the IEM Oakland VR streams. Sliver.tv has a separate deal with DreamHack to bring seven of its global 2017 eSports events to fans in 360. That deal kicked off with Dreamhack Vegas last month, which means eSports fans will be able to virtually attend 14 events this year using any VR headset.

Frank Soqui, general manager of the enthusiast desktop group at Intel, told UploadVR that Intel has invested in companies like Voke and Replay people fans of all sports want to look around and enjoy a 360-degree experience.

Redefining How Viewers Enjoy eSports

“We want to bring the audience into the immersive experience from a VR perspective through apps like Sliver.tv,” Soqui said. “Just because existing eSports games like League of Legends and CS:GO aren’t native VR games doesn’t mean we can’t use Sliver.tv to get people inside. We believe eSports will quickly evolve from watching competition from a flat screen perspective and will include virtual reality. I don’t know how many games will start taking existing designs and move to VR, but a lot more games will show up inherently designed with VR in mind.”

Intel actually showcased several of these potential VR eSports titles in Katowice, including a game being developed in Warsaw, Poland by HyperVR called Hyper Arena. While the game developer had a 1 vs. 1 version of the TRON-inspired disc-based HTC Vive game playable at the Intel Showcase at the Katowice International Conference Center, Lukasz Kur, founder and general director at HyperVR, said the studio is creating multiple additional levels that add new locales as well as a variety of weapons to the mix. The ultimate plan, according to Kur, is to release the game in 2018 with 2 vs. 2 gameplay. Beyond that, Kur would like to expand the gameplay into a 5 vs. 5 gameplay experience, which he believes will be perfect for team-based eSports. The game’s being designed to allow spectators to sit and watch the physical competition inside the virtual arena.

“Just imagine a full stadium of people who are watching gladiators rumble inside a virtual arena, when not only reflex and concentration matters but also physical muscle strength, agility, balance and creativity to finish off your opponent with style,” Kur said. “Hyper Arena VR is a perfect balance between sport and eSport.”

Intel is also showcasing a number of other eSports titles that could find a place into the tournament this year or beyond. Insomniac’s Oculus Touch spellcasting game, The Unspoken, made its second straight tour stop in Katowice (following its debut at IEM Oakland), alongside Ready at Dawn’s Lone Echo and Croteam’s Serious Sam. These games were featured in eSports tournaments open to the public in Katowice – complete with prizes.

“These games were developed with VR in mind and we’re starting to see the eSports angle emerge,” Soqui said. “Developers are starting to think about what kinds of VR games they should be creating that incorporate eSports fans into the experience. With Sliver.tv we at least have the audience inside of the game, but now we’re starting to see developers create games specifically for VR eSports.”

Intel also hosted VR games throughout the two-weekend event, which also featured a VR Festival Day on March 5. Vertigo Games’ zombie shooter Arizona Sunshine, Ubisoft’s Star Trek Crew Commander and Survios’ cooperative shooter Raw Data were among the titles playable for visiting eSports fans who attended from across Europe.

According to Ralf Reichert, CEO of ESL, two things will happen in the coming years as eSports evolves. One is that almost every game will have a competitive online aspect to it. And the other thing is there will be growth in the diversity of games.

“There’s a very small number of top games that people play today, but that will grow to include more games,” Reichert said. “And more professional eSports teams will be playing different types of games. Some of those games will be in VR, where you play standing with a controller and other input mechanisms that we haven’t even invented yet. The viewing experience could change as spectators wear VR headsets. It’s going to be fascinating to see how this all develops over the next 20 years. Like everything in gaming, it changes quicker than anything else does.”

Getting Active

Soqui said to succeed in eSports, VR games have to be really compelling from a viewing perspective like these giant tournaments that CS:GO, League of Legends and Dota 2 attract.

“I expect to see a lot of experiments and small local eSports things spring up,” Soqui said. “How fast it gets to that depends a lot on the fan base and how immersive it is. But you can see developers already interested. The great thing about VR is that it can bring new players into the market, and introduce a new audience to eSports.”

Lee Machen, director of developer relations at Intel, said one role IEM will play moving forward is ensuring that everyone has a chance to experience VR around the world.

“People who try VR are usually blown away by the experience,” Machen said. “There are a few things that have limited the growth of VR to date and one of them is how to get more people have that first ‘Oh my God’ VR experience.”

Intel showed off its Project Alloy wireless VR head-mounted display to eSports fans. That technology debuted at CES 2017. Soqui believes Intel’s Y-Gig technology, which debuted at Mobile World Controller, will also find a place at IEM moving forward, while that tetherless VR tech could also free up more competitive eSports play inside virtual reality in the near future.

As ESL and Intel map out the global stops for the 2017-18 tour schedule, VR will be a mainstay for eSports fans to play games, watch livestream eSports from the arenas and potentially become the future of eSports competition – at least on the smaller tournament stages for now.

Tagged with: , , ,

7 Things We Can’t Wait To Stick The Vive Tracker On

7 Things We Can’t Wait To Stick The Vive Tracker On

I sort of see the new Vive Trackers as like the post-it notes of VR, in that I want to stick them to absolutely everything. I often walk down the street looking at things you could virtualize by clamping HTC’s new peripheral to them. In fact I probably scare people when they see me contemplating a trash can and talking to myself about what it could do for VR.

Okay, that doesn’t really happen, but the Vive Tracker does open up a new world of possibilities for SteamVR. The kit, which is essentially a piece of plastic that your base stations will track the position of just like they do your headset and controllers, is already shipping out to developers, and we’ve seen a handful of ideas from the community.

But we’ve had a few ideas of our own. Here’s what we’d like to stick the Vive Tracker to when it starts to roll out later this year.

Instruments

When you think about it, Oculus and Harmonix thought up the basic concept of the Vive Tracker a long time ago, when it stuck a Touch to a plastic guitar and gave us Rock Band VR. We’ll find out how they worked out for the pair in a few weeks, but the Tracker could actually bring our real instruments into VR so that we could play them in front of scores of adoring virtual fans. There’s a lot of potential for a Rocksmith-style VR tutoring game here.

Cups

We all know playing VR games can be physically demanding — my shoulders still hurt from my latest round of Paulo’s Wing — so it’s a pretty good idea to stay hydrated throughout. Removing your headset and finding your icy beverage is annoying; why not just have it in the virtual world too? Note that this won’t solve VR’s awkward angles issue where most cups won’t actually be able to tip into your mouth without hitting the bottom of your headset. Forget positional tracking; tell John Carmack this is where the real innovation is needed.

Phones And Tablets

I’ve already seen a pretty cool multiplayer VR game that stuck a Vive Tracker to a phone, but I’m talking about bringing those devices into the virtual world with the user. Smart devices could be incredibly versatile input mechanisms for specific VR experiences. Imagine what it could do for exisiting experiences like Tilt Brush, allowing you to fine tune your creations with greater depth. Plus it would be great to quickly flick through your phone in VR.

Toy Lightsabers

Since 1977 humans have dreamed of what it would feel like to hold a real lightsaber and do glorious battle with evil forces. We came pretty darn close in 2016 with Star Wars: Trials on Tatooine but we could come even closer with the Vive Tracker stuck to a plastic sword. Imagine two of these things in the hands of two friends having what felt like a real lightsaber battle with tactile feedback. Okay, better make them out of foam instead. No headshots, either.

Friends

We’ve already used the Vive Trackers to bring our own bodies into VR experiences, but what if we put them on our friends? What types of experiences might be possible? Imagine the kinds of co-op games you could make in which friends could work with you and have a real presence in the virtual world. Again, you’re probably dealing with a lot of health and safety concerns, but this is definitely something that games like Keep Talking And Nobody Explodes should consider for future updates.

Pets

Some readers with a friendly cat or dog will probably be familiar with the stomach-churning guilt that comes with accidentally kicking them while walking around in VR. It’s all too easy to forget that a four-legged creature might be trotting up to say hello while you’re jumping around in your living room. I would happily pay $100 not to have that experience again and attach a VR life jacket to my pets. This might also be good for babies.

Oculus Rifts

The Tracker might not be the perfect answer but it’s certainly one possible way of achieving a larger room-scale setup for your Rift. As Valve’s tracking hardware continues to evolve it would be great to give Rift owners the option to buy some base stations and a few trackers to get larger room-scale tracking on their headsets.

Tagged with: , ,

Oculus On Previous Tracking Issues: ‘We Made A Mistake’

Oculus On Previous Tracking Issues: ‘We Made A Mistake’

Speaking with Oculus’ VP of content, Jason Rubin at GDC 2017, UploadVR asked him about the tracking issues reported by some owners of the Rift VR headset and Touch VR controllers. Rubin offered a pretty clear statement in response: “We made a mistake.”

“We are part of Facebook. Facebook’s slogan for a long time was ‘move fast and break stuff,'” Rubin said. “We believe we are in the move fast and break stuff mode when it comes to VR. We are pushing as fast as we can to get as much functionality into our systems as we can so that when open standards come, and we start to slow down because we have to support a lot more things, we have the foundation that we need.”

The Oculus tracking issues were most prevalent in users attempting to run “experimental room-scale” setups. These require several Oculus sensors in order to work. Affected users were finding serious glitches in their setups, like virtual floors randomly rising and falling or digital hands simply floating away.

Rubin on stage at DICE 2017

In an attempt to solve these problems, Oculus released update 1.11 — a software patch that was released in part to address tracking concerns. However, 1.11 had the opposite effect for some — tracking got worse. Oculus since released version 1.12, another patch that actually has fixed the vast majority of tracking problems. Rubin says that the company has learned and grown throughout this process.

“We’ve learned a valuable lesson: don’t move too fast don’t break too much stuff,” Rubin said. “But [the tracking was] fixable. That’s not permanent, that was a one time kind of failure.”

1.12 has been out in the wild now for around a week. In that time, user sentiment on major online sounding boards like Reddit seems to have turned from disappointment to adulation, as those running larger setups see their functionality returned to normal.

Let us know in the comments below if your tracking issues have been improved since 1.12.

Tagged with: , , ,

Farewell, holograms! Looking through the window…

The force is strong with this one!

… we thought. Well, Jedis wouldn`t need holograms or AR… they just see through the force. But puny little humans on Earth like us need devices for holograms. R2D2 had some real cool holographic technology included using the air particles as the canvas. Well, while there are some smoke or vaporized water screens out there, that´s not the real deal. As of today we cannot project holograms in a useful consumer way without some kind of medium. Holograms are a photographic recording of a light field that works in 3D without the need of glasses.

Damn, now and here, reality (real reality) hits us: currently all mixed reality tech is viewed through some piece of gadget (DLP, pocket device, glasses…) and we were all crestfallen when we first saw the “holographic lens” by Microsoft. It would have been just too good to be true… but we cannot avoid the detour through glasses we all figured. Time to get down to earth and live with it – and make the best out of it (… and to be happy with it! I`m certainly still am).

Microsoft going for a Windows view

So, Microsoft seems to have realized this as well – maybe their marketing and PR team took the time to look into wikipedia. Just now an unnamed spokesperson of Microsoft gave Tom`s hardware a quote:

“Microsoft renamed “Windows Holographic” to “Windows Mixed Reality” to be more encompassing of the company’s broader vision for the platform. We’re unifying the mixed reality ecosystem around a platform that enables shared experiences and interoperability between headsets. By opening up the Windows Mixed Reality platform to the industry at large, we anticipate the growth of holographic apps will make for stronger experiences and better devices for everyone.”

Time to find-and-replace all holographic´s in their websites and Hololens descriptions! At least they picked the suitable term of Mixed Reality instead (as discussed multiple times before here). Their range clearly aims wider now – reaching from AR to VR and anything that might fit into Milgarm`s mixed reality continuum this way.

Just In Time Compiled

So, just in time before the release of the Creator`s update of Windows 10, Microsoft bothered to rebrand. Since there is more news on Windows Holographic Mixed Reality, I thought it`s worth a shout out here to sum it up what`s going on with the big player.

Recently, Microsoft decided to skip the Dev Kit #2 for the Hololens. Mark 3 to be expected not before 2019. Why did they opt for that? Honestly, I can totaly live with it and am happy – as long as they put all resources into the next release and spare us another interim solution with slightly better headstrap or 10% better battery. Let´s go all in on a bigger update in 2019! Until then it´s enough time to establish a MR operating system and multi-device pipeline for development.

But you get the impression that it was all a bit overhyped and roll-out was just not realistically anticipated by most roadmap creators or marketeers. Now we get the Acer MR glasses (VR glasses) that support holographic Mixed Reality by Microsoft… but then again only for developers with very limited access at the start. They even dared to tease us with an empty plastic set of glasses last year… Is it really necessary to show things so early? Or maybe only the AR bubble pro`s saw this anyway and consumer landscape did not take a note. The Rift DK1, Crystal Cove, etc… also took a while to get out there.

Anyway, getting the operating system up and running for it now, seems a valid and smart move still. Since a week the new insider build of Windows 10 including Mixed Reality features for development is out in the wild and worth a look. Sean Ong was quicker than me and recorded a hands-on quick review of what you can expect of it. In the end it´s a virtual real, eh, world you can have as an emulator for now. But hey, worth a visit if you want to get the lead in MR SW development for MS!

Your real home is (obviously) represented virtually as a 3D environment you can move and teleport within. Then you can add virtual virtual objects (I must not call them holographic images anymore!) to see how the augmented objects would look like. You can see your Hololens menu, use Cortana and interact “naturally” with this playground sandbox for development.

It´s worth a trial and worth going this direction. But some questions need to be answered. How well can we actually use this for development? What if I need to use my hand to interact (like I do today with the Hololens)? How do I simulate that? Do I strap a Leap Motion to my Vive to simulate the Hololens air gestures to move teleporting through a virtual representation of my real home? Good fun, it`s getting ridiculous! Hopefully not recursive! ;-)

How accurate will the tracking work in the virtual representation of my real world in there? I assume it will work perfectly and the scanned virtual mesh of the real world virtual representation will be accurate by the millimeter (still with me?)! But hey, this could lead the way within the emulator to see how things could be with DK2 in 2019, right Microsoft?

Yeah, and how does Candy Crush look like on the Hololens? I need to try that! :-)

In any case, I do still agree with Alex Kipman (who recently gave an interview to Business Insider Australia adressing this issue) that we will throw away all our screens and smartphones at one point in time and live without screens (well, we carry the screen around with us on our nose). The potential is just so huge, but still so many things need to be answered. Not only battery life and how to shrink all tech, but also full indoor scaled tracking working seamlessly with full city-scale outdoor tracking. Data transmission via 5G or 6G needs to work and don´t get me started about regulations on safety… how long will it take to get approval by the law to wear “secure” MR smartglases while driving? Well, I`m getting carried away. Still enough research to blog about for the next years, still enough SDKs and DKs to try from all different companies and visionaries in the near and far future! Quite awesome!

… unless the computers go wild on their own before and all artificial intelligence destroys all puny humans before. Speaking of it, you can now also download a Hololens terminator T-800 view including some nice object recognition!

Hasta la vista, chach@s!

The 9 Biggest VR Stories Of GDC 2017

The 9 Biggest VR Stories Of GDC 2017

Whew, what a week. Can we sleep now? With each passing year the Game Developers Conference (GDC) becomes more and more important to the VR industry, and 2017’s iteration was no different. There have been a frankly ridiculous amount of announcements over the past six days and we wouldn’t blame you if you’d missed a few of them.

So we’ve gathered what we’d consider to be the nine biggest stories of the show, representing the need-to-know information. If you’ve been living under a rock (or in a VR headset) for the past week then look no further! You can start by listening in on our hour-long post-GDC live videocast analysis from yesterday right here:

After that, keep scrolling for the nine most essential headlines of the week.

1080 Ti Supercharges VR

VR requires meaty graphics processing power to run well, and Nvidia continues to push the boundaries with its latest GeForce GTX GPU release, the 1080 Ti. This is said to be around 35% faster than last year’s 1080, in many ways coming close to or surpassing the company’s top of the range Titan X Pascal GPU too. It’s shipping this month for $699 for all VR enthusiasts that want to push their VR experiences that bit further.

Rift’s 2017 Line-Up

Last year at GDC Oculus revealed much of its line-up for the imminent launch of the Rift. One year on and it’s ready to debut its bigger, better 2017 line-up, fueled by the recently-released Touch controllers. We’d seen games like Arktika.1 before, but new titles like From Other Suns and The Mage’s Tale are truly looking like the next generation of VR content.

Vive’s Peripheral Prices Are $99.99

HTC introduced its new add-on Trackers and an integrated audio strap for the Vive at CES in January, and it kicked off GDC with the announcement that both will be available for $99.99 each in the coming months. The Tracker will roll out to developers first and we saw plenty of great examples of what they’ll do with it at the show. This is something to be excited about.

Khronos’ OpenXR Sets VR/AR Standards

If we’re not careful, fragmentation could become a major issue for the VR industry, with so many devices already out there drastically different in features, power, and more. Khronos wants to combat that with OpenXR, a standard framework for VR software that will help you develop across multiple headsets and input devices. This could be crucial for social VR and bringing apps to as many platforms as possible.

New Gear VR, New Controller

With Google Daydream launching with motion controls last year, Samsung and Oculus’ Gear VR needs to play a bit of catch up. Fortunately the pair are doing just that; a new Gear VR was announced at Samsung’s MWC press conference last weekend, and will likely be fully unveiled when the company reveals its new S8 smartphone later this year. It’s got a controller very similar to a Daydream remote, which we can’t wait to get hands-on with.

Hands-On With the Microsoft Windows Holographic VR Headset

One of the biggest question marks heading into GDC was Microsoft. We knew it was going to announce when its dev kits for its VR headsets for Windows Holographic would start shipping at the show, we just didn’t know what that kit would look like. Well we got hands-on with it and it has potential. With consumer devices set to launch this year and Project Scorpio on the horizon, there’s a lot more to learn yet.

LG Looks Good With SteamVR Headset

We always knew there would be other SteamVR headsets beyond the HTC Vive, we just didn’t know what they’d be and when we’d see them. Turns out LG is the next company to partner up with Valve and it was at GDC with an early prototype of its headset, which doesn’t have a release date yet. We went hands-on with the device and were quite fond of it. We look forward to more details as the year progresses.

Robo Recall Ready

Epic Games’ Rift-exclusive Oculus Touch showcase, Robo Recall, was always pegged for an early 2017 release, but people were starting to get a little anxious it might not make that window. Well Epic had a perfect surprise for everyone on Wednesday when it actually released the shooter for free. If you’ve got Touch then be sure to go and grab it, it’s not one you should miss.

VR’s First Major Price Cut

Price cuts are a major part of driving adoption of any product, and a tactic we see used often in the console business. It looks like VR will be no different; Oculus this week announced that its Rift was dropping from $599 to $499, and Touch was dropping from $199 to $99. That’s both Rift and Touch for slightly less than the original price of just the headset itself. Game on.

Tagged with: , , , , , , ,

NVIDIA’s Announces ‘FCAT VR’ Benchmark Tool to Help Demystify VR Performance

The latest version of NVIDIA’s FCAT VR analysis tool is here and it’s equipped with a wealth of impressive features designed to demystify virtual reality performance on the PC.

NVIDIA has announced a VR specific version of its FCAT (Frame Capture Analysis Tool) at GDC this week which aims to provide accessible access to virtual reality rendering metrics to help enthusiasts and developers demystify VR performance.

Back in the old days of PC gaming, the hardware enthusiast’s world was a simple place ruled by the highest numbers. Benchmarks like 3DMark spat out scores for purchasers of the latest and greatest GPU to wear like a badge of honour. The highest frame rate was the primary measure of gaming performance back then, and most benchmark scores were derived from how quickly a graphics card could chuck out pixels from the framebuffer. However, anyone who has been into PC gaming for any length of time will tell you, this rarely gives you a complete picture of how a game will actually feel when being played. It was and is perfectly possible to have a beast of a gaming rig and for it to perform admirably in benchmarks, but to deliver a substandard user experience when actually playing games.

SEE ALSO
NVIDIA Announces GTX 1080 Ti, Purportedly 35% Faster Than GTX 1080 at $699

Over time however, phrases like ‘frame pacing’ and ‘micro stutter’ began creeping into the performance community’s conversations. Enthusiasts started to admit that the consistency of a rendered experience delivered by a set of hardware trumped everything else. The shift in thinking was accompanied (if not driven) by the appearance of new tools and benchmarks which dug a little deeper into the PC performance picture to shed light on how well hardware could deliver that good, consistent experience.

One of those tools was FCAT – short for Frame Capture Analysis Tool. Appearing on the scene in 2013, FCAT aimed to grab snapshots of what the user actually saw on their monitor, measuring frame latency and stuttering caused by dropped frames – outputting that final imagery to captured video with an accompanying stream of rendering metadata right alongside it.

SEE ALSO
NVIDIA GTX 1080 Performance Review: Head to Head Against the 980 Ti

Now, NVIDIA is unveiling what it claims is the product of a further few years of development capturing the underbelly of PC rendering performance. FCAT VR has been officially announced and brings with it a suite of tools which increase its relevancy to a PC gaming landscape now faced with the latest rendering challenge. VR.

What is FCAT VR?

The FCAT VR Capture Tool GUI
The FCAT VR Capture Tool GUI

At its heart, FCAT VR is a frametime analysis tool which hooks into the rendering pipeline grabbing performance metrics at a low level. FCAT gathers information on total frametime (time taken by an app to render a frame), dropped frames (where a frame is rendered too slowly) and performance data on how the VR headset’s native reprojection techniques are operating (see below for a short intro on reprojection).

The original FCAT package was a collection of binaries and scripts which provide the tools to capture data from a VR session and convert that data into meaningful capture analysis. However, with FCAT VR, Nvidia have aimed for accessibility and so, the new package is fully wrapped in a GUI. FCAT VR is comprised of three components, the VR Capture tool which hooks into the render pipeline and grabs performance metrics, the VR Analyser tool which takes data from the Capture tool and parses it to form human readable graphs and metrics. The final element is the VR Overlay, which attempts to give a user inside VR a visual reference on application performance from within the headset.

When the FCAT VR Capture tool is fired up, prior to launching a VR game or application, its hooks stand ready to grab performance information. Once FCAT VR is open, benchmarking is activated using a configured hotkey and it then sets to work dumping a stream of metrics to raw data on disk. Once the session is finished, you can then use supplied scripts (or write your own) to extract human readable data and output charts, graphs or anything your stat-loving heart desires. As it’s scripted, it’s highly customisable for both capture and extraction.

nvidia_logo-featuredSo What Does FCAT VR Bring to VR Benchmarking?

In short, a whole bunch – at least in theory. As you probably know, rendering for virtual reality is a challenging prospect and the main vendors for today’s consumer headsets have had top adopt various special rendering techniques to allow the common or garden gaming PC to deliver the sorts of low latency, high (90FPS) framerate performance required. The systems are designed as backstops when system performance dips below the desired minimum, something which deviates from the ‘perfect world’ scenario for rendering a VR application. The below diagram illustrates a simplified VR rendering pipeline (broadly analogous to all PC VR systems).

A Simplified VR Rendering Pipeline (Perfect World)
A Simplified VR Rendering Pipeline (Perfect World)

However, given the complexity of the average gaming PC, even the most powerful rigs are prone to performance dips. This may result in the VR application being unable to meet the perfect world scenario above where 90 FPS is delivered without fail every second to the VR headset. Performance dips result in dropped frames, which can in turn result in uncomfortable stuttering when in VR.

VR Application Dropped Frames
VR Application Dropped Frames

Chief among these techniques are the likes of Asynchronous Time Warp (and now Space Warp) and Reprojection. These are techniques that ensure what the user sees in their VR headset, be that an Oculus Rift or an HTC Vive, matches as closely with that users movements in VR as closely as possible. Data sampled at the last possible moment is used to morph frames to match the latest movement data from the headset to fill in the gaps left by inconsistent or under-performing systems or applications by ‘warping’ (producing synthetic) frames to match. Even then, these techniques can only do so much. Below is an illustration of a ‘Warp Miss’, when neither the application or runtime could provide an up to date frame to the VR headset.

fcat-2-vr-pipeline-warp-misses-1It’s a safety net, but one which has been incredibly important in reducing sensations of nausea caused by the visual disconnect experienced when frames are dropped, with stutter and jerkiness of the image. Oculus in particular are now so confident in their arsenal of reprojection techniques, they lowered their minimum PC’s specifications upon the launch of their proprietary Asynchronous Spacewarp technique. None of these techniques should be (and indeed aren’t designed to be) a silver bullet for poor hardware performance. When all’s said and done though, there’s no substitution for a solid frame rate which matches the VR headset’s display.

Either way, these are techniques implemented at a low level and are largely transparent to any application which is sat at the head of the rendering chain. Therefore, metrics gathered from the driver which measure when performance is dipping and when these optimisations are employed are vital to understand how well a system is running. This is where FCAT VR comes in. Nvidia summarises the new tool’s capabilities as below (although there is a lot more under the hood we can’t go into here):

Frame Time — Since FCAT VR provides detailed timing, it’s possible to measure the time it takes to render each frame. The lower the frame time, the more likely it is that the app will maintain a frame rate of 90 frames per second needed for a quality VR experience. Measurement of frame time also allows an understanding of the PC’s performance headroom above the 90 fps VSync cap employed by VR headsets.

Dropped Frames — Whenever the frame rendered by the VR game arrives too late for the headset to display, a frame drop occurs. It causes the game to stutter and increases the perceived latency which can result in discomfort.

Warp Misses — A warp miss occurs whenever the runtime fails to produce a new frame (or a re-projected frame) in the current refresh interval. The user experiences this miss as a significant stutter.

Synthesized Frames — Asynchronous Spacewarp (ASW) is a process that applies animation detection from previously rendered frames to synthesize a new, predicted frame. If FCAT VR detects a lot of ASW frames, we know a system is struggling to keep up with the demands of the game. A synthesized frame is better than a dropped frame, but isn’t as good as a rendered frame.

What Does This All Mean?

In short, and for the first time, enthusiasts will have the ability not only to gauge high level performance of their VR system, but crucially the ability to dive down into metrics specific to each technology. We can now analyse how active and how effective each platform’s reprojection techniques are across different applications and hardware configurations. For example, how effective is Oculus’ proprietary Asynchronous Time Warp when compared with Open VR’s asynchronous reprojection? It can also provide system enthusiasts alike vital information to pinpoint where issues may lie, or perhaps a developer key pointers on where their application could use some performance nips and tucks.

All that said, we’re still playing with the latest FCAT VR package to fully gauge the scope of information it provides and how successfully its present (or indeed how useful the information is). Nevertheless, there’s no doubt that FCAT‘s latest incarnation delivers the most comprehensive suite of tools to measure VR performance we’ve yet seen, and goes a long way to finally demystifying what is going on deeper in the rendering pipeline. We look forward to digging in a little deeper with FCAT VR and we’ll report back around the tool’s planned release in mid March.

The post NVIDIA’s Announces ‘FCAT VR’ Benchmark Tool to Help Demystify VR Performance appeared first on Road to VR.

NVIDIA’s Announces ‘FCAT VR’ Benchmark Tool to Help Demystify VR Performance

The latest version of NVIDIA’s FCAT VR analysis tool is here and it’s equipped with a wealth of impressive features designed to demystify virtual reality performance on the PC.

NVIDIA has announced a VR specific version of its FCAT (Frame Capture Analysis Tool) at GDC this week which aims to provide accessible access to virtual reality rendering metrics to help enthusiasts and developers demystify VR performance.

Back in the old days of PC gaming, the hardware enthusiast’s world was a simple place ruled by the highest numbers. Benchmarks like 3DMark spat out scores for purchasers of the latest and greatest GPU to wear like a badge of honour. The highest frame rate was the primary measure of gaming performance back then, and most benchmark scores were derived from how quickly a graphics card could chuck out pixels from the framebuffer. However, anyone who has been into PC gaming for any length of time will tell you, this rarely gives you a complete picture of how a game will actually feel when being played. It was and is perfectly possible to have a beast of a gaming rig and for it to perform admirably in benchmarks, but to deliver a substandard user experience when actually playing games.

SEE ALSO
NVIDIA Announces GTX 1080 Ti, Purportedly 35% Faster Than GTX 1080 at $699

Over time however, phrases like ‘frame pacing’ and ‘micro stutter’ began creeping into the performance community’s conversations. Enthusiasts started to admit that the consistency of a rendered experience delivered by a set of hardware trumped everything else. The shift in thinking was accompanied (if not driven) by the appearance of new tools and benchmarks which dug a little deeper into the PC performance picture to shed light on how well hardware could deliver that good, consistent experience.

One of those tools was FCAT – short for Frame Capture Analysis Tool. Appearing on the scene in 2013, FCAT aimed to grab snapshots of what the user actually saw on their monitor, measuring frame latency and stuttering caused by dropped frames – outputting that final imagery to captured video with an accompanying stream of rendering metadata right alongside it.

SEE ALSO
NVIDIA GTX 1080 Performance Review: Head to Head Against the 980 Ti

Now, NVIDIA is unveiling what it claims is the product of a further few years of development capturing the underbelly of PC rendering performance. FCAT VR has been officially announced and brings with it a suite of tools which increase its relevancy to a PC gaming landscape now faced with the latest rendering challenge. VR.

What is FCAT VR?

The FCAT VR Capture Tool GUI
The FCAT VR Capture Tool GUI

At its heart, FCAT VR is a frametime analysis tool which hooks into the rendering pipeline grabbing performance metrics at a low level. FCAT gathers information on total frametime (time taken by an app to render a frame), dropped frames (where a frame is rendered too slowly) and performance data on how the VR headset’s native reprojection techniques are operating (see below for a short intro on reprojection).

The original FCAT package was a collection of binaries and scripts which provide the tools to capture data from a VR session and convert that data into meaningful capture analysis. However, with FCAT VR, Nvidia have aimed for accessibility and so, the new package is fully wrapped in a GUI. FCAT VR is comprised of three components, the VR Capture tool which hooks into the render pipeline and grabs performance metrics, the VR Analyser tool which takes data from the Capture tool and parses it to form human readable graphs and metrics. The final element is the VR Overlay, which attempts to give a user inside VR a visual reference on application performance from within the headset.

When the FCAT VR Capture tool is fired up, prior to launching a VR game or application, its hooks stand ready to grab performance information. Once FCAT VR is open, benchmarking is activated using a configured hotkey and it then sets to work dumping a stream of metrics to raw data on disk. Once the session is finished, you can then use supplied scripts (or write your own) to extract human readable data and output charts, graphs or anything your stat-loving heart desires. As it’s scripted, it’s highly customisable for both capture and extraction.

nvidia_logo-featuredSo What Does FCAT VR Bring to VR Benchmarking?

In short, a whole bunch – at least in theory. As you probably know, rendering for virtual reality is a challenging prospect and the main vendors for today’s consumer headsets have had top adopt various special rendering techniques to allow the common or garden gaming PC to deliver the sorts of low latency, high (90FPS) framerate performance required. The systems are designed as backstops when system performance dips below the desired minimum, something which deviates from the ‘perfect world’ scenario for rendering a VR application. The below diagram illustrates a simplified VR rendering pipeline (broadly analogous to all PC VR systems).

A Simplified VR Rendering Pipeline (Perfect World)
A Simplified VR Rendering Pipeline (Perfect World)

However, given the complexity of the average gaming PC, even the most powerful rigs are prone to performance dips. This may result in the VR application being unable to meet the perfect world scenario above where 90 FPS is delivered without fail every second to the VR headset. Performance dips result in dropped frames, which can in turn result in uncomfortable stuttering when in VR.

VR Application Dropped Frames
VR Application Dropped Frames

Chief among these techniques are the likes of Asynchronous Time Warp (and now Space Warp) and Reprojection. These are techniques that ensure what the user sees in their VR headset, be that an Oculus Rift or an HTC Vive, matches as closely with that users movements in VR as closely as possible. Data sampled at the last possible moment is used to morph frames to match the latest movement data from the headset to fill in the gaps left by inconsistent or under-performing systems or applications by ‘warping’ (producing synthetic) frames to match. Even then, these techniques can only do so much. Below is an illustration of a ‘Warp Miss’, when neither the application or runtime could provide an up to date frame to the VR headset.

fcat-2-vr-pipeline-warp-misses-1It’s a safety net, but one which has been incredibly important in reducing sensations of nausea caused by the visual disconnect experienced when frames are dropped, with stutter and jerkiness of the image. Oculus in particular are now so confident in their arsenal of reprojection techniques, they lowered their minimum PC’s specifications upon the launch of their proprietary Asynchronous Spacewarp technique. None of these techniques should be (and indeed aren’t designed to be) a silver bullet for poor hardware performance. When all’s said and done though, there’s no substitution for a solid frame rate which matches the VR headset’s display.

Either way, these are techniques implemented at a low level and are largely transparent to any application which is sat at the head of the rendering chain. Therefore, metrics gathered from the driver which measure when performance is dipping and when these optimisations are employed are vital to understand how well a system is running. This is where FCAT VR comes in. Nvidia summarises the new tool’s capabilities as below (although there is a lot more under the hood we can’t go into here):

Frame Time — Since FCAT VR provides detailed timing, it’s possible to measure the time it takes to render each frame. The lower the frame time, the more likely it is that the app will maintain a frame rate of 90 frames per second needed for a quality VR experience. Measurement of frame time also allows an understanding of the PC’s performance headroom above the 90 fps VSync cap employed by VR headsets.

Dropped Frames — Whenever the frame rendered by the VR game arrives too late for the headset to display, a frame drop occurs. It causes the game to stutter and increases the perceived latency which can result in discomfort.

Warp Misses — A warp miss occurs whenever the runtime fails to produce a new frame (or a re-projected frame) in the current refresh interval. The user experiences this miss as a significant stutter.

Synthesized Frames — Asynchronous Spacewarp (ASW) is a process that applies animation detection from previously rendered frames to synthesize a new, predicted frame. If FCAT VR detects a lot of ASW frames, we know a system is struggling to keep up with the demands of the game. A synthesized frame is better than a dropped frame, but isn’t as good as a rendered frame.

What Does This All Mean?

In short, and for the first time, enthusiasts will have the ability not only to gauge high level performance of their VR system, but crucially the ability to dive down into metrics specific to each technology. We can now analyse how active and how effective each platform’s reprojection techniques are across different applications and hardware configurations. For example, how effective is Oculus’ proprietary Asynchronous Time Warp when compared with Open VR’s asynchronous reprojection? It can also provide system enthusiasts alike vital information to pinpoint where issues may lie, or perhaps a developer key pointers on where their application could use some performance nips and tucks.

All that said, we’re still playing with the latest FCAT VR package to fully gauge the scope of information it provides and how successfully its present (or indeed how useful the information is). Nevertheless, there’s no doubt that FCAT‘s latest incarnation delivers the most comprehensive suite of tools to measure VR performance we’ve yet seen, and goes a long way to finally demystifying what is going on deeper in the rendering pipeline. We look forward to digging in a little deeper with FCAT VR and we’ll report back around the tool’s planned release in mid March.

The post NVIDIA’s Announces ‘FCAT VR’ Benchmark Tool to Help Demystify VR Performance appeared first on Road to VR.

Hands-On: From Other Suns is a Procedurally Generated Multiplayer Starship Simulator

Hands-On: From Other Suns is a Procedurally Generated Multiplayer Starship Simulator

From Other Suns, a brand new VR title by Gunfire Games, draws heavy inspiration from several existing concepts and combines them all together, effortlessly, into a fresh and shiny package. It’s got the ship and crew management of games like Star Trek: Bridge Crew and FTL, it’s got the cooperative first-person shooter elements of games like Onward and Borderlands, with tons of loot to gather and gear to acquire, and it’s got the crazy ragtag crew antics of something like Guardians of the Galaxy — you can watch the trailer below to catch all of the vibes.

Even though it’s designed primarily as a three-player cooperative multiplayer game, the first time I played From Other Suns at GDC 2017 this week I was on my own. The other two demo stations were occupied by players also playing alone so I’d be rolling solo for my first mission. It was like my own private trial by galactic fire, as it were.

Everything began aboard my starship as the onsite Gunfire Games developer walked me through the controls and movement systems. On my wrists are a couple of buttons I can press with my opposite hand to pull up things like the options window or a map screen. The Oculus Touch controller face buttons toggle an inventory and an equipment display.

The default, more comfortable, movement system was a bit unique. You start by pressing forward on the left analog stick and then as you move around you watch your avatar from a third-person perspective.

Once you let go of the analog stick, you immediately teleport back into your body as you’re standing still. It feels almost like an out of body experience, but is a good option for those sensitive to motion sickness. It seemed to be a decent stop-gap solution, but I can’t imagine someone playing the entire game this way. It’s just wonky and feels like an inferior way of experiencing it.

For me, I preferred the full locomotion movement. It worked very similarly to Onward, allowing me to freely move around the world with few issues.

Once I got that down, it was time for my mission briefing. I headed to the bridge and looked down at my star map. After I selected a space station that was in trouble, my commander informed me that robots had overtaken the vessel and killed everyone on board. Because of course they did.

I made my way back to the chamber with the teleportation pad and inspected the guns on the wall. My starting pistol was good, but not great. Each of the guns had different fire rates, magazine sizes, and damage output. One functioned like an energy rifle, another shot lighting bolts, and then another was sort of like a short range shotgun. Plenty of diversity with options for every situation.

Once my loadout was set I stepped onto the pad and beamed down onto the ship. The developers told me that in the real game, maps like this would be procedurally generated from tilesets. This means that no two mission will ever be the same due to randomization, but it won’t be as lifeless as a truly randomized area.

Knowing that killer robots were on the loose, I was much more cautious than when freely roaming my own starship. I slowly edged around corners, poked my head out from cover to sneak a look, and made sure to stay mobile.

Eventually I encountered my first enemy, a robotic adversary that resembled the droids from the Star Wars prequels a bit. He was flanked by two similar robots, so I started by poking out from around the corner, gunning for headshots with my pistol. Soon, I swapped to the automatic rifle and peppered the chest of the closest one until it dropped. Once they got too close I switched to the shotgun and blasted their heads off. It felt extremely satisfying, especially with full locomotion.

Upon death, the robots dropped a few glowing items. One was a shield, which I could hold in one hand and squeeze the trigger to activate — if it got hit too many times it’d break and need to recharge. The second glowing item was green and I found out it was a syringe, which I could stab myself with to heal. Stocking up on those saved me a few times later in the mission.

When I came came back to the booth at a later time I was able to hop into a multiplayer session with UploadVR’s own Senior Editor, Ian Hamilton. While exploring the starship I quickly realized that I could hear him just fine over voice chat, but he couldn’t hear me. It was just a minor hardware issue. I decided to use this to my advantage.

While this was technically a cooperative multiplayer game, the folks at Gunfire didn’t want to cut any corners. This is a hardcore game about manning a starship and trying to survive. Friendly fire happens. I learned this by opening fire on Ian as he was still trying to find his way around the ship; I could even hear the booth attendees talking to him over the microphone.

One thing led to another and I killed him in cold blood before the mission even started. I didn’t need him slowing me down, but he just respawned and joined me anyway.

While we were down there, team dynamics started to emerge. Whoever had the shield could walk in front, drawing fire and keeping enemies busy, while someone else headed up the rear taking aim with more powerful and precise weapons.

The inclusion of thrown weapons like EMP blasts to stun robots or grenades to blow apart large groups would be a welcomed addition if the developers decided to add them. Later on, large robots with rocket launcher weapons could demolish a fully charged shield in a single blow, making it clear this wouldn’t be an easy game when it finally releases.

During our time with the game, Gunfire also mentioned a suite of features that weren’t available in the demo we tried. For starters, while aboard your ship, you can actually engage other ships in combat. Gameplay during these moments would consist of sending crew members to repair parts of the ship and actively rerouting power to shields or guns during a fight.

Ultimately, even though I was more successful and actually beat the mission on my own before joining forces with Ian, playing as a team was rewarding and exciting. Perhaps with a more competent partner things wouldn’t have broken down so quickly.

I ended up killing him again out of pure spite before the demo was over. It didn’t make me feel any better.

From Other Suns is in development by Gunfire Games as an Oculus Rift with Touch exclusive, currently slated for Fall 2017. Even though it’s being built with three player co-op in mind, it’s still playable in single player as well.

[Editor’s Note] – This article was originally published in February during GDC 2017 and has been republished to coincide with the free Open Beta weekend happening at the time of publication.

Tagged with: , , ,

GDC 2017: Vive Users Can Soon Pay $6.99 for a VR Content Subscription

GDC 2017: Vive Users Can Soon Pay $6.99 for a VR Content Subscription

At GDC 2017, UploadVR had the chance to speak with HTC’s president of Viveport and SVP of virtual reality, Rikard Steiber. During the interview, Steiber revealed the price point for Viveport’s upcoming subscription service for VR content. According to Steiber, the plan will cost $6.99 USD a month and will be available to current Vive owners and new customers “in the next few weeks.”

Steiber also reconfirmed that HTC will be offering a free, month-long trial of the Viveport subscription to all current Vive owners and new users when the service launches.

According to Steiber, with this subscription plan “users will be able to select 5 experiences a month from our pool of Viveport content” and these can be “kept month to month” or swapped out for new selections.

This content pool will be sourced from the content created by Vive Studios and any Viveport developer that chooses to opt in. The entirety of Viveport will not be available automatically to subscribers, although Steiber does expect that “many” studios will chose to be included.

In a previous story, we reported that developers will be getting a 60 percent cut of the subscription revenue, while HTC reportedly takes the remaining 40 percent. Meanwhile at MWC Alvin Graylin, China Regional President of Vive, further clarified the financial allocations for developers in the subscription plan.

According to Graylin, studios will get a piece of the revenue if their app is chosen by users. He explained that if developers create content that is “super sticky” and their app is selected by subscribers in a given then they’ll get a share of the revenue, making it in developer’s best interests to create content that people will keep coming back to.

“That’s something that I think that we want to encourage,” Graylin said, referencing apps that encourage users to keep coming back like Google’s Tilt Brush and Google Earth.

According to its official website, “Viveport is the app store for virtual reality where customers can explore, create, connect, and experience the content they love and need.”

Tangentially, Viveport is described by Steiber as as a place for the promotion and distribution of VR content that falls outside the category of a game. Experiences built for dducation, medicine, real-estate, enterprise and commerce would all be examples of Viveport-appropriate experiences.

Steiber has been pushing a vision of Viveport as a vital showcase for the less flashy VR experiences for some time now. This subscription model may be his best way yet to direct the masses towards the content they may otherwise miss.

Tagged with: , , , , , ,

Hands-On: Superhot VR’s Forever Update Makes a Great Game Even Better

Hands-On: Superhot VR’s Forever Update Makes a Great Game Even Better

Like most people that have played it, I only had one complaint after completing Superhot VR for the Oculus Rift: I wish there was more. Well, get excited people because more is almost here.

According to a message from Superhot Team, Superhot VR’s “Forever” update will be available in just a few weeks. The update is slated to hit Oculus Home on March 7th and will offer new achievements, new trials, and an endless mode to an already stellar VR experience. The additional content includes:

Endless Mode: Select a battleground and survive for as long as you can.

Headshots Only: Test your aim where only headshots take out enemies.

Time Trial: Race against your best scores in both in-game time (slowed down) and real-time.

Don’t Shoot: Try to complete the game without shooting.

Hardcore Mode: Get hardcore with faster enemies and less reaction time.

10-minute challenge: That’s right, just 10 minutes to complete the game.

Don’t Die: Any death restarts the entire game.

We had the chance to try the new Superhot VR Forever Update for ourselves and I think I can sum it up for you in one word: Wow.


In order to unlock the new modes you’ll have to complete the initial campaign comprised of multiple adrenaline pumping levels. Once you’ve done so, you’ll have access to a new startup menu for the game that takes the form of a slightly schizophrenic room filled with 90’s-looking computers and enough cryptic post-it notes to satisfy even the most skeptical conspiracy theorist. On the desk of this nightmare computer room is a series of floppy disks. To access each new game mode you pick up the correct floppy and insert it into the correct terminal.

In general, Forever brings a very welcome UX makeover to a game that, in its original form, offered very little in terms of choice. All you could really do is play the campaign over and over again. In Forever, however, you get to start every session in this new beginning hub which provides much more control over how and what you end up playing. There is also a new menu pyramid that will pop up in between levels that lets you boot back to the main menu or select a different stage.

Most of the new modes test one specific skill: accuracy (headshot only), speed (speedruns), endurance (don’t die), etc. There are also achievements that you can earn for especially skilled, or silly, accomplishments. These are all fun in their own right but the most exciting, and addictive, addition that Forever brings to Superhot VR is endless mode.

In endless you select one of several different battle grounds, take a deep breath and survive as long as you can. Endless takes the absolute best part of Superhot VR — making you feel like a complete badass — and turns it into a never-ending experience. You’ll be dodging bullets, grabbing weapons, nailing kills; it never stops feeling good and endless mode will never ask you to stop.

Endless mode is also incredibly challenging. Most levels set a goal of at least 80 kills and even after playing for an hour my record was only 27. Laugh all you want. You try it and see how you do.

Forever is one of the more significant post-launch updates we’ve seen for a VR game and it takes a game that was already incredible and makes it unforgettable. The one thing we were desperate for was more of this game, and now they’re giving us an infinite amount.

Life is good.

Superhot VR is available exclusively for the Oculus Rift on Oculus Home and requires Oculus Touch controllers as of today, March 7th.

Tagged with: , ,