Apple Embraces AR & VR, and What That Means

Apple, Inc. has been slow to jump on the augmented reality (AR) and virtual reality (VR) bandwagons, letting their competitors test the water before diving in themselves. That changed today, as the company’s Worldwide Developer Conference (WWDC), San Jose, hosted several key announcements for the immersive technology industries.

First up came the reveal that Steam VR would soon support Mac format. This doesn’t mean that every VR title on Steam will automatically work with iOS devices, but giving developers the option to create videogames and experiences for the Mac format is certainly a step in the right direction. This was reinforced with the announcement of Metal 2, the company’s API for high performance graphics, which promises to deliver a VR-optimised display pipeline.

This new technology was then showcased by Epic Games, who used the VR Editor component of Unreal Engine 4 to demonstrate a real-time build using assets from Industrial Light and Magic (ILM)’s digital Star Wars library through the HTC Vive, using green screen mixed reality (MR) video. That one sentence had a huge amount of high profile names within it, and the showcase was as impressive as you would expect it to be. Lauren Ridge, Technical Writer at Epic Games, resized and positioned Tie Fighters and other craft, before playing out the scene she created. A choreographed sequence featuring the infamous Darth Vader drew huge applause from the audience.

Apple then revealed their VR compatible systems before moving on to AR. With an equally remarkable showcase – again utilising Unreal Engine 4 and big names from the film industry – a LEGO model that could be exploded, Pokemon GO with a Pikachu looking as if it was directly in contact with the ground and an impressively rendered wild west cyberpunk battle scene, the ARKit development suite features stabilisation in motion-tracking, ambient lighting estimation and support for Unity, Unreal Engine, amongst other features.

While the company stopped short of revealing the AR head-mounted display (HMD) many had been expecting, the development tools presented with certainly have application beyond iPads and iPhones. A showcase it was; now it’s up to developers to take the technology further.

Ultimately, Apple coming late to the game is nothing new. What it does mean however, is that AR and VR will get a publicity injection and brand new opportunities to grow its audience. This is perhaps more important than the hardware line-up supporting VR or the graphical prowess of the AR presentation: it’s about the experiences you can have with the technology opposed to the technology itself, after all.

GDC 2017: Epic Games Unreal Engine VR Editor Coming in April With New Features

GDC 2017: Epic Games Unreal Engine VR Editor Coming in April With New Features

Epic Games is using the Game Developers Conference (GDC) to give an advanced preview of the latest additions to its Unreal Engine VR Editor, which allows creatives to build worlds in a virtual reality environment using the full capabilities of the editor toolset combined with interaction models designed specifically for VR world building. The goal is to officially launch the new VR Editor by April 17.

Mike Fricker, technical director at Epic Games, told UploadVR that working directly in VR provides the proper sense of scale necessary to create realistic, believable worlds, while the use of motion controllers means artists and other non-programmers can build environments with natural motions and interactions.

Epic’s own Robo Recall team used the VR Editor to build out the free pack-in game for the Oculus Rift with Touch, which also makes its complete debut at GDC this week.

“As soon as they started using it, they realized what the most beneficial use cases were to them,” Fricker said. “Inspecting and tweaking was one of them, but sometimes they just want to throw in things really quickly and see it at scale without having to constantly take the headset off and on.”

The Robo Recall team had a direct impact on the new VR Editor that everyone will have access to in April. Fricker said the team needed little power user features like the ability to snap objects right to the ground instantly without having to go grab them from a menu and move them down to the ground.

“They asked us to give them the power to use these additional features so that they can stay in VR longer,” Fricker said. “That’s not to say that we’re trying to replace desktop. If they’re going to go and do blueprint scripting or material editing, you can get to that stuff in VR and you can make some progress if you knew you were going to tweak something or make a quick change to something. If you’re going to develop a function library or a new game system, you’re probably not going to do that in VR today. But the fact that you can go and see it and inspect it without having to leave VR, that’s the feedback that we got from the team.”

Developing inside VR not only opens things to all members of a team, it also speeds up the development process.

“It’s much faster to navigate a scene in VR than it is with the desktop, where you’re constantly using the combinations of the mouse and keyboard and modifier keys to orbit around an object and zoom the camera around,” Fricker said. “In VR, it’s one-to-one. I know exactly where I’ll end up at any point. Once you get used to it, it’s super fast.”

Lauren Ridge, tools programmer at Epic Games, said they’ve put in safeguards to ensure developers don’t get sick working within VR. For example, you can only move in rotation towards one direction. Not a single Epic user has ever had any motion sickness problems while in the VR Editor at the studio, where high-end PCs ensure a fast framerate.

“We have various levels of safeguard settings that will do things like turn on a grid for my tracking space or dissolve the sky into grayness,” Ridge said. “For example, in real life, I don’t have the ability to grab the world, turn it like a steering wheel and see the sky change. To some people, that’s instantly not good, so we’ve looked at all the different cases people have and added safeguards for them. You also can’t tip yourself over.”

Ultimately, the VR Editor has been designed to allow creatives to do whatever they want. Epic showcased a complicated scene set on a beautiful beach during its GDC Keynote, which includes a surfing mini-game as well as a sea plane flying overhead. Moving the plane to a higher altitude is done in seconds by grabbing the plane and moving its trajectory.

“We’ve been improving things since last year, which was the equivalent to our early access,” Fricker said. “We know that navigating 3D spaces is really fun and fast in VR, so that’s another cool thing that we’re excited about.”

The GDC beach demo also shows how easy it is to access the Unreal editor UI in VR to change settings or change what types of plants you’re painting down for foliage painting. The brush has been improved and makes things like undo and redo more accessible with a quick action.

Simulate mode allows developers to see how objects act when physics are attached. Ridge shows rocks of different sizes accurately falling off a cliff that overlooks the beach.

“This means you can use physics as an art tool,” Ridge said. “When you move the rock around gravity will act on it. You can also trigger gameplay events.”

The demo shows accurately built wooden 2x4s being snapped together into a staircase for a wooden hut on the beach.

“We also added more precise snapping tools,” Fricker said. “That’s about having things look organic and natural, but we also wanted a way to have really precise interactions with objects.”

Epic is taking advantage of VR, which offers more degrees of freedom with motion controllers than when using a traditional mouse and keyboard.”

“If I paint using different pressure on the trigger of the motion controllers, it’ll paint different strengths of the rock material down,” Ridge said. “This is cool because the editor already had various painting and fluid creativity features, but then being able to use those with motion control suddenly made them way more accessible. I can instantly get the bird’s eye view and see how it looks all in the scene and then jump down to see the player’s view of it to make any changes.”

Epic has also partnered with Disney’s Pixar Animation Studio to have Unreal Engine 4 and the VR Editor support Pixar’s Universal Scene Description (USD) 3D graphics pipeline. Epic showed the coral reef from Finding Dory and characters Crush the sea turtle and Mr. Ray the manta ray running in UE4.

“The cool thing here is that we don’t need any other separate tools to go from USD to what you’d see on screen with this demo,” Ficker said. “USD is a pretty big deal to the film industry and other non-gaming uses, but it has some special powers that make it equally awesome for games too.”

Pixar wants to add more plug-ins for creatives beyond Autodesk Maya, so UE4 now opens up new opportunities for companies working in VR.

“As more plug-ins appear, more people will begin using this format,” Ficker said. “USD has a really elegant set-up for just describing a scene in its entirety with all the information you need to uniquely instance specific things along with dealing with complex animation.”

“We know the film industry will like it,” Ridge added. “We will increasingly use USD here. Hopefully, we will keep working with Pixar to make it awesome for every use case we can imagine. Right now we are working on USD import, but at some point we will probably be able to generate USD files as well.”

Tagged with: , ,