You don’t have to be a developer to have heard of Epic Games and Unreal Engine. The studio is well known for titles such as Paragon or its recently released virtual reality (VR) project Robo Recall. Both of which have been created using Unreal Engine as well numerous other videogames by developers around the world. This week the company released a preview of the engine’s next update, 4.16, so that studio’s can start experimenting with the new features and report any issues.
Epic Games continue to advance the VR side of the engine, adding several new tool updates like the VR Mode’s UI and interaction, access to the Sequencer Editor, Smart Snapping and physics simulation.
The studio hasn’t stopped there either. ‘Unified Console Commands across VR platforms,’ means that there is now a shared layer for common interfaces that developers can work from, rather than separate ones for each platform, and the mobile multiview path now supports Gear VR.
Back in February Valve launched the Steam Audio SDK as a free beta for developers. The SDK included a range of features such as HRTF-based binaural rendering, occlusion, physics-based reverb, real-time sound propagation and baked reverb and propagation to enhance VR applications. With Unreal Engine 4.16 Preview 1, Epic Games has fully-integrated the Steam Audio SDK whilst utilising the capabilities of the new Unreal Audio Engine.
As with any preview this is just for developers to test and experiment with. Epic Games always states: “Please be aware that the preview releases are not fully quality tested, that they are still under heavy active development, and that they should be considered as unstable until the final release. Developers should not convert their projects for active development on preview releases.”
For the full summary of updates head to the Unreal Engine forums. VRFocus will continue its coverage of Epic Games and Unreal Engine, reporting back with the latest updates.
Car manufacturer BMW are using technology developed by the videogame industry to assist in the process of designing and building cars by combining virtual reality (VR) and 3D printing for rapid prototyping and testing.
The German car maker have begun using Unreal Engine’s real-time physics-based rendering engine to explore design options such as interior design and vehicle functionality. Use of VR means that engineers can get an all-round view of the surrounding area and locate potential problems such a blind spots or switches and displays that may be awkward to reach depending on angle or seat position. The engineers and designers are able to get a sense of what sitting in a real car of that design would be like, without the cost of producing a prototype.
Simon Jones, Director of Unreal Engine Enterprise, explained the growing excitement surrounding the use of real-time VR in automotive design and engineering: “The arrival of relatively low cost, high fidelity VR has coincided with a rapid escalation in the need to do more with less and to do it faster,” he said: “BMW’s new mixed reality system is a great example of what can be achieved with clever thinking.”
The new mixed reality system also means that world-wide collaboration is much easier. Designers from all over the world are able to contribute to a project with reviews, evaluations or revisions, without needing to be in the same country at their colleagues.
“Virtual reality and Unreal Engine are becoming a crucial part of automotive design validation,” Jones added. “Car makers are defining the parameters and the Unreal Engine tools deliver the platforms they need, allowing engineers and stylists much greater freedom to explore different themes in a way that wasn’t previously possible with costly physical prototypes because they take so long to build and update.”
Many other manufacturing companies are using VR and augmented reality (AR) in their processes. VRFocus will keep you up to date on developments within this area.
Virtual reality (VR) can be isolating, but some developers are working on way to make it a more social experience. In one example, some children in Washington DC have the unique opportunity to board a truly special school bus – one that drives to Mars.
Field Trip To Mars is a VR experience created by Framestore with assistance from Lockheed Martin and using Unreal Engine 4. The experience takes place on what looks like a typical yellow American school bus, but this particular school bus comes fitted with special display screen windows and 3D surround sound speakers that allow the students on board to experience a trip through the Martian landscape.
Framestore have used Unreal Engine to carefully map out every street in Washington DC, which, coupled with GPS and laser sensor technology, means that whatever route the bus takes through the city, the children will see a unique view of Mars from the windows. If the bus turns left on Earth, the bus turns left on Mars. The active rendering means that every bump and pothole can be mapped onto the Martian landscape for a more immersive experience.
The specially designed windows in the school bus have an active film that can switch from opaque to clear when electrical current is passed through, so the real world can be viewed from the windows, or it can be quickly switched to the Mars projection.
Simon Jones, Director of Unreal Engine Enterprise had this to say: “All of this means that organisations across a range of sectors are increasingly understanding how they can embed VR within their design, development and technical strategies to help them do things faster and more efficiently. So what started life as a high-end computer gaming technology has developed to become an application that accelerates innovation, drives new technology and creates limitless opportunities. Like taking children on an astonishing field trip to Mars.”
Claude Dareau, Senior Developer of Field Trip to Mars, added: “We get the kids on the bus, the screens go dark, Mars pops up and they go crazy. Just seeing their reaction was incredible. I’d only had a few hours’ sleep and was shattered. I definitely felt emotional when I saw that.”
The VR experience would be shared and take place within a regular school bus that drives a route around Washington DC, and you can watch a detailed video about the experience below.
VRFocus will continue to bring you news about VR experiences as they come in.
A game is a game is a game, in the case of Epic Games’ action-packed virtual reality (VR) first-person shooter (FPS) Robo Recall it’s a very good one, achieving a full five stars from VRFocus in our review. However there are times when the user can expand the game and come up with new concepts, new levels that take the videogame one step further. Mods bring an extra string of creativity to a game’s bow, fresh content and fresh ideas from a new perspective. Epic Games has included mod compatibility with Robo Recall at launch and will be supporting user-created content through the Robo Recall Mod Kit, in a similar situation to what happened with ARK: Survival Evolved and the ARK Mod Kit or with NVIDIA’s kit for VR Funhouse.
The Robo Recall Mod Kit is complete with Robo Recall’s full gaming source code, available with C++ source through UnrealEngine.com. So how do you go about actually creating a mod? And when that is done how do you share it for other users to try? Helpfully there is a guide written by the team that we’re luckily able to share with you here as well. This is a multi-step plan so be sure you’ve got everything you’re going to need before you start.
1. Install the Robo Recall Mod Kit
Sign up for an Epic ID
The first step is always seemingly the most obvious, but if you do you have an Epic ID yet it is something that you will need to acquire. After which you can move on, and…
Download and install the Epic Games Launcher
Click Install to download Robo Recall Mod Kit from the Launcher.
Here’s where you can expect to play something of a waiting game as the Editor isn’t exactly on the light side, clocking in at over 20GB. So if you don’t have that space spare you’ll have to clear some space or do some creative shifting around of titles to make room on your system. The Unreal Engine team actually recommend you spend the time checking out additional help documentation on the use of Blueprints, and the basics of designing levels if you are unfamiliar.
Then once that is completed, launch the Robo Recall Editor
It’s time to get things underway properly…
2. Create A New Mod
So now that you have the Editor installed and open, it’s time to begin!
Then once that is completed, launch the Robo Recall Editor
The first step is always seemingly the most obvious, but if you do you have an Epic ID yet it is something that you will need to acquire. After which you can move on, and…
Select the type of mod you wish to create.
Enter a new mod name, ‘CeramicGuns’ in this demonstration.
Fill out the other fields…
You can also take the opportunity to fill out the Author and Description fields to put your personal stamp on things. This will be shown in the game’s Mod menu. If you change your mind, you can change this later by accessing your mod through the Editor’s Edit -> Plugins menu.
Click the Create Mod button to generate your mod files.
Wait for the mod to be generated.
The Robo Recall Editor will create your new mod structure, a popup tells you when it has completed.
Your new mod is automatically focused in the Content Browser.
Locate the Blueprints
You can then double click the Blueprints folder, then the Weapons folder to find your weapon Blueprints.
With your new mod structure setup you can now start creating your mod. Now, just because you started with one mod type doesn’t limit you to just modifying the couple of assets you’ll find in your mod’s Content Folder. You can modify any number of assets from Robo Recall and have them exist in one mod.
Just remember, that any new assets you import need to go into your mod’s Content Folder. If you put them elsewhere, they will fail to package when asked.
3. Add a New Weapon Material
Next is time to create a material asset to apply to weapons, continuing with the demo this will be giving the weapon a ceramic finish.
Create a new folder.
Right-click in the Content Browser and choose New Folder to create a new folder in your mod’s content directory. Name the folder Materials.
Double-click the Materials folder to open it and click the Add New button to add a new Material asset. Name the Material Mat_Ceramic.
Double-click the Mat_Ceramic Material to edit it in the Material Editor .
Drag a Constant3Vector expression into the graph from the Palette and connect it to the Base Color input on the Material node.
Double-click the black color preview on the expression to open the Color Picker . Set the R, G, and B values to 0.02 to give the surface just a tiny bit of color.
Drag a Constant expression into the graph from the Palette and connect it to the Metallic input on the Material node. Leave the value of the expression at 0. This will make the material have no metallic characteristics in its appearance.
Drag a Constant expression into the graph from the Palette and connect it to the Base Color input on the Material node. Leave the value of the expression at 0. This will cause the material to appear extremely shiny.
Click the Apply button to save the changes to the Mat_Ceramic Material.
Result
The Preview panel shows a dark, shiny surface that looks like ceramic. In the next step, you will apply this Material to your modded weapons to give them a ceramic appearance.
4. Apply the Weapon Material
It’s time to apply your ceramic Material to your weapons so they look ceramic instead of metallic.
Double-click one of your weapon Blueprints in the Content Browser to edit it in the Blueprint Editor .
Select the WeapMesh Component in the Components panel.
Back in the Content Browser, select the Mat_Ceramic Material.
In the Details panel, find the Materials category and click the Use Selected Asset from Content BRowser button to apply the Mat_Ceramic Material to the weapon mesh.
Click the Compile button in the Blueprint Editor toolbar to update the Blueprint with the changes.
Click the Save button in the Blueprint Editor toolbar to save the weapon asset.
Repeat these steps for the other weapon Blueprints in your mod to make them all ceramic.
The Preview panel will show the ceramic weapon.
5. Test Your Mod
Time to make sure the mod is working as intended by enabling it in the game and shooting some targets.
In the Level Editor toolbar, click the Play button to launch into the Hub.
Teleport to the holostation to open up the menu and select MODS.
In the Mods menu, find your Ceramic Guns mod and select it.
Select the option GUN RANGE from the main menu.
Your ceramic guns are now available to be equipped.
And with that done it’s time to shoot some targets with your new ceramic weapons!
Die Entwicklungsumgebungen Unity und Unreal stehen sich seit jeher konträr gegebenüber. Und bei den Spieleentwicklern liegt die Qual der Wahl, sich für eine Game-Engine zu entscheiden. Durch eine Umfrage der britischen Business-Analyse-Firma kam allerdings nun heraus: Die Mehrheit der VR-Entwickler, und zwar 59 %, arbeitet mit Unity.
Entwickler setzen auf Unity
Unter den 1000 Titeln mit den reichweitenstärksten Free-to-play Spielen basieren die meisten auf Unity. Zudem wurde mehr als die Hälfte der Daydream Apps mit Unity entwickelt. Aber nicht nur im Virtual-Reality-Bereich dominiert die Game-Engine. Die Hälfte aller Konsolen- und PC-Spiele und fast 70 % aller veröffentlichten Titel im Mobile-Gaming-Bereich nutzt die Entwicklungsumgebung.
Dabei war dieser Kräfteunterschied nicht immer so abzusehen. Denn die Unreal Engine, die 1998 zusammen mit dem Ego-Shooter Unreal auf den Markt kam und auf C++ basiert, war von Anfang an bei Entwicklern äußerst beliebt. Nicht zuletzt unterstützte die Spiel-Engine des Entwicklerstudios Epic Games, mit ihrem modularen Aufbau und ihrer Leistungsstärke, die rasante Entwicklung des Spielebereichs im damals stark wachsenden Mobile-Sektor.
Unity, dass als Firma ursprünglich von einem Dänen, Isländer und Deutschen gegründet wurde, setzte anfangs vor allem darauf, im Mobile-Bereich bessere 2D und 3D-Grafiken zu erzeugen. Seitdem kann die Engine auf ein starkes Verbreitungswachstum zurückblicken: Im Jahre 2016 wurden 16 Millionen Spiele installiert, die auf Unity basieren – eine Steigerung von 31 % gegenüber dem Vorjahr 2015.
Mit Unreal verdient man mehr Geld
Aber auch wenn Unity, das die Programmiersprachen C# und UnityScript nutzt, deutlich auf dem Vormarsch ist, Geld verdienen Entwickler momentan hauptsächlich noch mit der Unreal-Engine, welche auf C++ setzt. Denn auch wenn Unity mit mehr Endnutzern aufwarten kann, wurden 2016 mehr als 10 Milliarden US-Dollar mit Spielen eingenommen, die auf Unreal basieren. Das liegt nicht zuletzt daran, dass Unreal vor allem weiterhin im High-End-Entwicklungsbereich für anspruchsvolle Spiele eingesetzt wird, während die große Masse die Laufzeit- und Entwicklungsumgebung Unity nutzt.
Eine große Konkurrenzsituation zwischen den zwei Engines scheint es allerdings zumindest für die Firma Epic nicht zugeben. Zumindest wenn man nach den Aussagen von Kim Libreri, seines Zeichens CTO von Epic, geht. Denn selbst wenn Unity den Markt im Spiele-Sektor dominieren sollte, so gebe es doch zahlreiche weitere Bereiche, in denen die Unreal-Engine zum Einsatz kommen würde. So würden von Hotel Designern bis hin zu Autoherstellern viele verschiedene Wirtschaftsbranchen auf Unreal setzen. Aktuell arbeite auch die NASA daran, eine Mars Mission mit der Unreal-Technologie zu visualisieren.
Mike Fricker, Technical Director at Epic Games and colleague Lauren Ridge (Technical Writer) spoke to VRFocus about the changes and improvements that have been made to the Unreal Engine VREditor.
At the Epic Games keynote, a new version of VR mode tools was revealed during a video presentation. Added features included; Asymmetric hands with laser pointer for object tracking. Floating user interfact (UI) panel with radial menu for rapid access to options to switch between playing and editing rapidly without ever needing to take off the headset. A new Mesh editing mode to allow users to create objects from scratch within Unreal Engine while still wearing VR headset.
First version was released over a year ago, and the new version is available now on Github to download and compile.
You can watch the interview with Mike Fricker and Lauren Ridge below.
VRFocus will keep you up to date with developments on the VREditor and Unreal Engine.
At the Epic Games keynote at GDC in San Francisco this week, a short video was unveiled to show off the features that would be available in Unreal Engine over the coming year.
Unreal Engine’s original iteration dates back to 1998 with the original Unreal. It has been designed from the very beginning to be easily extensible and simple to mod. The release of Unreal Engine 4 as a free, open source engine in 2015 took it to new heights of popularity and alongside Unity it is now one of the most popular game engines in the world. Unreal Engine 4 is being used to develop up and coming games such as Tekken 7, Snake Pass, InSomnia and We Happy Few – as well as many virtual reality (VR) titles of course, on a variety of headsets and platforms.
Unreal Engine’s newest updates for 2017 will incorporate many new and improved features, such as photo-real lighting and post-processing, physics-driven animation, a replay system, high-performance VR at 90FPS, a full VR editing tool, Vulkan API support, Blueprint visual scripting, a visual material editor and GPU accelerated particle animation.
You can watch the Unreal Engine 2017 Feature video below.
VRFocus will continue to bring you information on Unreal Engine updates as well as the rest of the news from GDC 2017.
Epic Games is using the Game Developers Conference (GDC) to give an advanced preview of the latest additions to its Unreal Engine VR Editor, which allows creatives to build worlds in a virtual reality environment using the full capabilities of the editor toolset combined with interaction models designed specifically for VR world building. The goal is to officially launch the new VR Editor by April 17.
Mike Fricker, technical director at Epic Games, told UploadVR that working directly in VR provides the proper sense of scale necessary to create realistic, believable worlds, while the use of motion controllers means artists and other non-programmers can build environments with natural motions and interactions.
Epic’s own Robo Recallteam used the VR Editor to build out the free pack-in game for the Oculus Rift with Touch, which also makes its complete debut at GDC this week.
“As soon as they started using it, they realized what the most beneficial use cases were to them,” Fricker said. “Inspecting and tweaking was one of them, but sometimes they just want to throw in things really quickly and see it at scale without having to constantly take the headset off and on.”
The Robo Recall team had a direct impact on the new VR Editor that everyone will have access to in April. Fricker said the team needed little power user features like the ability to snap objects right to the ground instantly without having to go grab them from a menu and move them down to the ground.
“They asked us to give them the power to use these additional features so that they can stay in VR longer,” Fricker said. “That’s not to say that we’re trying to replace desktop. If they’re going to go and do blueprint scripting or material editing, you can get to that stuff in VR and you can make some progress if you knew you were going to tweak something or make a quick change to something. If you’re going to develop a function library or a new game system, you’re probably not going to do that in VR today. But the fact that you can go and see it and inspect it without having to leave VR, that’s the feedback that we got from the team.”
Developing inside VR not only opens things to all members of a team, it also speeds up the development process.
“It’s much faster to navigate a scene in VR than it is with the desktop, where you’re constantly using the combinations of the mouse and keyboard and modifier keys to orbit around an object and zoom the camera around,” Fricker said. “In VR, it’s one-to-one. I know exactly where I’ll end up at any point. Once you get used to it, it’s super fast.”
Lauren Ridge, tools programmer at Epic Games, said they’ve put in safeguards to ensure developers don’t get sick working within VR. For example, you can only move in rotation towards one direction. Not a single Epic user has ever had any motion sickness problems while in the VR Editor at the studio, where high-end PCs ensure a fast framerate.
“We have various levels of safeguard settings that will do things like turn on a grid for my tracking space or dissolve the sky into grayness,” Ridge said. “For example, in real life, I don’t have the ability to grab the world, turn it like a steering wheel and see the sky change. To some people, that’s instantly not good, so we’ve looked at all the different cases people have and added safeguards for them. You also can’t tip yourself over.”
Ultimately, the VR Editor has been designed to allow creatives to do whatever they want. Epic showcased a complicated scene set on a beautiful beach during its GDC Keynote, which includes a surfing mini-game as well as a sea plane flying overhead. Moving the plane to a higher altitude is done in seconds by grabbing the plane and moving its trajectory.
“We’ve been improving things since last year, which was the equivalent to our early access,” Fricker said. “We know that navigating 3D spaces is really fun and fast in VR, so that’s another cool thing that we’re excited about.”
The GDC beach demo also shows how easy it is to access the Unreal editor UI in VR to change settings or change what types of plants you’re painting down for foliage painting. The brush has been improved and makes things like undo and redo more accessible with a quick action.
Simulate mode allows developers to see how objects act when physics are attached. Ridge shows rocks of different sizes accurately falling off a cliff that overlooks the beach.
“This means you can use physics as an art tool,” Ridge said. “When you move the rock around gravity will act on it. You can also trigger gameplay events.”
The demo shows accurately built wooden 2x4s being snapped together into a staircase for a wooden hut on the beach.
“We also added more precise snapping tools,” Fricker said. “That’s about having things look organic and natural, but we also wanted a way to have really precise interactions with objects.”
Epic is taking advantage of VR, which offers more degrees of freedom with motion controllers than when using a traditional mouse and keyboard.”
“If I paint using different pressure on the trigger of the motion controllers, it’ll paint different strengths of the rock material down,” Ridge said. “This is cool because the editor already had various painting and fluid creativity features, but then being able to use those with motion control suddenly made them way more accessible. I can instantly get the bird’s eye view and see how it looks all in the scene and then jump down to see the player’s view of it to make any changes.”
Epic has also partnered with Disney’s Pixar Animation Studio to have Unreal Engine 4 and the VR Editor support Pixar’s Universal Scene Description (USD) 3D graphics pipeline. Epic showed the coral reef from Finding Dory and characters Crush the sea turtle and Mr. Ray the manta ray running in UE4.
“The cool thing here is that we don’t need any other separate tools to go from USD to what you’d see on screen with this demo,” Ficker said. “USD is a pretty big deal to the film industry and other non-gaming uses, but it has some special powers that make it equally awesome for games too.”
Pixar wants to add more plug-ins for creatives beyond Autodesk Maya, so UE4 now opens up new opportunities for companies working in VR.
“As more plug-ins appear, more people will begin using this format,” Ficker said. “USD has a really elegant set-up for just describing a scene in its entirety with all the information you need to uniquely instance specific things along with dealing with complex animation.”
“We know the film industry will like it,” Ridge added. “We will increasingly use USD here. Hopefully, we will keep working with Pixar to make it awesome for every use case we can imagine. Right now we are working on USD import, but at some point we will probably be able to generate USD files as well.”
Chevrolet announced today on stage at the Epic Games keynote speech at the GDC the existence of a real-time rendered advertisements created by visual effects (VFX) company The Mill, using Unreal Engine and The Mill’s dynamic Blackbird car in combination with new augmented reality (VR) platform Mill Cyclops.
Blackbird is a fully adjustable car rig that can be used in place of a real car in filming situations where it is not possible, for whatever reason, to use a real car.
Cyclops is a new virtual production toolkit that allows you to see CGI object live on location, tracking in real-time. In combination with the Blackbird, it is possible to use laser tracking and the four cameras on top of the Blackbird to produce an AR image of a Chevrolet Camero ZL1 that displays in real-time, including live reflections.
The Mill are well known for their award-winning visual effects, particularly for British sci-fi series Doctor Who. They have done previous work in the virtual reality (VR) and Augmented Reality (AR) arena, as previously reported by VRFocus.
The demo was performed using a Google Tango-enabled tablet-phone.
Alistair Thompson, International Executive Vice President at The Mill had this to say; “Visual effects should never get in the way of stories it should help tell stories. The future of filmmaking is realtime.”
VRFocus will continue to bring you news from the Epic Games keynote and the rest of GDC.
Ever since Epic Games opened its Unreal Engine 4 technology to the world, new use cases have been coming in fast and furious. The latest example of just how far real-time video game engine technology has come will be on display at GDC 2017 this week. Epic Games partnered with visual effects and creative content studio The Mill to shoot a new Chevrolet video that utilizes UE4 technology and an augmented reality Blackbird motion tracking vehicle.
That electric car is the brainchild of The Mill. And before the top-secret vehicle was revealed, Epic Games CTO Kim Libreri got an early look under the hood.
“This high-tech car can run any performance envelope of any car, and it has all the electronic equipment that you need to be able to track it, so you know where it was relative to the camera car, and also generates the light and reflection information that you need to be able to light a computer-generated car,” Libreri said.
Libreri met with Vince Baertsoen, the head of R&D at The Mill, last year in Germany at FMX 2016. At the time, the one challenge Baertsoen had was that the director filming the car was still seeing the Blackbird and the transformation into whatever car they wanted it to become occurred in post production. The Holy Grail was for those shooting the sequence to see the final version of the vehicle in real-time.
At GDC, Epic is showcasing a 2017 Camaro ZL1 in a race against the Chevy FNR concept car, except the vehicles are actually photorealistic pixels running in real-time using UE4 technology. To prove the point, the ZL1 can be swapped out for a classic ’69 Camaro.
“Those cars are built like we would do a video game asset,” Libreri said. “Right now, it’s a specialized version of Unreal because we’ve just put the demo together, but these are features that are going to be available in regular Unreal. The only difference between this and a car that you would put in a video game is the amount of polygons in the car. We actually have a couple levels of detail to the car. The one that you see in the video is comprised of millions of polygons. We also have a low resolution version that would be a more normal game-level asset that would run on a PlayStation 4. The materials and lighting and most of the things you see in the video would run on a console in a more regular video game environment.”
These virtual vehicles were super-imposed on top of the Blackbird during a live action shoot on the Angeles Crest Highway in Los Angeles. The Blackbird uses professional grade AR, filming a 360 video from the center of the vehicle using Red Epic cameras. Everything that’s around the vehicle is filmed as if it was a panoramic 360 photography. And a spinning LiDAR scanner is scanning the surrounding environment.
“They take the output from these four cameras, stitch it into a panorama, and then beam it to Unreal Engine wirelessly,” Libreri explained. “And then we take that as lighting and reflection information that we can place on top of a car that they’ve tracked with a real-time tracking system developed by partner company Arraiy.”
Before joining Epic in 2014, Libreri spent 20 years working in the Hollywood visual effects industry at companies like Lucasfilm, Industrial Light & Magic, and Digital Domain. These vehicles are inserted into the live action compositing background plates, which are equivalent to the kinds of images ILM would use.
The director of the 60-second video is sitting inside a customized Mercedes ML that has a Russian Arm that can film the Blackbird from any angle. Inside the Mercedes, he can watch the UE4-generated vehicle in real-time and make filming adjustments on-the-fly. A PC running on a high-end consumer NVIDIA graphics card is set up inside of the Mercedes to transform the Blackbird into the Camaro vehicles.
“We’re using some pretty beefy hardware for the demo right now, but that hardware capability is going to be available in the cloud very, very shortly, so you’ll be able to run these kinds of graphics-on-demand projects from the cloud,” Libreri said.
In addition to handling the augmented reality, UE4 is also handling a lot of information simultaneously.
“Each of these shots is an individual shot like you would have in Premiere or Avid, where you can cut backwards and forwards, and trim, and add the audio tracks,” Libreri said. “It’s all running just like you were doing normal visual effects photography, but inside a game engine.”
The Mill officially revealed the Blackbird on stage at GDC during Epic’s keynote. And Chevy also used that event to debut the final version of the race, which offers a wow factor when the vehicles enter a tunnel and go all TRON-like to showcase the real-time visual effects UE4 opens up.
“At every GDC we like to do some project that not only blows people away and inspires them, but shows that together with a customer we take some of the best people on the planet using our technology and make our engine better,” Libreri said. “We do something that people thought was impossible, so that’s why we went to this next level.”