Isabelle Riva: How The Unity Game Engine Will Democratize Film

Isabelle Riva: How The Unity Game Engine Will Democratize Film

Unity Technologies has been known among developers as the maker of a “game engine” since the tool debuted in 2005. But these days, “creation engine” might be a better name for it.

That’s because experiences created with Unity are more often than not films, interactive movies, advertisements, or augmented reality and virtual reality content. The game engine has grown up beyond games, and it is being used to expand the definition of entertainment, according to Isabelle Riva, head of the Made with Unity developer program at the San Francisco-based company.

In advance of the Siggraph computer graphics show in Vancouver, Canada, I spoke with Riva. Disney Television Animation said last week it is making the Baymax Dreams series, a trio of short films in the Big Hero 6 world, as an animated series using the Unity engine. You can see various films embedded in this post that were all built using Unity. These short films have become showcases for how Unity will not only democratize games with universally accessible tools. It will also democratize film, Riva said.

Here’s an edited transcript of our interview.

Above: Isabelle Riva is head of Made with Unity.

Image Credit: Unity

GamesBeat: I saw the four films that were attached. The Baymax one is the only Disney one. Is that right?

Isabelle Riva: Right. It’s a partnership with the Big Hero 6 TV series. Disney produced those shorts, the Baymax Dreams, with Unity’s support. Also, if you saw, Disney is releasing their first VR film, Cycles, which will be at Siggraph 2018. That project was made in Unity by Disney. Those were our most recent Disney partnerships.

GamesBeat: Unity-made films are catching some momentum, then?

Riva: Absolutely. These projects validate what people have been working toward for many years, which is having a real-time render engine as part of a mainstream animation pipeline — for all the benefits it brings. It’s definitely catching on now.

GamesBeat: What’s the common thread among the films and newer projects here, as far as how they’re using Unity?

Riva: The Big Hero 6 episodes were created differently than the ones that Neill Blomkamp did with ADAM. These were entirely made in Unity. We had the benefit of this amazing direct link to Autodesk, who produces the Maya software used for keyframing animation. All of Disney’s animation is keyframed in Maya. Apart from that, everything else is made in Unity. We have this connection between animation and engine going, which is really efficient.

We got rid of storyboards. We went straight from scripts into pre-vis. Once all the assets and modeling and texturing were done, once the characters were in there, the director was able to play and decide where the camera goes, what time of day it is. All those changes could be made in real time. It was very empowering as a pipeline for the storytellers.

The director would call for performances from the animators and say, “Hey, can you do one of these but with that goofy squeaking around on the top of a fence?” The animators could provide a performance, and it would appear directly in Unity. Then, the director could set their cameras up, almost as if they were right there on set. That’s how it evolved. It was kind of like game development, where you start rough with pre-vis and gray boxes, and the more you go, the more it becomes refined and polished, the closer you get to your target.

It was a very non-linear way to make an animated story. The well was set continually by all of the departments, including lighting and everything. It was instant compositing. Normally, the director would be in an edit suite, making judgment calls on timing and camera positions. Now, they can change the character’s color, the lights, everything in real time. That really shrunk our team. It sped up our production time. We basically got a new work flow into a steady state. Disney was very pleased.

GamesBeat: How often are film creators doing what you guys have demoed on stage in VR, walking around inside their creations and working from inside?

Riva: This one wasn’t VR. These were made for broadcast. We didn’t have to finish in a stereoscopic environment and test it like you would in a VR setting. We were producing what ended up being a high-res Quicktime movie. But there was a VR element to the early concept art. Our concept artist came from a VR background and, instead of drawing on paper, he painted in Quill. With that, we were able to transfer those files directly into the engine, which meant that we had 3D concept art from the beginning.

Unity is being used by Disney in a number of different ways. Not just for broadcast animation but, obviously, like you saw with Cycles, they’re using it for VR storytelling as well. They’re using it as part of big live-action movies. Certain films use virtual production tools when they’re filming green screen, and a lot of those tools are built on the Unity foundation.

GamesBeat: What advantage would you say Unity has in comparison to something like Unreal for this work? Unreal also pushes the cinematic quality of their engine. How do you compete in this area?

Riva: With respect to film — and in this specific case of episodic animation — our advantage is the direct link to Autodesk. Unity and Autodesk have a collaboration that allows for a greater link between their media and entertainment tools and ours. For example, we have access to source code for their FBX file format, which delivers a much more streamlined process for sharing assets in the modeling and texturing of a character or a prop and how it’s used in the engine. The artists are able to improve how they use 3DS Max and Maya thanks to the real time power of Unity. I feel like that’s a big difference. That gives an advantage to filmmakers in this particular case.

GamesBeat: Is there any real difference in costs between Unity and Unreal? Or, at the very least, it’s less expensive than traditional ways of doing this?

Riva: The traditional approach to episodic animation employs so many people because every person is rendering out in the traditional way. They don’t have instant access to their work. They have to — in some cases lighters and compositors have to set up their render for the day, go home, and come back to see the results. That requires a lot more labor and often requires a render farm. You have stacks and stacks of [graphics processing units] crunching data. That goes away with a real-time platform like Unity. Every station has the power to render instantly.

Everybody is actually working on the same scene. You have multiple artists working on the same beats, but they don’t step on each other’s work. They can see the context of other people’s work when they’re finishing their own. An animator will see the light almost finished in the scene and think, “Oh, the sun is right there. Let me squint this character’s eyes as he passes through it.” You get a much higher quality scene because they’re working together.

You wouldn’t get that in the traditional setting. You’d have to go through a stepped, waterfall kind of process before you got to that idea. Then, you’d have to go back four steps to get it made. That makes for a much more efficient process.

GamesBeat: That ability to work concurrently, I assume you could do that remotely as well? Has that been around for a while, or is that relatively recent?

Riva: Game development has been doing that for a long time because people are able to share a build and work in that same build. That technology is at play here. For animators and filmmakers, it’s newer. The team that Disney put together with our help was distributed from Montreal to Singapore, across many time zones, and they all worked together from their respective cities on the same project.

GamesBeat: I remember that was part of MaxPlay’s pitch from a few years ago, that you could work concurrently in the same file with somebody else.

Riva: Right. That’s definitely happening now but across the whole pipeline. It’s not just between two animators in Max. It’s happening between lighters and animators and editors and music and sound.

GamesBeat: Are you going to show a lot of stuff at Siggraph? What do you have lined up?

Riva: We have some great talks at Siggraph. It’s really exciting. Two of the three episodes are in the can and approved for broadcast, so we’re using those at Siggraph to demonstrate what we learned through the whole experience with Disney. One of the talks is almost an unplugged kind of demonstration, where the director is going to take a scene from one of the episodes in Unity, and they’re going to completely change the scene into another story using the same elements within minutes. That will show the audience, live, just how powerful the engine is.

We have another talk dedicated to tooling, which explains how we were able to do this concurrent, parallel work and roundtrip with Maya and Autodesk really well. We have a talk that focuses more on the lighting and effects we were able to achieve in-engine for the animated episodes. It looks like film quality, beyond broadcast. That’s really exciting to show as well because for animators, especially episodic producers who are making really high-volume animation, changing the color of a character’s costume on the fly is really useful when you want to have asset re-use from episode to episode. You can do that in Unity.

GamesBeat: The Book of the DeadI recall you showed that at GDC. Was there something new in the latest trailer?

Riva: The Book of the Dead is going to make an appearance at Real Time Live at Siggraph. It’s being used by the team at MPC, Motion Picture Company, who are showing their virtual production tools that are built on Unity. We also have ADAM 2, the Blomkamp ADAM 2. That’ll be at Real Time Live as well. We got into the Electric Theater animation festival with ADAM 2.

We have some other films, like Sonder, which is a gorgeous short film made by an animator at Pixar. We have a lot of episodes coming from France that were built with Unity. Of course, we all know Monsieur Carton. That will be on display at the booth.

GamesBeat: If you made a comparison to past years at Siggraph, what would you say is new? Is there a lot more presence for Unity films?

Riva: There’s absolutely more presence. We’re humbled to be participating at the table with so many brilliant engineers and artists who’ve been preaching the real-time gospel for years now. We feel we bring some contributions this year. We’ve been able to get a real-time workflow into a steady state for animated series with Disney. We want to share a lot of that knowledge with the community. Our relationship with Autodesk will benefit everyone, and we’re excited to celebrate that as well.

GamesBeat: Is some of this promise, then, that film is becoming democratized in the same way as games? How far along would you say that is?

Riva: I think the greatest leap in terms of democratizing animation and filmmaking rests on the shoulders of two features. One of them is Timeline, which is a sequencer that allows people to edit basically like in Final Cut but inside Unity, so it’s perfect for storytellers. The other is called Cinemachine. It’s an intuitive smart camera system that allows you to compose your shots and track your cameras in a cinematic way. It takes all the headaches out. It can track an object like a character across a complex choreography without you having to do hours of hand animation and camera programming.

Those two things have made it so much easier to tell stories in Unity. The rendering is obviously instant and real time, which makes it much faster. That’s the bedrock of democratizing animation. You can tell stories inside of Unity with tools that animators and CG artists understand. They don’t have to be a game developer.

Simon Smith, the director of the three Baymax episodes, came from Dreamworks. He was the original head of layout on Antz and Shrek, and then, he directed Bee Movie and Penguins of Madagascar. He came from a very traditional way of making animated film. When we tried to introduce him to how we thought we could get this done, he was a bit skeptical because he comes from layout — the camera department, basically, the department that decides where the cameras go and how the story gets told based on the storyboard.

When we showed him Cinemachine and he started using it, he was completely bowled over by the power of it. He said he never wanted to go back to doing things the old way again. It was almost like god mode, being able to use the tools in the edit bay to affect all of the elements in the scene at the same time. Thanks to stacking timeline sequences and Cinemachine cameras being quick and smart — even if you need a change on the north side of the set, the south side of the set stays coherent with all of its camera positions. He thought this was a revolution.

To have a really established director, a man in his 50s, adopt this new technology for telling stories, we felt that really validated this whole thing for us. It made us feel that this could go mainstream.

Above: Baymax shorts were made with Unity.

Image Credit: Unity

GamesBeat: Is film becoming a category you guys track and gather statistics around?

Riva: It is. It’s one of our new verticals, if you will. It’s a place where we feel our engine can do a lot of good and bring a lot of benefit to creators. We just want to educate people and support them in exploring these tools.

GamesBeat: What are some of the next milestones you’d like to see filmmakers reach?

Riva: What will be really powerful — this feels like the engine will be the tool of choice for episodic animation. Of course, eventually, what we’d love to see is not only VR filmmakers using us but feature filmmakers as well. We’re seeing that on the VFX side already. It would be fantastic to see us meet that bar for features in the future.

GamesBeat: I’ve been writing about the Academy Software Foundation — the Academy of Motion Picture Arts and Sciences and the Linux Foundation getting together to get a bunch of open-source software together for films. It seems like that effort is all in the same cause, this democratization?

Riva: I can’t speak to that directly, but I definitely think that the future of film, like we said, is here. The vision is that, as John Riccitiello often says, the world is a better place with more creators in it. It feels like if we can put tools in the hands of more people that normally might not have them, then it’s all the better. A great story, a great animated story, can come from anywhere.

GamesBeat: How far are we from something like a Pixar feature, a two-hour animated film in theaters, made with Unity?

Riva: [Laughs] If it were up to me, I’d love to see it next week. I can’t say more about what’s in the works right now. It’s definitely coming, though. We’re excited to be part of some of those groundbreaking projects. We definitely want to see that in the future.

GamesBeat: Anything else you’d like to call out while we have the chance?

Riva: We’re well positioned to fuel the next generation of filmmakers. Our scriptable render pipeline allows for the kind of control filmmakers need. Our graphics aren’t even in question. The graphics are inspiring people who come from feature animation to produce with our engine. The performance is great. The creativity that comes from the tools and working together makes filmmaking more fun. We have something great here to contribute to the community, and we’re very excited to share some of that at Siggraph.

The Baymax shorts, actually, are going to be premiering on YouTube on September 15. There’s more information to come around that as well.

This post by Dean Takahashi originally appeared on VentureBeat.

Tagged with:

The post Isabelle Riva: How The Unity Game Engine Will Democratize Film appeared first on UploadVR.