HITS

HITS Spring: How Real-Time Workflows are Revolutionizing Production

Super Giant Robot Brothers is a fast-paced action-comedy show produced by Reel FX and coming soon to Netflix that uses virtual production and the Epic Games Unreal Engine from initial story development to final pixel renders, and provides a perfect example of how real-time workflows are revolutionizing animation production.

Noting that his background is in feature film visual effects and visualisation, Michael Neely, technical account manager, M&E at Epic Games, said May 19 at the Hollywood Innovation and Transformation Summit (HITS) that it “sort of helps with the  job – and the job is basically to bridge the gap between traditional offline rendering and real-time rendering.”

During the Virtual Production (VP) panel session “The Super Giant Future of Animation: How Real-Time Workflows are Revolutionising Production,” moderator John Canning, director of developer relations – Creators at Advanced Micro Devices (AMD), asked why Reel FX chose to use the Unreal Engine for its new show.

Responding, Enrico Targetti, director of photography (DP) at Reel FX and its show Super Giant Robot Brothers, explained that, despite it being an animated show: “We approached the overall production like a live-action movie [and] did motion capture of the entire thing…. We basically shot footage like if it was a live-action movie and then all of that went through editorial.”

But the “final animation that people see on screen … was done by hand, by traditional animators of Reel FX,” he said, adding: “We did several takes – several variations – of each performance or camera movement that went into editorial and,  once the edit was logged, the animators just  started doing their magic.”

In the process, by “leveraging the real-time capabilities … we were able to render an entire episode in a day,” he noted. Targetti pointed out that his background was in live action and not animation.

“The big learning” on this project has been the “usual one,” he said:  “Just because you can do something doesn’t mean you have to do something.” Shooting coverage “becomes very, very cheap … because once you have your performance there, you can just go to town and do all the takes that you want because it’s just you and the director there” and not a cast of actors and others, he noted.

“So we just played a lot” and that “turns into an immense amount of footage and data for an editorial pipeline, which in animation is not used to receiving that kind of data,” he said.

That “sounds like a producer’s nightmare,” Canning responded, adding it can be kind of “like giving the keys to the Ferrari to the 16-year-old and going, ‘Just have a great time.’”

But Neely said: “One of the things that’s part of my job is to sort of give guidance as soon as possible so that they don’t paint themselves into a corner doing creative things because obviously there’s a ton you can do with the engine.”

What is shot “depends on what it is that you’re doing [and] how you model, how you get your data in, how you track your data, because you do have a ton of data to manage,” Neely said, calling all those things “super important.”

Epic brought in motion capture teams and spoke to them, explaining: “Here’s what we’re doing. Here’s the gear we’re using. Here’s how we’re doing it,” recalled Neely. “Giving them that guidance as soon as possible helped them,” he said, adding that, when they ran into issues they weren’t sure of, they asked, “Hey, we’re doing this, you know, is there something we’re doing wrong? Is there a button that we need to switch?”

There are also challenges in working with studios, Neely noted, explaining: “Every studio has their own pipeline – even the traditional visual effects pipeline. Every studio is different. So the way that we have to handle that is understanding what they want to do and then say, ‘OK, well, here are some pitfalls’ or ‘Here are some things you might want to think differently about in terms of output, in terms of animation, in terms of managing your data.’”

He compared it to cars, noting a Porsche is different from a Cadillac.
“They’re both fairly decent cars but they work differently and you have to work on them differently.”

Canning asked Targetti what kind of savings in cost and time he saw working on the new animated show. Targetti noted: “I’m not at a producer. I am usually one of the people who spend the money on a production so I’m probably not the most qualified to answer that question.”

Targetti added that his company tackled this “new workflow and this new methodology” with the “main goal [being] to just do something different and do it in a different way,” while giving “more creative agency to the creatives.”

With this new system, “instead of just storyboarding for hours and hours and hours, you can just grab a camera” and just experiment to show “what I’m thinking,” according to Targetti.

The new system requires only a small amount of people and, “so that, to me saves time but … in my position, if I have more time, I’m going to do more things,” he said.  So “I cannot speak for the economical decision but I know that for the creative decision, it’s the way to go,” he added.

Meanwhile, the “transition to these technologies doesn’t suddenly erase the problems of what we have to do,” said Canning. “It just presents a new set of interesting problems,” he added, asking what “some of the new, interesting problems that we’re jumping into” are.

Neely replied: “The number one issue that I run into is mindset because we have to transition artists who are working in 30 years of development in terms of software, in terms of pipelines, in terms of methodologies. And we have to get them from the offline rendering mentality and those animation tools and those effects tools, and bring them into the real-time side and there are limitations. You’ve got to get into the GPU. You don’t have 24 hours to render one frame.”

Neely conceded: “It’s very challenging for all of us in the beginning because I’m leading them to the technical answers that they’re looking for. They’re showing me things that I need to know and there’s this dance until the production’s well underway. And then, after a while, it’s like, ‘we rendered, you know, episode 10 in 10 hours or whatever,’ and I’m just like, ‘this is happy news’ because then they can iterate — then they can get back into the creative process and the editorial side of things and make colours different – and they have time to do those sorts of things.”

To view the entire presentation, click here.

The Hollywood Innovation and Transformation Summit event was produced by MESA in association with the Hollywood IT Society (HITS), Media & Entertainment Data Center Alliance (MEDCA), presented by ICVR and sponsored by Genpact, MicroStrategy, Whip Media, Convergent Risks, Perforce, Richey May Technology Solutions, Signiant, Softtek, Bluescape, Databricks, KeyCode Media, Metal Toad, Shift, Zendesk, EIDR, Fortinet, Arch Platform Technologies and Amazon Studios.