HITS Spring: The Evolution of Virtual Production Smart Stages
Production technology over the last three years has fundamentally changed the way we work and collaborate with each other, according to industry experts. Cameras are now talking to lights rendering to the cloud through a complex network of platforms and workflows.
Smart stages, meanwhile, have become a reality as the Internet of Things (IoT) finally has entered the production process.
ARRI works with clients on stage builds and production implementation of the ARRI product line, including its cameras, lenses and lighting,” and it is “just trying to expand client education overall [and] show that this technology, while it’s new and very exciting, [is] really getting us to the same end goals,” Cassidy Pearsall, applications engineer at the company, said during the Virtual Production session “Orchestrating Innovation on Smart Stages” at the Hollywood Innovation and Transformation Summit (HITS) on May 19,
The company is trying to figure out how to “get people comfortable with that and show them that it’s not so scary,” she noted.
Moderator John Canning, director of developer relations – Creators at Advanced Micro Devices (AMD), responded: “Take the fear out of it. I love it.”
Henrique Kobylko, virtual production supervisor at Fuse Technical Group, noted that his company built and engineered what he said was the industry’s first LED stage, pointing out that Fuse initially used LEDs for advanced lighting.
But now we are at an “inflection point,” said Canning, noting that the industry has used LEDs for lighting for several years.
Agreeing, Kobylko explained to attendees that inflection point happened “when the tech got to the point that we could start rendering things in a game engine in real time to the quality that it’s cinematic, that it’s believable.”
And “all of these things came together with other technology to allow us to start having things in camera with camera tracking,” he pointed out, adding: “This first stage is when all these technologies merged and then it kind of like started paving the way for the virtual production evolution. So now we’re building a lot of other stages like that. But it was basically when we could start doing real-time rendering.”
The LED, he explained, had a “pixel pitch that was acceptable; then the rendering engine” made it possible for video created this way to no longer look like a game anymore.
Additionally, he said: “We can have higher density meshes. We can have more lighting effects and more in-camera effects now [in] the wall and now we can add camera tracking to all of this, and that culminated into what we’re calling now the LED volumes. So those stages are being called volumes.”
Meanwhile, flexibility is something that productions want, Pearsall, pointed out. A production’s budget is also a major issue, she said.
While this virtual production technology has come a long way over the past three years, she said, what is important now remains “understanding the limitations that you will have to impose on your stage because there are limitations” still.
She predicted there will continue to be limitations “until the technology is far more advanced than it is now.”
She told attendees: “You’ll just have to figure out how to deal with those but you can build in flexibility if you understand those limitations of what your particular hardware and what your particular creative team is looking to accomplish.”
To view the entire presentation, click here.
The Hollywood Innovation and Transformation Summit event was produced by MESA in association with the Hollywood IT Society (HITS), Media & Entertainment Data Center Alliance (MEDCA), presented by ICVR and sponsored by Genpact, MicroStrategy, Whip Media, Convergent Risks, Perforce, Richey May Technology Solutions, Signiant, Softtek, Bluescape, Databricks, KeyCode Media, Metal Toad, Shift, Zendesk, EIDR, Fortinet, Arch Platform Technologies and Amazon Studios.