Hollywood News is your source for breaking news about Hollywood and entertainment, including movies, TV, reviews

ILM Head Talks AI, Deepfakes and 'Mandalorian' Visual Effects

ILM Head Talks AI, Deepfakes and Mandalorian Visual Effects

Rob Bredow, one of THR's Top Hollywood Innovators, also discusses how the AI-based image-doctoring technology behind deep fakes will change Hollywood and why his team’s virtual production developments could be "an important part of us being able to get back to work sooner."

As the head of visual effects powerhouse Industrial Light & Magic, Rob Bredow is usually greeted each day by a welcoming statue of Yoda perched atop a fountain at the studio's headquarters in San Francisco's Presidio. But like everyone at the company, Bredow has been working from home since March 17 after a nail-biting race to set up the VFX studio's staff to work remotely during the novel coronavirus outbreak.

"We heard an early report at 11 a.m. [on March 16] that there might be an announcement coming out that day, and it was effective at midnight. So we had to get everybody out and home, and that was quite an amazing sprint," he recalls. "The fact that the very next day we were up and running with everyone working remotely was a miraculous thing."

Bredow, 46, admits that some ILM projects are on hiatus but notes that others are staying in production despite the lockdown. He won't offer specifics, but ILM's slate includes such high-profile titles as Disney's Jungle Cruise, Universal's Jurassic World: Dominion and season two of Jon Favreau's Disney+ series The Mandalorian, which employs cutting-edge virtual production technology to seamlessly meld CG imagery with live-action production techniques.

A Southern California native who grew up in La Habra, Bredow says he envisioned working in the film industry from a very young age: "My first job was at Knott's Berry Farm, where I dreamed of someday working in their video production unit while I was busy selling churros," he says. After getting his start in 1991 as an intern at the now-defunct VFX house VisionArt Design & Animation, Bredow worked his way up the industry ladder, eventually landing at Sony Pictures Imageworks, where he spent nearly 15 years before joining Lucasfilm in 2014 as vp new media and head of its advanced development group. He helped launch the company's immersive entertainment unit, ILMxLab, in 2015, became Lucasfilm's chief technology officer a year later and was promoted to his current role in 2018, all while landing a 2019 Oscar nomination as VFX supervisor on Solo: A Star Wars Story in the process. Bredow now oversees about 2,000 employees in ILM's five locations in San Francisco, London, Singapore, Sydney and Vancouver.

Bredow, who lives with his wife and two daughters in Marin County, talked to THR about the challenges of working remotely, the advantages of virtual production and how the AI-based image-doctoring technology behind deep fakes — President Trump tweeted one of Joe Biden recently — will change Hollywood.

Can you give us an update on how you're expanding your virtual production services?

We're fortunate to get to partner with Jon Favreau on The Mandalorian. He had a huge vision of how to take some of his experience in virtual production on films like The Lion King and use that in live action. … We'd been building up those tools incrementally over the years, but The Mandalorian was really the combination of everything coming together. We're very excited to actually have a [virtual production services] offering [dubbed StageCraft] that we're not only using on The Mandalorian but now making available for other shows in the industry.

How do you see virtual production being used in light of the pandemic?

We think tools like StageCraft will be an important part of us being able to get back to work sooner. We can digitally build a big percentage of sets and can be doing that while we're in this work-from-home situation [so that] the first day we can safely return to shooting, we'll be able to pick up as quickly as possible. We're continuing to collaborate with art departments for upcoming shows and with production designers and directors of photography to build their digital environments. The first couple of weeks, we were making sure everybody was 100 percent productive. Now we're in this phase of, "What does it look like to get back up and running and shooting these films, and how few people can we have on set to keep everyone safe?"

What's the setup that allows employees to work from home like?

We're all in shelter-in-place studios. We're using ILM's on-premises technology to give us the ability to remote things to each individual user. And you still have all the power of working at ILM with all of our thousands of computers and all the same security protocols, but we're just allowing people to get a window into all of that from their homes, where they can work safely in this situation. The security implications are such that we're making sure that no data is actually leaving the [ILM] site, which is unusual, but the way we rolled it out allowed us to keep that part secure. We've involved our clients in each of these details as we've gone along. It's not quite business as usual, but it's amazing how much it is business as usual.

ILM has been working on digital humans as well as new de-aging techniques that were used in The Irishman. What's next in this space?

Our goals are really to continue to maintain the highest-quality digital performances, and there's always more work we can do on that front while simultaneously making the actor's and the director's experience as natural and unencumbered as possible.

Where does the technique used for The Irishman go from here? A smaller rig?

Yeah, that's exactly right. More flexibility on the rigging, meaning more ease of use on set, continued improvement to that. That goes for everything from how much the camera weighs to its physical dimensions. But we also want to be getting even higher-quality data from the set so that we can represent the actor's performance in even greater fidelity. And then on the digital side, there's just constant innovation in all the little details that make a digital human look as believable as possible. Everything from the way the skin moves, as it interpolates between the different shapes, to the subtle details of how we continue to shade that skin. Every few months we have new breakthroughs in those areas.

How do you see the potential impact of artificial intelligence on Hollywood?

The popularization of it and the maturing of the technologies that are used to create software have really changed in the past two years. We've all seen examples of deep fakes and other things online that are an illustration of the kinds of things that can be achieved without using all the individual steps in the process that you would in a traditional visual effects pipeline — but still give you most of the results. Now, when you study a deep fake, even the good ones, it's not quite photo­realistic enough for us to put straight into a film, but it definitely is pointing to what's going to happen in the future. And it will actually change the way we do visual effects and create visual effects for the big screen and for streaming and other high-quality forms of entertainment.

You also currently serve as governing board chair of the Motion Picture Academy's Software Foundation. What are some of its most potentially impactful initiatives?

Open Shading Language was just recently adopted into the Academy Software Foundation, and that is one of the very foundational tools in our industry. It’s a programming language for describing how objects should tell us what color they are. So every rendering system in the world has a shading language in it or leverages a different shading language. And Open Shading Language is designed to be an open source solution that multiple rendering systems can leverage. So, the great advantage for a visual effects company is, rather than having to learn multiple shading languages just to be able to control what your pictures look like, if you’re using Open Shading Language you can use the same algorithms to control the colors of all your objects and the textures of all your objects in all of your different systems.

I think this project is going to attract even more people to want to contribute to and enhance Open Shading Language, which should give us even more capabilities in the future to be able to make pictures that are more descriptive or more easily achieved than we can today.

Interview edited for length and clarity.

A version this story first appeared in the May 6 issue of The Hollywood Reporter magazine. Click here to subscribe.

Recent Posts