The strongest contenders in the visual-effects awards race have made stunning use of new technologies that allow for next-level artistry on work that until recently was impossible: Modeling digital lions after the actors who voice them, re-imagining superheroes and villains as a CG fusion of human and animation, and de-aging actors to play characters at different stages of life.
Visual-effects supervisor Rob Legato says he and his team on category front-runner “The Lion King” were influenced by footage of real lions and the actors who voiced them. From there they worked with director Jon Favreau to find the animals’ final look.
“The idea is to make a playground for the instinct to take over,” says Legato. “It’s like jazz, you’re all playing together — the actors are giving their performances and the VFX team is taking that performance and trying to create something new. The net result is better movies.”
“This has been something we wanted in VFX, a goal we had, for a very long time,” says Westenhofer, whose work on the film allowed Will Smith to act opposite a younger version of himself. “You’re trying to stay out of the ‘uncanny valley,’ where the human or animal you make looks fake somehow, and make something that looks right to the eye, which is so hard because there are so many things a face and eyes do when we talk to each other.”
But it’s not just about what happens in post. Visual-effects artists are also working to help actors through the process.
“Martin [Scorsese] said he didn’t think the actors would want to wear headsets and tracking devices,” says Pablo Helman, visual-effects supervisor on “The Irishman.” “That was something we started talking about, I think, four years ago, so we knew that would make things more complicated for us, but then the actor can just give their performance, which we knew would be very freeing for them.”
Visual-effects supervisors have also been an invaluable part of creating the kinds of stylized performances that make comic book-inspired stories such as “Avengers: Endgame” all the more spectacular.
“With machine learning, you’re teaching the computer to know what an actor’s face is going to do next, adjusting it as you go so it seems as real as possible and working with the actor to make something new, like we did with Thanos,” says Dan DeLeeuw, visual-effects supervisor on the Marvel Studios blockbuster. “I think that’s something we always wanted to do with VFX.”
Eric Saindon, visual-effects supervisor on “Alita: Battle Angel,” and director Robert Rodriguez set out to take the performances in a specific aesthetic direction, influenced by both actors and manga-style animation. “The eyes of Alita break her away a bit from being a human character, but this is the first humanoid character that needed to hold up for the entire movie,” Saindon says. “We’ve had experience with characters like Gollum, but I think we’ve reached a point where we can really capture a performance like the one from Rosa Salazar and make Alita a living character.”