Today’s visual effects are so high-tech that it seems quaint that Oscar nominees are chosen through a process called the Bake-Off. This live event, staged at the Academy’s Beverly Hills theater, is as far from a cooking contest as you can get. The craftspeople behind 10 shortlisted films take the stage to demonstrate, until a timer with a light flashes, how they achieved their effects.
This year, teams were allowed to show making-of clips — useful for understanding effects that look so much like photography even the pros can’t tell. According to ILM’s Rob Bredow, nominated for supervising vfx on “Solo: A Star Wars Story,”“it’s fun when you can pull back the curtain and show the magic tricks.”
This is Bredow’s first experience as an Oscar nominee, but his opinion is shared by Framestore supervisor Chris Lawrence, a winner for “Gravity” and a nominee for “The Martian.” Now nommed for “Christopher Robin,” Lawrence believes the making-of presentation showed why a film with Winnie the Pooh and a gaggle of animated toys should make the cut in a field filled with sci-fi sagas. “It’s quite a complex film,” says Lawrence. “We were able to create a platform where a stuffed bear could give an incredibly emotional performance. People may not believe he’s real, but they certainly forget he’s not real.”
The quality of performances remains crucial, even in an era when many film environments are animated and then peopled with real actors. This year, Marvel’s “Avengers: Infinity War” takes that into the stratosphere, delivering both familiar superheroes and Thanos — a huge digital antihero based on a performance by Josh Brolin. Using motion capture and artificial intelligence software, the filmmakers turned Brolin into a virtuoso villain. As Marvel vfx supervisor Dan DeLeeuw says, “Thanos was uncannily Josh, who played him as a character that thought he was the hero.”
Steven Spielberg’s “Ready Player One” also used motion capture and game technology to transform actors into avatars that roamed virtual worlds. It was another tour-de-force effort by ILM, garnering Supervisor Roger Guyett his fifth Oscar nomination.
Not for the first time, ILM has two contenders in the Oscar derby. While “Solo” supervisor Bredow could employ all the CG tools in ILM’s armory, he reveled in the opportunities to weave practical elements with simulated ones. “On Pinewood Studios’ 007 stage we set off indoor explosions 30 feet tall, with firetrucks standing by.” Such live-action photography provided tangible foundations for digital set extensions and effects added later.
Today’s mix-and-match blend of effects is viable via modern software that seamlessly merges vfx within final images. Practical elements also allow actors to react to real things, and not just perform against greenscreens. Even in the digital universe of “Avengers,” the crew dug a warm-water river for the actors to move through. DeLeeuw, a prior nominee for “Captain America: The Winter Soldier,” recalls, “If they walked on it, we built it.”
The final nominee, which thoroughly embodies this strategy, is “First Man,” recreating the moon landing of astronaut Neil Armstrong. The approach of Paul Lambert, vfx supervisor at effects shop DNeg, might have seemed retro just a few years ago, but it was well suited for this re-creation of ’60s-era events.
Lambert, an Oscar winner for “Blade Runner 2049,” filmed Ryan Gosling inside space vehicles that looked so authentic they fooled NASA veterans. The crew beamed images, including actual archival footage, toward Gosling’s cabin so he could react. The actor also got a palpable sense of what Armstrong had endured when Lambert’s crew physically shook the vehicle instead of adding shake in post. “Ryan spent days being shook,” says Lambert. “By the end, he was a slightly different color! “
(Pictured above: Josh Brolin, in mo-cap garb, playing Thanos in “Avengers: Infinity War”)