From bungee cords to artificial intelligence, the strategies used by this year’s Visual Effects Society nominees ran the gamut. The awards will take place Feb. 5.
In the space saga “First Man,” Ryan Gosling could mimic astronaut Neil Armstrong bouncing on the lunar surface via simple bungees that were erased in post. At the other end of the spectrum, the CG Thanos character dominated “Avengers: Infinity War,” in which performance capture and A.I. software turned Josh Brolin into a giant villain with serious acting chops.
As VES co-founder Craig Barron, an Oscar winner for “The Curious Case of Benjamin Button,” says: “What’s happening now is that the science of A.I. enables a character to ‘learn’ mannerisms that are specific to that character. I think we’ll see this more and more, and it will be undetectable. We’re not going back to recreate Marilyn Monroe. We’re going into the future to create characters that we couldn’t do any other way.”
After watching Thanos battle a phalanx of Marvel superheroes, VES members awarded six nominations for to “Avengers: Infinity War,” more than to any other film. Marvel vfx supervisor Dan DeLeeuw enlisted a slew of powerhouses, including Weta Digital, ILM, Digital Domain and Disney Research.
Lest anyone think performance capture might be a passing fad, Disney’s Medusa technology was used for Thanos and three other nominees: “Ready Player One,” “Solo: A Star Wars Story” and “Ant-Man and the Wasp.” Medusa’s impact garnered Sci-Tech honors from the Academy.
DeLeeuw, an Oscar nominee for “Captain America: The Winter Soldier,” wrangled shots from multiple studios. “We played to the strengths of each company. Weta pushed out big environments, and ILM was great at simulations. Marvel’s master database knew where everything was. If that had gone down we’d have been totally screwed!”
It’s increasingly common for vfx films to be made by companies scattered across time zones. “At any point in a day, there was someone working on this film,” DeLeeuw says.
Barron notes: “Assets created at one company are used within environments created at another company. To participate in today’s global effects community, you must be able to share data.”
VES voters also recognized the growing trend of “virtual” productions, in which filmmakers use game engines to manipulate CG characters inside invented worlds. Five nominations went to Steven Spielberg’s “Ready Player One,” for its performance-captured avatars as well as virtual camerawork, which transported audiences through mind-changing locales.
“This technology is becoming more accessible,” Barron says. “It won’t be limited to tentpole movies.”
While ILM’s Roger Guyett supervised Spielberg’s virtual worlds, the studio was also creating “Solo: A Star Wars Story,” an origin tale for Han Solo. “We wanted this movie to feel like it had been shot in the 1970s, before the original ‘Star Wars,’” visual-effects supervisor Rob Bredow says. “But we had to work at a level that would satisfy modern audiences.” Bredow’s credits include CG/live-action hybrids “Stuart Little,” and he loved using such practical techniques as rear projection – with a high-tech twist.
Bredow’s team put five projectors behind an LED rear-projection screen that wrapped around a full-size Millennium Falcon spaceship. ILM had rendered 20 minutes of high-quality space imagery that the actors could watch through their cockpit window.
“Plus, we had blaster fire and lightning flashes that we could cue to appear on top of those images,” he says. “It was like a stage production.”
For “First Man,” Paul Lambert, vfx supervisor at DNeg, fashioned a similar blend of practical and digital solutions. He wrapped an LED rear-projection screen around a facsimile of a NASA space capsule instead of shooting against green screen and adding backgrounds later. “An LED screen gives you something more believable,” says Lambert, an Oscar winner for “Blade Runner 2049.” “We had background footage running continuously, which reflected interactive lighting inside the astronauts’ cabin. That gave us little things like reflections on dials, which added to the believability.”
A mix of CG and archival NASA footage was projected onto the LED screen, and the result was an integrated combination that director Damien Chazelle could monitor on the spot. “He could watch it like a play,” Lambert says.
Combining traditional techniques with digital elements was a hallmark of a decidedly different VES nominee: Disney’s “Christopher Robin.” Those effects were designed to make viewers smile at the antics of Winnie the Pooh and his pals. Framestore vfx supervisor Chris Lawrence initially brought puppets onstage to give the actors some tangible toys to relate to. Although these were later replaced by computer-animated versions, they were photographed as living animals would be.
“We were chasing ‘perfect imperfection,’” says Lawrence, an Oscar winner for “Gravity.”
Lawrence says real and digital elements will increasingly share scenes in diverse hybrid combinations. “One of the common threads expressed at VES was how to help film crews feel like they’re just making a ‘normal’ movie. For vfx practitioners, that’s an interesting trend to observe.”