Thesps needed to make effects effective
Acting and visual effects sometimes seem hopelessly incompatible.
Acting is an organic, spontaneous craft, while vfx uses technology and thrives on meticulous planning. Acting depends on human beings and their reaction to their environment and each other. Vfx work sometimes strips that environment bare, forcing thesps to react to a bluescreen, a marker on a stick — or to empty air.
But the best vfx pros know actors and effects need each other.
“What makes the visual effects we do work is the actors themselves,” says Stu Maschwitz, co-founder of the Orphanage. “The connection we make at the lizard-brain level between the real people on the screen and the impossible things that people like myself create is still what’s fun about movies.”
So, given that actors are the sine qua non for a movie that touches an audience, how does vfx technology help actors and how does it hinder them?
“They get in your way and make you work harder and in an un-artistic way when it becomes mathematical, and it’s about standing here and doing this and looking there,” says Lena Headey, best known as Queen Gorgo in “300” and soon to be seen as Sarah Connor in the Fox network’s Terminator spinoff “The Sarah Connor Chronicles.”
She says working with virtual sets can be liberating, by contrast, “because you still have your freedom as an actor. But when it’s very precise work, it sort of breaks you down mentally.”
Some of the recent advances in visual effects attack that very problem.
The Imocap system developed by Industrial Light & Magic for the “Pirates of the Caribbean” films was developed because the vfx process on the first “Pirates” frustrated everyone.
On “Pirates 1,” explains animation supervisor Hal Hickel, animation supervisor on the last two “Pirates” pics, the fight scenes between actors and skeletal ghost pirates had to be shot several times, once with the actors and stuntmen performing their choreography together, then again with the actor miming alone. The first plate was used for reference, while the second was the “hero” plate for the film.
To get CG skeletons to move like the stuntmen, animators could either hand-copy the reference plate or bring the stuntmen in to shoot again, this time on a motion-capture stage.
“Neither of those options is great,” Hickel says. “So when we went into the new movies, what we wanted was to just shoot it once, with the actors on set, but to be able to extract their motion and apply it to the CG characters.”
Imocap simplified shooting, gave the director better takes and freed up the actors playing those CG characters, notably Bill Nighy as Davy Jones, from long hours in the makeup chair and from face-covering prosthetics.
Contrast that with Andrew Tiernan, who played the hunchback Ephialtes in “300.” Tiernan spent about 10 hours a day in the makeup chair getting painful foam prosthetics applied to his face and body.
Tiernan says, “I suppose when I do see things like Billy Nighy and Andy Serkis I think maybe they had a little easier time than me.”
Virtual sets are a different challenge. Bluescreen environments like those of “300” change movie acting into something much closer to basic stage acting, says Rodrigo Santoro, who played Persian king Xerxes in “300.”
“It was hard, because you do have to use a lot of imagination and concentration,” says Santoro.
“On the other hand it allows the actor to connect with the most important thing, in my opinion, which is the truth. I mean, what you have to say, what you are feeling and what happens between you and the other actor is the real exchange. It’s almost like working with a different part of your brain.”
Visual effects can also help an actor’s performance translate to something even less expressive than Davy Jones.
On “Ghost Rider,” Nicolas Cage worked closely with vfx supervisor Kevin Mack to give the Rider, whose face is a burning skull, a range of emotions even though the skull’s face never changes expression.
Mack would even shoot video reference footage of Cage performing for plates where the Rider would be all CG and Cage would be omitted from the shot. He would then apply Cage’s emotion to the Rider.
“My premise going in was a perceptual illusion that occurs through context. The fire, which is moving all the time, became the expressive element. The speed of the fire, the character of the little flames, the way it lights the skull, the heat distortion, those things all combine to create a constantly moving skull face your brain can’t quite get a handle on.”
So the viewer’s mind “sees” an expression on the skull’s face where there isn’t one.
“You would swear that the skull looks mad or it looks happy or these various different expressions,” says Mack.
Lesson in teamwork
The dark side to all this, at least as far as actors are concerned, is that digital tools can be used to alter actors’ performances simply because the director didn’t like a creative choice.
That possibility has not gone unnoticed. The Screen Actors Guild has tried for years to get actors contractual control of such changes — to no avail. Major agencies are wrestling with how to get limitations on such alterations into stars’ deals.
Yet as Mack points out, such changes are arguably not so different from what has always happened to an actor’s performance in editing.
“It would be unusual that they would want to alter a performance digitally, but if that was the only option and the filmmakers think it would make the film better, then it’s just another tool at their disposal.”
And Santoro, who was speechless on seeing the final result in “300,” says his experience working with virtual sets made him a better actor.
“It’s a lesson about respect, about getting together,” he says. “In this case, the actors and the special effects guys are the same. We’re together to try to build something.”