Puppet master makes most of motion capture
Motion capture technology may still be new enough to intimidate actors and animators (with both groups terrified such technology could eventually render them obsolete), but that hasn’t stopped another group of performers from becoming early adopters — and earning a special Engineering Emmy for their work.
For puppeteers, wearing a bodysuit covered in reflective markers isn’t such a stretch from how they already operate, creating performances in costumes that radically transform their physical appearance.
“Whereas a lot of animation companies use motion capture to capture precisely what an actor does, we’re somewhere in the middle,” explains Brian Henson, co-CEO of the Jim Henson Co. and son of the famed Muppets creator. “We’re always working through some kind of technical interface.”
What caught the TV Academy’s eye was a powerful set of tools, dubbed the Henson Digital Puppetry Studio, that allows the company’s pros to translate their performances into broadcast-quality 3-D animation in real-time, resulting in shows such as “Sid the Science Kid.”
At Henson’s Hollywood-based HQ, tucked behind the Kermit statue, the team has transformed a soundstage into a 40-foot-by-60-foot motion capture performance space or “volume.” It takes two puppeteers acting in synch to create a character’s “live performance”; one dons a custom exoskeleton and acts out body movements onstage, while the second controls the CG head and face from a separate booth on the sidelines. For any given project, as many as six such characters can interact at once, their every move tracked by 36 infrared cameras and played back in real-time on one of six huge screens surrounding the stage.
Such immediate feedback is essential to puppeteers. In the primitive sock-puppet days, a pro might operate a character out-of-view above his head while gazing down at a monitor below. But it was only in the last few years that computer processing speeds caught up with the Henson Co.’s ambitions, giving it real-time control over CG characters.
By partnering with Advanced Micro Devices (AMD) and videocard manufacturer Nvidia, the Henson team was able to develop its Digital Puppetry Studio, which builds on decades of R&D at the company, dating back to experiments with radio-controlled motors and transmitters in the late ’70s. Now, they could use the same interface pioneered for animatronic creatures to steer their new CG characters.
The resulting system not only delivers real-time rendering but also accommodates virtual cinematography, allowing camera operators to sit in front of a videogame-style console and move about the virtual space, “shooting” the action as they would a three-camera sitcom. Since the Maya-rendered output is immediate, editors can get to work on the footage right away (though shows like “Sid” allow time for a final broadcast-quality pass using Mental Ray rendering software for dynamics, such as hair and complex shaders).
“It’s a very efficient process,” says Henson, who is now considering ways to license the technology. “With ‘Sid the Science Kid,’ we were shooting and creating a real-time animated show every two days.”