the practitioners of the fine arts of visual effects and CGI have always had a Holy Grail: the creation of digital characters who seem so real that audiences will be fooled into believing they’re being played by real human beings.

But that goal has not yet been reached. People have an innate knowledge of how human faces and bodies move and react. Artists who fail to include in their work even the almost imperceptible qualities of a real person won’t fool anyone.

But progress is being made. Visual effects supervisor Richard Clegg of effects house the Moving Picture Co. — who created a digital Arnold Schwarzenegger fighting his real-world self in 2015’s “Terminator Genisys” — staged somewhat of a breakthrough with his work on the VFX character Rachel in last year’s “Blade Runner 2049.” She was animated to look like actress Sean Young as she appeared in the original “Blade Runner” in 1982. “We might spend all day changing a pixel value by 0.2%,” says Clegg. “A small change to a highlight in the eye makes a huge difference. That’s where you have to get it right.”

More and more, getting it right requires intuition as much as math. “In the past we would have been beating ourselves up over algorithms that aren’t strong enough, whereas now it’s all subjective decision-
making about how we want a character’s face to perform,” says Ben Morris, creative director at Industrial Light & Magic’s London studio, who was instrumental in wrangling the digital characters in “Star Wars: The Last Jedi.”

Still, the art is far from perfect. Facial expressions, eye movements, the way muscles settle and countless other properties of the human body contain so much detail that it’s difficult for computer animation to re-create all of them realistically. “We’re just so used to looking at each other that we know when a digital representation is off,” says Anton Dawson, owner and CEO of Pixelgun Studio, which renders sports stars and celebrities in 3D for animation projects. “We scan, represent the data as dots, then put that back into [the picture] … but it’s still hard to animate a human performance because there’s so much nuance.”

Perhaps the biggest advance in animating characters since the start of the CGI 4age has been motion capture. As actors perform, their body movements and facial expressions are digitally recorded using telltale dots on their faces and bodysuits. That data is then combined with the on-screen character, and a seemingly living, breathing person — or chimpanzee, murderous robot or swamp monster — emerges.

Motion capture is so successful because it directly records and re-creates the way real people move, smile or furrow their brow, giving the final product all the qualities of authentic human movement.

But as Peter Busch, VP of business development for facial motion capture systems provider Faceware Technologies, points out, hand-tooled computer animation will always be necessary on top of technologies such as motion capture. “Just scanning someone doesn’t look exactly correct,” he says. ”We get lifeless characters, and there’s a disconnect with audiences.”

Instead, motion capture is more about giving animators a performance as a basis from which to work. Morris says ILM wanted to maintain the performance Andy Serkis gave as the Last Jedi’s Supreme Leader Snoke but that there’s plenty of animation on top. “We don’t hit the magic button and out pops Snoke,” he says. “Performance capture is just a recording device.”

Nonetheless, the characteristics of the performance are sometimes so effective that they can affect the design of the character. For example, when ILM was assembling the data and footage of Serkis’ scenes, it realized the actor’s deep, resonant voice didn’t sound right coming from Snoke’s character design, which had carried over from “The Force Awakens.” “We broadened his shoulders, brought out his chest, changed his jawbone and completely restructured the anatomy of his throat,” Morris says.

Creativity aside, generating photoreal digital characters is also about costs. Think of it this way: If ILM had been given the money (and time) to make villainous Governor Tarkin 100% real, would it have impacted the film’s huge box office gross? (Actor Peter Cushing, who played Tarkin in 1977’s “Star Wars,” died in 1994 and had to be digitally re-created for 2016’s “Rogue One: A Star Wars Story.”) Dawson thinks there’s something of a “good enough” strategy, since the cost of creating digital characters dramatically increases the closer to realism one gets.  “It’s that last 10% to 15% that costs the most money and takes the most time,” he says.

Clegg agrees budgets and deadlines are finite, but tricking the audience with a realistic digital human — like every decision on set and in post-production — comes down to the wishes of the director. “On ‘Blade Runner 2049’ there was a desire to do everything so well you couldn’t tell,” he says.

And on “The Last Jedi,” Morris says director Rian Johnson was very concerned with getting Snoke right, wondering whether “we should just put a guy in makeup and wing it.” But as renders of the shots were coming from post-production, Johnson and the animators had what Morris calls a eureka moment. “We showed Rian some close-ups, pushing our cameras right in, and Snoke became more and more believable,” he says. “Our confidence came through, and any concern Rian had just vaporized.”