A comparison of motion-capture footage featuring Andy Serkis and the character he plays in “War for the Planet of the Apes” shows how the actor is transformed into an ape. (Twentieth Century Fox / Weta Digital)

“War for the Planet of the Apes,” the latest installment of the blockbuster movie reboot, is all about revealing the humanity in Caesar and his legions of gene-altered apes – but it takes legions of wizards to make sure that humanity comes through.

Fortunately, there are wizards galore at Weta Digital, the special-effects studio behind film extravaganzas ranging from “The Lord of the Rings” and “The Hobbit” to “Avatar” and the upcoming “Valerian and the City of a Thousand Planets.”

Oscar-winning visual effects supervisor Dan Lemmon said “War for the Planet of the Apes,” opening today, set a new bar for his New Zealand-based team.

“The story was driven from the apes’ point of view, from Caesar’s point of view,” he told GeekWire. “So, in terms of sheer shot count, that meant that every single shot in the movie pretty much had an ape in it … Caesar, primarily. He was in no uncertain terms the star of the movie. Also, compared to the previous movies, he was going through even greater emotional turmoil.”

Reflecting that emotional turmoil was the job of Andy Serkis, who’s been widely recognized as the world’s greatest motion-capture actor since his portrayal of the CGI-generated Gollum in “Lord of the Rings.”

But making sure that filmgoers get a full sense of the emotions flaring on Serkis’ face is Weta’s job. Lemmon estimates that 800 artists, producers and managers worked on the movie, with another 200 employees providing support and information technology services.

“It’s an army,” he said. “It’s a whole lot of people.”

The basic formula for motion-capture animation was established back in the 1990s: As they played their roles in “War,” Serkis and other ape actors were outfitted with special suits and peppered with target spots that could be read from the film footage to generate computerized 3-D avatars.

The avatars are inserted into a real-world scenes, and then the scenes are tweaked to produce a seamless blend of animated and live-action imagery.

The effect becomes more seamless with each advance in computer processing speed and software sophistication. For “War,” Weta used an active-marker system that made it possible to do the motion-capture part on location, rather than having to rely upon a green-screen environment in a studio.

“That’s a layer of acquisition to film that’s never been done before,” Joe Letteri, Weta Digital’s senior visual effects supervisor and director, told GeekWire.

Weta also went all-in with an in-house software tool called Manuka, which renders computer-generated scenes with realistic lighting. Manuka had an initial tryout in “Dawn of the Planet of the Apes,” and also came into play in “Jungle Book,” but the technology hit its stride with “War.”

Putting realistic-looking fur on virtual apes is one of the big challenges. “There’s a limit at which the computer says, ‘I don’t want to do it anymore,'” Lemmon said.

Most of the computer-generated apes in “War” have about twice as many individual hairs as the apes in “Dawn.” Key characters have three times as many hairs. And each virtual horse in the new movie has about 15 million hairs, five times the number for the horses in “Dawn.”

Even the CGI forests are higher-fidelity: This time around, animators scattered virtual seeds across a computer-generated terrain, and then let the trees grow up in accordance with biological constraints. “They compete for water, they compete for light, and they do it in a biodynamic way,” Lemmon said.

But the biggest creative challenge is to capture the characters. That’s a feat that can’t simply be left up to a motion-capture rendering program.

To make sure the full impact of Serkis’ expressions will be seen on screen, artists and directors watched footage of his performance side-by-side with the computer-generated view of Caesar.

“Sometimes, you look at it and you say, ‘Well, Andy’s angry, and Caesar’s angry. But Andy’s also a little bit sad, and Caesar’s just angry. He doesn’t have that little bit of sad. What is it about Andy that’s making him sad, and how can we dial that sadness in as well without losing the anger?'” Lemmon said. “That process is really interesting, and it’s something that the director, Matt Reeves, gets involved in.”

As advanced as the process has become, Lemmon said there’s much more to be done at the intersection of technology and moviemaking. “I feel it’s like the tip of the iceberg,” he said.

But he insists that performers like Serkis will be part of the process for the foreseeable future. “I can’t overstate the importance of having actors driving the characters,” Lemmon said. The all-too-human interactions that occur during performances, involving actors as well as directors, provide the magic that Weta Digital’s wizards conjure with.

“When characters get to the point of simulating the behavioral side of things where people totally buy it, I think at that point, you’re talking about implications that have far more ramifications than filmmaking,” Lemmon said.

“At that point, you’re talking about sentient artificial intelligence, where a director can talk to a digital character directly and emotionally,” he added. “If that’s even possible, I think it’d be at least 50 to 100 years away.”

Like what you're reading? Subscribe to GeekWire's free newsletters to catch every headline

Job Listings on GeekWork

Find more jobs on GeekWork. Employers, post a job here.