Skip to main content [aditude-amp id="stickyleaderboard" targeting='{"env":"staging","page_type":"article","post_id":421189,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,","session":"B"}']

How Treyarch created realistic human faces in Call of Duty: Black Ops II

How Treyarch created realistic human faces in Call of Duty: Black Ops II

As the cinematics lead for Call of Duty: Black Ops II, Adam Rosas has a tough job. Rosas (pictured below) had to use current-generation console technology to create convincing facial expressions and performances for the characters in the game such as Frank Woods. The end result are some highly realistic animated characters who have believable face-to-face conversations in the title’s movie-like sequences that take place in between gameplay scenes.

[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":421189,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,","session":"B"}']

In a presentation at the Santa Monica headquarters of Treyarch, the Activision Blizzard game studio that made Black Ops II, Rosas described the process that the team used to create the cinematics in the game. As a division of the largest video game publisher, Treyarch spares no expense in getting the cinematics right since Black Ops II is part of the best-selling series of all time.

It started with motion-capture of actors such as James Burns who plays the character called Woods. That takes place in a big sound-proof warehouse where actors perform in front of scores of super-high-resolution infrared cameras that pick up every movement. The actor has a special suit on with reflective markers covering every part of the body. The actor’s face has a mesh of wired sensors on his face which allow the capture of subtle face muscle movements. The system is similar to what was used with Call of Duty: Black Ops, but it is more refined in this second installment.

“We knew we had to bring the fidelity higher, something that takes us higher and lower,” Rosas said.

The actor has microphones on his head that pick up every breath, grunt, and yell that he makes.

“The goal is to pick up natural motion,” Rosas said in an interview with GamesBeat.”This is not an artist’s interpretation of what the performance should be. We work with actors to bring emotion to a scene. The markers on the actor’s face are in positions so they capture muscle clusters like the mouth or the side of the eyes. It adds so much more life to what we are trying to bring in terms of our storytelling abilities.”

The team studied an actress reading three versions of a script. In the scene, the actress is alternately angry, sad, or controlled. They looked at the results of the test and then created a new system to capture the emotion from the infrared camera data.

The performance capture is converted into digital form and then automatically converted into an animated 3D image that the animators can work with. (That process is automatic now, but it used to take a week and a half). Rosas said the hardest thing to get right is the movement of a character’s mouth. Capturing the lip-synch is adequate. But fully capturing the movement of a person’s lips is very hard.

“When you look at the raw data, you see issues around the mouth,” Rosas said.

[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":421189,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,","session":"B"}']

Rosas said that the team could use reflective make-up as created by Mova‘s motion-capture team. But he said Mova’s technology works only with faces, but the Treyarch team needed to capture not only an actor’s facial expressions, but their entire body movements as well. They also wanted to be able to take a performance and apply it to a character whose face was different from the actor’s.

“When you capture it all together, it looks more authentic,” Rosas said.

There are times when you’re speaking, for instance, when your lips become very thin. Other times, they become puffy or fat. The rest of the facial muscles move in tandem. Your nose may scrunch and your eyes may squint. The artists have to hand-draw the final parts of the lips to make sure they look realistic. That’s how the cinematics team can capture an emotional character scene up close.

Now the new scenes reveal subtle movements, such as a lip quivering, a nose sniffling or a brow furling.

[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":421189,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,","session":"B"}']

The same motion-capture equipment worked for animating horses. In one scene, the soldiers have to hop onto horses to go after the bad guys. To create the cinematic, Rosas’ team had to capture a performance with an actor riding a real horse.

“It was a very scary proposition,” Rosas said. “Not many game productions have captured a horse.”

But the people were still the toughest problem.

“In terms of emotional connections with the player, we feel we have it now,” Rosas said. “We’re very excited about where we are going with this.”

[aditude-amp id="medium3" targeting='{"env":"staging","page_type":"article","post_id":421189,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,","session":"B"}']


Check out our other coverage of Call of Duty: Black Ops II:


GamesBeat 2012 is VentureBeat’s fourth annual conference on disruption in the video game market. This year we’re calling on speakers from the hottest mobile, social, PC, and console companies to debate new ways to stay on pace with changing consumer tastes and platforms. Join 500+ execs, investors, analysts, entrepreneurs, and press as we explore the gaming industry’s latest trends and newest monetization opportunities. The event takes place July 10-11 in San Francisco, and you can get your early-bird tickets here.