Skip to main content [aditude-amp id="stickyleaderboard" targeting='{"env":"staging","page_type":"article","post_id":1817340,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,","session":"D"}']

Making realistic female characters, one motion-captured body at a time

Check out this character's eyes in Until Dawn. You have help eight teens survive the night.

Image Credit: Sony

The stereotypes of women in gaming have been with us for a long time. They’re often depicted as princesses who need to be rescued or overly sexualized objects. But Marla Rausch, the chief executive of motion-capture business Animation Vertigo in Irvine, Calif., has witnessed a change in recent years.

Her company helps create the ultra-realistic bodies and faces in games such as Beyond: Two Souls and Until Dawn. It doesn’t do the motion-capture itself, but it takes the data and, using a big team of technicians in the Philippines, cleans it up so that it can used state-of-the-art video games. This behind-the-scenes tech is one of the engines behind modern video games, and it is only beginning to become a more diverse environment.

[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1817340,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,","session":"D"}']

Rausch has been in the business for 15 years, and she is sometimes still greeted by strangers as if she were a marketing person, not the CEO of her company. She believes she is doing her part in making the industry more hospitable for women. But she is not one of the well-known women in the game business.

On the one hand, her technology enables game makers to create female game characters who are larger than life, as well as faithfully realistic in comparison to the way that humans actually look and move. I interviewed her recently about her role in gaming. Here’s an edited transcript of our conversation.

Above: Marla Rausch, CEO of Animation Vertigo.

Image Credit: Animation Vertigo

GamesBeat: Can you start by talking about how you got into the industry?

Marla Rausch: When I started, the industry wasn’t a place where there were a ton of schools out there teaching 3D animation. It was a very new thing. It tied into motion capture. Everyone learned by being involved and being a part of it.

My husband was a part of it, which is why I got involved. He was setting up a studio in the Philippines, and then he went to the U.S. and set up a studio. He was doing motion capture work. I was about six or seven months pregnant, waiting for him to leave, and as I sat there waiting I leaned over and said, “What are you doing, anyway?” That’s where it began from me. He taught me how.

I started doing freelance work for Spectrum Studios and Sony when they needed freelancers during crunch time. I’d work on my full time job and then go over there to work.

GamesBeat: Which part of the business was that?

Rausch: I was doing motion capture cleanup work, which is tracking and making sure that the data captured on stage is going to be workable, that it looks good, that it can go to the animators without any problems. The biggest issue in motion capture is, for the longest time, people would say, “Oh, this is cheaper and easier. You can do walk cycles much faster than on a key frame.” But people need to understand that it’s only faster and easier if you do it right the first time. That’s where things sometimes go off track.

[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":1817340,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,","session":"D"}']

Say the stage itself isn’t calibrated well. Something shook the stage and all of a sudden the data isn’t clean enough. People let that go. Then you’d have people like me saying, “Why on earth is he just standing there with everything blinking on and off?” That just goes on down the line. If I have trouble with it, the animator’s going to have trouble with it, and ultimately it just gets to be a bigger and bigger problem as it goes through the pipeline. I started working in 2000 or 2001, somewhere in there.

GamesBeat: What was the quality of motion capture at that time? Was it used for the whole body or just certain parts?

Rausch: When we started doing motion capture, it was interesting, because I saw the development of the technology, both hardware and software. The markers got smaller and smaller as the years went by. The cameras themselves were able to pick up the reflections more clearly.

It used to be that if you had two or three people in a volume, you were pushing the data quality. You’d sometimes have a harder time getting all the data in the machine. The occlusion level was quite high. Nowadays, two or three people on a stage is as easy as one person on the stage back then. It’s relatively easy.

[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":1817340,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,","session":"D"}']

Using optical data, optical systems, anything that’s occluded is going to be a problem. If you have two people rolling around on the ground, then or now, you’ll still have trouble making sure you get all the motion in your scene. But nowadays we’ve worked with as many as 19 people inside the volume, and we got good quality from all of them. We got very little occlusion, because the hardware is able to handle it.

Years ago it was mostly just capturing the body. Facial capture wasn’t where it is today. You’re talking about tiny markers on the face and trying to get a camera to identify them without making them turn into one gigantic blob. These days, body cleanup for motion capture — there’s an acceptable standard. We know exactly what goes into it. We know when it’s good enough. We know when there’s nothing more you can do. But we’re not quite there yet with the face.

Above: Ellen Page’s character in Beyond: Two Souls.

Image Credit: Quantic Dream

With the face we’re still finding out what pipelines and systems would be best utilized, especially with the games coming out now. You have things like Heavy Rain or Beyond, where it’s about facial expressions and the emotions on each character’s face that creates the impact on the audience. Today we’re doing with the face what we were doing with the body a decade ago, figuring out the best practices, the ways we can cut costs, the ways we can exceed quality expectations, and the ways we can shorten the amount of time it takes so we can release our games on time.

GamesBeat: Were you a serious gamer at the time?

[aditude-amp id="medium3" targeting='{"env":"staging","page_type":"article","post_id":1817340,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,","session":"D"}']

Rausch: I embarrass myself when my husband and I talk about this. Back then, the game I played was Rollercoaster Tycoon. I was about two weeks into that game. I was getting really good at improving parks and having enough people in there to meet goals. After two weeks, I realized, “I’ll never get that time back, and I didn’t actually build a roller coaster.” Then I thought maybe I should focus on my work more. But I do enjoy games. It’s one of the ways I relax.

GamesBeat: You didn’t need mocap for that game.

Rausch: Nope, it did not. But at the time, too, when you’re talking about mocap, you’re talking about war games and wrestling and stuff like that. These are games I liked to watch, but I was never very good at them. I react the way I would in real life — run away when there’s somebody shooting, that kind of thing.

GamesBeat: It sounds like a pretty technical job. Was it mostly technical work, or would you say that it wasn’t something you needed a lot of technical training to learn?

[aditude-amp id="medium4" targeting='{"env":"staging","page_type":"article","post_id":1817340,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,","session":"D"}']

Rausch: It is pretty technical. When Animation Vertigo started and we were looking for people to work for us, we needed to make sure they were technical people. People who weren’t necessarily looking to be creative or artistic in their work.

More often than not, what you’re dealing with is dots on the screen an

Above: Ellen Page in mocap outfit.

Image Credit: Devindra Hardawar/VentureBeat

d making sure you can recognize what those dots make. You need to make sure the entire scene is taken in. You see the two actors talking or running or shooting each other, and you should be able to make sure that comes out. There’s very little artistic or creative eye needed there. You just need to make sure it’s faithful to what was captured on stage, what the director wanted.

We’ve progressed, though. When Animation Vertigo started we were working on motion capture cleanup, getting the data from the stage and cleaning up the occluded data. About seven years ago, we started moving into what you call retargeting. In motion capture, once you have all your data cleaned up, the data from the markers goes to an actor, which now has the rotational information. You start with translation and you go to rotation, which means the data goes into a standard gray actor of whatever size. From that size, then you put it into a character. That process is called retargeting.

[aditude-amp id="medium5" targeting='{"env":"staging","page_type":"article","post_id":1817340,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,","session":"D"}']

I try to explain this to my mom sometimes, because it helps me to explain it in a way that’s more interesting and easier to understand. Basically, there isn’t going to be an oversized human out there. But say we have to capture a human – hopefully bigger, taller, more muscular — and then turn that regular-sized human into a seven- or eight-foot ogre. You’re going to see that the motions of the regular person don’t quite match what the big ogre would do. Or the other way around. A regular-sized human doesn’t match the movements of a dwarf.

When you do retargeting, you’re trying to make sure that the motion captured on stage is correctly interpreted into the character. At that point, you want to have people with an artistic eye, with an eye for animation. You want to see an ogre that walks and moves naturally, not stiff or robotically. You can tell when the animation isn’t working when you look at the gray version. You can’t quite put a finger on it, but there’s something wrong with how it moves. So while we started with purely technical work initially, we’ve moved on to work that needs a technical person with that artistic eye, someone who can tell when the movement looks natural.

GamesBeat: What was it like being a woman in this part of the business? Were you the only one, or were there others?

Rausch: It’s a tough one. When I first started, there weren’t a lot of women in that side, in the technical side, or in the decision-making side. Most of the people I interacted with, who made decisions on the teams, were men. I’ve been fortunate to have a very good experience. There’s been a lot said in the media about how the industry has been hard for a lot of women, but I’ve been fortunate to deal with people who understand that not only is this a fun industry, it’s a serious business.

It’s unfortunately common that when I do meet someone for the first time, their initial expectation or thought process is that I’m doing sales or marketing for Animation Vertigo. But we’ve moved into a different place these days. More women are executives, people making direct decisions. That hasn’t quite reached motion capture yet, though.

A story I tell people is that during the 10th anniversary party for Animation Vertigo — we decided to throw a celebration during GDC. We invited clients, folks we knew in the industry who’d worked with us, software people, hardware people, basically everyone who had a hand in helping Animation Vertigo be the company it is today. As we were taking down the guest list, my assistant looked at me and said, “Um, are we inviting any women?” That struck me. It didn’t occur to me that everybody we’d directly worked with were all men.

Above: Until Dawn

Image Credit: Sony

GamesBeat: Is this something you’ve tried to work against, finding a more diverse group of people for you to work with?

Rausch: I’m doing my part. I’ve been involved in mentoring and UC Irvine for women in computer science and computer gaming, working with girls in middle school and high school. It’s one way I’m able to bring women into the industry at a young age.

It’s an experience I’ve had personally with my daughter, when she was a sixth-grader. She’s sitting in front of a computer, holding her iPad, and she says, “I’m just not good at technology.” And I look at her and say, “You’re working at a computer and you’ve got an iPad. I’m not sure just what technology you’re talking about.”

I realized that’s where it begins. They’re surrounded by boys who play games and love games and maybe even program or make games. At some point or another girls decide, well, that means they’re better and we’re not. Most games are aimed at boys anyway, and so they lose interest. My way of giving back is mentoring, working with women who want to be in the industry, who have a lot of questions and insecurities about what the industry is like. They’ve heard all about Gamergate and all these horror stories, but what’s it really like?

Above: Call of Duty: Adavanced Warfare

Image Credit: Activision

GamesBeat: It’s an interesting dilemma. If you don’t point out some of the inequity in the industry, it just continues, but at the same time that scares people.

Rausch: Very true. It’s hard to explain to someone who’s looking from the outside. There’s a reality to it, but there’s another side as well, which is that there are a lot of decent people out there who want a diverse group of people working in the industry. It gives us different perspectives and points of view about games.

I hear people complain that games are always skewed toward men. I hear that a lot from girls in high school when we do career days. I’ve said, “Well, maybe you need to be in the industry. Maybe you need to pursue computer gaming, computer science, so you can use what you want to do and create a game that every woman can play.” You can’t expect people to do it for you. Why not do it yourself? That’s where I try to focus. We can focus on all the negative things that happen in various industries and be afraid of that, or we can create a different perspective and work toward that.

If you think about it, it’s difficult to change the current situation. We do hire more women in our group. Animation Vertigo is about 25 percent women on our team. I’d love to have more. There’s a balance when it comes to how women and men work, especially in this field. But unfortunately, I just don’t have a lot of women applying. It’s a dilemma. As much as I’d love to have more women around, women need to apply for these positions. They need to want to be in the industry and join the industry.

GamesBeat: You’ve watched the creation of more realistic people in video games. That’s happened quite dramatically over the last decade. Do you notice anything about how women characters have changed in games? Have they become more realistic? Have they become more common?

Rausch: I was once asked, “What do you think about how women are portrayed in video games?” I may not have the most popular way of thinking, but I think of it in the same way — I don’t expect men to look like He-Man, on steroids with giant swords and hammers and amazingly cut abs. It’s the same way. It’s a stereotype that’s been pushed to the max to become a completely fantastical creation.

Above: Marla Rausch, CEO of Animation Vertigo.

Image Credit: Animation Vertigo

There are more women heroes now that are still feminine, but not stereotypical. They’re not overly sexual. They’re more realistic. They’re maybe not as physically strong, but they’re mentally adept. They can create and plan in order to reach their goals. That’s great. As games develop, we’re seeing IP released out there that isn’t just fantasy, but it’s not just that. We’re seeing stories that are more realistically human, where you’re trying to solve a mystery or get to someother end goal. As long as we have more content that features different kinds of heroes and heroines and makes people look at gaming through different eyes — not just fighting in war or fighting dragons or whatever.

Gaming, especially with the technology we have now — we have a ton of things open for everyone to try. We can expand the market. There can be something for everyone.

GamesBeat: The character of Elena in the Uncharted games was an interesting departure, I thought. She’s attractive, but realistic. She’s focused as much on her intelligence and wit. It’s interesting to see how characters like that come about in motion capture. You can replicate those aspects of a person, or also distort a person in ways that are more real or less real.

Rausch: You see that in games like Beyond, with Ellen Page. I thought she looked amazingly Ellen Page-ey. It’s brilliantly done, great scanning work. We worked on that one. You start with things that look more like 3D animation, about 10 years ago, and come forward to today, where you can say, “That’s this guy or that guy. That’s Willem Dafoe. That’s Ellen Page.” They look real.

There’s also an opposite side of that, too — “Wow, that’s an amazingly huge ogre.” And it all happens through motion capture.

GamesBeat: How did you guys on the tech side get to where you are, working on these big marquee games?

Rausch: We’re fortunate to work with various clients who have various end goals. For Quantic Dream, they were big on directorship, on the artistic, on making sure that people felt the emotion of each character. I thought that was quite a unique experience, working with them.

When we work with other clients, it’s usually more like, “Okay, there will be five people here and a lot of interaction, so we need to make sure we get all this motion and action.” Heavy Rain was the first time where I sat down with the client and we discussed — they wanted us to know what the emotions of the story were, what the suspense was about. We talked less about motion and more about emotion, how that should translate in the data. Their director was very involved in the shooting. We needed to make sure we captured that and brought it back to them. It was pretty unique.

GamesBeat: If you look at the current consoles, the Xbox One and PlayStation 4, do you think we’ve hit a point where we cross the uncanny valley?

We’ve been touching on the uncanny valley a few times over the years, and it really boils down to how the artistic team can present it — finding a way to present it such that people can embrace it and not feel uneasy, if that makes sense.

GamesBeat: It’s always a moving target. One thing may look fake in one game, and in the next game something else comes into the foreground. I played Until Dawn this summer. A lot of the face work looks great. Sometimes, when they open their mouths, you see something that looks a little inhuman, though. Or when you look at the whole body’s movement, the way they move through the environment, that becomes the part that doesn’t look real.

Rausch: Yep. It’s very interesting. When we were working on Until Dawn, they would show me some scenes that they had each time we visited, and I was amazed how I was brought into the game. During the suspenseful moments, I actually jumped. I don’t like telling that to people, because it’s embarrassing, but it worked. And they’d say, “Hey, that’s good. I’m glad you were surprised by it.”

But it’s true. The face, especially, is where we get the most criticism. It’s something that you and I would look at and understand — there are some nuances that we can pick up that are very subtle, subtle enough that they sometimes aren’t conveyed by an actor in a game. It could be the engine, or how it was pushed through the rendering, or whatever. But somehow, sometimes, the face just doesn’t quite look right. Like you said, it can look great, and then there are times where the mouth just seems oddly shaped, for lack of a better term. “Can my mouth do that?”

That’s what they work on. That’s what technology is always trying to improve, trying to keep pushing so we can find the solution to getting those nuances right. The question is, like you said, do you cross the line and have that feeling where it’s too real?

Above: Marla Rausch has been in motion-capture for 15 years.

Image Credit: Animation Vertigo

GamesBeat: How much progress do you think can be made now, either by artists or people like your company or the vendors that are helping you? What parts have to progress in order to make graphics more realistic?

Rausch: It’s all working together. Pipelines work depending on what the outcome might be. Sometimes you create your own pipeline that will be able to establish the outcome you want. Sometimes people might try to create a basis from this game or that engine and put pieces together that may not necessarily fit so well.

It’s not anybody’s fault — technology’s fault, the developer’s fault, the production company’s fault. It’s trying to produce something that the market will want, when they expect it, as fast as they can. That’s all added into the equation. We need to get something out on this date. What can we do? At that point you look at what options you have to make the deadline.

It’s finding that sweet spot, where the preproduction is done well to where you know the pipelines you’re using will produce the product you’re hoping for. You won’t have to fight the engine. You won’t have to fight the character rigs. You won’t have to fight the facial rigs. You’ll produce something everybody will be happy with. These days there are so many steps, so many pieces that follow one after the other, and you have to put them together to produce something amazing. You have facial stuff, body stuff, the modeling and rigging the model. All these things have to work together to produce something that looks real and believable.

GamesBeat: Is it going to take more hardware, a next generation of consoles, or just more learning about how to do humans right?

Rausch: What we have now are the best consoles we’ve ever had as far as showing amazing animation. It’s a matter of putting these pieces of technology together correctly to where it conveys what you want. Technology is only as good as the people using it. If the people using it are still trying to find out the best way to get the product they need — things like body data just go through the process until you finally realize, “This is the best way to do it. This is how we can get the most realistic eyes.” Or mouths, or emotions on the bottom half of the face, results that look realistic.

Sometimes I’ll look at animation and think, “It doesn’t look like they’re talking so much as it looks like they’re chewing.” It’s figuring out things like that, the sweet spot between technology, talent, and the skills that will use it all. Until the next console, or the next big technological jump. Then everyone will be doing the same learning all over again.

GamesBeat: Maybe we’ll get our perfect woman character by then.

Rausch: Exactly. She’ll be completely badass and smart and rule the world. [laughs]

GamesBeat: Do they want you to airbrush them at all?

Rausch: [laughs] That’s for the scanning people to handle.

GamesBeat: Do you see the kind of work you do in games crossing over and meeting movie production at some point? Is the technology basically the same?

Rausch: We’re seeing a lot more — especially when you look at cinematics in games. They’re basically little movies in the game world now. You can see that the animation and screenwriting and acting are being pushed further and further toward the film world. A lot of films use motion capture and what can be done with it, too. Characters like Smaug in The Hobbit or certain parts of Star Wars. They had that fantastic piece where one of the actresses had her face lit up with facial markers.

With their budgets, it’s easier – although I’d put quotes around that — to push technology more. They have more money and more time to use it and test it. We’ve been involved in a few projects, although unfortunately I can’t mention which ones. It’s pretty interesting, the difference in scope compared to gaming. But the technology and the resources we use are similar.

I’m excited for the future. There’s so much more going on now, when you talk about the potential for integrating what we do with motion capture and 3D animation into mobile gaming, given the kinds of technology that Unity and Unreal are able to push through and how they look. It’ll be interesting to find out what next-generation media can use and how motion capture can be a part of that.

Nowadays, though, we’re starting to see those show up again, and mobile games as well. That’s what the people making this technology want to see. As a gamer, as someone who watches and plays games, when you look at something on the iPad, you want to expect that level of quality and realism now. We want to make sure the market will be able to accept that.

We see so much coming up in the future that will be different and exciting. I’m looking forward to a time when I’ll be able to say, “Hey, we were involved in that, and isn’t it the most amazing thing you’ve ever seen?” I can’t wait.