Skip to main content [aditude-amp id="stickyleaderboard" targeting='{"env":"staging","page_type":"article","post_id":1506523,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,cloud,games,media,","session":"A"}']

How DreamWorks created a visualization system to prototype scenes in 'Dragon 2'

Peter Upson's shows off DreamWorks Animation's visualizaiton system.

Image Credit: Dean Takahashi

One of the coolest new tools for making animated movies is a visualization system that shows you what an animated shot would look like in real-time — before a film animator goes through the expensive process of creating a fully animated scene.

I had a chance to view one of these systems in a visit to the Redwood City, Calif., campus of DreamWorks Animation, which used the rig to make How to Train Your Dragon 2, the big summer movie that has generated more than $300 million in worldwide box office revenues. Its real value is in giving a film director an idea of what a scene will look like before he commits the resources to create the scene.

[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1506523,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,cloud,games,media,","session":"A"}']

How to make How to Train Your Dragon 2

We went deep inside DreamWorks to find out how it used cutting-edge enterprise and animation tech to make this summer’s blockbuster animation.

Inside that campus is a camera-capture room that resembles a motion-capture studio, with ten state-of-the-art MX-F40 motion-capture cameras from Vicon. Those cameras send out an infrared light signal. The light hits markers on people or objects within the room and then bounces back to the camera. Then tracking software on a Windows machine computes the scene and all of the markers within it. The software can insert the marker locations into an animated world from the film, said Pete Upson, final layout artist and one of the capture-room experts at DreamWorks animation, in an interview with VentureBeat.

The function of the room goes beyond capturing images of actors that artists can use as the foundation for 3D-animated characters. These cameras capture the positions of people and props, which allowed director Dean DeBlois to visualize what he had in mind for both himself and the film crew about what the scene would really look like.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

“This gives the director the idea of blocking out the scenes in real-time,” Upson said. “We can work real-time with the director in the room and remove the back-and-forth process.”

I played around inside the room as a faux director. I had a camera on my shoulders that transferred the images in real time over a cable to a big computing rig. That machine transformed the images I was shooting into animated figures I could see on a display. Other visitors played characters from the film, like Astrid and Hiccup. They moved around, and I could see exactly where they were and how much of the screen they took up at any given time. If I didn’t like where someone was standing, I could ask them to move to another spot.

“This is what it allows the artists to do,” said Katie Swanborg, director of technical communication and strategic alliances at DreamWorks Animation, in an interview during a tour of the animation studio. “In the moment, I can provide creative feedback.”

Above: Visitors hold props with infrared markers in DreamWorks Animation’s visualization room.

Image Credit: Dean Takahashi

Director James Cameron also created a visualization system for the film Avatar so he could see what his live-action actors would look like when they were converted into ten-foot-tall, computer-animated warriors in scenes in the film.

In the past, computing power for animation was so expensive and time-consuming that there was no way to see what something looked like ahead of time. An artist would create a scene and send it off for rendering. After a day or so, it would come back, and the artist could see it. If the director wanted changes, the work loop would start all over again. The process was like taking a picture with Kodak film. You could take a roll of film, send it off to the store, and then find out if the pictures were good. If they weren’t, you’d have to do it again.

“We are harnessing the power of the hardware and software to put filmmaking back into our artists’ hands,” Swanborg said.

[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":1506523,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,cloud,games,media,","session":"A"}']

With DreamWorks Animation’s Apollo system, the visualization and animation are now integrated.

The storytellers still do a lot of their work ahead of time using animated storyboards, which are like frames from a comic book. For Dragon, that work took up about two of the five years it took to make the film. But the visualization tool saved a lot of time.

“Five years ago, the process seemed nuts,” Swanborg said. “We sit on top of a computational infrastructure here that can access a tremendous amount of compute power if we want to. Why can’t we use that to make our filmmaking much more like our consumer lives of taking pictures with our cell phones? We are using this to turn cinematography on its ear.”

 

[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":1506523,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,cloud,games,media,","session":"A"}']

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More