How to Train Your Dragon 2

Above: How to Train Your Dragon 2

Image Credit: DreamWorks Animation

VB: Are there any tools you dream of that you still don’t have, then?

Wallen: The old way, which is to do that in stages, means the animator isn’t really in control of the final process. But when you can compute on demand, essentially, you can embed those processes inside the animator work flow. The animator still sits at the end of the process. That’s enormous. You can really put your hands inside the final movie and start moving things around. There’s no reason you can’t. It’s just a question of, “Do you want to do that?” Do you want to do it now or do you want to do it when Dean’s in the room? It’s a matter of judgment based on your outcomes.

For our artists, it makes a big difference now, because historically, with the legacy software, as powerful as it was, it was legacy. It was becoming very difficult to do new features with. What we’re seeing now is that even in the life span of one movie, Hollow changed radically in the hands of the artists. What we’re seeing in the next year or two years — film, film, film — is that what you saw today is version one. We’re at the infancy.

That’s why we talk about a platform. Again, in the visual arts and in many other areas, software typically comes to users in storable packages. It’s about an isolated feature set where you think about, how do I get a data set to work on? You do the work and do something with it. As you can see, hopefully, by revealing some of the underlying elements of Torch, we’re really talking about an image creation platform.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

What that means is, you’re no longer talking about individual tools installed for individual people. You’re talking about, how does somebody fit within a much wider collaborative process? How can you use that larger resource to make that individual’s work that much more effective? Orders of magnitude more effective.

That’s a very different software paradigm. It’s reminiscent of the consumer paradigm that we’re all using now with apps on cell phones, but that paradigm used with significant compute for value, not commoditized compute where it’s bits of data moving between points. This is a whole design and manufacturing process. That’s why we think it has far wider implications.

Baker: The platform software, too, it has a unique advantage in that it’s designed with the customer right there. The folks who are going to use this look over the engineers’ shoulders saying, “No, I want it this way.” It adds so much to the resulting products, and they’re continuously evolving because the customers are demanding high quality. They want to plug it right into their work flow and get to work.

VB: How has it pushed what you’ve been able to do?

Baker: It’s been fascinating for us. Lincoln talked about some of their data constraints and compute constraints that they had hit up against previously. We look at that as a wonderful opportunity. You guys want to do what? How many MIPS do you need? Let’s go.

Wallen: One of the key things [is that] Intel’s tools around the IA architecture are fantastic. We had one challenge, though. We were scheduling down to the microsecond. And so exposing that data through Vtune and other instrumentation tools, it was like, “Okay, guys, this is what we need.”

How to Train Your Dragon 2

Above: How to Train Your Dragon 2

Image Credit: DreamWorks Animation

VB: From Intel’s perspective, are you learning things over time that apply to applications consumers care about? It seems like someday the web might be able to use a lot of this stuff. When we use web applications, we have lots of cores at our command too.

Baker: Yeah, I think so. You asked earlier about applicability to the games environment. Some of the stuff is very similar to how game environments work today. It’s just interactive, right? All this learning we’re developing with pioneer technologists like DreamWorks, we can apply that back to our products that will eventually show up in mainstream users’ hands. It may take a while, but that’s ideally the intent. We need to make better products that are more performance-oriented, more competitive, and cheaper. What we’re talking about in workstations here can end up in your cell phone 10 years from now.

VB: If something did come along and change everything, like virtual reality, could you adapt all of this?

Wallen: We don’t need to adapt it. The beauty here is that we are movie-makers. This is a movie studio. But we make very particular types of movie, in the sense that we generate our images entirely. We have total control of the image, which means that when you talk about something like virtual reality, where you’re playing with depth and projection, that’s trivial. We do that all the time with stereo. This platform is incredibly well-set-up to generate images for consumption environments that have all of these characteristics.

This is very different from live-action movies, where essentially you’re capturing data and that fixes its characteristics in your media. That’s one reason why a lot of live-action directors are moving toward the image-generation paradigm. It’s just that much more flexible. You can go much further.

We have in fact already done this. These technologies are new, emerging. They’re not really in the consumer’s hands yet. But we used the Oculus equipment and worked with that company, producing a dragon experience to promote the movie. It was used in New York and I think in Europe. We’ve done a couple of publicity stunts where our team, with the characters Dean’s team created, built something that can run up to five minutes. You control the whole thing, flying on a dragon.

DreamWorks Animation

Above: DreamWorks Animation

Image Credit: Dean Takahashi

VB: You integrated the cloud. You integrated multi-core. What about GPU computing? Where are you with something like this kind of change?

Wallen: All of our machines have a GPU, all of our client machines, because they use the GPU to put images on screen. We do not make extensive use of the GPU architecture for compute. It’s fantastic compute, delivering a hell of a lot of FLOPs, but it comes back to the point I made earlier about the flexibility of the compute model. We’re running an enterprise whose data loads are varying wildly from project to project, with a software platform that has to be able to accommodate that type of variation in throughput. The flexibility in the compute model is critical.

Certainly to date, the GPU architecture has not delivered the sort of virtualization, allocation, orchestration capabilities and instruction set flexibility that we need to do this type of scalable compute. That’s why Xeon 5 is so interesting to us, because it gave us the vectorized benefits that you often get in a GPU, but with exactly the same compute model. We can run software on a CPU. We can migrate it to Xeon 5 and back again. That’s all about a business choice, about how much scalable compute for this particular process at this particular time for that particular artist.

VB: How long has Dreamworks been on Linux?

Wallen: Our operating system is Red Hat Linux, that’s right. Part of the cloud platform involves the Red Hat MRG large-scale scheduling solution, which we were pioneers with them in refining and getting into this sort of enterprise. Essentially, in the wave of aerospace moving off of proprietary systems and on to commercial off-the-shelf, that was the end of Silicon Graphics and IRIX. The visual effects, advertising, and animation industries were all on SGI machines, all on IRIX. The question was, what are we going to move to? Dreamworks partnered with HP at that time in saying, “The right operating system is Linux.” We worked with HP to qualify and produce the first Linux workstations that then became standard and have been ever since, standard in the visual design and production industries. We’ve been on Linux since then, since IRIX.

VB: For the art style, there are a lot of cartoon faces on the humans. Do you ever want to try out more realistic human faces for any of your work, besides maybe this one? Or do you think it’s not possible?

Wallen: The issue isn’t actually the image. It’s the behavior. It’s the ability to direct performance down to that granularity. One hope I have is that with the sort of immediate craftability of the software, with Premo in particular, we’re able to approach that point from a different perspective. Certainly on the rendering side, we’re able to scan flesh. We’re able to reproduce that as a rendered process. But the behavior is the key.

VB: Do you want viewers to not be able to discern any computer-animated quality? Tangled his this combination, something more like hand-drawn versus computer-animated scenes. You could tell when it switched over. This part looks much more computerized than that part.

Wallen: One of the beauties of DreamWorks being one of the few, if not the only scaled animation production facilities and businesse. What I mean by that is that the same business and the same production house makes multiple movies in any given year. Just look at the range of movies that we make, from a stylistic and creative point of view. We don’t have a constraint in terms of the style that particular story needs to be told in. We’re not only doing looks that border on the realist. We’re doing very stylized movies like Peabody & Sherman and the upcoming Penguins movie.

The platform is the same, the tools are the same, but the artists have different visions. They can realize those visions as seamlessly as they can think of them.

VB: Were there any times when you felt like you needed something in addition to Apollo, or different from Apollo? Ubisoft is allowing their game developers in different studios to come up with different engines for creating games across the whole company. They could have half a dozen different engines, kind of like maybe having half a dozen different Apollos. It doesn’t seem very efficient.