Pixar’s animated films feature magical characters like Sully, the furry 1,000-pound beast from Monsters University. Building those computer-animated characters takes years of work, but studio engineers showed how they streamlined that process with the latest high-end graphics computers at Nvidia’s GPU Technology conference in San Jose, Calif.
In a keynote talk on Wednesday, Pixar engineer lead Dirk Van Gelder and technical director Danny Nahmias showed some of the most complex scenes from the film Monsters University and how much more difficult they were to create than the scenes from Monsters, Inc. a dozen years earlier. The talk showed how animation is pushing the high-end of the computing business in order to produce the entertainment that is enjoyed by hundreds of millions of fans. It highlighted more technical, super-geek details than we covered in our series on the making of Monsters University last year.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1233217,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,media,mobile,","session":"B"}']“This is the very leading edge of complexity in computing,” said Greg Estes, vice president of marketing at graphics chip maker Nvidia and a longtime follower of technology and entertainment, in an interview. “The GPUs can to the lighting and rendering in real-time in ways that were never before possible. They can play around with it more, do more iterations, and get it just right before they render it.”
With Monsters University, which debuted last year and generated worldwide revenues of $743 million, Pixar doubled its computing power and brought in a lot of graphics processing unit (GPU) servers to handle the workload of making and rendering the film. It had 270 people working for as much as four years on the movie, which was the first time Disney’s Pixar division ever did a sequel.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
Pixar created its Renderman animation system in the 1980s and even made its own Pixar Image Computer hardware in 1987. It made four films, including its debut film Toy Story, with Silicon Graphics computers. But since 2001, its last 10 films have been made with armies of Nvidia graphics chips, said Van Gelder.
I took a tour of Pixar’s headquarters last June. Inside the building is a data center full of humming servers — double the size that the company used in the past — that would be considered one of the top 25 supercomputers in the world. The 2,000 computers had more than 24,000 cores. The data center was like the beating heart behind the movie’s technology.
Even with all of that computing might, it still takes 29 hours to render a single frame of Monsters University, supervising technical director Sanjay Bakshi, told me last year.
All told, it took more than 100 million CPU hours to render the film in its final form. If you used one CPU to do the job, it would have taken about 10,000 years to finish. With all of those CPUs that Pixar did have, it took a couple of years to render.
All of this rendering takes place in a relatively small data center that is cooled by water from the San Francisco Bay, said Estes.
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":1233217,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,media,mobile,","session":"B"}']
The reason it took so much more computing power is that the eyes of the audience have grown. Something that looked spectacular 12 years ago, like the fur on the monster Sully, doesn’t look so great today. In his final form, Sully had 5.5 million individual hairs in his fur, compared to a fifth of that in the original film. In Monsters, Inc., it was impressive to create one simulated garment with realistic, cloth-like behavior: the shirt on the character Boo. The new film had 127 simulated garments, and the hair and cloth simulator had to be re-coded from scratch.
The tools that the technologists have built for the artists and animators are aimed at making it easier to make creative decisions. They use tools like a storyboarding animation system to take a system that once existed only on paper — comic-book-like storyboards — and turn it into a real-time visualization tool. Van Gelder said the artists and storytellers created scenes with animated stick figures to map out a scene that would be fleshed out later. The storyboarding tools can run at the fastest speeds because they leave out a lot of details.
With storyboarding, the artists set up background objects, pose the characters, and slowly add details with each creative pass. The GPU-based workstations render the results in real time. These technologies save many hours of work, but they also allow the artists do a lot more than they have ever done before, Estes said.
Once the storyboards are approved, the artists take another run at building a more complicated scene with real 3D models of the characters and objects. That is hard to do since Sully himself had so many hairs on him, each with four points of data.
[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":1233217,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,media,mobile,","session":"B"}']
But the number of hairs in this intermediate stage are just about 900,000, or a lot less than in the final film. Van Gelder said that visualization tools help the artists move a long way toward rendering Sully as a realistic character. He starts to look like a monster who weighs 1,000 pounds and has powerful arms that swing back and forth like an ape’s.
“When they push the button and hit render, they pretty much know it’s going to be close to what they want,” Estes said.
[aditude-amp id="medium3" targeting='{"env":"staging","page_type":"article","post_id":1233217,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,media,mobile,","session":"B"}']
Van Gelder showed one scene that was hard to do because it had six deeply detailed characters on the screen at once, and they were covered in shadows and moving through a building top.
“The toughest scenes are where you require a lot of lights, reflecting, and shadows to get the right mood,” Estes said. “That’s computationally very intensive because you’re calculating all of the light rays bouncing around.”
When it comes to mapping the crude blueprints into an animated scene, Pixar’s crew uses a proprietary animation system dubbed Presto, which was first built for the film Brave and was reused for Monsters University. Presto takes extremely complex character models, like the main character Sully, and brings them to life. Artists can manipulate the models in real-time to see how they can make them move, convey emotion, and tell a story. By clicking on objects, they can make arms and legs move just as they want them to in the final movie.
“They don’t have to wait for a full day to see the results,” Estes said.
[aditude-amp id="medium4" targeting='{"env":"staging","page_type":"article","post_id":1233217,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,media,mobile,","session":"B"}']
As for the lighting, Pixar also described its “global illumination” system, which is based on physical light sources in a scene. Artists can use an interactive lighting preview tool, based on Nvidia’s OptiX framework, to create proper lighting in a film. It is fully integrated within the Katana-based production workflow.
The lighting was important for setting the mood. A dark scene will be more ominous. Shadows can highlight the fear on a character’s face. Soft, brilliant lighting can create a feeling of nostalgia, Nahmias said.
The cool thing about the new system is that the tools are more intuitive for the artists. In the past, they had to look at screen after screen of lighting controls. They had to create their own lights and shine them on particular objects. Now, the physical light sources are things like lightbulbs or the sun itself.
These light sources can accurately calculate where light or shadows fall using ray tracing, or sending out rays from a single point in a scene and capturing how those rays bounce off objects and produce reflected light. Pixar’s ray tracing uses hundreds of millions of rays bouncing off tens of thousands of objects in a scene.
[aditude-amp id="medium5" targeting='{"env":"staging","page_type":"article","post_id":1233217,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,media,mobile,","session":"B"}']
Nahmias showed one scene in which a teapot could reflect an entire scene on its own shiny metal surface. That takes an enormous amount of computing power to calculate accurately, as it is effectively just like rendering two scenes in the same screen.
The results of the efforts were impressive. Audiences loved the film as an emotional, story-based tale about two compelling characters, Sully and Mike. The film’s budget was estimated at $185 million, and its box office take beat the $535 million generated by Monsters, Inc.
Check out the image gallery and videos below.
.