But the number of hairs in this intermediate stage are just about 900,000, or a lot less than in the final film. Van Gelder said that visualization tools help the artists move a long way toward rendering Sully as a realistic character. He starts to look like a monster who weighs 1,000 pounds and has powerful arms that swing back and forth like an ape’s.
“When they push the button and hit render, they pretty much know it’s going to be close to what they want,” Estes said.
Once the character animations are done and the scenes are fixed, Pixar’s artists make a final pass on rendering a scene with precise lighting, which can change the whole mood of a scene.
Van Gelder showed one scene that was hard to do because it had six deeply detailed characters on the screen at once, and they were covered in shadows and moving through a building top.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
“The toughest scenes are where you require a lot of lights, reflecting, and shadows to get the right mood,” Estes said. “That’s computationally very intensive because you’re calculating all of the light rays bouncing around.”
When it comes to mapping the crude blueprints into an animated scene, Pixar’s crew uses a proprietary animation system dubbed Presto, which was first built for the film Brave and was reused for Monsters University. Presto takes extremely complex character models, like the main character Sully, and brings them to life. Artists can manipulate the models in real-time to see how they can make them move, convey emotion, and tell a story. By clicking on objects, they can make arms and legs move just as they want them to in the final movie.
“They don’t have to wait for a full day to see the results,” Estes said.
As for the lighting, Pixar also described its “global illumination” system, which is based on physical light sources in a scene. Artists can use an interactive lighting preview tool, based on Nvidia’s OptiX framework, to create proper lighting in a film. It is fully integrated within the Katana-based production workflow.
The lighting was important for setting the mood. A dark scene will be more ominous. Shadows can highlight the fear on a character’s face. Soft, brilliant lighting can create a feeling of nostalgia, Nahmias said.
The cool thing about the new system is that the tools are more intuitive for the artists. In the past, they had to look at screen after screen of lighting controls. They had to create their own lights and shine them on particular objects. Now, the physical light sources are things like lightbulbs or the sun itself.
These light sources can accurately calculate where light or shadows fall using ray tracing, or sending out rays from a single point in a scene and capturing how those rays bounce off objects and produce reflected light. Pixar’s ray tracing uses hundreds of millions of rays bouncing off tens of thousands of objects in a scene.
Nahmias showed one scene in which a teapot could reflect an entire scene on its own shiny metal surface. That takes an enormous amount of computing power to calculate accurately, as it is effectively just like rendering two scenes in the same screen.
The results of the efforts were impressive. Audiences loved the film as an emotional, story-based tale about two compelling characters, Sully and Mike. The film’s budget was estimated at $185 million, and its box office take beat the $535 million generated by Monsters, Inc.
Check out the image gallery and videos below.
.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More