Detective Pikachu: Bringing Pokémon to the Screen
How do you bring the first-ever live-action Pokémon adventure, Pokémon Detective Pikachu to the screen?
Here Francesco Pinto, CG Supervisor and Lead Software Developer Curtis Andrus provide some insight into some of the challenges in making the world of Pokémon a reality.
Francesco Pinto, CG Supervisor “Creating the Torterra sequence was quite a challenge; it’s essentially a large moving forest environment on top of a giant turtle. To handle the large moving environment in the TOR sequence we made a couple of decisions.”
“First to address the movement of the static environment we would use our proprietary MPC PropRig system. This system is commonly used to add smaller props to characters such as Swords, Flags, but in our case it was mountains on the Torterra’s shells. The system itself is simple but we extended it slightly to allow the use of instances to populate tree’s, plants, grass, etc., and to allow the simulation of trees moving when the environment moved.
Then secondly, with Torterra sequence requiring around 358,000 trees, 355,000 rocks and 7,350,000 million grass patches, we had to think about how to optimize rendering and scene expansion time, so we decided to use InstanceArrays which provided major improvements in terms of speed and versatility instead of the normal Instances.”
“In addition to that we also created a new Culling pointCloud node that allowed us to make optimizations directly in Katana and allowed us to implement Pixar’s InstanceArray features into our rendering toolset and RenderManFor Katana.”
Curtis Andrus, Lead Software Developer – “The biggest and main focus for R&D on Detective Pikachu would be Pikachu’s fur. And everything with fur is challenging, the dynamics of fluffy fur, complex feathers, long render-times, etc.”
“First we adopted the latest version of our fur and grooming toolset – Furtility. The new version had many things that we were going to benefit from –improvements for driving fur with animated attributes, a new system for building feathers and significant fur render optimizations.”
Fur changes the look of animation once rendered and this is important to see throughout each review, but fully rendering each animation for every review was both time consuming and a lot of extra work for the render farm.
Animators also needed to see the position of the fur when posing and presenting their work. Therefore R&D needed to provide the animation department a solution which would change the animation workflow as little as possible but provided the best results to be seen in review.
“After some investigation we decided to implement Pixar’s Hydra technology for previewing fur. Hydra is a high-scalability, multi-pass rendering architecture from Pixar that ships with their USD distribution.”
“Hydra improved things in a number of ways. We built a tool to automatically display grooms in our animation dailies using Pixar’s Hydra. This provided our animators a better context when posing characters or presenting their work. We also adapted the tool to allow our Technical Animation department to quality check their fur dynamics results. Using Hydra we were able to get reasonable hair previews (with shadows, etc.) in a matter of seconds, where even a basic PRMan render would’ve taken an hour.”
“Another new toolset we implemented was something we call Sumo. This was essentially a Maya deformer to help simulate fat and flesh on our characters, but also clean up any intersections. We needed a faster way to simulate the jiggling and other secondary motion that the Pokémon characters would have. By using Sumo we estimated to have saved our TechAnim artists up to 3 hours per shot.”
Source: MPC R&D