Monday, December 27, 2010

Astro Boy vfx supervisor Yan Chen










For Astro Boy, Imagi Animation Studio's 2009 computer animated re-invention of the Osamu Tezuka Japanese manga series first published in 1952, and now on DVD, director David Bowers created a fresh look and feel to the popular boy robot story. Visual effects supervisor Yan Chen talks about creating this new Astro world.
How did you approach the look of the film, given that Astro Boy is such a well known property in both the east and west?

Chen: One of the things we loved on this film about the east versus west mentalities was that we could adopt the best of both worlds. We actually chose cultural icons and artistic icons to borrow. From eastern icons, we would look to the woodwork and watercolour paintings of Isamu Noguchi for the cold and sterile scenes on Metro City. This was even represented in the costumes. The Metro City suits are drab and grey, while the clothes people wear on earth are bright and colourful. They even have patches and are very vibrant and bohemian.

I noticed a very distinct colour palette throughout the film. Can you talk about that?

Chen: We approached it in a fairly traditional way through a colour arc or colour script. The art director, Jake Rowell, looked at the beats of the story and tried to time the colours appropriately. We used this colour palette to try and bring out the correct feel and representation of each scene. For instance, in the escape scene we tried to set up a desaturated white as a warm colour, whereas a cool colour would be the standard purples and blues. But because the warm colour wasn't actually like orange or yellow, your mind got accustomed to that being warm. So when Astro escaped into the clouds and saw the sunset, the real warmth of the orange and yellow of the sun kicked in, as he found his freedom, and it accentuated that fact. That was all explored in 2D with hand drawn paintings and was then translated into 3D.
 
What kind of modelling and animation effort was involved in the production?

Chen: We had about 20 modellers on staff, and separated our process into characters and environments. One team was solely responsible just for characters, largely because the performance of the characters was so important. We used Maya to do the modelling and animation, with rendering in RenderMan. We set Maya up in such a way that mimicked the muscle groups of a real human. Even though the characters are heavily stylised, we still controlled them using real muscle groups. For instance, in our facial system, we had 80 muscle shapes. When a character smiled, we'd move 15 to 20 of those muscles. We used a layered approached with blend shapes at the bottom and a higher control to manipulate multiple groups to give us the right expressions. On top of that we had a expression library that had director pre-approved expressions like smiles and frowns. 
 
Then there are the robot characters. Were there any particular challenges with them?

Chen: The robots fell into two categories - Metro City robots and the gladitorial or Earth robots. For the Metro City bots we primarily used ray tracing techniques to capture real world reflections. On the other hand, the robots in the battle arena on earth were all built of junk or made of scrap pieces. Because we didn't have the budget to individually build all these robotos, we went for a more Frankenstein system and built them with similar torsos, arms, legs and other junk. I think because they were meant to be more rustic anyway, the robots lent themselves to mis-mashing pieces together.



 
 
 
 
 
 
 
 
What kind of reference did you look to for the character animation?

Chen: It was primarily the voice actors - there was a camera pointed at them during recording - which was wonderful reference. The second form of reference was an expression sheet or style guide for the particular character. One of the quirks of Astro Boy was that the smile or smirk is very stylised, which obviously comes from Tezuka's original work, and we wanted to capture that.

And how did you approach the animation?

Chen: It was done via key frame animation. The overbearing philosophy was to treat each character with the same number of controls. There was one set for facial animation and one set for the body. That way the animator could dive into any character, from a small dog to a large robot. It made our library of over 150 characters easier to deal with as well. That's not to say we couldn't capture a different performance for each character. It was just that the joints were the same and it was up to the animator to deliver the appropriate performance.

What kind of considerations did you have in making Astro fly?

Chen: Well, we were pretty much using an off-the-shelf version of Maya. And one of the considerations we had with using Maya was how far Astro had to fly in the shots. Typically what happens when you move too far from the origin, things and deformations start to break down, purely because you're in the computer and it has a precision to it. What we did was that during previs we allowed the artists to fly anywhere they wanted in the world, but then we would run a technical process after the fact that constrained the action around the camera to a sphere. This was a sphere in which we could know how to maintain any deformation issues.

How did you realise the environments in the film?

Chen: It was really a mix of matte paintings and CG, often with a blend between the two. For the waking up Zog scene, which was set out in the wilderness, we wanted a very painterly look. We would create the environment in CG in Maya, render it in RenderMan and then we'd paint over it in Photoshop and re-project the painting onto the CG. We used a custom copy of Shake for all the compositing.  












What about the clouds that Astro flies through?

Chen: Our approach was to use 'Miyazaki' clouds. We stylised our shapes based on the classic 2D cartoons from Studio Ghibli. However, it was all done 3D where we actually modelled fluid containers then ran those 3D containers through a fluid sim. This gave us the ultimate control to make things like ice-cream shapes but still allowed it to feel like Astro was flying through a 3D, or real, bank of clouds.

How did you deal with the visual effects in the film, like the explosions and lasers and anti-matter?

Chen: The biggest challenge with the visual effects was to make them art-directable. They had to be somewhat graphic to keep the manga style but also fit into this 3D world we had created. We would test the look of our visual effects in paintings by literally painting our fx to see if it worked in 3D. We used Maya particles and fluid dynamics for the smoke and fire work. We stylised the motion of the smoke and fire to make it feel graphic. Any lightning was done with particles but with sprites that were painted textures on top. Again, this gave it that graphic sensibility.

Did you use particles for the red and blue energy?

Chen: The cores for the red and blue energy were multi-layered. It involved modelling a piece of geometry outside that was a sphere, but within the sphere was multiple layers of geometry on which we placed effects, some particle effects for the dots of the energy and some were 3D textures where you actually see a vortex of energy swirling around in the cores themselves. The rays that came out of that were mostly volumetric rays. If they interacted with a character they were volummetric or if they were standalone they were composited 2D effects.

What were some of the challenges of the final battle sequence?

Chen: One of the largest challenges was just the pure length. There were 70 shots and we had to match continuity across the shots for colour or smoke and where the sun was coming from. The good thing about it was that it was scheduled for the the last three months of production and we knew what it entailed.
 
 
 
 
 
 
 
 
 
 
In that final confrontation the Peacemaker keeps adding to his body and growing. How did you accomplish those shots?

Chen: The Peacekeeper could absorb and control anything. He absorbs drones, typewriters and even buildings by the end. We had to show this progression from a seven foot character to a 700 foot character! Although we did model different stages of the progression - 7 foot, 20 foot and 700 foot - we had to compromise with the director on some creative editing. You have to make them grow - 7 foot, 20 , 700 foot stages. Our money shots had to show the absorption of metal pieces and then we implied the rest in editing. We still had to show the things welding into him and show portions of him growing, but mostly we pre-built the growth in. 

The visual effects of Black Swan













For Black Swan, director Darren Aronofsky turned to Look Effects to help bring his dark New York ballerina story to life. Look's visual effects supervisor Dan Schrecker oversaw 220 shots for the film, from full CG swan wings to prosthetic augmentation, face replacements and other enhancements. Here's my article at fxguide.

Building LEGO together with Click 3X













Stop-motion photography and visual effects by Click 3X helped bring together Build Together, a :30 spot promoting the joys of LEGO construction. Here's my article at fxguide taking a look at how the commercial was created.

The visual effects of The Voyage of the Dawn Treader













Head on over to fxguide for extensive coverage of The Chronicles of Narnia: The Voyage of the Dawn Treader. Here's my article highlighting the VFX by Framestore, MPC, Cinesite, The Senate and The Mill. Then there are two great podcasts with VFX supe Angus Bickerton, an fxpodcast looking at the effects and stereo and a red centre podcast covering the lensing.

Face replacement in TRON: Legacy













fxguide's Mike Seymour has an in-depth article on Digital Domain's face replacement effects for TRON: Legacy. Includes a bunch of great behind-the-scenes pics, too.