• Home
  • Articles
  • Bringing the Gods to Life
  • Bringing the Gods to Life Sunstar
    Bringing the Gods to Life
    Method Studios was in charge of bringing the ancient Kronos up to date with cutting-edge 21st-century technology.
    « prev1234next »
    Wrath of the Titans, released this weekend, is a reboot of some very, very old storylines. The movie uses the characters of Greek mythology--popular some 2,500 years ago--to make a thoroughly modern, blockbuster film. Method Studios was in charge of bringing these ancient gods up to date with cutting-edge 21st-century technology.

    In addition to building Kronos--the father of Zeus, king of the gods, who deposed his father and banished him to the underworld--Method was responsible for creating his Hell-like prison. “In addition” to the underworld, says Olivier Dumont, Method Studios’ Visual Effects Supervisor, “our team in London had the huge task of showing the outside of Tartarus, with all its set extensions.”

    In the movie, the gods carry small weapons that, when activated, grow much larger and have parts that glow. Method was in charge of creating those effects as well, producing Zeus’ famous thunderbolt, Hades’ pitchfork, Poseidon’s trident and Ares’ hammer.

    Another major part of their work was bringing underworld battles to life, making the gods duke it out while they’re surrounded by flying fireballs and lava.

    In all, Method completed 163 shots for the movie in just six months, helped by a team of 110 people.

    The Main Pipeline



    The process began with building models in ZBrush on top of a low polygon version generating displacement map. Method used Maya to set up all the layouts. “We knew from the beginning we would have to move things around to accommodate each framing situation,” Olivier explains. “We developed a specific asset browser and publishing tools on this show to be able to handle thousands of pieces per shot and share these between different applications, as everything was lit in Houdini using Renderman.”

    Method also created tools linking Nuke to their main database in Shotgun, so that artists could access all the information directly in Nuke. That way, Olivier says, “the composer could preview his shots with the latest surrounding ones using the cut length to make sure of the continuity. They could also see the thumbnails of all the approved renders available and import them in a click of a button.” After getting the first turnovers, the comp department began keying all the green screen in bulk, which allowed the lighter to start working.


    “In search of efficiency,” says Olivier, “I wanted to be able to show and get approvals from the production as quickly as possible, even if we weren’t ready with the CG FX. In order to do that, we mocked up all the shots using the first CG renders for the backgrounds, and 2D elements for the fire smoke and so on, putting them on cards. We could then see quickly if the quantity, the speed, or the scale and shape chosen were good.” Once the mockup was given the green light, the team could decide what was working in 2D and what wasn’t, so they could tell what needed to be done in CG. Those elements tended to be ones that the cards couldn’t handle because of a strong camera move, or pieces that needed very specific art direction. Overall, the team ended up creating three kinds of elements: 2D, generic CG elements that were too complex for 2D, and CG elements made for specific shots that required more precise direction.

    To accommodate the delivery of layers for the stereo conversion, the comp department used templates for their scripts in Nuke, so an automatic process could generate chosen layers and mini comps to check the output.

    As for lighting, Olivier says that “the main challenge was to keep the consistency between each lighter, as they always have their own approach. We enforced the continuity between the shots through a robust publishing and communication system to make it as smooth as possible.” That required lots of meetings to make sure everyone was on the same page.

    Throughout the process, Mari was the main texturing software. Since it can handle files up to 32K, it allowed the team to simplify the number of textures to output, and to have an overview of them while they were working. “It was really valuable on this because we knew that we couldn’t rely solely on matte paintings due to so many different camera angles,” Olivier adds.
    Adversitment
    • Latest Articles

    • Top Stories

    • Similar Stories

    • Related Stories

    • Last Coolthreads