cgezine (c) By Tom Russo | Hearst Communications
CINEMA SCOPE: ĐằnBehind the lens of this Panavision Genesis camera is an image sensor. It captures 12.4 megapixels of image and color information and turns it into movie magic. (Photograph by Kyle Christy)
Chris Watts has never worked with a wolf before. He and his crew are the designated animal wranglers for a scene in the upcoming movie 300, directed by Zack Snyder. The wolf in question is making an appearance in the epic about Spartan warriors at the battle of Thermopylae in 480 B.C. Watts and company are trying not only to make the creature stalk through the scene convincingly, but also to capture a particularly menacing shine on its teeth. "If you dipped a popsicle stick in maple syrup, that's the look we want for this fang," Watts says to one of his team.
CLIFF DIVING
This scene of Spartan soldiers driving their Persian enemies off a cliff and into the sea was shot 11 times on a blue screen stage in Montreal. The final image in the movie (below) was com-piled from eight separate takes to add more soldiers to the finished product. Later, a CGI sky and falling debris were added, along with moody lighting effects. "This scene has a very particular look in the [300] book," Watts says, "and we wanted to reproduce that."
Fortunately, no one has to lubricate a real-life lupine grin to get the shot Watts wants. In 300, the wolf's cuspids are a purely digital construct, as is every hair on its hide, the rocky canyon the wolf is haunting, the wintry nighttime sky overhead, and virtually every other element of the shot save for the young actor playing the animal's Spartan prey. More than a year after the human element of the scene was shot on a blue screen stage with a stand-in mechanical wolf, Watts, the movie's visual-effects supervisor, is filling in the expansive blanks with staffers at Hybride, a Quebec-based effects facility. "One nice thing about doing this on the computer," Watts says, "is that if you decide, ‘Okay, I like the hair and the eyes and everything else,' you can turn off all the other layers, and just highlight the teeth."
The New Normal
Digital effects such as 300's virtual wolf are remarkable not because they are groundbreaking — the use of computer-generated imagery (CGI) in cinema dates back to the 2D pixel-vision of a robotic Yul Brynner in 1973's Westworld — but because this technology is now a standard part of the moviemaking toolkit. The impact of digital technology on Hollywood has been gradual but all-encompassing.
A WOLF IN DIGITAL CLOTHING
In today's moviemaking, the creative work that takes place on a computer can be as important as what goes on in front of the camera. In the big-screen adaptation of Frank Miller's historical graphic novel 300 (above), the future Spartan King Leonidas fends off a wolf. On set, visual-effects supervisor Chris Watts tried using a robotic wolf (top) for the scene, but it was eventually covered up by a computer-generated version of the animal (shown midrender, below). Today, a movie can be shot, edited and distributed — from camera to theater and beyond — without involving a single frame of film. The transformation is at least as sweeping as the introduction of sound or color in the early 20th century, and it is changing both the business and the art form of cinema.
{mospagebreak}
Cinematographe rs, long resistant to digital image recording, are starting to embrace the use of digital cameras, shooting clean-looking footage that's easier to manipulate than film.
DRAGON RIDING
The fantasy movie Eragon combines traditional on-set filmmaking (above) with digital innovations. About 200 shots were created using a "motion rig" to help the actor simulate riding a dragon. Edward Speleers, who plays the title role, rides a hydraulically driven mechanical saddle that is programmed to mimic the movements of Saphira, a computer-generated dragon rendered in postproduction. Motion-rig scenes are filmed on a sound stage against a blue screen peppered with laser dots that help digital-effects artists align backgrounds and digital characters with the movements of the camera
Commonly available software allows small special effects shops such as Hybride to render entire virtual worlds and blend them seamlessly with live-action shots. Scenes that would have required elaborate sets 25 years ago can now be shot against a blue or green screen, and the setting can be filled in later — and then tweaked until the director is satisfied.
Visual effects once were labor-intensive novelties generated for impact at key moments in a movie, but digital cameras and powerful software have changed all that. "Effects used to be an issue of process versus product," says John Dykstra, the visual-effects designer on the first two Spider-Man movies. With film, getting the end result the director wanted tended to slightly degrade the quality of the image. Wizards such as Dykstra had to import footage frame by frame into computers for editing and CGI work, then convert the digital product back into film. "Putting effects on film always meant photochemical generational loss," Dykstra says. That's changed. "With digital, we went from being optics mavens to focusing on what illusion tells the story best, because now you can do anything."
There is a powerful recycling effect in Hollywood — as digital techniques for rendering textures such as hair, water and fire are pioneered in films such as Stuart Little and The Perfect Storm, they become part of movieland's collective effects arsenal, eventually being packaged in software such as Autodesk Maya Hair and Maya Fur.
Elements and tools — from digital characters and environments to motion-capture techniques that record actors' movements and facial expressions — now are handled routinely, with confidence rather than crossed fingers. Stefen Fangmeier, an alum of George Lucas's Industrial Light & Magic (ILM), sounds matter-of-fact as he discusses the elaborate work he and his crew have done on his directorial debut, an adventure fantasy called Eragon. "Is there a tremendous amount of new technology in this? No," Fangmeier says. "It's the way we're putting it together and applying it to this character. A dragon has never been done like this."
Unlike most digital creatures, which are created almost entirely in postproduction to react to the movements of a movie's live characters, the movements of Eragon's dragon, a central character, were choreographed by animators before the cameras started rolling. The dragon's motion was uploaded to a high-tech mechanical bull ridden by an actor on a blue screen stage. "The result is that you get more realistic body language from your actor," says Samir Hoon, the movie's visual-effects supervisor for ILM. "Even if the dragon is just waddling along, you're trying to capture as many nuances as possible."
DIGITAL BONES
In the new action-comedy Night at the Museum, Ben Stiller is chased through the movie by a dinosaur skeleton come to life. To create the T-rex, effects artists first had to create this digital wire-frame model.
. {mospagebreak}
Digital character Building
New technology also is allowing directors to meld CGI and live action in fresh ways. In last summer's Pirates of the Caribbean: Dead Man's Chest, ILM's image-based motion-capture, or Imocap, software helped animators turn actor Bill Nighy's face into a squiggling mass of octopus tentacles for his role as the villain Davy Jones. Until recently, motion-capture work on characters such as Gollum from the Lord of the Rings trilogy tended to interfere with acting. A performer charged with creating a digital character's movements had to work in a spandex suit on a motion-capture stage with a minimum of 16 cameras sampling his movements.
GETTING INTO CHARACTER
For Pirates of the Caribbean: Dead Man's Chest, ILM's Imocap technology put actor Bill Nighy's on-set performance into the tentacled CG face of the villainous Davy Jones. In contrast, Imocap let Nighy work on the set with other actors. "We wanted Bill to be able to do his performance opposite Johnny Depp and everyone else, without any constraints or weird processes getting in the way," says Pirates animation supervisor Hal Hickel. The result was a kind of digital makeup that accentuated Nighy's character rather than covering it up — the tentacles moved naturally (or, perhaps, supernaturally) with his facial expressions.
Motion rigs and makeup are old moviemaking standbys that are being reinvented in the new, digital environment. Much the same could be said of 3D effects, which were introduced as a novelty in the 1950s. Today, digital 3D formats such as IMAX 3D and Real D are bringing the funny glasses back as a way to differentiate the theater experience from what's available through increasingly sophisticated home entertainment systems. Moviemakers are using software to take existing 2D footage and reformat it for stereoscopic projectors. For the recent 3D re-release of The Nightmare Before Christmas, all of the puppets in Tim Burton's 1993 film were digitally rendered at a slightly shifted angle compared to the original footage. When the finished product is run through the Real D projector adapter, the viewer's left eye sees the original movie footage, while the right eye takes in the new material.
There are a slew of 3D epics in the works. Movie-tech pioneer James Cameron (Titanic) is working on the big-budget sci-fi features Avatar and Battle Angel; director Robert Zemeckis (Back to the Future) is working on a 3D Beowulf; and George Lucas — ever the digital revisionist — has stated plans to re-release the Star Wars trilogies in 3D.
FACE FACTS
The science of motion capture — gathering data from an actor's performance for use in digital animation — reaches its limitations when it comes to the human face. But new technologies may better enable visual-effects gurus to track subtle facial expressions. The Mova Contour system (above, with Mova founder Steve Perlman) uses phosphorescent theatrical makeup, texture and geometry cameras, and fluorescent lights that flash 90 to 120 times per second. A technology from Image Metrics goes further, tracking facial movements without the need for makeup or any special hardware. So a computer character can instantly mimic every element of an actor's face.
! Moving beyond "Cut!
Cinematographers are the film era's last holdouts. As the people most directly responsible for the color, texture and clarity of the images onscreen, they tend to be conservative. Many still prefer the richness, highlights and grain of film over the cleaner, harsher look of digital image recording. But today other cinematographers say they are drawn to the capabilities the technology provides. Industry veteran Dean Semler, an Oscar winner for Dances With Wolves, has used Panavision's digital Genesis camera on his last three projects: the Mel Gibson-directed Mayan epic Apocalypto, and the two Adam Sandler comedies Click and I Now Pronounce You Chuck and Larry. Cinematographers have long used low-res video playback to check their work on the set, but the images on film often look quite different. Digital moviemaking solves that problem. "There's a huge comfort factor in looking at an image you know is going to look the same way it is on the screen," Semler says.
For directors, less cost pressure means more creative freedom, and compared to film stock, digital tape is almost free. "Sometimes you roll for an hour without cutting, because you can," director Robert Rodriguez (Spy Kids) said at a recent panel discussing Grindhouse, a horror film he is codirecting with Quentin Tarantino. "You find moments there that you might lose otherwise." Rodriguez, who often doubles as his own cinematographer, shot his last two movies digitally. "I feel like I'm wasting film if I mess up a line or if something's not coming together," said Rodriguez's fellow panelist, actress Rose McGowan. But when she voiced that worry on the set of Grindhouse, she said, "the entire crew and Robert started laughing - 'That's old school!'"
{mospagebrea k}
Smoke and mirrors ...
The technology breakthroughs that made dinosaurs and big waves a few years ago have eddied into mini disciplines with ever-rising levels of virtuosity. "Effects have gotten more evolutionary rather than revolutionary as time has gone on," says ILM's Hickel. One area that is seeing continuing incremental advancement is element and particle simulation — rendering water, fire, smoke and dust with greater fidelity. For the February comic book action flick Ghost Rider, lead actor Nicolas Cage's head is replaced by a skull exploding with digital flames designed in a tweaked version of Maya software. "Our hero doesn't have any eyes or lips or a tongue, so he can't form words, and he doesn't have any expression," says director Mark Steven Johnson. "You can't tell when he's sad or vengeful. So I really wanted the fire to have a personality, to make up for what we didn't have."
During Hollywood's Golden Age of the 1930s and '40s, filmmakers used entire guilds of set decorators, matte painters and other artisans to help create movie magic. Today, boutique digital-effects shops perform similar tasks. "God is in the details," Dykstra says. "You get into this business of who's producing the most realistic skin, or the most realistic sky, or the most realistic field of battling armies. The ability to create images that are indistinguishable from reality has truly opened a Pandora's box. In a good way."
Hybride, the outfit simulating wolf drool for 300, is best known for rendering stylized digital backdrops, like those it created for Rodriguez's 2005 movie Sin City. That film's dark comic book atmospheres melded the live action of the movie with the raw visual approach of graphic novelist Frank Miller, who also wrote the book upon which 300 is based.
To design 300's digital backdrops, Hybride artists needed to combine obsessive attention to detail with a deliberately artistic — as opposed to realistic — visual aesthetic. To blend footage of the actors into the surreal backgrounds, the artists employed a complex process. First, 3D tracking was used to map out virtual camera angles to correspond to the real camera's movement. Once the virtual and real camera angles were matched, the computer-generated imagery was "shot" from the right angle and dropped into a scene, ensuring that warriors won't be hidden by the enemy, the terrain or the odd lethal projectile.
The process was just the first step in a branching and converging stream of CGI work that included modeling sets, modeling characters and rigging them for animation, then adding texture and lighting flourishes to all of it. In a downstairs conference room at Hybride's headquarters, effects artists review a shot that's nearing completion. On a small movie screen, the Spartan king cradles a young war casualty, his somber troops clustered around them. In the background, digitally rendered flames flicker on smoky, expressionistic, combat-ravaged digital hillsides.
"We've asked [Hybride] to give us art, not just reality," says director Zack Snyder. "It's hard because it's subjective. One man's art is another man's screw-up." To that end, work on the digital wolf continues. In a number of shots, the wolf still shows up as a wire-frame construct; in others, he looks like some bizarre alabaster lawn ornament — the fur has been left off for the time being to allow the artists to focus on the movements of the animal's musculature. Watts is on that case, but he's also finessing other fine points, such as making sure the creature's breath is shown, highlighting the scene's frigid conditions. "Wars have been waged over the breath," he jokes. "The stuff that we argue about is so beyond the realm of what normal people ever worry about." In the new digital age, every pixel counts.