Lights, camera, data! – Part I

  • Cinema scope - Behind the lens of this Panavision Genesis camera is an image sensor. It captures 12,4 megapixels of image and colour information and turns it into movie magic.
  • A wolf in digital clothing - In today"â„¢s moviemaking, the creative work that takes place on a computer can be as important as what goes on in front of the camera. In the big-screen adaptation of Frank Miller"â„¢s historical graphic novel 300, the future Spartan King Leonidas fends off a wolf. On set, visual-effects supervisor Chris Watts tried using a robotic wolf for the scene, but it was eventually covered up by a computer-generated version of the animal (see image).
  • Cliff diving - This scene of Spartan soldiers driving their Persian enemies off a cliff and into the sea was shot 11 times on a blue screen stage in Montreal. The final image in the movie (above) was compiled from eight separate takes to add more soldiers to the finished product. Later, a CGI sky and falling debris were added, along with moody lighting effects. "This scene has a very particular look in the (300) book," Watts says, "and we wanted to reproduce that."
  • Getting into character - For Pirates of the Caribbean: Dead Man"â„¢s Chest, ILM"â„¢s Imocap technology put actor Bill Nighy"â„¢s on-set performance into the tentacled CG face of the villainous Davy Jones.
  • Lights, camera, data! - Part I
Date:31 January 2007 Tags:,

In today’s digital Hollywood, cameras capture scenes in bits, not frames – and computer wizards conjure up everything from impossible beasts to clifftop battlegrounds. Film is dead. Long live the movies.

Chris Watts has never worked with a wolf before. He and his crew are the designated animal wranglers for a scene in the upcoming movie , directed by Zack Snyder. The wolf in question is making an appearance in the epic about Spartan warriors at the battle of Thermopylae in 480 BC. Watts and company are trying not only to make the creature stalk through the scene convincingly, but also to capture a particularly menacing shine on its teeth. “If you dipped a sucker stick in syrup, that’s the look we want for this fang,” Watts says to one of his team.

Fortunately, no one has to lubricate a real-life lupine grin to get the shot Watts wants. In , the wolf’s cuspids are a purely digital construct, as is every hair on its hide, the rocky canyon the wolf is haunting, the wintry night-time sky overhead, and virtually every other element of the shot save for the young actor playing the animal’s Spartan prey. More than a year after the human element of the scene was shot on a blue screen stage with a stand-in mechanical wolf, Watts, the movie’s visual-effects supervisor, is filling in the expansive blanks with staffers at Hybride, a Quebec-based effects facility. “One nice thing about doing this on the computer,” Watts says, “is that if you decide, ‘Okay, I like the hair and the eyes and everything else’, you can turn off all the other layers, and just highlight the teeth.”

The new normal
Digital effects such as virtual wolf are remarkable not because they are groundbreaking – the use of computer-generated imagery (CGI) in cinema dates back to the 2D pixel-vision of a robotic Yul Brynner in 1973’s – but because this technology is now a standard part of the moviemaking toolkit. The impact of digital technology on Hollywood has been gradual but all-encompassing. Today, a movie can be shot, edited and distributed – from camera to theatre and beyond – without involving a single frame of film. The transformation is at least as sweeping as the introduction of sound or colour in the early 20th century, and it is changing both the business and the art form of cinema.

Cinematographers, long resistant to digital image recording, are starting to embrace the use of digital cameras, shooting clean-looking footage that’s easier to manipulate than film. Commonly available software allows small special effects shops such as Hybride to render entire virtual worlds and blend them seamlessly with live-action shots. Scenes that would have required elaborate sets 25 years ago can now be shot against a blue or green screen, and the setting can be filled in later – and then tweaked until the director is satisfied.

Visual effects once were labour-intensive novelties generated for impact at key moments in a movie, but digital cameras and powerful software have changed all that. “Effects used to be an issue of process versus product,” says John Dykstra, the visual-effects designer on the first two movies. With film, getting the end result the director wanted tended to slightly degrade the quality of the image. Wizards such as Dykstra had to import footage frame by frame into computers for editing and CGI work, then convert the digital product back into film. “Putting effects on film always meant photochemical generational loss,” Dykstra says. That’s changed. “With digital, we went from being optics mavens to focusing on what illusion tells the story best, because now you can do anything.”

There is a powerful recycling effect in Hollywood – as digital techniques for rendering textures such as hair, water and fire are pioneered in films such as and , they become part of movieland’s collective effects arsenal, eventually being packaged in software such as Autodesk Maya Hair and Maya Fur. Elements and tools – from digital characters and environments to motion-capture techniques that record actors’ movements and facial expressions – are now handled routinely, with confidence rather than crossed fingers. Stefen Fangmeier, an alum of George Lucas’s Industrial Light & Magic (ILM), sounds matter-of-fact as he discusses the elaborate work he and his crew have done on his directorial debut, an adventure fantasy called . “Is there a tremendous amount of new technology in this? No,” Fangmeier says. “It’s the way we’re putting it together and applying it to this character. A dragon has never been done like this.”

Unlike most digital creatures, which are created almost entirely in postproduction to react to the movements of a movie’s live characters, the movements of Eragon’s dragon, a central character, were choreographed by animators before the cameras started rolling. The dragon’s motion was uploaded to a high-tech mechanical bull ridden by an actor on a blue screen stage. “The result is that you get more realistic body language from your actor,” says Samir Hoon, the movie’s visual-effects supervisor for ILM. “Even if the dragon is just waddling along, you’re trying to capture as many nuances as possible.”

Character Building
New technology also is allowing directors to blend CGI and live action in fresh ways. In , ILM’s image-based motion-capture, or Imocap, software helped animators turn actor bill Nighy’s face into a squiggling mass of octopus tentacles for his role as the villain Davy Jones. Until recently, motion-capture work on characters such as Gollum from the trilogy tended to interfere with acting. A performer charged with creating a digital character’s movements had to work in a spandex suit on a motion-capture stage with a minimum of 16 cameras sampling his movements.

In contrast, Imocap let Nighy work on the set with other actors. “We wanted bill to be able to do his performance opposite Johnny Depp and everyone else, without any constraints or weird processes getting in the way,” says animation supervisor Hal Hickel. The result was a kind of digital makeup that accentuated Nighy’s character rather than covering it up – the tentacles moved naturally (or, perhaps, supernaturally) with his facial expressions.

Motion rigs and makeup are old moviemaking standbys that are being reinvented in the new, digital environment. Much the same could be said of 3D effects, which were introduced as a novelty in the 1950s. Today, digital 3D formats such as IMAX 3D and Real D are bringing the funny glasses back as a way to differentiate the theatre experience from what’s available through increasingly sophisticated home entertainment systems. Moviemakers are using software to take existing 2D footage and reformat it for stereoscopic projectors. For the recent 3D re-release of , all of the puppets in Tim Burton’s 1993 film were digitally rendered at a slightly shifted angle compared with the original footage. When the finished product is run through the Real D projector adapter, the viewer’s left eye sees the original movie footage, while the right eye takes in the new material.

There are a slew of 3D epics in the works. Movie-tech pioneer James Cameron () is working on the big-budget sci-fi features and ; director Robert Zemeckis () is working on a 3D ; and George Lucas – ever the digital revisionist – has stated plans to re-release the trilogies in 3D.

Moving beyond ‘Cut!’
Cinematographers are the film era’s last holdouts. As the people most directly responsible for the colour, texture and clarity of the images onscreen, they tend to be conservative. Many still prefer the richness, highlights and grain of film over the cleaner, harsher look of digital image recording. But today other cinematographers say they are drawn to the capabilities the technology provides. Industry veteran Dean Semler, an Oscar winner for , has used Panavision’s digital Genesis camera on his last three projects: the Mel Gibson-directed Mayan epic , and the two Adam Sandler comedies and . Cinematographers have long used low-res video playback to
check their work on the set, but the images on film often look quite different. Digital moviemaking solves that problem. “There’s a huge comfort factor in looking at an image you know is going to look the same way it is on the screen,” Semler says.

For directors, less cost pressure means more creative freedom, and compared with film stock, digital tape is almost free. “Sometimes you roll for an hour without cutting, because you can,” director Robert Rodriguez () said at a recent panel discussing , a horror film he is co-directing with Quentin Tarantino. “You find moments there that you might lose otherwise.” Rodriguez, who often doubles as his own cinematographer, shot his last two movies digitally. “I feel like I’m wasting film if I mess up a line or if something’s not coming together,” said Rodriguez’s fellow panellist, actress Rose McGowan. But when she voiced that worry on the set of , she said, “the entire crew and Robert started laughing – ‘That’s old school!'”

Smoke and mirrors
The technology breakthroughs that made dinosaurs and big waves a few years ago have eddied into mini-disciplines with ever-rising levels of virtuosity. “Effects have (become) more evolutionary rather than revolutionary as time has gone on,” says ILM’s Hickel. One area that is seeing continuing incremental advancement is element and particle simulation – rendering water, fire, smoke and dust with greater fidelity. For the February comic book action flick , lead actor Nicolas Cage’s head is replaced by a skull exploding with digital flames designed in a tweaked version of Maya software. “Our hero doesn’t have any eyes or lips or a tongue, so he can’t form words, and he doesn’t have any expression,” says director Mark Steven Johnson. “You can’t tell when he’s sad or vengeful. So I really wanted the fire to have a personality, to make up for what we didn’t have.” During Hollywood’s Golden Age of the 1930s and ’40s, filmmakers used entire guilds of set decorators, matte painters and other artisans to help create movie magic. Today, boutique digital-effects shops perform similar tasks.

“God is in the details,” Dykstra says. “You get into this business of who’s producing the most realistic skin, or the most realistic sky, or the most realistic field of battling armies. The ability to create images that are indistinguishable from reality has truly opened a Pandora’s box. In a good way.”

Hybride, the outfit simulating wolf drool for , is best known for rendering stylised digital backdrops, such as those it created for Rodriguez’s 2005 movie . That film’s dark comic-book atmospheres blended the live action of the movie with the raw visual approach of graphic novelist Frank Miller, who also wrote the book upon which is based.

To design digital backdrops, Hybride artists needed to combine obsessive attention to detail with a deliberately artistic – as opposed to realistic – visual aesthetic. To blend footage of the actors into the surreal backgrounds, the artists employed a complex process. First, 3D tracking was used to map out virtual camera angles to correspond to the real camera’s movement. Once the virtual and real camera angles had been matched, the computer-generated imagery was “shot” from the right angle and dropped into a scene, ensuring that warriors won’t be hidden by the enemy, the terrain or the odd lethal projectile.

The process was just the first step in a branching and converging stream of CGI work that included modelling sets, modelling characters and rigging them for animation, then adding texture and lighting flourishes to all of it. In a downstairs conference room at Hybride’s headquarters, effects artists review a shot that’s nearing completion. On a small movie screen, the Spartan king cradles a young war casualty, his sombre troops clustered around them. In the background, digitally rendered flames flicker on smoky, expressionistic, combat-ravaged digital hillsides.

“We’ve asked (Hybride) to give us art, not just reality,” says director Zack Snyder. “It’s hard because it’s subjective. One man’s art is another man’s screw-up.” To that end, work on the digital wolf continues. In a number of shots, the wolf still shows up as a wire-frame construct; in others, he looks like some bizarre alabaster lawn ornament – the fur has been left off for the time being to allow the artists to focus on the movements of the animal’s musculature. Watts is on that case, but he’s also finessing other fine points, such as making sure the creature’s breath is shown, highlighting the scene’s frigid conditions. “Wars have been waged over the breath,” he jokes. “The stuff that we argue about is so beyond the realm of what normal people ever worry about.” In the new digital age, every pixel counts.

10 scenes that changed movie history
Today, many digital effects are so subtle that movie audiences often don’t notice them – but it wasn’t always so. We asked industry insiders to pinpoint the biggest breakthroughs in digital F/X history.< NULL

Latest Issue :

Jan-February 2022