If it seemed to you five years ago that big-budget moviemakers were suddenly trying to blow you out of your seat, you’d be right. With movie theaters equipped with new digital sound systems, there was virtually no limit to how loud a movie could be and a lot of directors, especially where action movies were concerned, pumped up the volume.
“Mixers and editors were sometimes being bullied into making movies louder and louder to get more impact,” says Kevin O’Connell, currently mixing the ’97 summer release “Con Air.” But with the noise in theaters sometimes reaching the decibel level of a Rick Derringer comeback concert, audience enjoyment was down and complaints were up. “It was getting too painful,” adds Bruce Stambler, the Oscar-winning sound designer and co-owner of Los Angeles-based SoundStorm.
However, directors and sound designers alike have become more savvy to the possibilities of digital sound of late, and the summer of ’97 should be no season of discontent.
“Digital technology allows you to more clearly define the sound and to be much more specific with it,” says Wylie Stateman, co-founder of Hollywood-based Soundelux, also the head sound designer for “Air Force One,” starring Harrison Ford. “It’s come of age and so have we.”
Sounds of silence
Because of the widespread use of CGI and other special effects, sound is becoming more important than ever, says Stambler, who is currently wrapping sound work on this summer’s Warner Bros. release “Batman and Robin.” It’s not uncommon these days for sound people to be given 10-minute sequences with no recorded sound at all. In “Batman’s” second act, for example, Stambler and his crew had to record and layer sound for a 7-minute motorcycle chase that was shot completely MOS (without sound).
The bikes used by the actors were dressed-up two-cycle dirt bikes that make an annoying, unusable sound — part chainsaw, part model airplane. So Stambler spent a week at the Agua Dulce airpark near Palmdale, Calif., racing six Harley-Davidsons, Hondas and Triumphs up and down the empty runways at speeds approaching 140 mph and recording the resultant sound from various points of view.
Back at the studio, it took a week to organize takes and load them into digital audio workstations and another week for two editors to cut the bike sounds, crashes, gunshots and other sounds to the pictures.
For Gary Rydstrom, sound designer at Skywalker Sound in San Rafael, Calif., the challenge has been even greater. Working on “The Lost World,” Steven Spielberg’s much-anticipated sequel to “Jurassic Park,” he has had to not only fill huge voids of silence, but to create sounds for creatures no one has ever seen or heard.
“What makes doing this sequel harder is that we all feel we have to top what we did in the first movie,” Rydstrom says. But, he adds, the audience won’t be in for a disappointment. T-Rex will sound meaner thanks to some design changes and there’s a lot of new characters to pique audience interest. There is, for example, the Compy, a dinosaur the size of a chicken with the attitude of Tyrannosaurus. To give it a voice, Rydstrom and his team made numerous trips to Bay area zoos and wildlife parks to record the chirps of raccoons, the squeals of baby seals and the snuffling of arctic foxes. By layering the sounds together and processing them through a digital sampler — where sounds can be literally played on a piano keyboard with sharp, flat and minor keys — the Compy, by itself and in chattering groups, has come to life. “It’s amazing what adding even the smallest piece of sound can do to a scene,” Rydstrom says.
Far from being a burden to the industry’s top sound people, huge holes of silence (or more precisely, minutes of CGI with a 60 MHz hum) are a creative challenge sound designers say they love to take on.
“Audiences are much more into sound than they used to be,” says John Leveque, Stambler’s partner at SoundStorm. Leveque often goes to previews of a project he’s working on and sits in the middle of the theater to gauge the audience’s reaction before going back into the studio to make the necessary adjustments. “Specific sound is as important as a specific performance from an actor,” Leveque says.
To create the sound of “Batman and Robin’s” Mr. Freeze (played by Arnold Schwarzenegger) icing an entire city with his freeze gun, Leveque had to go to some extremes. “We needed some big sounds of stuff freezing,” he recalls. Flash-freezing objects in liquid nitrogen and recording it wasn’t doing the trick; the sound was too small. So Leveque took a field trip to Big Bear, 70 miles east of Los Angeles, where a friendly park ranger steered him to a deserted lake. Getting up at the crack of dawn, before the morning sun could melt the new sheets of virgin ice formed overnight, Leveque went for a walk on the frozen lake and recorded himself as his body weight slowly cracked the ice and he fell through.
Luckily, the multiple times he did this, he only sank to his waist. “The squeegee sound ice makes as it’s giving way is fantastic,” Leveque says. By reversing the sound and feeding it through a digital processor and then building about 100 layers of sound, the freezing of Gotham has been made real.
The summer of ’97 will see no big breakthroughs in the technology available to movie sound. Rather, industry analysts say, it’s going to be the summer when audiences will be treated to the work of sound people increasingly comfortable with the subtleties of sound design made possible by digital work stations in the studio and the six- to eight-track sound systems in most theaters.
“We can pinpoint sound 360 degrees,” Stateman says. “That allows much more definition in our work.” Most U.S. theaters these days have multiple-channel sound systems — either an eight-track Sony Dynamic Digital Sound (SDDS) system or the six-track Digital Theater Sound (STS) and Dolby Digital Sound systems.
With two more tracks to work with, many sound mixers prefer to mix for SDDS, but either way, designers can create a sonic environment that puts moviegoers at the center of the action.
Much of this summer’s “Air Force One” and “Con Air” takes place aboard an airplane — the presidential jet in the former and a government airplane used to transport convicts across the country in the latter. To keep the audience from becoming too claustrophobic, sound is used to keep the space dynamic real.
“Con Air’s” supervising sound editor George Watters spent days flying around in a C-140 cargo plane recording hours of sound in every part of the airplane in almost every possible situation. That gave O’Connell multiple options when it came to the final mix.
When there’s some action in the tail of the plane while the audience is looking forward, they’ll hear it behind them. When there’s a struggle up front to the right of the screen, that’s where the sound will come from. And when a fighter jet flies by the presidential plane from front right to back left, that’s precisely what you’ll hear. “When you sit in the theater, you’ll be sitting in ‘Air Force One,'” O’Connell predicts.