How many times has the following happened to you? You're in a room watching a band (or perhaps a public speaker), and you notice that the sound of an instrument or voice is coming from the P.A., not from the physical location of the source. Sometimes it's subtle, but sometimes you think: "Why am I seeing this guy directly in front of me but hearing his voice to my left?" It's such an unnatural phenomenon that it distracts from delivery of the musical content or message. We all share the pursuit of a realistic listening experience, whether in studio or onstage. Back in the dark ages we had mono, which led to stereo, quad and X.1 surround (pick your favorite for the day). Unfortunately, listening in the real world is not a 1-, 2-, 5- or 7- channel experience. It's infinitely more complex: the sounds we hear everyday (musical or otherwise) emanate from more than just a fixed number of locations. In fact, if you tried to trace the sonic path of a simple sound in a room, you'd find that the additional paths created by reflections from walls, floor and ceiling are far more complex than the sound that reaches you directly from the source.
In addition to simple changes in volume, it's these complex paths that give humans (and most animals) the ability to localize sound. Most of us hear binaurally, i.e. we have two ears. Typically, sound is received by both ears, but may be louder in one ear than the other, or a sound may arrive at one ear earlier than the other. Your brain has the ability to process this difference based upon the (hopefully not empty) space between your ears, then figures out the direction from which the sound originates. The brain does this for free, oodles of times each day. (You can prove this to a friend with a simple experiment. Have your friend c l o s e their eyes and point at you, while you say their n a m e f r o m different parts of a room. They will be able to locate you. Then have them stick a finger, drumstick or cactus in one ear, and repeat the process. They'll point in the wrong direction. Really.)
Alas, there is no way to stop your brain from making these sonic calculations, so when faced with the scenario presented in the first paragraph, your brain concludes: "I see the guitar player on my right but I hear her on my left, therefore this is not a realistic experience." If you're lucky enough to be exactly on the centerline between the speakers (be there two, four or more), then maybe you'll hear what you see, but the vast majority of our audience (and often the engineer) does not have the luxury of sitting in the "sweet spot." We've come to accept this limitation, which is a shame.
When people perform on stage we typically use a stereo P.A. system, but the P.A. system itself prevents us from properly localizing a sound source due to what is known as the "Haas Effect." In 1949, Dr. Helmut Haas discovered that when a sound comes from two or more speakers, the listener perceives the sound as coming from the closer speaker– even if the more distant speaker is louder (thus the importance of listening from center). Not only does your brain think this is u n n a t u r a l , but it's now a distraction that impedes your enjoyment. What a drag.
Enter Out Board TiMax
Recognizing that the majority of any audience is not on the center line between the speakers, Out Board developed TiMax, a "Source-Oriented Reinforcement" (SOR) system which achieves two distinct goals: 1– Even distribution of sound level over a large listening area and 2–Accurate directional information for multiple sound sources so that the sonic position of a performer can match the visual location, no matter where you're sitting. The result is a reduction of stress on the brain (seriously), decreased listener fatigue, improved intelligibility, enhanced delivery of the "message" (whatever that may be) and widening the sweet spot to include more than 90 percent of the audience.
TiMax's SOR employs a multichannel audio matrix and a proprietary DSP that gives each audio source a unique level and delay setting with respect to every loudspeaker in the room. This allows every sound source to be independently and identically localized for each audience member, so sound appears to come only from the performer even though it is also coming from a loudspeaker.
TiMax Matrix Processor hardware is available in modular and non-nodular formats, both with analog I/O standard and AES/EBU digital I/O as an option. Systems range from 8 in/8 out up to 32 in/32 out. All systems come bundled with ShowControl software and a proprietary PCI card for your PC. Out Board also offers the ShowControlPC (a rackmount PC for touring applications) and the SoundTablet, a cue-driven sound effects editing package.
The basic objective of TiMax is a simple psychoacoustic trick: Make sure that every audience member hears the natural acoustic wavefront from each performer 10 to 20 milliseconds before they hear the amplified sound from the P.A. As long as the delayed arrival of the P.A. is within this time, the brain will integrate the two sounds, causing the listener to locate the sound in the direction of the earlier arrival.
To achieve this deceptively simple goal, TiMax creates multiple "Image Definitions" or unique delay relationships between every source (e.g. microphone) and every loudspeaker. Of course, the problem is that these delays change every time a performer moves across the stage, so TiMax must recognize these changes and adjust the parameters.
In earlier TiMax systems, performer movements were rehearsed, a series of TiMax cues were created, and an engineer could scroll through the cues during the show–much like any audio scene change. Recently, Out Board has been working with a Norwegian company called Track The Actors (TTA) to automate the process. Each performer wears a small radio tag that communicates positional data in real time to TTA software. The software sends MIDI messages to ShowControl software which then converts them into level and delay instructions for the TiMax delay matrix, placing the audio image of the performer in the appropriate zones. The net result–whether you are sitting on the far right of the room or directly in front of the performer–is that the audio location of the performer matches the visual location! Initial applications for TiMax have been arena opera and theatre that require sound reinforcement without distraction, but it's easy to see the possibilities in just about any type of live performance.
Steve La Cerra is the tour manager and FOH engineer for Blue Oyster Cult. He's trying to figure out how to be in two locations simultaneously and can be reached via e-mail at Woody@fohonline.com