In the early days of audio, the “power matching” technique of creating 600-ohm source impedances to drive 600-ohm load impedances was a standard design practice. The topic of impedance matching used to be a prime teaching topic in decades past. The reason for learning impedance matching was that many sound sources, signal processors, consoles, and amplifiers were borrowed from fields other than sound reinforcement, and the necessity of knowing the input and output impedances was critical to get the system to play nicely together.
Back in the 1960s and 1970s, you could have a home hi-fi tape player patched into a borrowed recording studio console driving an install PA power amplifier into shop-made live sound speakers. Each class of gear had its own ad-hoc standards for line levels and input/output impedances.
The formal evolution of sound reinforcement came from the telephone industry, followed by the “talking” motion picture industry that used telephone industry standards and practices. Back then, everything was referenced to 600-ohm transmission line impedances with an exception of a few 150-ohm microphone line impedances. Since all the electronics were made of vacuum tubes, both input and output transformers were used on all audio circuits to bring things back to 600-ohms for efficient long-wire transmission.
This early era of audio used the most of “impedance matching” because vacuum tubes were relatively expensive and the most reliability-challenged part of audio systems. To preserve every bit of audio power, the “power matching” technique of creating 600-ohm source impedances to drive 600-ohm load impedances was a standard design practice. The upside to this practice made everything compatible with each other. The downside was that each interface lost half its power (and some voltage and current), so minimizing circuitry was very important.
From dBms to dBus
When lower-cost circuitry began to appear in the 1970s, the idea of not counting the tube or transistor stages, and of getting more flexibility, won out. The concept of impedance matching shifted from identical impedances for power matching. Lower impedances driving higher input impedances became normal practice. So the 1 milliwatt at 600-ohms that was 0.775 volts RMS (0 dBm) now lost its 600-ohm reference and became 0.775 volts RMS unreferenced (dBu). So to keep a 600-ohm input impedance from losing most of its voltage, the drive circuit had to have a 150-ohm or less source impedance by decree.
But matching transformers were becoming a habit that was too expensive for high fidelity audio circuits, especially when vacuum tubes and transistors (and integrated circuits) were getting really inexpensive by comparison. So as a result, audio impedance standards began to rise up to many thousands of ohms to use direct coupling of audio gear. In tube equipment, source impedances such as 100k ohms were much more accepted driving 470k ohms or higher input impedances of the next stage gear. Today, most vacuum tube guitar amplifiers have a standard 1 meg-ohm input impedance jack for the 250k ohm instruments.
All this high impedance matching was based on the old-school practice of impedance matching that enforced a one-to-four ratio of source impedance to load impedance. In Figure 1, the 220k ohm plate resistor of the 12AX7 tube amplifier circuit used a 0.02 micro-farad coupling capacitor to couple the output voltage into a 1 meg-ohm grid level control of the next tube stage. This practice is followed less today, as most equipment is now solid-state. Note that you can always lower the source/drive impedance or raise the load impedance and get even a better match to lose less voltage (dBus).
When transistorized circuitry became the usual practice, the impedance matching technique became even less power matched, with a one-to-ten ratio taught in most technical and engineering schools. Figure 2 shows this with a common PN2222 transistor having a 10k ohm collector resistor driving a 1 micro-farad coupling capacitor into a 100k ohm volume potentiometer that has 10 times the collector resistor value. Given that the coupling capacitor impedances are fairly low in the audio frequency band, most all the source voltage makes it into the next stage input circuits. And this also applies if the capacitor may have jacks and a cable after itself.
The Modern Era
Today we are spoiled rotten with industry-dedicated gear using XLR or TRS phone connectors, uber-perfect balanced cabling and low-noise circuitry everywhere. Just about all output impedances are 100 ohms or less, and most input impedances are 6k ohms or more. These standard combinations take advantage of integrated circuits or dedicated drivers to handle all kind of cable interconnections and provide nearly hum (ground-loop) free audio paths. Figure 3 shows a typical low cost integrated circuit driving a 47 ohm safety resistor (to prevent cable shorts from blowing up the circuit), a 10uf coupling capacitor, into a cable towards a 10k ohm input resistor of the next piece of gear.
Looking back from the cable in Figure 3, the output impedance is typically 100 ohms or less, coming from the 47 ohm safety resistor plus an internal 50 ohm output impedance of the TL071 integrated circuit chip. And by some standards, the feedback resistor design of the TL071 (22k ohm feedback resistor), makes the 47 ohm resistor the real output impedance. The bottom line is that today’s modern circuitry makes impedance matching an often neglected thought by even audio grandpappys like me.