How do you measure a headphone amplifier's output impedance?
Oct 21, 2014 at 7:03 PM Thread Starter Post #1 of 8

miceblue

Headphoneus Supremus
Joined
Feb 23, 2011
Posts
16,414
Likes
3,088
The title of the thread explains it all.

Do I need expensive measuring equipment to measure it, or can I just use a typical household digital multimeter? I saw a video

[video]http://youtu.be/ieAhBejHe2M?t=6m43s[/video]


But how do I attach known loads to the amplifier through the headphone jack? And how would I measure the output voltage?
 
Oct 21, 2014 at 7:16 PM Post #2 of 8
The minimum you need to do this is a true rms reading digital voltmeter,
a low output impedance signal generator, and a resistor, typically 10 ohms to 50 ohms.
 
you use the signal generator to force feed a signal thru the resistor back into the
output of the amplifier. you measure the rms voltage on both sides of the resistor
and calculate the impedance of the amplifier.
 
if the amplifier has a high damping factor you will need a signal generator that
can put out several watts of power, and the resistor needs to be rated for several
watts.
 
Oct 21, 2014 at 8:13 PM Post #3 of 8
The minimum you need to do this is a true rms reading digital voltmeter,
a low output impedance signal generator, and a resistor, typically 10 ohms to 50 ohms.

you use the signal generator to force feed a signal thru the resistor back into the
output of the amplifier. you measure the rms voltage on both sides of the resistor
and calculate the impedance of the amplifier.

if the amplifier has a high damping factor you will need a signal generator that
can put out several watts of power, and the resistor needs to be rated for several
watts.

Okay, that sounds pretty straight-forward. Would the output impedance be R2 in a typical resistor voltage divider circuit then?

I can get access to resistors, signal generator, and voltmeter, but I'm still wondering how I would attach the resistor to the amplifier's 3.5 TRS headphone output jack.
 
Oct 21, 2014 at 9:59 PM Post #4 of 8
You can send in a tone(pure sine wave) signal and measure the RMS of the tone at the output.  That's the open voltage without any load connected.  Then, you can connect a resistor at the output, and measure the voltage drop across the resistor.  You subtract the voltage drop across the resistor from the open voltage, and the result is the internal voltage drop.  Output impedance should be proportional to voltage proportions since amp's internal impedance and the load impedance(the resistor) would divide the open voltage.  So if the voltage drop across the resistor was 1V, and open voltage was 2V, that means internal voltage drop was 1V.  That means the output impedance and resistor are the same value.  You can probably use male jack and cut out part of the cable to get easy access to the ground and one of the channels with clips.  
 
Oct 22, 2014 at 9:33 AM Post #5 of 8
That is not the standard method of measuring damping factor used in the industry
and probably gives different answers. Especially on high power amplifiers where
you would have to load the amplifier with a resistance far lower than the desired
load.
 
What you can do is use one amplifier channel to drive the other thru a resistor,
then use a sound card as the signal generator. you can even use the sound
card to measure back both the driving and driven channel. Get a trs connector
and solder wires to it.
 
Oct 23, 2014 at 2:18 PM Post #6 of 8
You could use the 90% method.  This way you aren't as likely to overburden the amp.
 
It is explained on this page:
 
http://www.sengpielaudio.com/calculator-InputOutputImpedance.htm
 
Internal resistance of a power amplifier
"Measuring the output impedance by means of a burden": Suppose there is a 100 watt amplifier. Then the output voltage at half power is P = 50 W = V2 / R.  Loudspeaker impedance = 8 ohms. V = √(P × R) = √ (50 × 8) = 20 volts. (You can also use 10 V.) Give a sine voltage of 1 kHz to the amplifier input, until we get 20 volts at the output. Now we apply the "90% method", that is when we put an output resistance R, until there appear 90% of the open circuit voltage, in this case 18 volts. The internal resistance is then calculated with the 90% method:
 
 
The 90% method
Rinternal = R / 9
 
 
This is no different than Silverears method.  Just explained differently and showing you don't need to half the open vs loaded voltage for it to be useful. 
 
This way you only need a good multi-meter and resistor to do the test.   It may not be the industry standard way to measure, but should be plenty close enough for most purposes. 
 
 
Oct 29, 2014 at 3:42 AM Post #7 of 8
  The minimum you need to do this is a true rms reading digital voltmeter,
a low output impedance signal generator, and a resistor, typically 10 ohms to 50 ohms.
 
you use the signal generator to force feed a signal thru the resistor back into the
output of the amplifier. you measure the rms voltage on both sides of the resistor
and calculate the impedance of the amplifier.
 
if the amplifier has a high damping factor you will need a signal generator that
can put out several watts of power, and the resistor needs to be rated for several
watts.

How much would that type of voltmeter cost?
 
Oct 29, 2014 at 6:06 AM Post #8 of 8
It does not have to be very accurate in terms of absolute levels, as it is only the ratio of voltages that matters in this case. For the same reason, true RMS is not important either, and you are only going to test with sine waves anyway. For example, I have a cheap and old multimeter that does not support true RMS measurement, and it works decently (maybe ~1% variation) with pure sine waves from about 30 Hz to 2000 Hz, and even from 20 Hz to 20 kHz the frequency response error is within 1 dB. Your mileage may vary, though, and I think cheap true RMS capable multimeters are the ones that tend to have a limited usable frequency range. In any case, as long as you can measure ratios of AC voltages accurately enough, it should be fine.
 
You can also use a sound card or external A/D converter, as these usually have good linearity now, they are just not calibrated for accurate absolute levels, and have limited voltage range. They are more likely to interfere with the measured amplifier's performance, though, for example by having a low input impedance of only a few kiloohms (which should be taken into account when the amplifier output impedance is very high, like 100 ohms or more), or by creating ground loops.
 
For headphone amplifiers, the simpler method suggested by SilverEars can actually be OK, since it is not important to measure <1 ohm output impedances very accurately. In the case of speaker amplifiers, it is more important, as having damping factors in the range of hundreds or higher with 4-8 ohm loads is a marketable feature. But if you do care about accuracy in the milliohm range, you also need to be careful about cable and connector impedance.
 

Users who are viewing this thread

Back
Top