El-cheapo measurement of amp distortion
Feb 13, 2020 at 7:46 AM Thread Starter Post #1 of 9

charleski

100+ Head-Fier
Joined
Nov 10, 2014
Posts
385
Likes
374
I posting this here to get some feedback on my technique. Please jump in and let me know if I've done anything wrong or made some invalid assumptions.

My source component is an LG G7, which I got late last year and am very happy with. For those not familiar with the phone, its headphone jack is powered by an ES9218P which marries ESS's Sabre DAC with an on-chip headphone amplifier. Up until recently I've been using it to drive low-impedance iems and had no problem at all with the volume output, but I recently decided to resurrect my vintage (25yr-old) HD580 Jubliee cans which are 300ohm nominal. The phone correctly switched to high-impedance mode (it auto-switches output gain depending on the impedance sensed at the jack), but I did notice that I was having to turn the volume up higher than when using the iems. Which left me wondering if I really needed a proper external amp to get the best out of them. The review of the G7 on audiosciencereview gives a power output of 14mW into 300ohms. But I've had problems finding a proper spec for the HD580's sensitivity. There's a page on Stereophile's site claiming it's '97dB' with no unit attached, though their other sensitivity specs seem to be in db/mW, and I found another review site that stated it was '~98dB/mW' (these are all for the standard HD580 model, but AFAIK the Jubilee model should be identical in this respect). If we take it as 97dB/mW, the G7 should be able to drive these phones to a maximum of around 108.5dB, which is reasonably close to the recommendation of 110dB (which refers to maximum transitory peak output, obviously average sane listening level will be 30-40dB lower and this number just specifies maximum headroom).

So on paper it looks fine, but I wanted to find some way to check this, if only to prevent myself giving in to audiophilia nervosa and buying a more powerful amp purely to satisfy a baseless anxiety. But I wanted to do this on the cheap, and sadly I've yet to find any bargain APx555s is the SpecialBuy section of my local Aldi. So here's the plan I came up with, which cost me a grand total of $0 for equipment.

Code:
LG G7 ---> Y-splitter ---> HD580J
                    |---> line-in on computer audio interface

I'd expect the line-in input impedance to be around 10kOhms, which shouldn't have much effect on the load seen by the amp when placed in parallel with the 300Ohm cans. I then used REW to generate a wav file with a 10-22k sweep at -0.3dB, which should represent the maximum signal level the player might encounter. I checked that the phone was still in high-impedance mode and played back this file through Neutron at 8 different volume levels (Neutron's only fault is that while in-app it forces volume changes to fairly crude 3dB steps. The volume numbers given below are those seen in the app). Each playback sweep was recorded in Audition and then saved, and the recorded audio was imported in REW for analysis.

Results:
Code:
Vol       THD at 1kHz
100       11.3%
93        0.0051%
87        0.0039%
80        0.0045%
73        0.0063%
67        0.0053%
60        0.0050%
53        0.0063%

Obviously full volume is completely unusable (this is something I noted before in my testing with iems). But anything below that is just fine, and the maximum volume I'd been using was 87, meaning the amp still had at least another 3dB available. There was some increase in THD in the bass, but this peaked at 0.013% at around 50Hz on Vol93.

It's important to note that this experiment was designed to answer a specific question: should I waste money on an external amp? The actual distortion produced by the on-chip amp will be lower than measured here, as these numbers include distortion coming from the computer's recording interface. They merely provide an upper bound to the amp's output distortion, and from Vol93 down are well below audibility.

The answer to my question is clearly No, which leaves my wallet with a smile on its face. Since an extra amp is just going to increase noise and distortion without providing any truly usable increase in headroom (unless I decide I want to commit auditory suicide), it also means I've avoided degrading the sound quality as well.

As I said at the start, though, I thought I'd run this by you lot to see if I've done anything clearly wrong here which might affect my conclusion.
 
Last edited:
Feb 13, 2020 at 9:12 AM Post #2 of 9
I posting this here to get some feedback on my technique. Please jump in and let me know if I've done anything wrong or made some invalid assumptions.

My source component is an LG G7, which I got late last year and am very happy with. For those not familiar with the phone, its headphone jack is powered by an ES9218P which marries ESS's Sabre DAC with an on-chip headphone amplifier. Up until recently I've been using it to drive low-impedance iems and had no problem at all with the volume output, but I recently decided to resurrect my vintage (25yr-old) HD580 Jubliee cans which are 300ohm nominal. The phone correctly switched to high-impedance mode (it auto-switches output gain depending on the impedance sensed at the jack), but I did notice that I was having to turn the volume up higher than when using the iems. Which left me wondering if I really needed a proper external amp to get the best out of them. The review of the G7 on audiosciencereview gives a power output of 14mW into 300ohms. But I've had problems finding a proper spec for the HD580's sensitivity. There's a page on Stereophile's site claiming it's '97dB' with no unit attached, though their other sensitivity specs seem to be in db/mW, and I found another review site that stated it was '~98dB/mW' (these are all for the standard HD580 model, but AFAIK the Jubilee model should be identical in this respect). If we take it as 97dB/mW, the G7 should be able to drive these phones to a maximum of around 108.5dB, which is reasonably close to the recommendation of 110dB (which refers to maximum transitory peak output, obviously average sane listening level will be 30-40dB lower and this number just specifies maximum headroom).

So on paper it looks fine, but I wanted to find some way to check this, if only to prevent myself giving in to audiophilia nervosa and buying a more powerful amp purely to satisfy a baseless anxiety. But I wanted to do this on the cheap, and sadly I've yet to find any bargain APx555s is the SpecialBuy section of my local Aldi. So here's the plan I came up with, which cost me a grand total of $0 for equipment.

Code:
LG G7 ---> Y-splitter ---> HD580J
                    |---> line-in on computer audio interface

I'd expect the line-in input impedance to be around 10kOhms, which shouldn't have much effect on the load seen by the amp when placed in parallel with the 300Ohm cans. I then used REW to generate a wav file with a 10-22k sweep at -0.3dB, which should represent the maximum signal level the player might encounter. I checked that the phone was still in high-impedance mode and played back this file through Neutron at 8 different volume levels (Neutron's only fault is that while in-app it forces volume changes to fairly crude 3dB steps. The volume numbers given below are those seen in the app). Each playback sweep was recorded in Audition and then saved, and the recorded audio was imported in REW for analysis.

Results:
Code:
Vol       THD at 1kHz
100       11.3%
93        0.0051%
87        0.0039%
80        0.0045%
73        0.0063%
67        0.0053%
60        0.0050%
53        0.0063%

Obviously full volume is completely unusable (this is something I noted before in my testing with iems). But anything below that is just fine, and the maximum volume I'd been using was 87, meaning the amp still had at least another 3dB available. There was some increase in THD in the bass, but this peaked at 0.013% at around 50Hz on Vol93.

It's important to note that this experiment was designed to answer a specific question: should I waste money on an external amp? The actual distortion produced by the on-chip amp will be lower than measured here, as these numbers include distortion coming from the computer's recording interface. They merely provide an upper bound to the amp's output distortion, and from Vol93 down are well below audibility.

The answer to my question is clearly No, which leaves my wallet with a smile on its face. Since an extra amp is just going to increase noise and distortion without providing any truly usable increase in headroom (unless I decide I want to commit auditory suicide), it also means I've avoided degrading the sound quality as well.

As I said at the start, though, I thought I'd run this by you lot to see if I've done anything clearly wrong here which might affect my conclusion.
No offense, but it appears you may be making a few assumptions that could be incorrect:
1. The MassDrop HD58X Jubilee is 150 ohms, not 300.
2. In your volume percentage vs. distortion table, what indicates that those volume percentage settings correspond to a linear dB scale? Neutron may say it's 3dB, but that's already an assumption that they're correct - much less the real dB resulting from the headphone.
4. This is a nit-pick, but your choice of maximum volume seems to indicate a setting of 93, but at 87, you state that you only have 3 left (leaving alone the issue on whether that really corresponds to dB).

If you were reading this: https://www.audiosciencereview.com/...o-measurement-of-lg-g7-thinq-smartphone.4468/, it appears that at 150 ohms, it can put out 20mW.

Looking at Massdrop, the spec for the HD58X Jubilee sensitivity is 1V at 1kHz results in 104dB. That's a roundabout way of actually stating that 104dB on the headphone requires 6.7mW at 1kHz (at 150 ohms). If you are really wanting that 110dB capability for musical peaks (especially at frequencies much different than 1K), then a quick doubling of power requirements per every 3dB results in 26.7mW minimum to reach 110dB. That's well over the maximum power measured by the review referenced above. The actual usable power is probably much lower at preferred distortion levels - as indicated with your tests.

According to that testing/review above, the voltage may also be severely limited in that phone. High impedance phones need higher voltage swing ability from an amp. 150 ohm is better than 300 ohms with that, but it may still be choking on peak passages.

JMHO, but I think it's pretty easy to conclude that you need an external amp.
 
Feb 13, 2020 at 2:34 PM Post #3 of 9
No offense, but it appears you may be making a few assumptions that could be incorrect:
1. The MassDrop HD58X Jubilee is 150 ohms, not 300.
No, this is the original HD580 Jubilee, from 1995 (I said it was vintage). This was a modified version of the standard HD580 with a carbon-fibre shell and headband and very similar to the HD600 released a year or two later. There’s a thread on it and it’s clearly a 300ohm set.
2. In your volume percentage vs. distortion table, what indicates that those volume percentage settings correspond to a linear dB scale? Neutron may say it's 3dB, but that's already an assumption that they're correct - much less the real dB resulting from the headphone.
Neutron says nothing about dB and just gives numbers on a 0-100 scale, but I’ve measured the output at the different volume levels and each step is roughly 3dB.
4. This is a nit-pick, but your choice of maximum volume seems to indicate a setting of 93, but at 87, you state that you only have 3 left (leaving alone the issue on whether that really corresponds to dB).
A volume setting of 93 is roughly 3dB louder than at 87 (I think the actual figure is 2.9 or so). Again, this has been confirmed by measurement.
The actual usable power is probably much lower at preferred distortion levels - as indicated with your tests.
With monolithic amps like this clipping is a brick-wall phenomenon. The amp is fine until it suddenly isn’t. There’s nothing to be gained by running below the threshold, in fact THD+N figures are typically lowest right before clipping sets in (though obviously you want to set the volume to a comfortable level).

According to that testing/review above, the voltage may also be severely limited in that phone. High impedance phones need higher voltage swing ability from an amp. 150 ohm is better than 300 ohms with that, but it may still be choking on peak passages.
The one problem with that review is that he didn’t account for the phone’s auto-sensing function. If you want the details on that it’s all covered in the LG V30 thread in the source forum, but it’s not really relevant here. The phone is specced to go to a max of 2V, and it can do that with 300ohm phones (as shown by the power he got into 300ohms: sqrt(0.014*300) = 2.05).

JMHO, but I think it's pretty easy to conclude that you need an external amp.
I think you’re just confused about the impedance. But more importantly the numbers I got show that a full-scale signal played at the loudest setting I found comfortable (87) showed no sign of clipping. So no.
 
Feb 14, 2020 at 9:49 AM Post #4 of 9
Yes, I'm familiar with the original HD580 Jubilee. I bought the original HD580 when it was NEW and still have it, along with the 600, 650, and 800. The HD580 Jubilee was basically the HD600 before Sennheiser released it. Many people - back in the day - ordered the metal grilles separately when the HD600 came out and installed them on the HD580. That's essentially the Jubilee and/or the same as the HD600. Like everything else Sennheiser makes, there were small tweaks in drivers and materials over the years with all of those headphones, but those essential differences remained constant.

It's just that the 580 Jubilee was not very common, since the HD600 was released so quickly after that prototype. So, yes - I assumed incorrectly and you definitely have a 300 ohm headphone. As for the rest, you will have trouble finding a true sensitivity rating for it, because Sennheiser altered their spec reporting through the years. The history is not very conclusive regarding sensitivity. The best source is Tyll's testing on Innerfidelity: https://www.innerfidelity.com/images/SennheiserHD580wHD600headband.pdf

He actually measured it at 0.09mW at 90 dB. (He also measured impedance at 330ohms at 1kHz, but 300 is close enough for your calcs.) If we do some massaging with 3dB increases and doubling the power at each step, going from 90dB to 111dB results in 11.52mW. Referring back to the same AudioScience review, your phone only puts out 14mW at clipping at 300 ohms.

You have to look at the curve on those Sennheiser tests. Every one of the HD580/600/650 family has a huge impedance spike around 100 ohms. On that same Innerfidelity test above, Tyll measured close to 600 ohms at 100Hz. Combined with your phone's power limit of 30mW at 50ohms, 20mW at 150ohms, and 14mW at 300 ohms (power reduction appears to be linear at greater than or equal to 50ohms), you only have perhaps 7mW at 600 ohms. (Maybe that overall 330 ohm rating is also important, after all.)

What does all this mean? Your headphone may sound anemic with your smartphone. Now, you may not notice that until listening to an alternative. Few people do until they actually hear the difference. Clipping in the low-to-mid bass regime is not always that noticeable as "clipping." In any event, it will certainly be lacking in the warmth and bass slam it might otherwise have, depending on the volume level. Bottom line, you will still do better with a separate amp, period. The heritage of the Sennheiser HD580/600/650 family is one of responding noticeably with sound quality improvements at every step of amplifier quality improvement.

It has always been so.

P.S. Keep in mind that sensitivity ratings are also usually quoted/measured at 1K. It's more than likely that bass frequencies require more power, period. Combined with the power limitation of your phone at the high impedance spike, it's even more likely bass will suffer. It's just not a good idea to back into all these calculations when most of them are based on assumptions. Compare your phone with a good amplifier that can provide a high-voltage swing. Then see if you agree re: need an amp or not.
 
Last edited:
Feb 15, 2020 at 6:08 AM Post #5 of 9
The best source is Tyll's testing on Innerfidelity: https://www.innerfidelity.com/images/SennheiserHD580wHD600headband.pdf

He actually measured it at 0.09mW at 90 dB. (He also measured impedance at 330ohms at 1kHz, but 300 is close enough for your calcs.) If we do some massaging with 3dB increases and doubling the power at each step, going from 90dB to 111dB results in 11.52mW. Referring back to the same AudioScience review, your phone only puts out 14mW at clipping at 300 ohms.
Thanks, I missed that and it's a useful set of measurements as you'd expect from Tyll. Crunching the numbers using the 14mW/2.05V @ 300ohms result from ASR gives a maximum SPL of 111.47dB (90+ 20*log(2.05*1/0.173)), which is even better than the previous estimate.

You have to look at the curve on those Sennheiser tests. Every one of the HD580/600/650 family has a huge impedance spike around 100 ohms. On that same Innerfidelity test above, Tyll measured close to 600 ohms at 100Hz. Combined with your phone's power limit of 30mW at 50ohms, 20mW at 150ohms, and 14mW at 300 ohms (power reduction appears to be linear at greater than or equal to 50ohms), you only have perhaps 7mW at 600 ohms. (Maybe that overall 330 ohm rating is also important, after all.)
That's interesting. I certainly did see a rise in distortion in the bass, as I mentioned, though with volume on the '93' setting it still failed to come close to something I'd consider audible. Here's the graph from REW:
Distortion 93.jpg

What does all this mean? Your headphone may sound anemic with your smartphone. Now, you may not notice that until listening to an alternative. Few people do until they actually hear the difference. Clipping in the low-to-mid bass regime is not always that noticeable as "clipping." In any event, it will certainly be lacking in the warmth and bass slam it might otherwise have, depending on the volume level. Bottom line, you will still do better with a separate amp, period. The heritage of the Sennheiser HD580/600/650 family is one of responding noticeably with sound quality improvements at every step of amplifier quality improvement.

It has always been so.

P.S. Keep in mind that sensitivity ratings are also usually quoted/measured at 1K. It's more than likely that bass frequencies require more power, period. Combined with the power limitation of your phone at the high impedance spike, it's even more likely bass will suffer. It's just not a good idea to back into all these calculations when most of them are based on assumptions. Compare your phone with a good amplifier that can provide a high-voltage swing. Then see if you agree re: need an amp or not.
You need to keep in mind that I was using a full-scale digital sweep for the test input. Since this is a digital system, the signal level will never exceed this. If a full-scale input doesn't reveal any clipping then you can be sure the system will handle any signal that's thrown at it.

Also, one advantage of me cheaping out and using the headphones themselves as the load is that it avoids any errors introduced by incorrectly estimating their impedance. In general, of course, doing it this way is a BAD IDEA. If you were testing a powerful amp there'd be a real possibility of damaging the drivers by sending too much power through them, and you avoid that by using a high-power resistor as the load instead. But I was pretty sure that the G7 wasn't capable of outputting enough to cause damage so could get away with it in this situation.

So I'm really not convinced I'd notice any difference with an external amp. Even less so, actually, with the numbers from Tyll. But it so happens that I was talking to a friend of mine and he offered to give me an old headphone amp that needs some work. It's a MusicalFidelity X-Cans, another vintage piece of kit that has, predictably, suffered failure of its power supply electrolytics after a couple of decades. It looks like MF never bothered to release proper specs for this, but from some old posts it seems it was praised as a good match for the HD580, particularly after replacing the cheap caps used as stock. So I might give that a go and see if I notice any difference after living with it for a month or so (short-term subjective impressions are worthless IMO). Since this is a hybrid op amp/valve design there'll probably be a degree of tubey euphony going on that will cloud things, but it's worth a go.
 
Feb 15, 2020 at 9:37 AM Post #7 of 9
You were/are looking for confirmation bias from the beginning. I hope you found it.
Well I was wondering if there was an obvious flaw in the measurement design that I’d overlooked. But the usual ‘more is better’ arguments don’t really cut it for me, especially since simplicity has a clear virtue as well.

You might find this link interesting:
http://www.rock-grotto.co.uk/Neilxcan2.htm
Yeah, that’s the link I put in my last post. I also found a schematic for the original XCans if anyone is interested.
C46E3F5B-46BE-4486-99B0-9B2FEEBBE934.jpeg
 
Mar 4, 2020 at 7:00 AM Post #8 of 9
Well, I've been doing some extra testing and discovered the fatal flaw with my bright idea. :sleepy:

It appears that the line input on my motherboard overloads once the input voltage goes above 1.5Vrms. I tried testing a couple of different headphone outputs and found they all reached 1%THD at around the same output ... Doh! :face_palm: I was actually just measuring the limit of the input buffer on the motherboard. I suppose an answer would be to insert a voltage divider to pad the input down to a level it can accept, though this would affect the noise floor (which is already not great). I might see if an entry-level pro-audio interface can take higher levels (since these can be picked up second hand for not that much.)
 

Users who are viewing this thread

Back
Top