The (new) HD800 Impressions Thread
Aug 16, 2016 at 3:17 PM Post #23,326 of 28,989
 
What's interesting is the article I linked precisely uses the HD650 cable as an example, and the only major variance was one channel of one of the two stock cables. The other stock cable essentially matched the others.
 
Would it be safe to assume that the larger the source impedance, the less relevant a cable is?


yeah, I've actually talked to Tyll about those very measurements, and like he says in the article, he's not sure if those are the measurements that would show differences in the cables if they did exist. He said he picked the HD600 cable, because, like with me, that is tha stock cable that he hears the largest (though still subtle) difference when switching cables.  Dynamicism and impulse response (as defined in Tyll's measurements) aren't exactly the same thing (though are very much intertwined).  Dynamicism has to do with how impulse response varies at different loads, rather than a single characteristic load (which is what Tyll measures).  A headphone having a similar impulse response to different frequencies at different power levels is what really characterizes a very dynamic headphone/amplifier/cable system.  
 
Two things can impact a cable/headphone's perceived dynamicism:
 
1) a lot of ringing in the impulse response.  The HD600 never really had this problem, to my ears, and that's borne out with Tyll's graphs.  
 
2) different impulse responses for different power loads.  This is something that isn't currently measured by Tyll.  To me, this is what brings out micro-detail, that a DAC/amp/cable/headphone system delivers a similar impulse type response across both axes of the frequency/power 2D plot.  If a system jumps hard at a certain high power level, but is more sluggish at lower power levels, especially in very dynamic passages, micro detail is lost and/or smoothed over.  As Doug from ECP likes to often state, "micro-detail resides in the first milliwatt."  This is also where small differences in differential resistance can become much bigger deals.  Although obviously subtle, this is often the difference in the last little bit of transparency.
 
Aug 16, 2016 at 3:27 PM Post #23,327 of 28,989
http://www.head-fi.org/t/650510/the-new-hd800-impressions-thread/23295#post_12792295
 
From an engineering analysis standpoint the way these guys keep using damping factor to make suggestions is incorrect.  I was just saying earlier if you are going to analyse this incorrectly, the amp's output impedance (more correctly named as the "source impedance" would usually be on the input side of an amplifier) they are using for the amp makes no sense given the high impedance of the HD800 as this is generally 120 ohms.
 
If you replace source impedance with load impedance, I would totally agree with your post (I think this is what you meant).  That is what my voltage divider I proved was explaining, and this is really the only way to approach this from an electrical engineering standpoint as the headphone amp is really a voltage source.  
 
Even a few pages back, one of these guys shows the correct voltage divider and it has only load impedance (e.g. one headphone drivers impedance) and cable impedance, this was the correct formula, so I don't know why people keep trying to use damping factor.
 
Quote:


 
What's interesting is the article I linked precisely uses the HD650 cable as an example, and the only major variance was one channel of one of the two stock cables. The other stock cable essentially matched the others.
 
Would it be safe to assume that the larger the source impedance, the less relevant a cable is?

 
Aug 16, 2016 at 3:56 PM Post #23,328 of 28,989
This bit.

Everyone's been saying "cable impedance is irrelevant to driver impedance!". That was responded to with "it's about source impedance relative to cable impedance, not driver impedance relative to cable impedance!". That was responded to with "Your source impedance example was way off, it's still negligible."

Now we're back to driver impedance. Is it because the source impedance line of thinking's dead in the water?


I should have been clearer. He misinterpreted what I said. By source, I meant the source of the signal relative to the cable, not the impedance of the source component attached to the amplifier.

The input loss is the amp's output impedance divided by the amp's output impedance + cable impedance. 10cos10 x that formula converts to dB.

I also asserted, given that the amplifier output impedance tends to be low, adding the cable impedance on top of that has a multiplicative effect that lowers the damping factor relative to the HD800.

The others argue that this is irrelevant, given the HD800 already has such high impedance.

Without the actual ability to perform conclusive scientific tests on our own, we've decided to debate how wavy some wavy lines are.

To our credit, I think this has gone farther than about 99% of the cable debates I have read, and has been mostly civil along the way. I do respect what the other guys are bringing to the table from their side of the issue.
 
Aug 16, 2016 at 3:58 PM Post #23,329 of 28,989
Can you please post a link to a site showing this impulse difference?  Because just moving a mic 1/4" in a jig is going to change acoustic measurements.  Also as was said earlier, even adjusting headphones on ones head can have an impact on these types of measurements (since if there is air leakage it is really changing the acoustic properties of the ear/headphone chamber).
 
Also, the noise before an impulse is indicative of distortion. This could be from the THD of the amp being used to perform the test, could be from driver distortion.  However, no way is it from cable distortion as the other two are far far far more dominant.  It is wrong to think cables add audible THD.  Yes they can add noise if they aren't shielded, but that's why a factory cable is shielded.
 
When Sennheiser manufactures headphones, the impedance is not always exact too, but will vary.  This is why they perform measurements and match drivers for the HD800; you can see this in the factory tour posted on YouTube if you don't believe me.  Since it seems folks are worried about less than .1 ohms, I guess you all better fly to Germany and hand pick the drivers with the lowest possible impedance.
 
Also, wrt to your other post, speed is determined by the dielectric constant of the cable not the resistance or even impedance alone.  This is only used in transmission line theory which has nothing to do with a short headphone cable passing frequences in the kHz range and lower over short distances.
 
https://en.wikipedia.org/wiki/Transmission_line
 
"In communications and electronic engineering, a transmission line is a specialized cable or other structure designed to carry alternating current of radio frequency, that is, currents with a frequency high enough that their wave nature must be taken into account."  
 
I promise wikipedia is right (I'm an EE), every Electromagnetic Theory text book is correct too, the rest of this time these terms don't matter at all.  Your ears are just plain not that good.  I'm not saying you don't think you hear this stuff (placebo effect, etc...), but it's not not within our range of hearing.
 
Quote:
   
 
Two things can impact a cable/headphone's perceived dynamicism:
 
1) a lot of ringing in the impulse response.  The HD600 never really had this problem, to my ears, and that's borne out with Tyll's graphs.  
 
2) different impulse responses for different power loads.  This is something that isn't currently measured by Tyll.  To me, this is what brings out micro-detail, that a DAC/amp/cable/headphone system delivers a similar impulse type response across both axes of the frequency/power 2D plot.  If a system jumps hard at a certain high power level, but is more sluggish at lower power levels, especially in very dynamic passages, micro detail is lost and/or smoothed over.  As Doug from ECP likes to often state, "micro-detail resides in the first milliwatt."  This is also where small differences in differential resistance can become much bigger deals.  Although obviously subtle, this is often the difference in the last little bit of transparency.

 
Aug 16, 2016 at 4:01 PM Post #23,330 of 28,989
To our credit, I think this has gone farther than about 99% of the cable debates I have read, and has been mostly civil along the way. I do respect what the other guys are bringing to the table from their side of the issue.

I'm obviously quite skeptical but this is definitely the most enlightening conversation I've had about this. We're actually defining constraints here, which is much more than the USB cable crowd does...
 
Aug 16, 2016 at 4:06 PM Post #23,331 of 28,989
I'm not saying you don't think you hear this stuff (placebo effect, etc...), but it's not not within our range of hearing.

 
Love it. The best cable in the world has the exact same signal on both ends. Anything starkly different sounding, fidelity-wise, means your signal is being killed.
 
Aug 16, 2016 at 4:22 PM Post #23,332 of 28,989
I'm not sure what your short reply means (I agree that for an ideal cable the same signal that goes in, comes out)?  My post says there will be a small negligible difference, that is why my post earlier talked about less than .1 ohms (and not zero), the issue is you aren't going to hear the difference.
 
Being factual again...  a human's perception is easy to trick, and we are all humans.  This phenomenon is well known in psychology and to any PhD in the audio engineering field.  Watch this AES workshop video if you don't believe me (has some excellent examples):

 
 
Quote:
   
Love it. The best cable in the world has the exact same signal on both ends. Anything starkly different sounding, fidelity-wise, means your signal is being killed.

 
Aug 16, 2016 at 4:28 PM Post #23,333 of 28,989
I'm not sure what your short reply means (I agree that for an ideal cable the same signal that goes in, comes out)?  My post says there will be a small negligible difference, that is why my post earlier talked about less than .1 ohms (and not zero), the issue is you aren't going to hear the difference.

I'm agreeing, and further making the suggestion that any stark difference one hears (and can perhaps measure) is a sign of a cable attenuating a signal heavily - which is often what people consider to be a poor cable.
 
Aug 16, 2016 at 4:31 PM Post #23,334 of 28,989
To our credit, I think this has gone farther than about 99% of the cable debates I have read, and has been mostly civil along the way. I do respect what the other guys are bringing to the table from their side of the issue.

 
 
Big +1.  I have seen this conversation go way south more than once...
 
Aug 16, 2016 at 4:32 PM Post #23,335 of 28,989
 just gonna respond to a bunch of stuff from etc6849  For some reason I can't get multi-quote (or even reply) to work on his posts.

 
the 120 Ohm output impedance IEC standard is just flat out wrong, even for high impedance headphones.  I've never met anybody who actually thinks it's correct.  We can look at it from a standard of 1) What users actually use: Most users use amps well short of 120 ohms output impedance 2) What the headphone is designed for: The HD800 was designed to be used with amps between 0 and 25 ohms, according to Sennheiser engineers (and again, the amp literally designed for the HD800 is 16 ohms). 3) ideal damping factor: 120 ohm output impedance would give a damping factor of 2.5, which I think almost everybody would agree is pretty poor, unless you're one of the extreme "damping factor doesn't matter under unity damping" crowd.
 
The 120 ohm standard was likely made to accommodate for makers of receivers with headphone jacks in the 80s and 90s, which were often pretty bad.  Professional sound gear typically has output impedance around 20 ohms for headphones, most consumer gear is around 10 ohms these days.  High end gear audiophile gear (which obviously the HD800 is aimed towards) actually tends to be lower than that now, between 5 ohms and 0 ohms.  There's just no reason whatsoever to use the 120 ohm output impedance IEC standard.  
 
I'm not sure what you are implying about "replacing source impedance with load impedance" (there's some weird quote things going on, and I can't really tell what post, and what words specifically this was in response to).  Cable impedance absolutely is a part of source impedance, not load impedance. I don't think debating that's what you meant because you obviously understand electrical circuits well enough to know that cables are added to source impedance, but I wasn't really sure what you were saying otherwise.
 
Now, we can debate all day long about what damping factor is audible, I have never seen any test done to establish a true target dampening factor (if you have one I'd be all ears).  I've seen three primary, wildly divergent guidelines used: 1) anything short of unity is fine 2) 10:1 3) 100:1.  With a very low output impedance amplifier, a cable can impact the damping factor significantly.  Now, as at that point you're probably already at an extremely high damping factor with the HD800, it probably doesn't matter, but again there's no set standard and I see people disagree about this all the time.  
 
With regards to "speed" I was speaking of the audiophile sense of speed, not speed of transmission.  For headphones speed of transmission is more or less irrelevant.  The audiophile term "speed", unfortunately, is a messy term that somewhat embodies impulse response, damping factor, and other factors that vary depending on who is using it.  
 
I'm not aware of a site that measures impulse response at varying power loads and frequencies.  This is one of the shortcomings I was talking about wrt available measurements earlier.
 
And while cables do not directly create THD, they do impact how much is sent to a headphone from the amp, if you believe that cable can differentially attenuate a signal from the amp (especially if it impacts higher frequencies more than lower frequencies, as by definition harmonic distortion is higher frequency than the fundamental source), and thus impact the balance of which harmonics are present in the signal as it arrives from the headphone.  Now if you believe that a cable equally attenuates all portions of the signal, then obviously this wouldn't matter.  I've not seen that question ever firmly settled with measurements.
 
while a .5dB difference at 8kHz is fairly insignificant if that 8kHz is fundamental signal, if it's 4th harmonic distortion of a 2kHz fundamental at 8kHz, it can be very significant.  Our ears are much more susceptible to changes in distortion amounts relative to overall signal strength than they are changes to actual volume of the fundamental.  
 
Aug 16, 2016 at 5:30 PM Post #23,336 of 28,989
A blind listening test between a $400 aftermarket cable and $3 worth of cat5 ethernet cable (about 5 meters worth) would be very interesting
evil_smiley.gif
just saying.
 
Personally I would rather put my cash into a high quality source and amplifier the gains from moving up the tree can be substantially once you get into quality dacs and amps with decent power supplies and amps with stepped attenuators andhigh quality signal output capacitors. An expensive aftermarket  cable would be the last thing on my list.
 
Aug 16, 2016 at 6:42 PM Post #23,337 of 28,989
Let's stop saying source (because any textbook will say output).  Again, signal source is thought of as a voltage source on the input.  I only used that terminology since Hansotek was using it.
 
So let's all read this:  http://www.electronics-tutorials.ws/amplifier/input-impedance-of-an-amplifier.html and pledge to use the same terminology (if we are to continue to discuss this topic).
 
And use this figure (where the load in this case is the one of the headphone's drivers):

 
Let's stop saying source. Too confusing this is the wrong side of the amp, we want the output since that's where we plug in our headphones.


Let's use this circuit that the link above describes as the output circuit model (as an sophomore in electrical engineering can explain to you):

 
 
Now lets replace Zout with Rout'+Rcable and do some more basic math:

 
 
Now, please tell me how you ears are more accurate than a cheap volt meter (e.g. .9697-.9694 = .0003), in other words you lose .0003*Vout' or .03% of Vout' versus an ideal cable (which doesn't exist)?!? 
 
NOTE:  as my original post said in the disclaimer I was treating the voltage source as ideal (e.g. Rout = 0).  This was since adding .1 ohms of load to a 320 ohm load is not likely to cause any voltage sag from the source, so for comparison the voltage source (e.g. Vout from the amp) can be treated as ideal. Here I am including Rout' instead of using an ideal voltage source (by popular demand).
 
Now, substitute whatever you like into Rout'.  120 ohms is the standard, but I do agree headphone amps may not follow this standard and that it's also outdated.
 
Please draw out a circuit and do the math, it doesn't lie.  Also note than an ideal cable doesn't exist, but I can make my copper conductor a lower resistance by just increasing thickness (or even paralleling two cables) so everyone can be happy for $50 instead of $500 (even though you can't hear the difference).
 
Quote:
   
Cable impedance absolutely is a part of source impedance, not load impedance. I don't think debating that's what you meant because you obviously understand electrical circuits well enough to know that cables are added to source impedance, but I wasn't really sure what you were saying otherwise.

 
Aug 16, 2016 at 6:43 PM Post #23,338 of 28,989
A blind listening test between a $400 aftermarket cable and $3 worth of cat5 ethernet cable (about 5 meters worth) would be very interesting :evil: just saying.

Personally I would rather put my cash into a high quality source and amplifier the gains from moving up the tree can be substantially once you get into quality dacs and amps with decent power supplies and amps with stepped attenuators andhigh quality signal output capacitors. An expensive aftermarket  cable would be the last thing on my list.


You should definitely buy those things first. Without an adequately resolving system, it's all a moot point. You'll never hear the difference. Cables cannot add, they can only prevent further subtraction.

Once you do have a very resolving system, however there's a bit of a catch 22: when you can hear everything, you can hear everything - good and bad. This stuff becomes a lot more important.

Until then, focus on the other parts of your chain.
 
Aug 16, 2016 at 7:27 PM Post #23,339 of 28,989
Now, please tell me how you ears are more accurate than a cheap volt meter (e.g. .9697-.9694 = .0003), in other words you lose .0003*Vout' or .03% of Vout' versus an ideal cable (which doesn't exist)?!? 
 
NOTE:  as my original post said in the disclaimer I was treating the voltage source as ideal (e.g. Rout = 0).  This was since adding .1 ohms of load to a 320 ohm load is not likely to cause any voltage sag from the source, so for comparison the voltage source (e.g. Vout from the amp) can be treated as ideal. Here I am including Rout' instead of using an ideal voltage source (by popular demand).
 
Now, substitute whatever you like into Rout'.  120 ohms is the standard, but I do agree headphone amps may not follow this standard and that it's also outdated.
 
Please draw out a circuit and do the math, it doesn't lie.  Also note than an ideal cable doesn't exist, but I can make my copper conductor a lower resistance by just increasing thickness (or even paralleling two cables) so everyone can be happy for $50 instead of $500 (even though you can't hear the difference).

 
I'm fairly cable agnostic, the only cable I've ever really personally heard a difference is the HD650 cable.  And up until that point I 100% didn't believe cables could make audible differences.  Even then it was a $50 cable that made the difference, that I bought simply because I wanted a shorter 3.5mm cable instead of the longer 1/4" cable. BUt to me, the difference was outside the bounds of what I would think might be placebo, it's impossible to reliably blind A/B because I can feel the difference in weight, so unfortunately a "scientific" test is probably impossible.  But given that I was actually expecting no difference, I think I'd be less prone to the placebo effect/expectations bias.  If somehow tomorrow we conclusively proved there was no audible difference, then I wouldn't care in the least, other than from a purely hobby/academic standpoint.
 
On the HD800, I haven't personally heard cables make a difference.  I have a $50 acrolink cable I bought because I, again, wanted a shorter cable with a 3.5mm termination, and a homemade silver balanced cable, because I needed a balanced cable.  It cost me $30 to make.  My point earlier was more to clarify what the claims of cable proponents are, in as honest terms as I understand them.  And clarify a few points on output impedance of the output/source/amp+cable.  I can see the theory of how cables may impact the sound and also the theories that say it shouldn't be audible, but I also haven't seen any tests done that satisfactorily resolve the issue one way or the other, and thus remain pretty agnostic on the issue.  
 
Aug 16, 2016 at 7:41 PM Post #23,340 of 28,989
  A blind listening test between a $400 aftermarket cable and $3 worth of cat5 ethernet cable (about 5 meters worth) would be very interesting
evil_smiley.gif
just saying.

 
Some very experienced old analogue engineers of my acquaintance (who worked on world-class broadcast chains) liked to challenge snakeoil types to hear the difference between exotic unobtanium cable and that horrible bright orange mains cable that you use for electric lawnmowers, especially when used as speaker cable 
evil_smiley.gif

 

Users who are viewing this thread

Back
Top