1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.

    Dismiss Notice

Estimating channel imbalance of headphone cable

Discussion in 'Sound Science' started by bobbooo, Sep 9, 2019.
2 3
Next
 
Last
  1. bobbooo
    How would I go about estimating the channel imbalance (i.e. the difference in voltage drop on the left and right channels) of a headphone cable, specifically a single-ended 2-pin IEM cable? I have resistance measurements of the left (3.5mm tip to left signal pin), right (3.5mm middle to right signal pin), left ground (3.5mm ground to left ground pin), and right ground (3.5mm ground to right ground pin).

    (I'm aware this would only be an estimation of the channel imbalance heard using the cable, as I'm only measuring DC resistance and not the impedance of the cable for AC audio signals.)
     
    Last edited: Sep 9, 2019
  2. castleofargh Contributor
    there is not much of a reason for a typical IEM cable to cause a significant(as in remotely audible) imbalance. the reason being that there isn't much of a reason for a cable to have any form of significant impact at all. that could change if the cable somehow has a big impedance(for a cable) and if the IEM on the other hand has a very low impedance. then variations between channels in the cable might reach a point where they could perhaps be of significance(most likely a case of defective cable, with one channel close to break, or some butchered soldering job).

    specifically for your question, the total impact of a difference in resistance in the cable between channels is going to depend on the amp and IEM used with it. it's an electrical circuit. so we can't say much from just the cable alone unless there is a clear issue like one channel being 3 times the impedance of the other one. and the same issue with impedance per frequency if we're looking for a more complete approach.
    if I was to do this, I would use one channel of the amp and only one side of the IEM you're using, and I'd measure the output with any available mic and some app that hopefully can show more than 1dB increments without being a horrible liar. on the amp side I would use some adapter of sort. I started measuring IEMs with male to male short cable and some short wire with croco plugs at each end so I could connect any ring of the IEM cable to any side of the amp. you would want something like that unless you're sure that your amp is very accurately channel matched. and on the IEM+mic side, any tiny movement could cause a change in measured amplitude bigger than any cable imbalance, so you'd have to come up with a way to switch the cable and still get a very repeatable result. that might not be easy without some practice or some mean to hold everything firmly into place while switching the cable side. but still that's what I would consider the most direct and concrete approach.
     
  3. bobbooo
    Thanks for the reply. Measuring the cables acoustically won't be possible though - I am looking at several different cables that have all had their resistances measured already (not by me) and I would like to estimate their channel imbalance before buying them. One of my sources has a relatively low output impedance of 2.2 ohms, but another (an AVR) does have a very large output impedance of 500 ohms at the headphone jack, and my IEMs have an input impedance minimum of 14 ohms, so I believe channel imbalance effects may become audible in my setup (one of the cables also has a high resistance of around 2 ohms). So do you know of a formula or method I could use to estimate the difference in voltage drop on the left and right channels, given the cables' known resistance measurements I mentioned above, and the source and headphones' known impedances? Much appreciate the help.
     
    Last edited: Sep 9, 2019
  4. bobbooo
    I think I figured it out. Using the 4-wire circuit diagram here: https://diyaudioheaven.files.wordpress.com/2013/03/resistance-impedance-and-other-issues.pdf, I plugged in the measured source impedance + resistances of the signal R1 and ground R3 for each channel into here: https://www.rapidtables.com/calc/electric/Voltage_Divider_Calculator.html, using the headphone impedance for R2. This gives the voltage drop over the headphones, V2, for each channel which I then plugged into the voltage damping formula here: http://www.sengpielaudio.com/calculator-voltagedivider.htm to give the dB drop per channel. I then subtracted these two values to find the relative difference in dB between the two channels. Do you see anything wrong with this analysis as a reasonable estimate?
     
    Last edited: Sep 10, 2019
  5. castleofargh Contributor
    I'm not sure I follow your specific references here, but a simple circuit in series with the amp cable and coil(headphone) is the right idea. and yes, from the voltage value you get at the headphone, you calculate the dB variation from using each wire and whatever different resistance they have. but I have to reiterate my doubt about the purpose of the all venture. the amp probably has some amount of channel imbalance(could be negligible depending on the volume system, or changing as you turn the volume knob). the IEM is definitely going to have some form of imbalance, usually not uniform over the entire frequency range. and it's pretty likely that differences in placement and ear canal shape will also cause yet another imbalance. all hopefully small, but there anyway. under normal circumstances we wouldn't even consider channel imbalance from cables because of how small it's expected to be compared to everything else. and if you happen to have some cables showing big differences in resistance between channels, then you'd typically just throw those cables away, suspecting damage or pretty poor manufacturing.

    the other issue here is that a basic case with a circuit in series and fixed numbers like you imagine may only give you some estimated result for 1 frequency. we can expect that the cable itself is going to be mainly resistive and stable over the audible range, but that may not be the case for the amp's output impedance and for the IEM's impedance.

    all in all I'm wondering if you're just vaguely curious about those stuff, if you have a devilish plan to find the worst cable(or use 2 different cable for each channel) and use one imbalance to counter another in your IEM or wherever, or if you're just focused on something you wrongly assume to be of significance? out of curiosity I just checked 2 IEM cables and neither showed even 0.1ohm of difference between channels on my cheapo multimeter. I could check impedance over frequency, but my setup is no way near accurate enough for me to have confidence in the result. that stuff I have is cool to measure a headphone or an IEM if its impedance doesn't get too low, but to measure a short cable on its own... we get into values that are as small or smaller than the expected accuracy of the all setup.:crying_cat_face: maybe someone with pro gears can handle that confidently, but not me.
     
  6. gregorio
    1. True, because cables don't carry acoustic signals.

    2. I'm not sure I understand. The wires inside the cable carrying the left and right channel signals are exactly the same, so they'll have exactly the same resistance and there will not be any channel imbalance. OK, so strictly speaking the wires are not "exactly the same" at the atomic level. At that level there will be some differences but we're talking about differences that can't be reproduced anyway, that would probably be more than a million times below audibility even if they could be reproduced and it would require highly specialised laboratory equipment to measure such tiny differences in resistance.

    G
     
  7. bobbooo
    Mostly the first reason :wink: Although I was looking at buying some cables that have the occasional bad unit with resistance discrepancies between the channels so was wondering if this would be a big deal if I got one of the bad ones.


    1. This was obviously just shorthand for 'measuring the effect of the cables on the acoustic output of the headphones connected to them' as castleofargh suggested. But I think you knew that.

    2. As I said above, sometimes you get bad cables (manufacturer error, bad soldering etc. - check out the measurements in bold on this thread: https://www.head-fi.org/threads/resistance-of-cables-pics-comments-and-links.907998/) so I was wondering if this would have any audible effect. From my calculations, worst case scenario it looks like it might, at 1dB imbalance. But for the cables I'm looking at it's a max imbalance of around 0.03dB with my setup. If we take 1dB to be the (fairly relaxed) threshold of audible channel imbalance, that's around 30 times below audibility, so your million times below guess is off by about 4 orders of magnitude. It's always best to check these things to be sure, either with a rough theoretical estimate or direct empirical tests, especially in the science section of a forum.
     
    Last edited: Sep 12, 2019
  8. bigshot
    Whenever I've had a cable with bad connectors, it either cuts out or crackles. It's blatantly obvious. When that happens, I chuck it in the trash and order a new one from Amazon basics.
     
  9. bobbooo
    If only Amazon basics did 2-pin IEM cables. Bad soldering isn't always that audible though - if a few cores of say a 16-core wire aren't soldered on one of the channels, you just get a resistance (and so a channel level) imbalance between them, which as I've calculated, is usually below audibility.
     
  10. bigshot
    Wires, if you're using the right one for the job, should all sound the same. If they don't, they must be defective in manufacture. If Amazon doesn't have it, try monoprice.
     
  11. bobbooo
    Mostly true, although not always the case. For example, my current (not defective) cable has a resistance of 2 ohms. This, coupled with a source that already has a high output impedance and my IEMs which have a low, wildly varying impedance with frequency, results in their frequency response being altered for the worse from their ideal response. If I replaced this cable with a 0.1 ohm resistance one or less, the effective output impedance of the source would decrease by nearly 2 ohms, which would probably have an audible effect on the IEM's frequency response (see here for a full explanation: http://nwavguy.blogspot.com/2011/02/headphone-amp-impedance.html). Some super thin and light cables have an even higher resistance (e.g. the Linum Voice cable at 4.5 ohms), which would definitely have an effect on the frequency response of a lot of balanced armature IEMs, which usually have a low, varying impedance with frequency.

    (Unfortunately Monoprice don't do IEM cables either by the way.)
     
    Last edited: Sep 12, 2019
  12. bigshot
    IEMs can have wildly incompatible specs that limit what you can use with them. That's the fault of the designer of the IEMs, not the cable. If you plug that cable into something that isn't so temperamental, it will work fine. But even with the wonky IEMs, the likelihood that one channel is going to sound different than the other when both channels are the exact same kind of cable is just about nil.
     
    Last edited: Sep 12, 2019
  13. bobbooo
    I think you misunderstood me. I thought we'd moved on from channel differences. When you said cables 'should all sound the same' I assumed you meant in every way, not just channel balance. The fact is, depending on the output impedance of your source and the input impedance of your headphones, using cables with differing resistances will not necessarily result in the headphones sounding the same, even if there are no manufacturing errors in any of your equipment. IEMs with an impedance that varies across the frequency range are not 'wonky', 'temperamental' (there is no variation in their impedance over time, just over the frequency range), or otherwise manufactured poorly. It is an inherent property of balanced armature and multi-driver IEMs that they are likely to have these varying impedances with frequency. To be clear, here is an example - the measured impedance vs frequency of the Ultimate Ears SuperFi 5 Pro IEM:

    UESuperfi5_impedance.png

    As you can see, the impedance ranges from a minimum of 12 ohms at 10kHz, up to a high of around 96 ohms at 1kHz, with an average of 37 ohms. Now look what happens to the measured frequency response using sources of varying output impedance:

    UESperfi5_freqresponsechange.png

    This shows that the higher the output impedance, the greater the frequency response is affected, following the shape of the impedance curve. My point is, the cable's resistance effectively adds to the output impedance of the source, so a high enough cable resistance will audibly affect (by a couple of dB in this case) the frequency response of these IEMs, and many others like them (mostly balanced armature and multi-driver units). Again, please read this excellent article on impedance effects for a full explanation of why this is so (in particular the section 'The Frequency Response Problem'): http://nwavguy.blogspot.com/2011/02/headphone-amp-impedance.html
     
    Last edited: Sep 12, 2019
  14. bigshot
    OK. You moved on from channel differences. All the rest of it is dependent on your IEMs, not the cable.
     
  15. gregorio
    1. In which case you'd be measuring the headphones (transducers) not the cable. "But I think you knew that."

    2. I'm not sure why you'd want to measure the effect of broken/faulty cables and not just buy properly functioning ones.
    2a. If the left or right signal wire is faulty/broken and the other isn't, wouldn't the worst case scenario be an imbalance equal to whatever you set your output level to, say 70dB? EG. 70dB from the functioning wire and 0dB from the broken wire? With a normal, functioning cable (of the appropriate gauge for the job), I've never been able to measure a difference between the two signal wires, hence why I said "probably" (guessed).
    2b. Oh the irony! Taking your conditions of 1dB threshold of audibility and a faulty cable (one of the two signal wires/connections were faulty) which resulted in a channel imbalance of 0.03dB, that would be roughly 3,400 times below audibility, not 30 times (the dB scale is logarithmic, not linear), so your calculation "is off by about 2 orders of magnitude. It's always best to check these things to be sure ..."! :)

    Obviously though, regardless of whether it's 30 times, 3,400 times or a million times below audibility, they're all well beyond the point of being inaudible and therefore the audible difference is 0dB.

    G
     
2 3
Next
 
Last

Share This Page