I thought it might be worth expanding on this point, particularly for the OP and others not so well versed in the subject:
The headphone, while not being as extreme as some IEMs, has more potential for sound change with a reasonable increase in impedance than most non-portable headphones(high impedance of dead flat planar stuff). I say that because of the low, and apparently non-flat impedance curve I saw online. I don’t know anything else. So without more information about the cables and gear, I don't know why the possibility of audible change simply gets rejected?
I’m seem to be one of those rejecting “
the possibility of audible change” and yet I “Liked” castle’s post, aren’t I disagreeing with castle and contradicting my “Like”? The answer is “No” and here’s why:
I‘m not actually rejecting “
the possibility of audible change” due to different cables! There are various conditions under which different cables will produce an audible change, that I’ve already mentioned; If a cable is broken, if it’s the wrong cable for the job or due to user error. That all sounds very obvious of course but there are in fact some potential subtleties in the last two points that aren’t so obvious, in certain specific scenarios. First a little history: In the beginning the cable industry started due to the telegraph, relatively little was known and literally millions were wasted trying to get it right by trial and error. The first transatlantic telegraph cable cost millions to make and lay, worked intermittently for about a week then never worked again!
Then in the 1860’s experiments with electromagnetic telephones advanced, which required a greater bandwidth than telegraph and therefore specialised cabling (Meucci), although telegraph cabling was used in early demonstrations by Alexander Graham Bell in the late 1870’s and his developments which allowed telephones to be commercially viable. In the 1880’s the Bell Telephone Company had researchers dedicated to working through Heaviside’s writings, to develop cables which avoid issues that would otherwise have seriously hampered the expansion of telephone usage; skin effect, reflections and power loss in transmission lines, etc. Jump forward to the 1920’s and the introduction of electrical recording: The recording industry adopted what had already been discovered and employed by the telecoms industry, EG. Impedance matching. The input and output of the audio equipment in the chain, including the cabling/patch cables was matched to a nominal impedance (600 Ohms, same as telecoms). However, with the introduction of tape recording and then multitrack tape recording, matching impedance was not very optimal, it was still employed by telecoms because it was great at minimising power loss in transmission lines but the audio recording industry didn’t use transmission lines, so power loss, skin effect and other issues didn’t apply or were insignificant, fidelity within the audio band was all that mattered so the paradigm of impedance matching gradually died out in favour of a low output, high input impedance paradigm. IE. The input of any audio device should have relatively high impedance and the output should have relatively low impedance. This is still the case today and the only examples of audio cables making an audible difference is when this paradigm has not been followed. For example, in some guitar rigs the output impedance of certain equipment in the chain can be high, some turntables can have high output impedance and more recently, some IEMs or a few HPs can have very low input impedance (in certain regions of the spectrum). BTW, one could argue that such very low input impedance is a defective design. But if this is the case, it can be relatively easy to use the wrong cable for the job, EG. One with say an insufficient gauge for the task, resulting in an audible difference even though the cable is listed as say a headphone cable (or guitar cable, etc.). This is effectively user error, although the headphone or other maker may not publish comprehensive enough specifications, thereby making it difficult for the user to avoid error in these rare cases when using cables other than those supplied. The general exception is speakers, which typically have somewhat low input impedance, 8 Ohms is typical and 4 or even 2 Ohms isn’t unheard of, so care sometimes has to be taken with gauge choice, especially as speaker cable lengths vary significantly and are typically much longer than interconnects or HP/IEM cables. In the vast majority of cases though, such “wrong cable for the job” or “user error” isn’t a concern, providing it’s the right basic cable type for the job it won’t make any audible difference and even a completely wrong cable for the job often won’t, the old lamp cord or coat hanger for speaker cable being obvious examples.
Going back near the top of this already overly long post. I’m actually agreeing with castleofargh, I’m not completely rejecting the possibility of a different cable causing an audible difference, it is possible, though quite unlikely compared to the numerous alternative potential causes that were not eliminated. However, if this does in fact turn out to be the case, it will be due to a significant difference in gauge or some factor other than that stated in the thread title (silver plate vs copper). The differences over that cable length caused by silver plate vs copper are so minuscule they almost certainly can’t even be resolved into sound, let alone be audible.
G