A question just occurred to me. From what I gather, a device's output impedance has to be matched to the transducer's input impedance at some ratio I can't remember at the moment. When a cable is used in an audio chain, which end of that equation does it affect, the output impedance or the input impedance?
Kind of both. It's a matter of what you're looking at.
For the amplifier, if you plug a 4ohm IEM and a 3ohm cable, the amp deals with a higher load which is usually welcomed news for the amp and how it measures. If the cable is 0.6ohm, perhaps now the amplifier struggles too much into that tiny total impedance and starts distorting even at moderate volume level. Or maybe it will still be fine? Depends on the amplifier.
Obviously, I'm taking an extreme example. With a 300ohm headphone, the few ohms of difference between the 2 cables will become irrelevant for the amp.
From the load's point of view, it's sort of the same, but this time the cable "goes" with the amp for the analysis of what happens to the IEM or headphone. Same examples, The 4ohm IEM and say a 1ohm amp, we end up with 2 cables that could cause several dB of signature change(if the impedance curve of the IEM is a little wild, which it probably is if it can get as low as 4ohm somewhere). Or have one cable be noticeably louder.
But with our 300ohm headphone, the electrical damping ratio is still more than enough even with a 10ohm cable, and the possible frequency response or volume level change should be tiny enough that it's irrelevant between our 0.6 and 3ohm cables.
Both happen at the same time, but usually we're looking at how the change impacts one side in particular so we take a particular position to "look" at the cable.