Just how old are you? Do you remember "Pure Perfect Sound, Forever". Do you know/remember what the first CD players sounded like back in the '80's before they discovered jitter? Bits-is-bits-is-bits, they told us. So, now we have discovered jitter, we understand it, and we buy into it, and we accept that bits-wasn't-bits-wasn't bits. But today, so you tell us, a USB cable can't POSSIBLY affect the sound quality, because we KNOW that the signal is re-clocked at the DAC, so there CAN'T be any jitter, so it CAN'T POSSIBLY affect the sound. And, moreover, there is NO POSSIBILITY WHATSOEVER that there may be other effects that we haven't got round to reliably quantifying/measuring yet.
And I'm so dumb, that I'm just hearing things that I want to hear? Puh-lease!
So tell me, if a signal is playing at (for example) 24/192, just how small must the jitter be in order for it to be incapable of inducing significant artifacts (i.e. artefacts which would cause the digital data stream to be different if it were re-sampled) into a theoretically perfect DAC's output? And how would you set about measuring that?... Now, I'm no expert, but I think I can do that calculation. Why don't you give it a go yourself, and tell me what you come up with. (Hey, I'm not trying to put you down here - I'm being serious).