Actually, I am trying to get you to justify that statement. Because it simply isn't true. Let me refer you to the Wikipedia page (because it is convenient, not because it is an absolute reference)
here. In the paragraph "Sampling jitter" it states "
less than a nanosecond of jitter can reduce the effective bit resolution of a converter with a Nyquist frequency of 22 kHz to 14 bits" I think it means 16 bits, not 14, as it is referring to CD playback. Where does <1ns come from? Well, when you convert a digital signal to analog, the thing to bear in mind is this:
The right signal at the wrong time is the wrong signal. So, if a digital data stream contains 16-bit data, this means that the data is specified with a resolution of one part in 65,536. It is straightforward to appreciate that this means that the sample has to be converted from the digital domain to the analog domain at a point in time which is accurate to within 1/65,536th of the sampling interval. The sample frequency is 44.1kHz, so the sampling interval is 22.7 microseconds. Therefore, if the DAC timing signals are wrong to within 1/65,536th of 22.7 microseconds (346 picoseconds), then the analog signal will be
wrong. This is what we mean by jitter. If the digital data stream is to be systematically accurate to within 346ps, and we assume that the USB controller at the computer is perfect, and presents a source signal with no jitter, this implies that the bandwidth of the signal delivery system - the USB cable - needs to be close to 1GHz. That is non-trivial. I think it is not unreasonable to postulate that different USB cables
can have different transmission characteristics in the GHz frequency range, and thereby
can have - in principle - an audible effect on red book music.
Now lets move on to something like 24/192. If we apply the same rationale, then the jitter requirement becomes 310 femtoseconds. A femtosecond is a millionth of a billionth of a second. And the bandwidth required to systematically guarantee jitter-free transmission is a
thousand GHz. USB cables do
NOT transmit those frequencies. Do you still insist that a USB cable
CANNOT POSSIBLY impact the sound of the resultant analog signal? Yes or no?