When you are talking about USB synchronous transfer, the problems are really all the same as the problems you get with S/PDIF protocols such as TOSlink, AES/EBU, or the digital coax interfaces. Asynchronous USB transport does get around some of these problems *If* your equipment has that capability. But consider the following:
The transfer of "ones and zeros" is an oversimplification of what physically must occur to convey data and typically is not understood by many. All information transfer is analog. Digital data is transferred in an analog format and therefore is still subject to some (although fewer) analog problems.
For example, on a wire that is sending a voltage between 0 and 5 volts, what voltage would constitute a "0"? 0 volts for sure but what about 0.1 volts or 0.2 volts? Noise does occur on digital lines. And what would be a "1"? Well 5.0 volts for sure but the further down the line you get, that signal degrades. If you saw a value of 0.5 volts, would you consider it a "0" or a "1"? Noise on the line can knock a signal up or down. And when do you look at the line to "read" the bit value? In the middle, the end, or the begining? How do you even know where any of these are when the waveform of the bit is distorted due to noise, cable length, and loss of required harmonics?
If a bit is knocked out or a block of bits is hit, most digital protocols have error correction and parity detection to deal with it. If a receiver gets a section of data that is marked as containing bad information, it can request the transmitter to re-send it. This allows computer data to be transferred with exact precision.
That is "computer data". Audio data is quite a different story. Audio data is all "real time". In general bits coming in must be EXACTLY in the time slot they are expected. Let's go back to the voltage on a wire discussion above.
In our example, when a bit stream is sent out on the wire, it is basically a stream of squared off waveforms (i.e., step functions) with a value of either 0 or 5 volts. A Fourier analysis of a step function shows that you have to have all of the odd harmonic frequencies of the base in order to keep the step function "sharp" (i.e., for the square wave to look like a square wave). If you start losing those harmonics, the square wave begins to distort into a rounded off waveform. Once that is gone it is hard to know where the "beginning" of the bit is. So here's a little math:
If you are transferring a 24/196kHz audio file over a digital cable, that is a total bit frequency of 24 * 196kHz = 4.7 megabits per second. In order to maintain a decent signal waveform lets limit the number of odd harmonics to support to about 5. This takes you to the 11th harmonic of 4.7 mHz which is around 52mHz. Your cable has to pass 52mHz just to maintain the bulk of the waveform. If you are trying to transfer computer data, this is not necessary. Even if the waveform is totally rounded off by losses in the cable, the computer just looks for the middle of the bit's signal to determine if it is a "0" or "1" and if it reads wrong, it uses error correction techniques to fix it. A normal 12mHz USB cable would be sufficient for that computer data.
However, since audio data is real time, you can't be requesting re-transmission of data (no time to do it) when hits occur so the cable needs to be uber-shielded and the DAC has to have some other scheme to eliminate bit erros. Also using the 12mHz USB cable you barely get the first odd harmonic (4.7mHz * 3 = 14.1mHz) of the 5 that you need for prober bit positioning. Since the re-clocking of data is based on when the bits are detected, and since each bit is significantly rounded off and distorted without those odd harmonics, the clock has to continually re-adjust when the bits are fed into the DAC for conversion. This constant jumping around of the clock positioning of bits is called jitter and the effects of jitter in the data conversion can be heard, especially on highly resolving amps and transducers.
So in summary, even though the USB standard can be used for errorless data transfer of computer data (assuming that proper error checking and correction is used), under many circumstances USB cables with a much higher bandwidth and noise rejection than that required by the USB standard will improve audio data transfer (less jitter) and reduce data hits (less noise). How much of those benefits that you see on your system is dependent on many other issues, of course.
where do you get this info from? this stuff is golden and not the kind of thing that comes with user manuals. do you work in the industry?