Someone is going to have to biff me and explain to me the following:
All bus protocols (all of them) have some level of jitter with respect to the clock syncing the bus. From PCI express to USB. Obviously some bus protocols allow for retransmits as well as self correcting BER using special encodings (8b/10b like in SAS).
What folks are not addressing is that any jitter caused by the OS driver not delivering packets out the HCI within the exact time domain should be buffered at the other end, i.e. whether the master clock is at the host or the target, I would *THINK* that all DAC implementations would buffer the input and RECLOCK the bits to feed into the DAC, i.e. the Windows driver stuff seems to me hand waving over the real issue (no offense to anyone, does it surprise you a senior Unix kernel developer says that).
Unless you are losing bits due to the sample size (which you aren't AFAIK), then the USB cable is simply a bit pusher that needs to push the bits in a relatively timely fashion (which it does in spades). Putting driver implementation and interrupt latency aside, why oh why would I be that concerned over small jitter flucations due to clocking/sync issues caused by the initial bits being sent over the wire to the DAC. The DAC is clearly going to buffer and RESAMPLE the bits with a much better clock to give me an accurate D/A conversion.
The only aspect of this I can't verify is that if you get so out of sync over the USB cable the connection literally has to resync itself while data is in flight, you will suffer bit loss (more likely full interrupt of playback) - btw this is similar to network cards losing sync over a connection due to improper autonegotiation implementations which will drop packets and cause TCP retransmits, etc.
I don't get it, please help!
