Head-Fi.org › Forums › Equipment Forums › Sound Science › Discussing USB implementations. Why should the quality of the interface and cable matter?
New Posts  All Forums:Forum Nav:

Discussing USB implementations. Why should the quality of the interface and cable matter?

post #1 of 18
Thread Starter 

Preface: I am a graduated computer engineer and work in the software business. I am also a relative newbie with respect to the audiophile scene. That being said, my father is an avid audiophile and I have had the pleasure of auditioning many setups with him. Recently, I built him a music server and we've auditioned the Meitner, Invicta, and Weiss DAC202 on his B&W 800 Diamond speakers.

 

Being a computer engineer, I am relatively well versed in the technical workings and interactions of computer interfaces. That being said, I do not have any professional experience with the USB interface. Most of my hardware engineering experience is limited to FPGA development, microcontroller programming, and communication with relatively simple interfaces such as serial interfaces.

 

I am not going to discount the possibility that the quality of the USB interface on the PC or that the quality of the USB cable matters. It very well might. I'm here to find out why.

 

To me, used in the intended asynchronous fashion, USB should be able to deliver perfect 1:1 data that is completely devoid of data degradation or timing issues. Take a USB drive, for example. When you transfer data to and from it, nothing happens to it. Error correction and proper back/forth communication in between the drive and the USB's port controller ensures that the data arrives perfectly. If there are any flaws in the data, they are detected thanks to error correction and promptly fixed. Undetected errors are extremely rare, to the point where I have never encountered a corrupt file sent over USB in my lifetime. They're possible, but the error correction used is quite robust and makes undetected errors extremely unlikely.

 

That being said, I realize that the USB interfaces in DACs may not use USB in the ideal fashion. They say they use asynchronous USB, but I'm not sure how it's implemented. Data is buffered into the DAC's USB interface at a rate determined by the DAC's asynchronous USB controller. How big is the buffer? How likely are data errors? I would have to assume that the PCM audio data arrives in "chunks", each chunk representing a bunch of PCM data with attached error correction bits. If the chunk is determined to be erroneous, the DAC would have to request a new chunk from the PC. If it's not retrieved correctly in time, you'd basically have a missing chunk of data, resulting in audio drop outs. After the data makes it through the buffer, it's essentially reclocked by the DAC's master clock, retransmitted and sent to the DAC chip.

 

I've described what I suspect happens using asynchronous USB implementations. If designed in a proper fashion, I see absolutely no reason why the quality of the cable or USB port would effect the sound. At least, there's no reason it should effect the bits and how they're processed by the DAC in any way.

 

That being said, there are a few unknowns to me: can the -V, +V lines in the USB connection somehow "dirty" the DAC's power? I don't know...I'm not terribly comfortable with electrical engineering subject matter.

 

If someone could enlighten me, I'd be forever grateful.

 

post #2 of 18

Though I'm responding, I don't really have the answers either.  Please correct anything below.

 

 

In standard USB audio implementations--which are isochronous, not asynchronous, I think--data is not buffered much on the DAC side (so lower latency, but lower reliability).  AFAIK it's sent pretty much as it's needed, and there's no FEC applied.  Doesn't USB not have FEC, just a CRC for error checking on those reliable data transfers like to storage media?  Hence dropouts occur more often than they really ought to for USB audio, when there's misbehaving drivers eating CPU cycles or the CPU otherwise just doesn't get around to sending the audio data in a timely fashion.

 

USB supplies +5V and GND.  Definitely the power lines will be pretty dirty in any computer system, because of all the fast-switching nearby power hogs like the CPU and potentially GPU, and even some of the I/O all around.  From the computer power supply itself, +5V should have no more than about 50 mV AC ripple on top of the DC (50 mV is ATX spec, but you'll get maybe 10 mV from a high quality supply or possibly lower depending on the load, if it's pretty constant), but the line from the motherboard probably has additional transients under realistic usage.

 

However, with reasonable power supply filtering and modern chips which have very high typical PSRR (power supply rejection ratio), it's not such a big deal, and you can still get extraordinarily clean analog outputs even using USB power.

 

I'd expect with cables of reasonable construction and length, etc., bit errors in the data itself should be pretty rare, so data really shouldn't be corrupted, and sound quality should more or less be the same regardless of what is used.

post #3 of 18
Quote:
Originally Posted by mikeaj View Post
However, with reasonable power supply filtering and modern chips which have very high typical PSRR (power supply rejection ratio), it's not such a big deal, and you can still get extraordinarily clean analog outputs even using USB power.


Not really, power supply for DACs is almost everything, a low ripple external power source is mostly necessary (no need anything extremely exotic, something normal will do). I can't really make my point here but this is from what I have read up so far.

 

Also if I'm not wrong, the problem with some async usb implementations is the the clock crystals are not precise enough or for cheaper solutions, not even using dedicated oscillators. That being said, the clocks(again) also need to be fed with a good power source. 

 

 

post #4 of 18

For me the issue is how does corruption of the data (which a better 'audiophile' USB cable supposedly reduces) then translate into differences in sound quality?

 

Testing has found that with a properly specified and made USB, the length of the cable is the only impact factor where data can be corrupted and then it appears as cracks and drop outs, not a harsher treble or reduced soundstage as many claim.

post #5 of 18
Quote:
Originally Posted by Prog Rock Man View Post

For me the issue is how does corruption of the data (which a better 'audiophile' USB cable supposedly reduces) then translate into differences in sound quality?

 


From a previous USB thread, it makes totally no difference, it is the DAC that makes the difference. Even if it does make a difference(which I think so), I highly doubt that audiophile companies have the equipment to measure the effects(Jtest). Effects of usb data corruption due to jitter can be eliminated with good implementation of the clock though.

post #6 of 18

The point of a proper asynch USB connection with back and forth error correction is that jitter is rendered completely irrelevant. The USB clock is rendered completely irrelevant. It receives a packet with error correction bits and it can verify 100% that there is no data corruption. If there is, it either reconstructs the packet from redundancy bits in the packet (this may or may not occur...depends what packet layout is used in the USB communication protocol), or requests a new packet from the PC. When it arrives in perfect condition to the DAC's USB input and makes it through error correction, it's stored in a buffer which is basically solid state memory. The bits are stored as copies of the 1's and 0's which were in the data stream. When they make it through the buffer, they're essentially retransmitted as new data and the waveform that was in the data stream is not transferred over in any way.

 

What I'm suggesting is that the asynch USB should work as bulk mode data transfer does. It ensures that data is transferred over perfectly. Obviously, this needs a buffer which can have packets inserted in-place and out of order.

 

I'm interested to know whether or not current asynchronous USB connections do this. Are they truly ensuring data integrity? I'm assuming so, as altering the size of the buffer does matter: the smaller the buffer, the less time the USB connection has to fix errors and the more likely dropouts become.


Edited by canuck525 - 12/10/11 at 7:06am
post #7 of 18
Quote:
Originally Posted by canuck525 View Post

The point of a proper asynch USB connection with back and forth error correction is that jitter is rendered completely irrelevant. The USB clock is rendered completely irrelevant. It receives a packet with error correction bits and it can verify 100% that there is no data corruption. If there is, it either reconstructs the packet from redundancy bits in the packet (this may or may not occur...depends what packet layout is used in the USB communication protocol), or requests a new packet from the PC. When it arrives in perfect condition to the DAC's USB input and makes it through error correction, it's stored in a buffer which is basically solid state memory. The bits are stored as copies of the 1's and 0's which were in the data stream. When they make it through the buffer, they're essentially retransmitted as new data and the waveform that was in the data stream is not transferred over in any way.

 

What I'm suggesting is that the asynch USB should work as bulk mode data transfer does. It ensures that data is transferred over perfectly. Obviously, this needs a buffer which can have packets inserted in-place and out of order.

 

I'm interested to know whether or not current asynchronous USB connections do this. Are they truly ensuring data integrity? I'm assuming so, as altering the size of the buffer does matter: the smaller the buffer, the less time the USB connection has to fix errors and the more likely dropouts become.

 

I say the answer is yes, or else we would notice the resulting clicks, pops and drop outs.
 

 

post #8 of 18

Quote:

Originally Posted by firev1 View Post

 

From a previous USB thread, it makes totally no difference, it is the DAC that makes the difference. Even if it does make a difference(which I think so), I highly doubt that audiophile companies have the equipment to measure the effects(Jtest). Effects of usb data corruption due to jitter can be eliminated with good implementation of the clock though.


How would you get data corruption from digital baseband communications with jitter?  You'd need some really really really really really really high jitter for that, like jitter almost on the order of magnitude of the duration of each bit transmitted.

 

For every bit sent, the system is probably doing more or less the mathematical equivalent of integrating the received (differential) voltage over the time interval when that bit was sent.  If it thinks the differential value changed from J to K or vice versa, it detects that '0' was sent.  Otherwise, it detects that '1' was sent.  You'd need lots and lots of noise or huge huge jitter (making a J bleed over a lot into the time when a K was supposed to be received) to have an error.

post #9 of 18
Quote:
Originally Posted by mikeaj View Post

Quote:


How would you get data corruption from digital baseband communications with jitter?  You'd need some really really really really really really high jitter for that, like jitter almost on the order of magnitude of the duration of each bit transmitted.

 

Actually no, measurements in jitter are conducted in pico, nano and micro seconds.  With long cables especially, cable loss is very true and for short ones, maybe well below audible limit. And for one USB in the first place was never jitter free. It is more of a question on DACs sampling clock and oscillator itself, both which will be audible if implemented incorrectly. After all it takes a mere 20ps error in the sampling clock for 20khz jitter to be audible.

 

Source:http://www.nanophon.com/audio/jitter92.pdf (We are in the sound science section after all)

http://www.benchmarkmedia.com/sites/default/files/documents/EAN-interview-john.pdf


Edited by firev1 - 12/10/11 at 9:02am
post #10 of 18
Quote:
Originally Posted by firev1 View Post

Actually no, measurements in jitter are conducted in pico, nano and micro seconds.  With long cables especially, cable loss is very true and for short ones, maybe well below audible limit. And for one USB in the first place was never jitter free. It is more of a question on DACs sampling clock and oscillator itself, both which will be audible if implemented incorrectly. After all it takes a mere 20ps error in the sampling clock for 20khz jitter to be audible.

 

Source:http://www.nanophon.com/audio/jitter92.pdf (We are in the sound science section after all)

http://www.benchmarkmedia.com/sites/default/files/documents/EAN-interview-john.pdf


That's about jitter in the D/A, not jitter in digital communications.  The USB data communications is (more or less) independent of the clock / jitter / etc. on the D/A process, right?  The D/A is not getting clock information from the USB transmission (edit: nevermind, that depends on the implementation...).  Jitter in the D/A is what causes frequency modulation of the analog output, which is definitely bad.

 

 

So when you talk about "USB data corruption" with regards to the physical signaling, it could be sending files over USB, sending audio, or whatever else.  And that's just about receiving 1's and 0's correctly.  Even if the signal is jittery or noisy, so long as you get the proper 1's and 0's, it's all good.  USB 1.1 signaling rate is like 12 Mb/s so probably something around 83 ns per symbol (bit)?  So show me jitter on the order of 10 ns or more and then maybe I could see data getting corrupted every once in awhile.

 

If the D/A is dependent on the interface like USB, then that's a different matter, and it is important.

 

 

Anyway, the point still stands about USB data corruption.  You won't corrupt the USB data itself by a little bit of jitter.  You just might corrupt whatever is behind the USB interface (unless it's like a hard drive and obviously is just interested in the 1's and 0's successfully transmitted).


Edited by mikeaj - 12/10/11 at 9:44am
post #11 of 18
Quote:
Originally Posted by mikeaj View Post


That's about jitter in the D/A, not jitter in digital communications.  The USB data communications is (more or less) independent of the clock / jitter / etc. on the D/A process, right?  The D/A is not getting clock information from the USB transmission (edit: nevermind, that depends on the implementation...).  Jitter in the D/A is what causes frequency modulation of the analog output, which is definitely bad.

 

 

So when you talk about "USB data corruption" with regards to the physical signaling, it could be sending files over USB, sending audio, or whatever else.  And that's just about receiving 1's and 0's correctly.  Even if the signal is jittery or noisy, so long as you get the proper 1's and 0's, it's all good.  USB 1.1 signaling rate is like 12 Mb/s so probably something around 83 ns per symbol (bit)?  So show me jitter on the order of 10 ns or more and then maybe I could see data getting corrupted every once in awhile.

 

If the D/A is dependent on the interface like USB, then that's a different matter, and it is important.

 

 

Anyway, the point still stands about USB data corruption.  You won't corrupt the USB data itself by a little bit of jitter.  You just might corrupt whatever is behind the USB interface (unless it's like a hard drive and obviously is just interested in the 1's and 0's successfully transmitted).

Yes I agree with your points. But that article earlier also cover's line jitter(mainly in digital coax not usb) and strategies to counter that.It really is quite an informative read. 

 

And to make clear my stance, with proper DAC implementation. "audiophile" usb cables make no difference.

 

post #12 of 18

I think I got hung up on "USB data corruption due to jitter."  I interpeted this as corruption of the USB data by jitter (which hardly happens), rather than the USB data causing corruption by introducing jitter elsewhere (which can potentially happen with some implementations, which may be potentially audible).

 

I'm gonna blame the English language and call it a day. 


Edited by mikeaj - 12/10/11 at 10:35am
post #13 of 18

Far too many threads on USB get hung up on jitter IMO. Any made to spec DAC/interface with audio in mind will work fine, jitter and cables is a complete red herring and it also is not linked in any proven way to variations in sound quality.

post #14 of 18

I'm likewise not convinced by audible effects of jitter at reasonable levels (achieved by most gear).  If it's really bad, then sure.

 

But there's really little else to talk about, since nothing much else could really make a difference.  Length is the most important parameter of the cable, and that's likely not an issue for most standard installations.

post #15 of 18

I know this thread is about pretty much dead but what are you thoughts on the comment? http://www.benchmarkmedia.com/discuss/forum/general-conversation/thoughts-asynchronous-transfer


Edited by firev1 - 12/10/11 at 8:50pm
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Sound Science
Head-Fi.org › Forums › Equipment Forums › Sound Science › Discussing USB implementations. Why should the quality of the interface and cable matter?