USB cable MATTERS!
Nov 27, 2008 at 5:59 AM Post #76 of 102
I was wondering what kind of USB cables people are using. A search on 'USB cable' led me to this thread. 'USB cable matters' - I keep hearing this statement these days and am always amazed by the sensitiveness a pair of human ears can reach. To me, this is by far the most ridicules statement I've seen on this forum. I don't want to be rude, but please, stop passing wrong knowledge onto other people ...

Let me make my point first. USB cable DOES NOT matter, well, at least for audio applications. USB transport is jitter-immune.

Why? I'll try my best to explain. Although USB is a serial bus(using NRZI encoding - a NRZI encoded SPDIF would be cool
tongue.gif
), the USB communication is actually data packets based. There are three basic packet types: handshake(who said USB audio is one way communication?), token(address) and data(payload with CRC). A simplified picture of USB communication is like this: host sends token+data, device accepts payload and responds with handshake ACK, or rejects payload with handshake NAK. In later case, host sends the token+data again. So in USB world, there is absolutely no wrong or partial data packets being passed to application layer. A data packet either arrives immediately or eventually arrives after retransmissions. A data packet can carry a payload up to 1K bytes, that is 8192bits, or 512 16bit samples. Use CD audio as an example, a USB data packet can hold a little less than 6ms audio stream (two channels). Let's calculate how many packets can be exchanged on USB in this 6ms. Although USB 2.0 supports up to 480Mbps, most devices typically work at an average 24Mbps with a peak transfer rate up to 80-160Mbps. To be conservative, we use 24Mbps for our calculation. 24Mbps/1000*6=144K bits, that is to say, in 6ms, USB can transfer 144K bits, or 9K 16bit samples, which are 18x of the bandwidth required by CD audio. See my point here? If there ever is a wrong packet being detected, it can take up to 17 retries to get it right without interrupting the playback of an audio stream (un-buffered, worst case). I would say any decent 'certified' USB cable should have a BER(bit error rate) far lower than the required 17/18, otherwise, how they got certified at first place? Now the job of USB cable is done, you can see no jitter being introduced at all. The jitter actually kicks in at the following stage when a USB receiver chip converts the received payload into bit stream(serialization/re-clocking). I won't go there since I'm not an expert.

OK, people may ask, why are there super expensive USB cables out there? Well, every product has its own market. But using these high-class cables for USB audio is definitely an overkill, IMO. I can see their usage, when the application is bandwidth sensitive, or simply requires the maximum throughput out of USB transport. Mass-storage and video are two obvious example I can think of.

Let me summarize:
1. Lossless transmission is the beauty of digitalization(SPDIF is rough and bad, shame on SONY and Philips).
2. Any USB 2.0 certified cable is good cable(for audio use).
3. There is no sonic difference between a $10 USB cable and a $300 USB cable.
4. Jitter comes from USB receiver chip and beyond.

Edit: A cable matters only if the USB receiver is bus powered, which is a bad design to start with anyways.

lk
 
Nov 27, 2008 at 6:04 AM Post #77 of 102
yeah ? well explain to me the fact that i can hear a difference between the usb cable supplied by emu, and the other usb cables i have lying around here (printer, etc.) and the kimber usb cable i got.

have you actually sat and compared ? or are you just relying on what you think ?
 
Nov 27, 2008 at 6:17 AM Post #78 of 102
Quote:

Originally Posted by x_lk /img/forum/go_quote.gif
I was wondering what kind of USB cables people are using. A search on 'USB cable' led me to this thread. 'USB cable matters' - I keep hearing this statement these days and am always amazed by the sensitiveness a pair of human ears can reach. To me, this is by far the most ridicules statement I've seen on this forum. I don't want to be rude, but please, stop passing wrong knowledge onto other people ...

Let me make my point first. USB cable DOES NOT matter, well, at least for audio applications. USB transport is jitter-immune.

Why? I'll try my best to explain. Although USB is a serial bus(using NRZI encoding - a NRZI encoded SPDIF would be cool
tongue.gif
), the USB communication is actually data packets based. There are three basic packet types: handshake(who said USB audio is one way communication?), token(address) and data(payload with CRC). A simplified picture of USB communication is like this: host sends token+data, device accepts payload and responds with handshake ACK, or rejects payload with handshake NAK. In later case, host sends the token+data again. So in USB world, there is absolutely no wrong or partial data packets being passed to application layer. A data packet either arrives immediately or eventually arrives after retransmissions. A data packet can carry a payload up to 1K bytes, that is 8192bits, or 512 16bit samples. Use CD audio as an example, a USB data packet can hold a little less than 6ms audio stream (two channels). Let's calculate how many packets can be exchanged on USB in this 6ms. Although USB 2.0 supports up to 480Mbps, most devices typically work at an average 24Mbps with a peak transfer rate up to 80-160Mbps. To be conservative, we use 24Mbps for our calculation. 24Mbps/1000*6=144K bits, that is to say, in 6ms, USB can transfer 144K bits, or 9K 16bit samples, which are 18x of the bandwidth required by CD audio. See my point here? If there ever is a wrong packet being detected, it can take up to 17 retries to get it right without interrupting the playback of an audio stream (un-buffered, worst case). I would say any decent 'certified' USB cable should have a BER(bit error rate) far lower than the required 17/18, otherwise, how they got certified at first place? Now the job of USB cable is done, you can see no jitter being introduced at all. The jitter actually kicks in at the following stage when a USB receiver chip converts the received payload into bit stream(serialization/re-clocking). I won't go there since I'm not an expert.

OK, people may ask, why are there super expensive USB cables out there? Well, every product has its own market. But using these high-class cables for USB audio is definitely an overkill, IMO. I can see their usage, when the application is bandwidth sensitive, or simply requires the maximum throughput out of USB transport. Mass-storage and video are two obvious example I can think of.

Let me summarize:
1. Lossless transmission is the beauty of digitalization(SPDIF is rough and bad, shame on SONY and Philips).
2. Any USB 2.0 certified cable is good cable(for audio use).
3. There is no sonic difference between a $10 USB cable and a $300 USB cable.
4. Jitter comes from USB receiver chip and beyond.

lk



I'm sorry, but you are mistaken. Jitter is added by the USB cable and is audible in most audio playback applications. Just as jitter is added by a S/PDIF or AES/EBU cable. There are at least three mechanisms in involved, not counting the sensitivity to edge-rate by the receiver:

1) cable losses
2) cable dielectric absorption
3) cable and connector mismatches in impedance

#1 causes low-pass filter due to bandwidth limitations
#2 causes distortion due to dispersion
#3 causes reflections and inter-symbol interference

accurate D/A conversion requires 2 things:

1) accurate Data
2) accurate Timing

Steve N.
Empirical Audio
 
Nov 27, 2008 at 6:19 AM Post #79 of 102
Quote:

Originally Posted by [L]es /img/forum/go_quote.gif
yeah ? well explain to me the fact that i can hear a difference between the usb cable supplied by emu, and the other usb cables i have lying around here (printer, etc.) and the kimber usb cable i got.

have you actually sat and compared ? or are you just relying on what you think ?



I have cables from EMU, M-Audio and $10 cheap cable from local computer store. I hear no difference, or my ears are not gold enough?
icon10.gif
Well, I have never tried any USB cable having a price tag higher than $100, so maybe they have magic ... I do have a job in the communication protocol business and believe in scientific facts.
 
Nov 27, 2008 at 6:23 AM Post #80 of 102
Quote:

Originally Posted by audioengr /img/forum/go_quote.gif
accurate D/A conversion requires 2 things:

1) accurate Data
2) accurate Timing



You are absolutely right on this. USB transport provides guaranteed 'accurate data'. I suppose by 'accurate timing', you mean the clock? The clock is locally generated at USB receiver side, has nothing to do with your cables. So I said, cable doesn't matter, but the receiver chip used DOES MATTER.
 
Nov 27, 2008 at 6:28 AM Post #81 of 102
Well, I am sorry for forgetting one exception: if USB receiver is bus powered, the cable matters - simply because of power drifting, not the USB transport(computer PSU drifts anyways, cheap cable could make the situation worse, maybe?). I suppose a jitter sensitive hi-end system has very low chance of being bus powered.
 
Nov 27, 2008 at 6:57 AM Post #82 of 102
Actually, just because it is packet-based does NOT mean it is error correcting. If the DAC had a direct connection to the PC/Mac and had a back and forth communication to ensure every packet arrives intact prior to processing (eg: TCP), such as the case with USB hard drives, then sure.

What sucks is that USB audio devices are treated simply as streaming audio, meaning it's no different from SPDIF in that data is sent and left to its own devices. The DAC doesn't receive checksums and send ACKs back to the computer to say 'hey everything is intact!' This is why USB cables matter.

USB cables do NOT matter for things like USB hard drives because upstream protocols handle the error correction and retransmission of bum packets.

The ideal solution would of course be a situation where data retrieval AND error correction is performed and and stored in a buffer, which is later reclocked and processed.
 
Nov 27, 2008 at 7:25 AM Post #84 of 102
Short summary of what is to appear in the next episode in this thread:

...And about here, the discussion turns into the pro and con of asynchronous devices like the 0404 usb... later, it goes into the fact that maybe other aspects afffect the sound more, like the da chip.... but shortly after that there appears other posts about how the da chip is not as important as the clock... and then the modders join in about all the other electronic junk inside the converter box like opamps being more important too....
 
Nov 27, 2008 at 8:47 AM Post #85 of 102
Quote:

Originally Posted by yammy1688 /img/forum/go_quote.gif
What sucks is that USB audio devices are treated simply as streaming audio, meaning it's no different from SPDIF in that data is sent and left to its own devices. The DAC doesn't receive checksums and send ACKs back to the computer to say 'hey everything is intact!' This is why USB cables matter.


Sorry, this is not entirely true. First, there are USB SC/DAC using customized drivers working under async mode (with retransmission). Even under isochronous mode, data is still packetized and checksumed. The only difference would be lacking retransmission, which means when errors detected, some data will be thrown away. With that, you hear interrupts, not coloring(I was told jitter causes coloring). USB bus and audio stream run on different clock, it's simply not possible that the USB chip inherits jitter from USB bus. When audio data is serialized, it has to be re-clocked. For isochronous mode USB DAC, the real jitter contributor is the fact USB receiver and DAC can not be synchronized - still has nothing to do with the innocent cable. I have said I'm not an expert in DAC area, but again, I'll try my best to explain myself. USB receiver keeps receiving with limited buffer size, then pass the data to DAC. Here you are facing a problem of choosing the right clock. Sure you can use an accurate 44.1K from local OSC, but with audio data keeps coming in, how do you handle the case that incoming data is too fast, or too slow? Either exhausted or overflowed buffer is disaster. I'm not sure how this is handled in real world, but I can see an accurate serialization/re-clock only works perfectly with customized USB drivers, with feedback to the source(asking PC to speed up or slow down). Again, cables are not the issue. Actually, it's quite straight forward that USB transport does not contribute to jitter, because a clock simply can not be recovered from USB signal(it's an async protocol after all. the isochronous mode only means sending one packet every 1ms, I can not imagine an implementation uses this 1ms interval to generate its master clock). However, SPDIF is another story, it's a true isochronous format, meaning it carries jitter around.
 
Nov 28, 2008 at 6:48 AM Post #86 of 102
Quote:

Originally Posted by x_lk /img/forum/go_quote.gif
You are absolutely right on this. USB transport provides guaranteed 'accurate data'. I suppose by 'accurate timing', you mean the clock? The clock is locally generated at USB receiver side, has nothing to do with your cables. So I said, cable doesn't matter, but the receiver chip used DOES MATTER.



Yes, I mean the clock, the one inside the PC. It is embedded in the data stream and it definitely matters. There is always a clock generated on the USB side, but it is far from immune to the incoming jitter. Really good PLL implementations can reject much of the jitter, as my Off-Ramp does, but it is not immune. A better cable makes a difference. Even async implementations have been demonstrated to be not immune to incoming jitter. In theory they should be.

Steve N.
 
Nov 28, 2008 at 6:50 AM Post #87 of 102
Quote:

Originally Posted by x_lk /img/forum/go_quote.gif
Well, I am sorry for forgetting one exception: if USB receiver is bus powered, the cable matters - simply because of power drifting, not the USB transport(computer PSU drifts anyways, cheap cable could make the situation worse, maybe?). I suppose a jitter sensitive hi-end system has very low chance of being bus powered.


Decent USB audio implementations are separately powered, not from the cable. Jitter added by the cable is still the problem.

Steve N.
Empirical Audio
 
Nov 28, 2008 at 6:51 AM Post #88 of 102
Quote:

Originally Posted by yammy1688 /img/forum/go_quote.gif
Actually, just because it is packet-based does NOT mean it is error correcting. If the DAC had a direct connection to the PC/Mac and had a back and forth communication to ensure every packet arrives intact prior to processing (eg: TCP), such as the case with USB hard drives, then sure.

What sucks is that USB audio devices are treated simply as streaming audio, meaning it's no different from SPDIF in that data is sent and left to its own devices. The DAC doesn't receive checksums and send ACKs back to the computer to say 'hey everything is intact!' This is why USB cables matter.

USB cables do NOT matter for things like USB hard drives because upstream protocols handle the error correction and retransmission of bum packets.

The ideal solution would of course be a situation where data retrieval AND error correction is performed and and stored in a buffer, which is later reclocked and processed.



One USB cable sounding better than another has nothing to do with data errors. It's strictly jitter. Errors are rare.

Steve N.
 
Nov 28, 2008 at 6:57 AM Post #89 of 102
Quote:

Sure you can use an accurate 44.1K from local OSC, but with audio data keeps coming in, how do you handle the case that incoming data is too fast, or too slow? Either exhausted or overflowed buffer is disaster. I'm not sure how this is handled in real world, but I can see an accurate serialization/re-clock only works perfectly with customized USB drivers, with feedback to the source(asking PC to speed up or slow down).


This is called asynchronous mode. It can and is done in some implementations. The anecdotal data on some implementations indicates that even this is not immune to incoming jitter, and the jitter added by the cable. However, in theory, the right implementation of this should be totally immune to incoming jitter.

Steve N.
 
Nov 28, 2008 at 11:43 AM Post #90 of 102
You are here: (at the x)

..And about here, the discussion turns into the pro and con of asynchronous devices like the 0404 usb (X)... later, it goes into the fact that maybe other aspects afffect the sound more, like the da chip.... but shortly after that there appears other posts about how the da chip is not as important as the clock... and then the modders join in about all the other electronic junk inside the converter box like opamps being more important too....
 

Users who are viewing this thread

Back
Top