Yes Virgina, There is a difference in USB cables
Apr 14, 2009 at 7:11 PM Post #121 of 279
Quote:

Originally Posted by hybris /img/forum/go_quote.gif
Synchronisation and reliable transfer of audio is also a basis for the USB spec:
"An essential issue in audio is synchronization of the data streams. Indeed, the smallest artifacts are easily
detected by the human ear. Therefore, a robust synchronization scheme on isochronous transfers has been
developed and incorporated in the USB Specification. The Audio Device Class definition adheres to this
synchronization scheme to transport audio data reliably over the bus"



I usually see asynchronous synchronization schemes touted as being better for digital audio transfer as it has data "cells" that are intended to reduce jitter for realtime audio transfer. Its claimed that this is ignored as speeds have become faster and jitter isn't as rampant. USB does theoretically get data at 480 mbps, but this is usually a burst speed as I understand it. I would prefer asynchronous transfer mode scheme used for USB, there description of isynchronous has a lot of flowery language with little to no explanation of why its used for audio. Isynchronous guarantees bandwidth for unidirectional transfer while asynchronous uses everything available as it needs. Kind of reminds me of how Windows keeps bulking up with the excuse that processor speed has increased so who cares if its slower. Isynchronous seems to be a waste of bandwidth that increases jitter in return for reliability of bandwidth available in the other direction. I'm sure there is more to this too that I don't know of.
 
Apr 14, 2009 at 7:16 PM Post #122 of 279
Quote:

Originally Posted by manaox2 /img/forum/go_quote.gif
I usually see asynchronous synchronization schemes touted as being better for digital audio transfer as it has data "cells" that are intended to reduce jitter for realtime audio transfer. Its claimed that this is ignored as speeds have become faster and jitter isn't as rampant. USB does theoretically get data at 480 mbps, but this is usually a burst speed as I understand it. I would prefer asynchronous transfer mode scheme used for USB, there description of isynchronous has a lot of flowery language with little to no explanation of why its used for audio. Isynchronous guarantees bandwidth for unidirectional transfer while asynchronous uses everything available as it needs. Kind of reminds me of how Windows keeps bulking up with the excuse that processor speed has increased so who cares if its slower. Isynchronous seems to be a waste of bandwidth that increases jitter in return for reliability of bandwidth available in the other direction. I'm sure there is more to this too that I don't know of.


But jitter in USB transmission doesn't affect the actual audio signal, as referred by the USB developer:

"As for timing/latency concerns, you need to look into the USB audio specification details and how audio signals are digitized in order to see that momentary signal latency in a USB cable has nothing to do with audio timing -- the time base is also carried in a digital way and the audio waveform is not dependent on "now" (generally it's encoded in the signal or derived from the manner in which the digital samples are handled)."
 
Apr 14, 2009 at 7:18 PM Post #123 of 279
Quote:

Originally Posted by hybris /img/forum/go_quote.gif
But jitter in USB transmission doesn't affect the actual audio signal, as referred by the USB developer:

"As for timing/latency concerns, you need to look into the USB audio specification details and how audio signals are digitized in order to see that momentary signal latency in a USB cable has nothing to do with audio timing -- the time base is also carried in a digital way and the audio waveform is not dependent on "now" (generally it's encoded in the signal or derived from the manner in which the digital samples are handled)."



I agree the cable has nothing to do with it, a regular old certified cable should be fine. I'm only talking about the transfer scheme, I mostly want asynchronous more for the increased bandwidth. The proper drivers can make it happen if the device is willing to receive. I do not believe that the time base is included in the normal USB standard. Isocronous transfer doesn't actually include the timing data as far as I know, the PLL on the master word clock has to estimate the master clock rate so thats its always under or over filling. This seems to be because the variation in timing from when the data is received affects the estimation of the master clock rate. The only digital transfer method that carries the time base that I know of is I2S.

Quote:

Originally Posted by hybris /img/forum/go_quote.gif
In a futile hope of ending this (an similar discussions about HDMI cables and the like) I actually contacted the guys behind the USB standard with this issue.

I got the following reply:

As for timing/latency concerns, you need to look into the USB audio specification details and how audio signals are digitized in order to see that momentary signal latency in a USB cable has nothing to do with audio timing -- the time base is also carried in a digital way and the audio waveform is not dependent on "now" (generally it's encoded in the signal or derived from the manner in which the digital samples are handled).



Maybe the receiving device could have proprietary USB drivers programmed for the computer to send timing data and some system of decoding it for the clock which seems like it would also take a lot of bandwidth to perform. I haven't heard of such a device. Some devices I do know of completely reclock the data I believe so that its not such a big issue. My plan is to just use my m-audio transit to optical to Gamma1 with my laptop at home and USB on the go where I'm not using higher end equipment or doing deep listening (because really I don't care too much about the jitter as much as I care about electrical isolation and the ability to hear the music without resampling) and eventually get a desktop with a sound card that sends an i2s signal to my home DAC. A USB cable is the least of my monetary concern.

This seems like a good rundown of what I'm talking about I just found: http://www.audioasylum.com/forums/pc...ages/7719.html
 
Apr 16, 2009 at 4:13 AM Post #124 of 279
I think that the beautiful part of this is that for someone who uses and understands digital audio, this whole thread is like witchcraft talk.
When digital audio gets corrupted, there is no slight loss in soundstage. What you get are very, very nasty digital clicks.
It just doesn't happen at all. The connection will drop out before the data gets corrupted. I would be very, very scared of using a USB hard drive if it lost even one digital number in my files.
People don't get that corruption in a digital world is not qualitative. Jitter exists, but only in a DAC or ADC. Jitter in the actual USB connection causes only one thing, latency. That's all.
End of rant, never stepping a foot in this thread again.
 
Apr 17, 2009 at 9:30 AM Post #125 of 279
Quote:

Originally Posted by DistortingJack /img/forum/go_quote.gif
I think that the beautiful part of this is that for someone who uses and understands digital audio, this whole thread is like witchcraft talk.
When digital audio gets corrupted, there is no slight loss in soundstage. What you get are very, very nasty digital clicks.
It just doesn't happen at all. The connection will drop out before the data gets corrupted. I would be very, very scared of using a USB hard drive if it lost even one digital number in my files.
People don't get that corruption in a digital world is not qualitative. Jitter exists, but only in a DAC or ADC. Jitter in the actual USB connection causes only one thing, latency. That's all.
End of rant, never stepping a foot in this thread again.



Actually, that's not entirely true if you have an audio / PCM stream without error correction (I'm amazed and baffled that anyone would design and implement such a thing, but that is the only logic explanation to audible differences in digital cables). According to the USB developer I discussed with there is CRC error checks in the protocol, while other sources claim that USB audio do not implement error handling.

If you do random bit changes in a PCM stream (you can simulate this by editing a .wav file with a hex editor and replacing a hex value here and there by only one value up/down (i.e change 08 to 09 or EE to ED) this won't immediately generate audible clicks or digital noise. So theoretically you could have individual bit errors in an digital stream with no immediately audible effects. Wether such errors actually manifest themselves at reduced soundstage or other audible artifacts is another question.
 
Apr 17, 2009 at 1:57 PM Post #126 of 279
Quote:

Originally Posted by hybris /img/forum/go_quote.gif
Actually, that's not entirely true if you have an audio / PCM stream without error correction (I'm amazed and baffled that anyone would design and implement such a thing, but that is the only logic explanation to audible differences in digital cables).


If that's true, then they would be better off returning the device, and letting other people know about the problems. A digital device not using the CRC values it was given to detect random bit changes, would be just plain stupid.

Quote:

According to the USB developer I discussed with there is CRC error checks in the protocol, while other sources claim that USB audio do not implement error handling.


Are these the sources that were quoted as experts in a USB cable review? Unless I've missed something, they threw in just enough technical jargon and unqualified absolutisms to leave the discussion "up in the air" and then disappear before anything can be resolved. Or are there other sources?

Quote:

If you do random bit changes in a PCM stream (you can simulate this by editing a .wav file with a hex editor and replacing a hex value here and there by only one value up/down (i.e change 08 to 09 or EE to ED) this won't immediately generate audible clicks or digital noise. So theoretically you could have individual bit errors in an digital stream with no immediately audible effects. Wether such errors actually manifest themselves at reduced soundstage or other audible artifacts is another question.


Random bit changes would be random, not always the LSB (as your example implied).

11101101 = ED (original)
11101100 = EC (original, except 8th bit flipped)
11001101 = CD (original, except 3rd bit flipped)
01101101 = 6D (original, except 1st bit flipped)

The reason it's random is that USB is serial, e.g. bits are sent one at a time, and each bit is subject to the same issues (i.e. same probability of being flipped), regardless of it's meaning in the data format.

Moreover, if the numbers being represented by the bits are using more than than 8 bits, then the maximum difference of flipping a single bit will be much greater than (2^8)/2 = 128 (for an 8-bit integer). It'd be (2^16)/2 = 32768 for a 16-bit integer, (2^32)/2 = 2147483648 for a 32-bit integer, or (2^64)/2 = 9223372036854775808 for a 64-bit integer. In other words, the value of the most significant bit.
 
Apr 17, 2009 at 2:21 PM Post #127 of 279
Quote:

Originally Posted by hybris /img/forum/go_quote.gif
According to the USB developer I discussed with there is CRC error checks in the protocol, while other sources claim that USB audio do not implement error handling.


Quote:

Originally Posted by null_pointer_us /img/forum/go_quote.gif
If that's true, then they would be better off returning the device, and letting other people know about the problems. A digital device not using the CRC values it was given to detect random bit changes, would be just plain stupid.


USB itself only does error detection. As you said, it would make sense for any USB receiver to perform error correction before using the data.
 
Apr 18, 2009 at 8:11 AM Post #128 of 279
Quote:

Originally Posted by manaox2 /img/forum/go_quote.gif
USB itself only does error detection. As you said, it would make sense for any USB receiver to perform error correction before using the data.


If I've understood the protocol correctly, the reciever can only detect and discard invalid packets, it is up to the sender (that would be the USB driver on your source/PC I guess) to resend. Either way it seems only natural that any USB device supports this functionality.
 
Apr 20, 2009 at 2:20 PM Post #129 of 279
actually, I was reading again about that AudioTrak Cube soundcard : Welcome to AUDIOTRAK Korea

they say it has a Tenor USB chip : GALAXY FAR EAST CORP.

that uses :
Quote:

2 isochronous input endpoints for recording, 2 isochronous output endpoints for playback, and 1 interrupt endpoint for HID


and if you look for its reference on google, it's good stuff : tenor te7022l - Google Search

and it's also found in this DAC : Translated version of http://www.pc-speaker.com/zboard/view.php?id=bestreview&page=1&sn1=&divpage=1&sn=of f&ss=on&sc=on&select_arrange=headnum&desc=asc&no=9 27
 
Apr 22, 2009 at 8:37 PM Post #130 of 279
Onkyo says that regular USB souncards suffer from terrible jitter :
Translated version of http://www2.jp.onkyo.com/product/products.nsf/wavio/C112D5E10EA1F3C9492574DA00020251?OpenDocument

well....how can you "hear" jitter? I sure don't have the feeling to hear "messed up" waveforms like on their pictures
rolleyes.gif
 
Apr 23, 2009 at 9:08 AM Post #131 of 279
Jitter is a problem only after a critical amount; the delay between the reception of the audio stream and its decoding is called latency. Usually this is way bigger than any jitter induced in the digital (it doesn't have to be audio it could be an excel spreadsheet for all we care) there shouldn't be any problems. If the jitter in the INCOMING data stream is enough to starve the DA converter, then the connection drops out. This is what happens when you have too many devices in the same hub. The audio stream doesn't lose fidelity, you just get an error on the screen saying the device is not connected or something of the sort, same as if you unplug the cable.
 
Apr 23, 2009 at 8:20 PM Post #134 of 279
I think there is something wrong with the Onkyo translation. My understanding is that the USB protocol is by packets. In another word, the data is asynchronous. Clock and jitter in the data transfer is not relevant. The timing information is regenerated in the USB DAC. The word regenerate kept reappearing in the article. My guess is they have a better technique in regenerating a low jitter clock.

The Asahi chip as with any other DAC chip have high jitter tolerance as standard feature. This is independent of the quality of clock that Onkyo regenerate.

Keeping it simple, I don't think Onkyo said anything in USB cable jitter affecting sound quality.
 
Apr 24, 2009 at 10:39 AM Post #135 of 279
Quote:

Originally Posted by spanimal /img/forum/go_quote.gif
Time to buy a new inexpensive USB cable and let SPANIMAL hearing decide on the outcome. Can it? Will it? Opinions coming up.



I hope you will at least perform a blind (preferrably ABX) test, the results of a sighted test will of course be useless.
 

Users who are viewing this thread

Back
Top