Digital music transmission via USB
Jul 25, 2013 at 4:49 PM Thread Starter Post #1 of 58

ab initio

500+ Head-Fier
Joined
May 1, 2013
Posts
854
Likes
157
Hi all,
 
I was trying to help a fellow head-fier understand digital data transmission and what it meant in regards to their USB cable quality concerns. The original thread is here.
 
The issue came up that digital signals are really analog signals---to which I disagreed and provided an counterargument. Obviously, this conversation remained at a high level without getting into any actual details, and as a consequence, is full of broad sweeping generalizations.
 
Now, I didn't mean to step on any toes, and I know my limited humble experiences with hifi audio pale in comparison to Currawong's, nor do I know any actual industry engineers. Furthermore, Currawong suggested that there are some experience DAC designers and other experts around head-fi who may be able to assist my understanding. 
 
I started this thread to open a dialog with the experts on the topic. I put the thread in Sound Science, because
a) I didn't want to derail the other thread with an emotionally charged debate (nor do I want there to be any emotional debating here!), and
b) I hope that it will be acceptable to ask questions and discuss the details at a higher technical level than acceptable in the other parts of head-fi.
 
The topic of digital signal transmission and effects on perceived audio quality has been debated numerous times throughout the head-fi forums and there is an epidemic of misinformation floating about. It seems like it would be extraordinarily beneficial for the community to get the fundamentals straightened out and organized in a way that can be explained to the less technically savvy.
 
Thus, I propose this conversation to be technically slanted and rigorous with the goal that those with technical backgrounds can follow sufficiently. Once the underlying principles are established, the basics should be boiled down in a way that is accessible to all.
 
I would have proposed using the ODAC as a case study because there is a freely available schematic; unfortunately, the USB controller datasheet is not publicly available, which leaves a pretty big hole. Any ideas for a DIY or open source asynchronous USB DAC that would make a good case study?
 
We could do case study on something based off of a C-Media CM6631 [datasheet]? Is it necessary to include the i2c link from the USB controller to the actual DAC as part of understanding where things can go wrong because of USB data transfer? As far as asynchronous USB controllers go, how much variability is there between models in their susceptibility to error because of USB data transfer conditions?
 
Basically, how is data communicated from a computer to an external DAC via USB and where can things go awry in between? What conditions are necessary for optimal DAC performance? What conditions are sufficient for optimal DAC performance?
 
Cheers!
 
Jul 25, 2013 at 5:50 PM Post #2 of 58
Someone else can give you the dense version of this, but in a nutshell... The data is pushed through the cable in packets. Modern DACs have a buffer which makes it so the music doesn't have to stream. If there is a momentary delay, the buffer takes over and the music continues uninterrupted. With a USB cable that performs to spec (like a cheapo monoprice cable) there is no way that the data stream will ever overrun the buffer.

If a cable is defective, it might drop packets, which would result in a nasty click or pop. As long as a USB cable sounds ok, it *is* ok.

There was a thread a couple of weeks ago on USB cables where all the "digital is analogue" stuff was trotted out, explained and dismissed. Here you go. 53 pages of back and forth over nonsense claims about USB cable audibility.

http://www.head-fi.org/t/617026/usb-cable-and-sound-quality
 
Jul 25, 2013 at 6:02 PM Post #3 of 58
To send data from one place to another, you need to move things around in the real world, so yeah, there's an analog process involved. For digital communications, the information you want to send is encoded and represented digitally. The transmitter maps this digital information to an output (analog) waveform signal that is corrupted by noise and other factors, and it is gotten by the receiver. The receiver filters and processes it and attempts to unmap the received analog waveform to determine the transmitted digital information. Lots of low-level details etc. skipped. For USB, without error correction, the bit error rate spec is supposed to be 10^-12. i.e. pretty much bit errors don't happen. If a cable connects two compliant devices and it can't achieve that, it's not compliant to spec. All the physical-level details about waveforms, noise, interference, synchronization, threshold voltages, filters, phase-locked loops, jitter, intersymbol interference, etc., deal with making that bit error rate low, like 10^-12.
 
USB modes of transmission determine how the data is packaged, when it is transmitted, buffers, and so on. Some USB audio devices use different modes, though most standard ones use isochronous. If a computer for whatever reason doesn't get around to sending the data in time and the USB audio device ends up running out of audio data, you get a click or pause.
 
If the USB audio device is powered over the USB bus—or even if not, to some very small extent—the characteristics of the power supply rails supplied over USB may have some small impact on performance. Sure. Hopefully such a design has decent power filtering on board and otherwise has a high power supply rejection ratio.
 
Depending on the design of the USB audio device, what kind of topology and chips are used, components, PCB layout, and so on, there may be some measurable causal relationship between the nature of the USB data transmission and the way the D/A process is handled. If the timing or clocking of the D/A output is affected, then the output signal may be compromised in some way that's probably not audible unless egregious. So depending on USB audio device design, some jitter and other issues on the USB transmission may have some effect on the DAC's output performance even if no bits are flipped in the transmission. In some small way.
 
 
If you want examples, some kind of TI PCM270x-based DAC may have more information available.
 
Jul 25, 2013 at 6:46 PM Post #4 of 58
All the physical-level details about waveforms, noise, interference, synchronization, threshold voltages, filters, phase-locked loops, jitter, intersymbol interference, etc., deal with making that bit error rate low, like 10^-12..


There's the dense version! I told you it was coming!

Now for the real world answer... Don't worry about it.
 
Jul 25, 2013 at 7:12 PM Post #5 of 58
Thanks guys, this is a good start.
 
I'm hoping some of the DAC designers that were alluded to in the post I linked to might show up and add a few comments from their experience.
 
Clearly, I must have said something wrong or at least very misleading in that other thread. And I'd like to figure out where exactly the divide between the believers and cynics is.
 
Cheers
 
Jul 25, 2013 at 7:22 PM Post #6 of 58
Quote:
If the USB audio device is powered over the USB bus—or even if not, to some very small extent—the characteristics of the power supply rails supplied over USB may have some small impact on performance. Sure. Hopefully such a design has decent power filtering on board and otherwise has a high power supply rejection ratio.
 
Depending on the design of the USB audio device, what kind of topology and chips are used, components, PCB layout, and so on, there may be some measurable causal relationship between the nature of the USB data transmission and the way the D/A process is handled. If the timing or clocking of the D/A output is affected, then the output signal may be compromised in some way that's probably not audible unless egregious. So depending on USB audio device design, some jitter and other issues on the USB transmission may have some effect on the DAC's output performance even if no bits are flipped in the transmission. In some small way.
 
 
If you want examples, some kind of TI PCM270x-based DAC may have more information available.

If I understand correctly, a chip in this family (like the PCM2707) is both a usb controller AND a dac? Is this the "usual" implementation?
 
For instance, I have the Magni, which I know uses a separate asynchronous USB contorller C-Media CM6631 and then there is an I2S connection to a standalone DAC chip from AKM. My understanding is other amps use a design like this, where DtoA conversion happens in a stand along chip (i.e. the ODAC)
 
You mention that isochronous is the standard? I thought just about everything coming out nowadays is using async transfers where the device polls the computer for data when it is ready?
 
Cheers
 
Jul 25, 2013 at 7:35 PM Post #7 of 58
Usual implementation for something cheap is to have USB receiver and DAC in one chip like that. That TI series is pretty old I think. Most all of these and most normal audio products are isochronous.
 
Most audiophile products have separate chips... two, three, four, or I don't know. Depends. I don't really follow or know the details. I think you'd find some more in the thread bigshot linked. Many of these are isochronous too, but many are not.
 
Actually, some USB audio DACs use PCM270x for the USB interface but then run I2S out of it to a separate DAC chip. Like the DIY AMB gamma1.
 
Jul 25, 2013 at 10:35 PM Post #8 of 58
Cool.
 
The only ones that I looked up were the Schiit Modi (that uses the C-Media CM6631 usb controller and a separate AKM4396 DAC), and the ODAC (that uses a TEL7022L usb controller with a separate Sabre ES9023 DAC). It seems that most midfi and above implementations use a configuration such as this.
 
I'll look into this a bit and try and come up with case study idea.
 
Cheers
 
Jul 27, 2013 at 4:53 PM Post #9 of 58
The transfer mode IMO are not relevant. Async or isochronous mode are both non-synchronous, the key is the clock domain. The only factor that can affect sound quality is clock jitter. There are distinctly 3 clock domains. The USB transmission, I2S transfer and DAC conversion clocks. Are the clocks all synchronous to each other? My last project (PON device with integrated USB controller and I2S interface) do not have i2s clock and the media synchronized.
 
My biggest question is how is content clock fit into this. For example if the content is recorded at 44.2 KHz and playback is at 44.1KHz would that be more of an issue? My understanding is streaming media usually have an embedded sync signal to sync the playback clock. I have read this in some IEEE standard but haven't seen in any datasheet. Or, maybe this is not an issue.
 
Jul 28, 2013 at 4:41 PM Post #10 of 58
I got my answer. Looks like most of the USB transfer are in isochronous mode. There are basically three different ways of synchronization. Synchronization is key to the cable/jitter argument in USB digital transmission.
 
1. Asynchronous Mode. This is the simplest and cheapest way. The play clock is on chip. The input data is put in the buffer and then read with a read clock (the play clock. To prevent buffer over and underflow, the write clock pointer and the read clock pointers are compared. When the margin is too wide, samples are removed or added. The quality is lowest among the three.
2. Synchronous Mode. An external aelf-adjusting clock source is used. The packets are put in the frame buffers. Each frame is 1ms. An external clock is used. The frequency is incremented or reduced by ppm via a frac-n synthesiser to prevent under or over flow. This the best performance however, it's more expensive because the need of an accurate external source.
3. Adaptive mode. This the mode most chip today used. The C-media chip also used this mode. The content are put into a buffer. The data is then re-sampled into 44.1KHz and put into an output buffer. The advantage of this is it can address multiple sampling rate. The disadvantage is it will take up more CPU cycles and memory.
 
The key of playback has nothing to do with media clock (USB). The key is to synchronize the play clock with the sample clock. In all cases, the data are stored in buffer and the only difference is how the match the clock/jitter in the sampling clock. 
 
This is just a high level view. I welcome comment on any incompleteness.
 
Jul 29, 2013 at 8:57 PM Post #11 of 58
I started this thread because of discussions like this one [link to USB cable thread]. I will try and keep the related threads linked here.
 
I'm hoping to compile a definitive answer so that threads like the one above can be settled with a simple link in the first reply post. I realize this is naively optimistic...
 
Cheers
 
Jul 30, 2013 at 11:29 PM Post #12 of 58
If it's just the cable, then the answer is very simple. I worked 20 years in digital transmission, specifically in digital telephony and Ethernet. Now, from all the claim I read, the key compliant is jitter in the transmission caused lower DAC performance. How so? The claim is somehow the play clock inherited the jitter from the cable. So here are the claims and reality.
 
1. How much jitter was generated by the cable? Jitter is a random electrical behaviour caused by disturbers. In order to generate any type of significant jitter, a long loop (maybe with taps) is more likely than a short loop. At 1 meter, jitter is not significant. I have not seen any sort of significant jitter performance measurement in cable. I think the transmitter jitter (from the output) dominates the jitter performance.
 
2. How does the play clock jitter performance relate to the media (USB) clock? AFAIK, all data are buffered. So the media clock has no relationship to the media clock. Of course one can try and derive the play clock from the media clock. This will not be a good engineering practice. One is because there is no way to solve the buffer over/under flow issue. Two, the media clock can be of different frequencies, e,g. USB1. USB2 and USB3). Also the line codes are also different. The extracted clock might have additional jitter generated by the clock recovery system.
 
3. Would distortion in the analog waveshape cause ant quality issue? the answer is no. Digital is digital. A square wave with a slow rise time does not affect signal integrity. As a matter of fact, this slow rise time is prefered because it reduced the high frequency content and reduced the RFI.and make it easier to pass FCC. Any defect in the transmission that cause BER will result in a false value and will sound like a pop not more detail or better soundstage (a common claim). Many standard do specify a MIN. rise time.
 
I have seen that thread you're talking about. I would like to see any kind of evidence that claimed otherwise. And yes, I do have friends that design DACs.
 
Jul 31, 2013 at 12:06 AM Post #13 of 58
Quote:
3. Would distortion in the analog waveshape cause ant quality issue? the answer is no. Digital is digital. A square wave with a slow rise time does not affect signal integrity. As a matter of fact, this slow rise time is prefered because it reduced the high frequency content and reduced the RFI.and make it easier to pass FCC. Any defect in the transmission that cause BER will result in a false value and will sound like a pop not more detail or better soundstage (a common claim). Many standard do specify a MIN. rise time.

 
Thanks for sharing. This point about "rounded square waves" in digital transmission causing audible distortions is one of the biggest arguments used to support the need for $450 USB cables. It can be repetitive and frustrating when explaining that in digital signals, a signal is either greater than the minimum HI voltage or not---and that only matters when there is a clock cycle.
 
Cheers!
 
Jul 31, 2013 at 12:26 AM Post #14 of 58
Yes all the signal are rounded intentionally in some case. Transformers (in isolated application) are designed to provide this function. The cable itself because of its shortness does not really do any kind of "roundness". When I'm reading the other forum; sometimes I got the feeling of a herd mentality. People are looking for peer approval. Aniti-facts and mis-information are all around. I have seen someone put up a waveform that is clearly just a prop delay and he called it jitter. Many so called audiophiles on this board do not even own speaker system or has an ability to describe the jitter impression. Try and ask what jitter sounds like. I bet you these cable experts do not have a clue.
 
Jul 31, 2013 at 4:00 AM Post #15 of 58
Quote:
Try and ask what jitter sounds like. I bet you these cable experts do not have a clue.

 
Jitter sounds....well...jittery. Like the music is nervous, not comfortable with the cable you've just used.
 

Users who are viewing this thread

Back
Top