Async mode breakthrough for USB DACs!
Feb 22, 2008 at 10:15 PM Post #31 of 45
Quote:

Originally Posted by Crowbar /img/forum/go_quote.gif
Well, such claims are good business
biggrin.gif


Of course, the truth is not exactly simple, and the possibility for the cable to make a difference exists. Let me explain: USB Audio uses a mode that, unlike the rest of USB does _not_ use error correction, so in some systems it might be possible to get an error rate that is noticeable.

Gordon from Wavelength told me that in his final products he was getting no errors, so it's likely to be a non-issue with most setups. In my case, I thought I was getting some errors but the way I was checking turned out not robust, so I'm going to repeat this as soon as the redesign of the USB chip->DSP link is done.



Errors? Seems to me that the error rate would have to be quite high in order to actually make it audible.

One way to test this theory would be to use a very short USB cable and a long one of the same type. If it is errors, then the shorter one should sound better. I have tried this with my own USB interfaces and they sound identical to me.

I was just reading an add from a company that is selling USB output connectors for computers and laptops that have integral filtering. This could be bad news for audio streaming given the negative effects of ferrites on some USB cables.

Steve N.
 
Feb 23, 2008 at 1:06 AM Post #32 of 45
Quote:

Originally Posted by audioengr /img/forum/go_quote.gif
Seems to me that the error rate would have to be quite high in order to actually make it audible.


Imagine a high-order (in terms of positions in the word) bit flipped. That could create a very short but large transient. If the settling time of the DAC and analog stages isn't fantastic, you'll get audibility problems beyond the transient itself.
 
Feb 23, 2008 at 6:48 PM Post #33 of 45
Quote:

Originally Posted by Crowbar /img/forum/go_quote.gif
Imagine a high-order (in terms of positions in the word) bit flipped. That could create a very short but large transient. If the settling time of the DAC and analog stages isn't fantastic, you'll get audibility problems beyond the transient itself.


Yes, but if you got these kinds of errors, the probability of an error in the header would be high and you would get drop-outs or go out of sync etc. with this kind of error. No?

Steve N.
 
Feb 24, 2008 at 3:24 AM Post #34 of 45
Depends on error frequency and decoder robustness to such errors (which likely is not going to be zero).
 
Mar 31, 2009 at 9:35 AM Post #35 of 45
Hi people,

I've been reading about this subject for the past few days and I'm still confused.

Now, let's assume a DAC gets the source data through any means imaginable - providing you can get bit perfect transmission. We could even assume the data is transmitted through smoke signals or morse code audio/light signals
smily_headphones1.gif
So point being, the data is transmitted as bit perfect from A -> B.

Now, if the DAC has a reasonable buffer for the data, isn't the jitter on the data stream totally irrelevant? The DAC would probably reclock the incoming stream, but that's what the dacs are doing?

So, in a summary:

1) DAC gets the source file in a bit perfect fashion.
2) In extreme case it could buffer the whole song into its memory, before it plays it. Of course this isn't very practical, but just to make a point.
3) When the DAC has enough data, it starts to convert it into analog. DAC would reclock the data (although I don't totally understand this part).

Now, where does the jitter problem come from? From the above points, I can't see what the problem is? Please enlighten me.

Sorry for bumping an old thread, but I guess this is a good place to ask.
 
Mar 31, 2009 at 3:58 PM Post #36 of 45
Ilkka,

It really depends on how the USB receiver generates it's Audio Clocking output (I2S etc..). If for example it uses a frequency synthesizer and adaptive mode USB (changing this frequency every 1-4ms) then there will be two components of jitter. One from the Adaptive mode and one from the frequency synthesizer.

This is why all adaptive mode engineers have used reclockers or upsamplers to fix this problem. But this really does not work totally.

We looked at all the adaptive mode reviews and noticed that with most of them. Their SPDIF alternative inputs were significantly less jitter than their USB. In testing some of the units we found that the jitter coming out of the USB link (into the upsampler/reclocker) was more than 4x that of SPDIF.

Most of these companies claim that the upsamler/reclocker will rid the adaptive mode of all it's jitter as well as the SPDIF inputs but these test show that is not the case.

Really the only way to make sure the jitter is gone is to make sure it was not there in the first place. The only way to do that is with Asynchronous USB mode and using a really low jitter master clock feed to both the dac and the USB Audio receiver.

Thanks
Gordon
 
Mar 31, 2009 at 4:18 PM Post #37 of 45
Thanks for the reply, but there is still something unclear to me.

If the DAC gets the data in a bit perfect fashion, and starts converting it when it thinks it has enough data, wouldn't this eliminate all the jitter from transport?

I mean ... the data would be totally identical, right? I can see that there could be a problem if the DAC needs to start playing it immediately, but if we can tolerate some latency (I know I could), wouldn't this eliminate all the jitter from the transport?

So - is the latency the problem after all, people can't tolerate latency, they need the playback immediately, and that causes problems?

Another confusing thing:

If the transport causes jitter, does that mean that there are actually errors in the transmission and the data that the DAC receives is not an identical/bit perfect copy of the original data?

Aren't there some error correction methods to ensure that data can be transmitted error free to the device? There has to be, as digital streams cannot just get broken just like that?
 
Mar 31, 2009 at 4:44 PM Post #38 of 45
So... I guess this is what I'm trying to get at:

Wouldn't all the jitter problems (that are caused by the transport) be resolved by a bit of buffering/latency?

But latency in itself is a no-no and jitter is less evil and that's why there is jitter?
smily_headphones1.gif
 
Apr 1, 2009 at 1:36 PM Post #39 of 45
Quote:

Originally Posted by ile /img/forum/go_quote.gif
So... I guess this is what I'm trying to get at:

Wouldn't all the jitter problems (that are caused by the transport) be resolved by a bit of buffering/latency?

But latency in itself is a no-no and jitter is less evil and that's why there is jitter?
smily_headphones1.gif



Ilkka,

No, buffering alone will not solve the problem.

While yes if you did change the way the Adaptive works, you program a device to slow it's progression of changing the master clock which would diminish one aspect of the jitter.

The problem is that you are getting a 1/2 million bytes a second. Most of these chips have very little buffering.

But the big deal with jitter is not the interface it's the clocks and how they are created. Jitter can be related directly to what is called phase noise. Phase noise can be effected by any noise modulated into the clock signal.

So if you are trying to make an 11.2896Mhz clock from a 6MHz system oscillator the noise required to do that will add easily a factor of 50 to 100x what it would be if you put an oscillator directly into the chip at 11.2896.

But remember not even that will give you good results. If you go to Digikey or Mouser and pick up an available can oscillator and put it in a circuit expect the jitter to be from 50-100ps. That's for the Master Clock not the Word Clock which is the reference for jitter in any audio system. This because the Word Clock will always have the worst jitter and therefore the easiest and most reliable way to measure. If the Master Clock is 50-100ps then the Word Clock is going to be in the 500ps area.

You also have to worry about how to power the oscillators. In the Cosecant and Crimson we use sub 20nV RMS @ 1Hz discrete regulators on the oscillator circuits to keep the jitter down to the lowest possible. The oscillators are rated at 0.5ps at 100Hz. Which is not were I would rate them.. but on the Wavecrest we have measured them at less than 1.2ps at 10Hz which is really good.

Thanks
Gordon
 
Apr 1, 2009 at 2:45 PM Post #40 of 45
Thanks for bearing with me, Gordon. I got the answer I was looking for.

I was just thinking on a "theoretical" level. I'm trying to get a clear picture of the issue so that I could explain it to a curious 5 year old if I had to
smily_headphones1.gif
I'm curious myself, but not a 5 year old - not sure about that though.

I also can understand that there are many places jitter can be introduced - not sure yet if I want to go deeper on this issue, but maybe I understand one part better now.
 
Apr 4, 2009 at 6:54 PM Post #41 of 45
Quote:

Originally Posted by Wavelength /img/forum/go_quote.gif
less than 1.2ps at 10Hz which is really good


It's not at all good for a 24-bit system, unless you only play 16-bit content.
 
May 4, 2009 at 10:56 PM Post #42 of 45
Slightly shifting direction again, is it clear to anyone how USB cable properties can affect performance in asynchronous mode? It seems that cable properties (as long as the cable meets spec, of course) could be irrelevant just as they are for the link between hard drive and CPU.
 
Nov 25, 2009 at 4:40 PM Post #43 of 45
Quote:

Originally Posted by ile /img/forum/go_quote.gif
Thanks for the reply, but there is still something unclear to me.

If the DAC gets the data in a bit perfect fashion, and starts converting it when it thinks it has enough data, wouldn't this eliminate all the jitter from transport?

I mean ... the data would be totally identical, right? I can see that there could be a problem if the DAC needs to start playing it immediately, but if we can tolerate some latency (I know I could), wouldn't this eliminate all the jitter from the transport?

So - is the latency the problem after all, people can't tolerate latency, they need the playback immediately, and that causes problems?



thought I'll stir up this thread again with new information. The Naim DAC does exactly this. ie. it buffers the streaming data and plays it when it has enough data in it's buffer. It's due to be released soon. With the Naim badge, you know it's a high end product. I don't think it can be connected to the PC, but you could hook it up to a USB stick. But it essentially implements what you had in mind. You should check it out how it sounds like.
wink_face.gif
 
Nov 25, 2009 at 5:06 PM Post #44 of 45
Connecting to PC or inserting a USB stick are two very different things.
 
Jan 25, 2010 at 6:06 PM Post #45 of 45
Quote:

Originally Posted by khewa /img/forum/go_quote.gif
The Naim DAC does exactly this. ie. it buffers the streaming data and plays it when it has enough data in it's buffer.


Whatever it actually does, it is not what you say, for that means that it will stop playing when it runs out of data, or skip music when the buffer gets full. What in fact must be the case if it buffers data is that either its clock must synchronize to the source (adaptive or synchronous USB Audio), or the other way around (asynchronous USB audio). Alternatively, it could have an enormous buffer that can be reset during pauses in music (i.e. between songs) and the two clocks allowed to drift, in which case you also have to deal with latency on the order of seconds. No professional product would go that route, though I have seen at least one DIY project do it.
Quote:

With the Naim badge, you know it's a high end product.


With khewa's posts, you know it's a shill.
 

Users who are viewing this thread

Back
Top