Benchmark DAC1 now available with USB
Dec 16, 2008 at 11:43 AM Post #2,116 of 3,058
Quote:

Originally Posted by EliasGwinn /img/forum/go_quote.gif
In a USB playback system, unless the driver and the media software are directly linked, the software operates using an internal clock. This data stream is then sent to the audio stack in the operating system.


Ah, yes, I see what you're saying. If I could rephrase it, in order to have only one clock in the playback data path the audio player also needs to respond to the DAC's requests to speed up/slow down data transmission, otherwise you have two clocks and risk buffer under-runs/over-runs somewhere in the system. And we don't know if any media players support that mode (and the default OS drivers should provide the right APIs for the media players to use if you also want to avoid custom drivers and I don't know if they do either).

Quote:

Originally Posted by EliasGwinn
So, even if the computer did not re-sample, and the device was receiving bit-transparent (or otherwise high-precision) audio, the asynchronous method still may not have less jitter.


I don't understand why this might be the case. Under those circumstances there is only one data clock that affects audio output timing, and thus the jitter due to computer generation and data transmission in other methods disappears. Yes, you still have to arrange your software and communications stacks to avoid buffer under-runs and over-runs, but you're not getting jitter due to data transmission between computer and DAC, nor due to the computer's clock. True, you still have other sources of jitter - e.g. due to the DAC's clock - but you have that under all possible circumstances anyway.

Are you saying this method might have more jitter than others because it's DAC clock implementation has more than an alternative method? If so, this is an attribute of the implementation, not of the method. Or do you have some attribute of the method in mind?

I think it's also worth pointing out that - were there to be truly only one clock that affected data output - that on the computer sample rate conversion would be required rather than resampling. The sample rate conversion algorithm (presuming a Nyquist limited original sampling process and good quality ADC clock, as for any good recording) is unique, entirely deterministic and independent of the quality of the computer's clock, so if implemented properly wouldn't introduce any errors. But it sounds like we don't have a truly one clock world today so it doesn't really matter...
 
Dec 16, 2008 at 8:27 PM Post #2,117 of 3,058
Hmm, this is definitely a thought-provoking topic (the downfalls of asynchronous USB support in a DAC). I wonder if the Wavelength and Ayre people have really considered what this means on the computer end.

Meanwhile, as a long-time programmer, I think this problem is being constrained by the hardware that is available to the DAC. I think a much easier approach to this jitter problem (rather than resampling to work around it) would be to create a buffer (perhaps 1 second worth of audio at the current playback rate) that contains not only the data but information (for each 16-bit audio data word) about the time since the previous piece of data was received. The playback code could then sum all these times and base the current playback rate on the average time between each piece of data. This would reduce jitter to ridiculously low levels (if the buffer had 1000 words, the jitter would be reduced to 1/1000 of the level at which it was originally received). This approach would prevent buffer over- and under-runs as well, since you are using data at the "average" rate at which it arrived.
 
Dec 16, 2008 at 9:31 PM Post #2,118 of 3,058
Quote:

Originally Posted by G-U-E-S-T /img/forum/go_quote.gif
Hi Elias,

I have a question please about the analog inputs on my DAC1 PRE. Do they have any provision at all for interrupting any potential ground loop current or other EMI/RFI flowing through the interconnect cable shield?

I like very much how the digital inputs are isolated, but I worry about the possible lack of decoupling/isolation on the analog rca inputs. As you know, this can be a problem when interfacing some single-ended equipment - and I would guess (as a layman) that any possible inlet for EMI/RFI (even the analog ins) could possibly also affect the internal DA circuitry as well.

Thanks in advance for your reply!



G-U-E-S-T,

The shield of the RCA connector is bonded to the chassis (earth), which is isolated from the signal grounds, so it can't get into the analog OR digital signal ground.

Thanks,
Elias
 
Dec 16, 2008 at 10:14 PM Post #2,119 of 3,058
Quote:

Originally Posted by Scrith /img/forum/go_quote.gif
I think a much easier approach to this jitter problem (rather than resampling to work around it) would be to create a buffer (perhaps 1 second worth of audio at the current playback rate) that contains not only the data but information (for each 16-bit audio data word) about the time since the previous piece of data was received.


Isn't that how the Genesis Digital Lens worked?
 
Dec 16, 2008 at 11:22 PM Post #2,120 of 3,058
Hey Elias,

I've just ordered the Benchmark and I have several questions if you have the time

The DAC will be connected as follows :

1) PC > DAC-1 > Bada ph-12 amp > Beyerdynamic DT-880 headphones
2) Cambridge Audio 640C > DAC-1 > Vincent SV-226 amp > Chario Academy millenium 2 speakers

I will therefore use 2 inputs and 2 outputs. The questions would be :

1) What inputs/outputs would you use in the 2 configs ? (ex : config 1 > input 4 with output RCA, config 2 ...)
2) What cables would you use (how dependent on the quality of the cables is each input/output) ?
3) Any other recommendation regarding positions of pins inside the unit or anything else ?
4) Recommended players for PC and their config ? (I now use Foobar 2000)

PS : I will only use DAC-1 for listening to music in wav, flac and ape !

Thanks !
 
Dec 16, 2008 at 11:37 PM Post #2,121 of 3,058
Quote:

Originally Posted by Mazz /img/forum/go_quote.gif
Ah, yes, I see what you're saying. If I could rephrase it, in order to have only one clock in the playback data path the audio player also needs to respond to the DAC's requests to speed up/slow down data transmission, otherwise you have two clocks and risk buffer under-runs/over-runs somewhere in the system. And we don't know if any media players support that mode (and the default OS drivers should provide the right APIs for the media players to use if you also want to avoid custom drivers and I don't know if they do either).


Right. In other words, the media player and all software processes in-between are operating at a rate dictated by the computer. A USB audio device is not going to dictate that rate for that whole stack without custom drivers and agreeable software. And even then, the only way it would be driven directly from the DAC's master clock is if there were a copper trace from the DAC's clock to the CPU clock input....not likely.

I suppose this doesn't mean that sample-rate conversion is inevitable. Thereotically, it could fill buffers at the USB port, and transmit the data using a different clock. But, to avoid sample-rate conversion, the control clock has to know what the original sample-rate of the audio was and control the flow at the original sample rate.

Quote:

Originally Posted by Mazz /img/forum/go_quote.gif
Under those circumstances there is only one data clock that affects audio output timing, and thus the jitter due to computer generation and data transmission in other methods disappears. Yes, you still have to arrange your software and communications stacks to avoid buffer under-runs and over-runs, but you're not getting jitter due to data transmission between computer and DAC, nor due to the computer's clock. True, you still have other sources of jitter - e.g. due to the DAC's clock - but you have that under all possible circumstances anyway.

Are you saying this method might have more jitter than others because it's DAC clock implementation has more than an alternative method? If so, this is an attribute of the implementation, not of the method. Or do you have some attribute of the method in mind?



There will be many sources of jitter in asynchronous mode.

1. First of all, the DAC's master clock is not the clock that controls the host. The DAC's master clock is sent to the TAS1020B, which uses a PLL to estimate and regenerate the master clock. The TAS1020B is the real master clock coming down the USB cable, and it is not a low-jitter crystal oscillator. I haven't measured the jitter on that clock, but it isn't designed to be a low jitter source. It doesn't even specify a jitter-spec on the data sheet.

2. As the TAS1020B sends a master (L/R) clock 'tick' (which tells the USB port to send a frame), the impedance and capacitance of the USB cable/connections will add jitter to this master clock signal.

3. When the host (computer) receives the clock signal, it must apply another PLL-driven clock to estimate the received clock. The new PLL-driven clock is what pushes the data out the USB port.

4. Then, this data must travel along the impedance/capacitance-ridden USB cable/connectors to get to the DAC.

5. Immediately inside the DAC, the data goes through the TAS1020B before it is delivered to the D/A chip.

As you can see, it isn't as simple as the DAC reaching into the computer and grabbing samples at will. On the surface, it may seem that the DAC's master clock would be best, but it doesn't really get you anywhere. The real reason that asynchronous mode exists is so that a device can serve as master clock for multiple synchronized devices (ADC's, etc).

Quote:

Originally Posted by Mazz /img/forum/go_quote.gif
I think it's also worth pointing out that - were there to be truly only one clock that affected data output - that on the computer sample rate conversion would be required rather than resampling. The sample rate conversion algorithm (presuming a Nyquist limited original sampling process and good quality ADC clock, as for any good recording) is unique, entirely deterministic and independent of the quality of the computer's clock, so if implemented properly wouldn't introduce any errors. But it sounds like we don't have a truly one clock world today so it doesn't really matter...


I would say that "sample-rate conversion" is a type of re-sampling. It is true that this process could be done 'neatly' in a computer...in fact, Vista does it very neatly. However, XP and OSX do it very poorly.

Thanks,
Elias
 
Dec 17, 2008 at 11:46 AM Post #2,123 of 3,058
Quote:

Originally Posted by Scrith /img/forum/go_quote.gif
The playback code could then sum all these times and base the current playback rate on the average time between each piece of data.


I assume you're referring to the playback code in the DAC, not on the computer? If so, you're basically saying the DAC has to do clock estimation, which all of them do, and that it can use some form of averaging in doing so in order to improve the clock frequency estimate (whilst still tracking slow-moving variations in the incoming clock frequency), which I think almost all of them do already. They just tend to talk about it in different terms - PLLs, low pass filters, etc.
 
Dec 17, 2008 at 12:10 PM Post #2,124 of 3,058
Quote:

Originally Posted by EliasGwinn /img/forum/go_quote.gif
I suppose this doesn't mean that sample-rate conversion is inevitable. Thereotically, it could fill buffers at the USB port, and transmit the data using a different clock. But, to avoid sample-rate conversion, the control clock has to know what the original sample-rate of the audio was and control the flow at the original sample rate.


Yes, that was what I was getting at. But as you pointed out, the software stack today doesn't seem to support that method.

Quote:

Originally Posted by EliasGwinn /img/forum/go_quote.gif
In other words, the media player and all software processes in-between are operating at a rate dictated by the computer. A USB audio device is not going to dictate that rate for that whole stack without custom drivers and agreeable software.


Agreed.

Quote:

Originally Posted by EliasGwinn /img/forum/go_quote.gif
And even then, the only way it would be driven directly from the DAC's master clock is if there were a copper trace from the DAC's clock to the CPU clock input....not likely.


This is sufficient, but not necessary (nor likely, as you point out
biggrin.gif
). Instead you merely need the DAC to be the rate controller for data transmission, and you need the computer software stack to be sufficiently responsive, which isn't too hard to arrange under most circumstances these days. Remember the old dial-up modem flow control protocols which solved a remarkably similar problem? You can either have the DAC do high-watermark/low-watermark flow control (e.g. telling the computer "send a little bit faster than you are" or "a little bit slower" or even "stop sending until I tell you to start again"), or you can have it say something like ("send me the next chunk of audio in the next X milliseconds please")...

Quote:

Originally Posted by EliasGwinn /img/forum/go_quote.gif
There will be many sources of jitter in asynchronous mode.

1. First of all, the DAC's master clock is not the clock that controls the host. The DAC's master clock is sent to the TAS1020B, which uses a PLL to estimate and regenerate the master clock. The TAS1020B is the real master clock coming down the USB cable, and it is not a low-jitter crystal oscillator. I haven't measured the jitter on that clock, but it isn't designed to be a low jitter source. It doesn't even specify a jitter-spec on the data sheet.
[...]



Ah, thanks for this elaboration, but I see we misunderstood each other - which puts your original comments in a different light and probably answers the question I posed in response. I assumed you were talking about (possibly hypothetical) scenarios where the external DAC incorporates the master clock (plus some flow control protocol, as suggested further above). You're writing about cases where the master clock is derived instead from the USB transmission.

Thanks for a very interesting discussion!
 
Dec 17, 2008 at 8:55 PM Post #2,125 of 3,058
Quote:

Originally Posted by Scrith /img/forum/go_quote.gif
Meanwhile, as a long-time programmer, I think this problem is being constrained by the hardware that is available to the DAC. I think a much easier approach to this jitter problem (rather than resampling to work around it) would be to create a buffer (perhaps 1 second worth of audio at the current playback rate) that contains not only the data but information (for each 16-bit audio data word) about the time since the previous piece of data was received. The playback code could then sum all these times and base the current playback rate on the average time between each piece of data. This would reduce jitter to ridiculously low levels (if the buffer had 1000 words, the jitter would be reduced to 1/1000 of the level at which it was originally received). This approach would prevent buffer over- and under-runs as well, since you are using data at the "average" rate at which it arrived.


Hey Scrith,

We could employ this method...the technology does exist. You'd be interested in reading this paper.

However, there is more to consider here...there are major benefits of employing an asynchronous sample-rate converter (ASRC). Specifically, it optimizes filter performance.

The most critical element of digital conversion (aside from jitter attenuation) is low-pass (stop-band) filtering. Without a well designed filter system, the audio will have aliasing, non-linear frequency response, and high-frequency attenuation.

Here's what the ASRC does for us:

1. The digital filters in DAC-chips suffer from the fact that they are sharing a die with analog circuitry. It is difficult to optimize both the digital and analog performance of a single chip. The ASRC (asynchronous sample rate converter) in the DAC1 is a dedicated digital DSP environment, where all resources are optimized towards the heavy-math required to build a high quality filter.

2. By converting the sample rate to 110 kHz, we are optimizing the filter of the DAC chip. As I mentioned before, DAC filters are inherently compromised because they are hybrid digital/analog chips. It requires a lot of DSP horse-power to create a digital filter that works well. So, by pushing the new nyquist frequency out to 110 kHz, we reduce the risk of the DAC's filter causing aliasing of ultra-sonic audio and/or attenuating high-freq audio. Also, any inaccuracies in the filter are far removed from the audio band.

We could have implemented a jitter-reduction system that did not convert the sample rate. However, the ASRC method creates harmony amongst all the components.

When we design a product, we look at the strengths and weaknesses of all components available and design the most complimentary combination possible. From the outside, it may seem convoluted, but the results are what matter in the end.

Thanks,
Elias
 
Dec 17, 2008 at 9:39 PM Post #2,126 of 3,058
Quote:

Originally Posted by nae45ro /img/forum/go_quote.gif
Hey Elias,

I've just ordered the Benchmark and I have several questions if you have the time

The DAC will be connected as follows :

1) PC > DAC-1 > Bada ph-12 amp > Beyerdynamic DT-880 headphones
2) Cambridge Audio 640C > DAC-1 > Vincent SV-226 amp > Chario Academy millenium 2 speakers

I will therefore use 2 inputs and 2 outputs. The questions would be :

1) What inputs/outputs would you use in the 2 configs ? (ex : config 1 > input 4 with output RCA, config 2 ...)
2) What cables would you use (how dependent on the quality of the cables is each input/output) ?
3) Any other recommendation regarding positions of pins inside the unit or anything else ?
4) Recommended players for PC and their config ? (I now use Foobar 2000)

PS : I will only use DAC-1 for listening to music in wav, flac and ape !

Thanks !



Nea45ro,

I'm sorry its taken so long to respond. I've been buried in phone calls, emails, etc.

As far as digital inputs and digital cables, you don't have to worry about it too much. The performance of the DAC1 is consistent regardless of connection type. I see the 640C has both optical and coaxial. Feel free to use either one.

Analog cables are a different story. A good quality starquad XLR cable is a great interconnect.

I suggest using the balanced XLR cables for the setup that will require the longest cable run. I would also suggest using XLR Y-adaptors so that you can connect both devices with XLR balanced cables. Don't worry about over-loading the output of the DAC1. It can handle both devices simultaneously.

Regarding pins inside the unit, I would suggest not changing anything until it is apparent that you need to. The only thing you may want to change is the output attenuators, but only if the output is too high or too low for your system.

As for media player, foobar is great. If you enjoy using foobar, don't worry about changing a thing. Here is an article I wrote about setting up foobar...

Foobar2000 for Windows - Setup Guide - Benchmark

Thanks,
Elias
 
Dec 17, 2008 at 9:45 PM Post #2,127 of 3,058
Quote:

Originally Posted by gevorg /img/forum/go_quote.gif
Hi Elias,

For the DAC1 USB, what is the V and mA output of the HPA2 headamp?

I'm considering to use low impedance and high current headphones with DAC1 USB, so just wanted to compare HPA2's specs to standalone amps.

Thank you!



gevorg,

Max Vrms = 8.7 Vrms
Max Vrms w/ 10 dB gain reduction jumper = 2.7 Vrms
Max I = 250 mA

More info on page 44: http://www.benchmarkmedia.com/manual...nual_Rev_C.pdf

Thanks,
Elias
 
Dec 18, 2008 at 12:48 AM Post #2,128 of 3,058
Cool, thanks for the answers. Distance between DAC and both amp (speaker amp and headphone amp) is about 0.5m !
 
Dec 18, 2008 at 1:09 AM Post #2,129 of 3,058
Quote:

Originally Posted by EliasGwinn /img/forum/go_quote.gif
We could employ this method...the technology does exist. You'd be interested in reading this paper.


Very interesting. Yes, this is more technical description of what I was describing...use a buffer to generate an average clock rate for a large sample of data, then base the output rate on that, thereby reducing outgoing jitter to a very small fraction of the incoming jitter without having to worry about buffer over- and under-runs. And no resampling is required.

This paper is from late 2006...where is the Benchmark DAC2 that will be based on it?!?
smily_headphones1.gif
 
Dec 18, 2008 at 7:10 AM Post #2,130 of 3,058
Quote:

Originally Posted by Scrith /img/forum/go_quote.gif
Very interesting. Yes, this is more technical description of what I was describing...use a buffer to generate an average clock rate for a large sample of data, then base the output rate on that, thereby reducing outgoing jitter to a very small fraction of the incoming jitter without having to worry about buffer over- and under-runs. And no resampling is required.

This paper is from late 2006...where is the Benchmark DAC2 that will be based on it?!?
smily_headphones1.gif



I am really thinking of getting the DAC1, but if there is going to be DAC2, I guess it will be worth waiting, not?
darthsmile.gif
 

Users who are viewing this thread

Back
Top