Quote:
Originally Posted by mulveling
Actually, DACs like the DAC2 and DAC1 still have to synchronize to the signal clock - they are effective at attenuating jitter, though. Something like the Chord DAC64, which buffers all incoming data and tosses the signal clock in favor of its local clock, is truly independant of transport/interface jitter (unless it's great enough to cause data errors) - though the consequence is that it does run a slight risk of buffer overrun/underrun over a long period of time.
|
I didn't include the DAC64 in my discussion because, at US$3000, it didn't meet my halfway-reasonably-priced criterion
. Seriously... I don't claim sophistication in this stuff; I'm very much a layperson who's trying to understand the complexities of digital audio - simply but accurately.
After seeing your post, I had a look at the Stereophile review of the DAC64; in the introduction, Atkinson states: Quote:
I referred above to the RAM buffer. This is basically arranged as a FIFO (First-In, First Out) store. In theory, the clock accuracy with which the data are clocked into the FIFO doesn't matter, as the data are clocked out with a high-precision local crystal, which in turn should reduce jitter to vanishingly low levels. In practice, there has to be some means of locking the local clock to the long-term-averaged clock of the incoming data, which will mean low-frequency jitter might still propagate to the DAC chip. ... |
(emphasis added)
My understanding is that any external DAC using an SPDIF interface needs to synchronize with the incoming clock data simply to be able to read the signal. Writeups about the Benchmark DAC1 have described the attention to PCB design required to isolate the effect of the incoming jittery clock from the resampled internally clocked signal that the ASRC sends to the DAC chip. Is there theoretically any difference in the concept of isolating the incoming clock from the internally clocked bitstream depending on whether an ASRC or buffer is used as the intermediary between the two?
Rereading my post, it seems like I was overstating to say, in regard to ASRC DACs, that "the resampled signal... is not affected by timing errors that may have existed in the original signal." But I wonder about your statement, "Something like the Chord DAC64, which buffers all incoming data and tosses the signal clock in favor of its local clock, is truly independant of transport/interface jitter..." Is there some way that the buffer-based design of the DAC64 is "truly independent" in a way that ASRC solutions aren't? Or is it perhaps more correct to say that the extent to which either successfully create new signals that are independent of transport/interface jitter depends on how well the engineers have accomplished the isolation from the input clock?
FWIW, given my limited ability to understand such, it appears that the DAC1 and DAC64 perform very similarly on the Stereophile jitter measurements.
Always learning...
Best,
Beau