- Coaxial is a particular methode of transmitting spdif, so is optical. What do you want to say exactly ?
- It is mathematically proven that jitter, as small as it is, will corrupt the analog signal. Did you even read the first paper I linked ? Jitter as low as 150ps can be reliably measured in practice by analysing the analog output of a DAC. The question is how much jitter (and of which type, signal correlated or not) is needed to have an audible effect. To get an accurate analog signal, you need the correct samples values at the DAC input AND those values must be fed at the correct timing.
- You realize that most USB DAC on the market are still using USB 1.1 receivers from TI, don't you ?
- Don't put words in my mouth. Noise will not corrupt data but it will surimpose itself on the output signal and corrupt the analog signal. Example: depending on which PC I connect my usb DAC to (desktop, laptop on batteries, etc), I can get the noise floor (at the analog output) varying by as much as 6dB. With sensitive headphones, that can get pretty audible.
- SATA has error correction and works only with purely digital devices. It doesn't care much about noise. However, you have to realize that USB receivers, SPDIF receivers and DAC are real time, mixed-signal devices. They are nothing like SATA. You put noise on the spdif receiver supply and the (analog) PLL performance is reduced which in turn reduce the accuracy of the system clock it's producing. You put noise on the DAC supply pins and you'll find part of it on the analog output. It's not a matter of corrupted data.
You could read those too:
It can't be real-time...
DACs don't do bit by bit real-time conversion, they use buffer. Look at this Julia screenshot:
For example our buffer is 32 samples. When it gets all 32 samples it converts all of those into analog wave. And it doesn't care how much jitter there was, the only thing that matters is that all 32 samples are received in one tact.
So it doesn't convert bit by bit in real time, but in those pieces of samples. And it will wait for all samples before converting them into analog wave.
Let's say jitter is a random delay on each bit. But buffer waits until all bits of given sample are completed. It doesn't record the actual delay, it just waits until all 32 are ready and the converts that. Just one bit comes sooner, one later.
You are talking about jitter in general, I am talking how it works with DACs.
And USB 1.1 sucks big time, that's why you can't go higher than 44khz/16bit in usb 1.1.
Edited by Drake22 - 3/8/11 at 4:47am