Head-Fi.org › Forums › Equipment Forums › Cables, Power, Tweaks, Speakers, Accessories (DBT-Free Forum) › Optical TOSLINK vs. USB: Which connection is better to connect a DAC?
New Posts  All Forums:Forum Nav:

Optical TOSLINK vs. USB: Which connection is better to connect a DAC? - Page 4

post #46 of 105
Quote:
Originally Posted by 00940 View Post

Quote:

 

- Coaxial is a particular methode of transmitting spdif, so is optical. What do you want to say exactly ?

 

- It is mathematically proven that jitter, as small as it is, will corrupt the analog signal. Did you even read the first paper I linked ? Jitter as low as 150ps can be reliably measured in practice by analysing the analog output of a DAC. The question is how much jitter (and of which type, signal correlated or not) is needed to have an audible effect. To get an accurate analog signal, you need the correct samples values at the DAC input AND those values must be fed at the correct timing.

 

 

- You realize that most USB DAC on the market are still using USB 1.1 receivers from TI, don't you ?

 

- Don't put words in my mouth. Noise will not corrupt data but it will surimpose itself on the output signal and corrupt the analog signal. Example: depending on which PC I connect my usb DAC to (desktop, laptop on batteries, etc), I can get the noise floor (at the analog output) varying by as much as 6dB. With sensitive headphones, that can get pretty audible.

 

- SATA has error correction and works only with purely digital devices. It doesn't care much about noise. However, you have to realize that USB receivers, SPDIF receivers and DAC are real time, mixed-signal devices. They are nothing like SATA. You put noise on the spdif receiver supply and the (analog) PLL performance is reduced which in turn reduce the accuracy of the system clock it's producing. You put noise on the DAC supply pins and you'll find part of it on the analog output. It's not a matter of corrupted data.

 

 

You could read those too:

http://www.tnt-audio.com/clinica/diginterf1_e.html

http://www.stereophile.com/reference/1093jitter

 

 

 


It can't be real-time...

 

DACs don't do bit by bit real-time conversion, they use buffer. Look at this Julia screenshot:

20110308-ayoe-54kb.jpg

 

For example our buffer is 32 samples. When it gets all 32 samples it converts all of those into analog wave. And it doesn't care how much jitter there was, the only thing that matters is that all 32 samples are received in one tact.

So it doesn't convert bit by bit in real time, but in those pieces of samples. And it will wait for all samples before converting them into analog wave.

Let's say jitter is a random delay on each bit. But buffer waits until all bits of given sample are completed. It doesn't record the actual delay, it just waits until all 32 are ready and the converts that. Just one bit comes sooner, one later.

 

You are talking about jitter in general, I am talking how it works with DACs.

 

And USB 1.1 sucks big time, that's why you can't go higher than 44khz/16bit in usb 1.1. 


Edited by Drake22 - 3/8/11 at 4:47am
post #47 of 105

rolleyes.gif

 

Yeah, sure, right, whatever. You're mixing up latency buffers and what happens at the DAC side, don't know what spdif is, have no clue about I2S timing at a DAC's inputs. And yet you put your nonsense forward as if it were the Gospel.

 

That latency buffer has nothing to do with jitter. It's there to avoid missing samples.

 

To put it simply... Your data (0 and 1) is stored on your HDD. It must be transferred towards a digital transmission interface (and sometimes manipulated along the way). This interface will push data out at every tick of a clock. The DAC, hardware side, is expecting that data at every tick. See the problem now ? How can we guarantee that the digital transmission interface will always have something to send, at every tick ? What happens if the HDD slows down, if your computer is suddenly loaded down and the software manipulation of the data takes more time than expected ? That's where the latency buffer comes into play. We don't start the transmission immediately but we wait to have some samples on hand on a fast buffer next to the digital transmission interface so that we can preserve the real time flow needed for the spdif (or isochronous usb) connection to work.

 

This explanation isn't too bad: http://support.apple.com/kb/ht1314

post #48 of 105
Quote:
Originally Posted by 00940 View Post

rolleyes.gif

 

Yeah, sure, right, whatever. You're mixing up latency buffers and what happens at the DAC side, don't know what spdif is, have no clue about I2S timing at a DAC's inputs. And yet you put your nonsense forward as if it were the Gospel.

 

That latency buffer has nothing to do with jitter. It's there to avoid missing samples.

 

To put it simply... Your data (0 and 1) is stored on your HDD. It must be transferred towards a digital transmission interface (and sometimes manipulated along the way). This interface will push data out at every tick of a clock. The DAC, hardware side, is expecting that data at every tick. See the problem now ? How can we guarantee that the digital transmission interface will always have something to send, at every tick ? What happens if the HDD slows down, if your computer is suddenly loaded down and the software manipulation of the data takes more time than expected ? That's where the latency buffer comes into play. We don't start the transmission immediately but we wait to have some samples on hand on a fast buffer next to the digital transmission interface so that we can preserve the real time flow needed for the spdif (or isochronous usb) connection to work.

 

This explanation isn't too bad: http://support.apple.com/kb/ht1314

 


You've just answered yourself in your own statement. What about jitter when data goes into buffer with jitter already. And then inside the DAC you'll have the same jitter regardless if it's received from usb or optical if we have buffer in the end after transmission. So jitter won't affect the processing in any case.

 

I know what an spdif is, in that sentence I meant toslink vs coaxial.

 

Nonsense is saying that you can hear the difference between different digital carriers.

I'll see my way out :)

 

post #49 of 105
Quote:

Originally Posted by Drake22 View Post

 

What about jitter when data goes into buffer with jitter already.


??? Once the data goes into a big enough buffer, there is no jitter to speak about anymore. The sample simply wait for their turn to leave the buffer, at the tick of the clock. Once they leave the buffer... well go back reading TNT's article, it's decent enough.

 

Quote:

Originally Posted by Drake22 View Post

 

 And then inside the DAC you'll have the same jitter regardless if it's received from usb or optical if we have buffer in the end after transmission.

 

Who told you had a buffer in the end after the transmission ? Because you don't in most cases or not big enough to remove the jitter. And of course, there's still the question of the system clock DAC side. How do you generate it properly, knowing that it must be somehow linked to the source time domain ? Because, even if you have a buffer on the receiving side, the data still has to leave it 1/on proper timing so that the D/A conversion is done properly, 2/ in some link with the source clock so that you don't empty or fill in too quickly your buffer.

 

 

edit:
 

Quote:

Originally Posted by Drake22 View Post

 

Nonsense is saying that you can hear the difference between different digital carriers.

 

I'm not making the case you can hear it... I'm making the case that you can objectively measure it and that there are sound engineering reasons for that. Differences in between digital carriers are cold, hard facts as far as SPDIF and USB audio protocols are concerned.

 

Audibility is another thing.

 

 

 

This whole discussion remind me of this :

 

http://www.aoselectronics.com/jitter_article.html


Edited by 00940 - 3/8/11 at 6:31am
post #50 of 105
Quote:
Originally Posted by 00940 View Post




??? Once the data goes into a big enough buffer, there is no jitter to speak about anymore. The sample simply wait for their turn to leave the buffer, at the tick of the clock. Once they leave the buffer... well go back reading TNT's article, it's decent enough.

 

Who told you had a buffer in the end after the transmission ? Because you don't in most cases or not big enough to remove the jitter. And of course, there's still the question of the system clock DAC side. How do you generate it properly, knowing that it must be somehow linked to the source time domain ? Because, even if you have a buffer on the receiving side, the data still has to leave it 1/on proper timing so that the D/A conversion is done properly, 2/ in some link with the source clock so that you don't empty or fill in too quickly your buffer.


*on the way out*

 

"??? Once the data goes into a big enough buffer, there is no jitter to speak about anymore."

 - that's my point you've been disagreeing about. And that's gotta be some insane jitter to miss the buffer, hence why there are different buffer settings to prevent that.

 

"Who told you had a buffer in the end after the transmission ?"

 - every DAC has a buffer.. find me a DAC with no buffer and 0 latency that does bit by bit real time conversion

 

Mate, it seems that you have such a superDAC that has no latency buffer. In that case, you are awesome.

Seriously speaking, you are the one with no clue.

 

post #51 of 105

 

Quote:
Originally Posted by Drake22 View Post


"??? Once the data goes into a big enough buffer, there is no jitter to speak about anymore."

 - that's my point you've been disagreeing about. And that's gotta be some insane jitter to miss the buffer, hence why there are different buffer settings to prevent that.

 

"Who told you had a buffer in the end after the transmission ?"

 - every DAC has a buffer.. find me a DAC with no buffer and 0 latency that does bit by bit real time conversion

 

Mate, it seems that you have such a superDAC that has no latency buffer. In that case, you are awesome.

Seriously speaking, you are the one with no clue.

 



Here is a simplified typical digital chain:

 

HDD/CD/DVD - latency buffer - spdif emitter - SPDIF cable - spdif receiver (dir 9001 for ex.) - I2S lines - DAC chip (PCM1794 for ex.).

 

 

Jitter is a problem that starts with the spdif emitter. Explain me how the latency buffer will change anything.

post #52 of 105


 

Quote:
Originally Posted by 00940 View Post

 



Here is a simplified typical digital chain:

 

HDD/CD/DVD - latency buffer - spdif emitter - SPDIF cable - spdif receiver (dir 9001 for ex.) - I2S lines - DAC chip (PCM1794 for ex.).

 

 

Jitter is a problem that starts with the spdif emitter. Explain me how the latency buffer will change anything.


Your DAC has a buffer.

After "spdif receiver (dir 9001 for ex.)" there is buffer. Your receiver collects the data and packs it into the buffer. It then passes it through i2s bus to the DAC who converts that buffer into analog wave on the way out.

This makes any jitter that happens during spdif transmission worthless, simply because of the buffer after that, as it waits for bits and doesn't care if they come with jitter or not.

 

Taking a DAC that has for example a toslink input and a coax input, and sending the same signal through them will result in the same graph on the output. So if everything working correctly - no sound difference.

If people say they can hear difference between them it's either:

  a) they lie for whatever reasons (they bought an expensive cable, self-suggestion or whatever)

  b) their device is defective

 

post #53 of 105

Quote:

Originally Posted by Drake22 View Post


Your DAC has a buffer.

After "spdif receiver (dir 9001 for ex.)" there is buffer. Your receiver collects the data and packs it into the buffer. It then passes it through i2s bus to the DAC who converts that buffer into analog wave on the way out.

This makes any jitter that happens during spdif transmission worthless, simply because of the buffer after that, as it waits for bits and doesn't care if they come with jitter or not.


It has been fun but now it's time to wake up. Take the blue pill and open your eyes. There is no such buffer. Read TI's datasheets. Read this http://www.tnt-audio.com/clinica/diginterf1_e.html (obviously, you still haven't done it). Everything you're saying is wishful thinking.

 

You know why ? Because consummers don't want delays in playback devices and because IC manufacturers find big buffers too costly. Read this carefully: http://www.eetimes.com/design/audio-design/4009467/The-D-A-diaries-A-personal-memoir-of-engineering-heartache-and-triumph (there are three parts, don't skip the 2nd).

 

If you want, you can read this: http://www.wolfsonmicro.com/documents/uploads/misc/en/A_high_performance_SPDIF_receiver_Oct_2006.pdf   It's maybe the closest thing to what you're describing. But you have to know that even the wolfson current spdif receivers don't use that technology according to Wolfson's representatives.

 

Tell me... why do engineers at Wolfson bother trying to improve their SPDIF receivers if everything is already perfect ?

 

Tell me... why can we measure, through the analog outputs of a DAC, differences in jitter if connected through different methods ? Logically speaking, it's only because there are differences showing up in the analog waveform, isn't it ? Or are the guys from MSB wrong ? http://www.msbtech.com/support/JitterPaper.pdf

 

 

You might be right that differences in sound are non-audible in between digital cables (the noise question set apart obviously). You are completly wrong in the reasons why it might be so.

post #54 of 105

Interesting debate, but surely the OPs question is which sounds better USB or optical, not which one works better? My impression is that when they work, they work as well as each other and that without the audibility testing that is banned from much of the forum, you cannot actually tell which, if any sounds better.

post #55 of 105

For those questions, I stand by my post #8  biggrin.gif

post #56 of 105
Quote:
Originally Posted by 00940 View Post

The V-DAC uses an asynchronous sample rate converter (hence the claim for ultra low jitter). The sonic differences in between digital inputs should be zero.

The arguments for optical over USB would be isolation of the computer ground (can induce some noise in some designs) and the fact that your soundcard optical output is probably not as dependant as USB from the CPU. USB can be laggy when there is an heavy load on the CPU (not so much of a problem with modern CPU). The drivers for your soundcard might also be better optimized for gaming than the generic USB audio drivers.


Post Non 8. It contains suggestions of why there may be a difference and instances where there a fault could be present. That is not meant to be a criticism. I am just making the point that suggestion is used a lot to explain supposed differences in sound, that then vanish when subjected to an audibility test. Then, if there is a fault, such as a ground loop, then that is not so much a reason as why one type of cable is better than another, it is just an issue than can arise with certain types of connection.

 

post #57 of 105

My small experience goes with the people who've said it depends upon the individual dac's implementation of the two interfaces. I just bought a v-link usb-s/pdif converter. The sound I'm getting out of my same dac through its coax input is noticeably better than the sound I was getting out of its usb input. Admittedly, the converter is using asynch mode to get the usb signal, so that may play into the equation.

post #58 of 105

@Prog Rock Man: I see your point. However, as I was rather making the point that, in this case, there should theoretically be no differences in sound, I would be very pleased if an audibility test proved me right on this. evil_smiley.gif

 

Of the three potential problems, two (CPU load and drivers) should be easily apparent in games by just checking the FPS. The last (ground isolation) is more tricky.


Edited by 00940 - 3/8/11 at 9:13am
post #59 of 105

Optical from my Macbook is the lowest performing connection method to my DAC. Your performance from your sound card might be better. I'm willing to bet a V-link is going to be much better, as in no comparison, than the optical from your PC. It'll also prob be better than going straight USB into the DAC.

post #60 of 105

 

Quote:
Originally Posted by Drake22 View Post


lol, 5k posts you have, are they all worthless like this?

 

What about latency? It's just a delay, the sound itself doesn't change.

 

Evidently you are incapable of learning and have to make things personal. To quote from one of the links from which you evidently know better than:

 

"Instead of impulses, usually the sequence of numbers update the analogue voltage at uniform sampling intervals.

These numbers are written to the DAC, typically with a clock signal that causes each number to belatched in sequence, at which time the DAC output voltage changes rapidly from the previous value to the value represented by the currently latched number. The effect of this is that the output voltage is held in time at the current value until the next input number is latched resulting in a piecewise constant or 'staircase' shaped output. This is equivalent to a zero-order hold operation and has an effect on the frequency response of the reconstructed signal."

 

 

 

 


Edited by grokit - 3/8/11 at 2:20pm
New Posts  All Forums:Forum Nav:
Head-Fi.org › Forums › Equipment Forums › Cables, Power, Tweaks, Speakers, Accessories (DBT-Free Forum) › Optical TOSLINK vs. USB: Which connection is better to connect a DAC?