USB quality - does it make any difference?
Nov 22, 2009 at 6:26 PM Post #31 of 46
Quote:

Originally Posted by Duggeh /img/forum/go_quote.gif
I didn't make conclusions based on testing. I stated facts. The USB spec sheets are freely available here. All devices make to work to those specifications are USB compatible. Those not made to compatible spec may have issues. I will not reiterate my previous points regarding compatibility. The issue relating to single bus power, as was patently obvious from my post, came from my own personal experience with a particular situation.

Additionally, I am not a MOT in the business of selling cables. Empirical Audio are. As such, claims made should be substanciated, if there is no evidence for claims made by manufacturers then those claims, from any company, scientific though they may often sound, are no different that claiming that their product is superior because it uses magic dust harvested from the wings of organically grown pixies.

Please don't miscontrue me. I'm not trying to slander what has been claimed. But it is a claim that is best served by some means of verifiction.



There are a plethora of measurements and graphs on my website for my cables, as well as simulation results.

There are also some measurements for my products, such as my Overdrive DAC done with Audio Precision equipment. Other measurements for jitter on my USB devices and reclockers are forthcoming.

Steve N.
Empirical Audio
 
Nov 22, 2009 at 8:08 PM Post #32 of 46
The only factors that are relevant in a serial cable for data transmission are it's performance wrt packet drops and latency. These are trivial to measure with a loopback device and any number of free software available online. You would think that companies selling USB "audio" cables for hundreds of dollars could spend 10 minutes to get some actual performance metrics.
 
Nov 22, 2009 at 9:12 PM Post #33 of 46
Quote:

Originally Posted by ert /img/forum/go_quote.gif
The only factors that are relevant in a serial cable for data transmission are it's performance wrt packet drops and latency. These are trivial to measure with a loopback device and any number of free software available online. You would think that companies selling USB "audio" cables for hundreds of dollars could spend 10 minutes to get some actual performance metrics.


Dropouts and hiccups are a non-issue. Jitter is the only thing worth characterizing, and this requires frequency sweeps and modulation of the jitter stimulus. Non-trivial.

Steve N.
Empirical Audio
 
Nov 22, 2009 at 9:17 PM Post #34 of 46
Duggeh, no offense meant, and I was surely not backing up a MOT.

My point is that SPDIF is also build to specs in all consumer devices, but still can sound widely different.

There is no point in symplifying discussions by stating facts and measures when those do not garantuee all other possible factors (identified or not) are equal. Have scientists been able to explain with facts and measurements the origin of the universe... Of course not, still, they can observe its consequences.

I feel an open mind is necesarry when seeking for the best possible sound and give credit to other's opinions even if they differ from ours.
 
Nov 22, 2009 at 10:14 PM Post #35 of 46
Quote:

Originally Posted by audioengr /img/forum/go_quote.gif
Dropouts and hiccups are a non-issue. Jitter is the only thing worth characterizing, and this requires frequency sweeps and modulation of the jitter stimulus. Non-trivial.

Steve N.
Empirical Audio



Where are you measuring jitter? Why go to such extreme when a trivial usb packet analyzer can compare the performance of two cables?
 
Nov 23, 2009 at 1:58 AM Post #36 of 46
Quote:

Originally Posted by ert /img/forum/go_quote.gif
Where are you measuring jitter? Why go to such extreme when a trivial usb packet analyzer can compare the performance of two cables?



A USB analyzer cannot measure psec of jitter. I have to send the equipment off to a professional test engineer with an Audio Precision system. This system rents for more than $2K per month. Cost is astronomical. A $100K scope would do the measurements too, but not automated.

How many businesses do you thing would spent $100K on a scope just so they can publish some jitter measurements?

Steve N.
 
Nov 23, 2009 at 4:45 AM Post #37 of 46
Hi,

There will always be jitter of some sort on a USB interface as there isn't a dedicated clock connection. The USB cable has four wires 2 for power, 2 for digital transmission there isn't clock line.

Therefore the sending clock (the USB bus on the computer) and receiving clock (the DAC) can and will drift. Most USB DACS I've come across use some sort of PLL (phase lock loop) to mitigate this affect.

You can read more about how the designer of the PCM2702 USB DAC went about minimizing this here:
The D/A diaries: A personal memoir of engineering heartache and triumph

So to answer the original question:
Is there any difference in quality between the digital audio coming out of a cheap mother board's USB sockets vs a high quality USB out?

Yes, there is. It depends on how well matched the clocks on the sending and receiving device are and how good your receiving device (DAC) is at correcting any differences.

Note: This isn't applicable to devices like printers where if things don't make sense the printer can ask for the data again. If the DAC did this it would result in a pause in sound.
 
Nov 23, 2009 at 6:57 AM Post #38 of 46
Quote:

Originally Posted by Rooford /img/forum/go_quote.gif
There will always be jitter of some sort on a USB interface as there isn't a dedicated clock connection.


First, methodologically correct and well designed studies should prove whether jitter at the levels observed in audio equipment has any real effect on subjective sound quality. Otherwise, people get fooled by companies selling them this "jitter is bad" unconfirmed slogan.
 
Nov 23, 2009 at 4:20 PM Post #40 of 46
Yep in totally agreement with you AdamWysokinski.

It doesn't matter if you spend large amounts of cash on a dedicated USB out. If the clocks on either side are different it doesn't matter.

In practical terms this problem can be all but eliminated using clever hardware techniques, as the article goes on to explain.

I'm sure all USB DACs now use these techniques. Which I guess was the point I was trying to make, instead of being a smart arse in saying there's always going to be jitter.

To minimize noise introduced from the USB cable (which I doubt you'd notice) always use the shortest length possible and ensure it's shielded.
 
Nov 23, 2009 at 6:21 PM Post #41 of 46
Quote:

Originally Posted by Rooford /img/forum/go_quote.gif
Yep in totally agreement with you AdamWysokinski.

It doesn't matter if you spend large amounts of cash on a dedicated USB out. If the clocks on either side are different it doesn't matter.

In practical terms this problem can be all but eliminated using clever hardware techniques, as the article goes on to explain.

I'm sure all USB DACs now use these techniques. Which I guess was the point I was trying to make, instead of being a smart arse in saying there's always going to be jitter.

To minimize noise introduced from the USB cable (which I doubt you'd notice) always use the shortest length possible and ensure it's shielded.



Shorter is not always better. If you dont go short enough, you will likely increase jitter because the cable reflection hits the receiver just as it is detecting the edge. I recommend 1m. If you go too long, then you are adding jitter due to low-pass filtering and dispersion. See this paper:

spdif
 
Nov 24, 2009 at 3:14 AM Post #42 of 46
Quote:

Originally Posted by audioengr /img/forum/go_quote.gif
A USB analyzer cannot measure psec of jitter. I have to send the equipment off to a professional test engineer with an Audio Precision system. This system rents for more than $2K per month. Cost is astronomical. A $100K scope would do the measurements too, but not automated.

How many businesses do you thing would spent $100K on a scope just so they can publish some jitter measurements?

Steve N.



I understand the cost fully (I've been in this position myself). When I say "trivial" to measure, I mean there is an existing technology (scope) somewhere that may be used, in contrast to a "non-trivial" measurement where you basically have to invent a better technology (ie r&d). This is slightly beside the point though. The data should already be available. If low-jitter is the most desirable quality of a so-called "audiophile" USB cable, then the jitter measurements of the cable should already be known relative to a "standard" USB cable. Otherwise I don't see how the cable manufacturer can honestly design a cable that claims to be superior.
 
Nov 24, 2009 at 7:01 AM Post #43 of 46
Quote:

Originally Posted by ert /img/forum/go_quote.gif
I understand the cost fully (I've been in this position myself).


I am aware that DBT/ABX are not welcomed in this forum, but let me say that there is no point in buying expensive equipment to test jitter level until it is proven that people are able to distinguish normal and voodoo cables (later, if they could, we would test which they prefer).

BTW in every science lab that I've been working in or visiting, very expensive, precise and sensitive equipment is attached to computers using normal cables and jitter is never a problem. Pretty strange, huh?

I suggest that manufacturers of fancy Ethernet/USB/etc. cables should contact these guys - thousands of digital connections are used there. I'd like to see their reaction to a suggestion that their Higgs boson measurements are incorrect because of "significant" influence of jitter affecting data transmission through USB/Ethernet cables
k701smile.gif
 
Nov 24, 2009 at 2:42 PM Post #44 of 46
Quote:

Originally Posted by shamu144 /img/forum/go_quote.gif
My point is that SPDIF is also build to specs in all consumer devices, but still can sound widely different.


My understanding is the majority of consumer equipment does not conform to the SPDIF spec, especially wrt proper termination.
 
Nov 25, 2009 at 5:08 AM Post #45 of 46
Quote:

Originally Posted by AdamWysokinski /img/forum/go_quote.gif
BTW in every science lab that I've been working in or visiting, very expensive, precise and sensitive equipment is attached to computers using normal cables and jitter is never a problem. Pretty strange, huh?


Not strange at all, if the signal or data being transmitted doesn't require accurate timing. We're talking about signals carrying audio data and equipment that relies on accurate timing.

Interestingly, since you are mentioning other fields, I had a friend come to visit some months ago, and while he was trying out my rig, I was telling him about how my DAC uses a (most commonly used for) networking ASIC and the supposed benefits. His comment (he is a software engineer for a telco) is that it makes sense, as you're dealing with similar issues in network data transmission, if not exactly the same issues (eg: I don't think picosecond timing accuracy matters in network gear).
 

Users who are viewing this thread

Back
Top