Don't get why "Audiophile" USB Cable would improve sound quality
May 21, 2011 at 9:12 PM Post #167 of 835


Quote:
My only caution with these tests would be: I'd want the recording equipment to have an equal, or lower THD overall than the equipment I am measuring, otherwise I'd be wasting my time.  I'm sure I could do the same tests with my computer, but I'm also quite sure that its analogue audio system has inferior specifications compared to my audio gear.


In the test I did, the THD equals the THD that is "delivered" with the recording, as it never passes through an analog circuitry. 
 
What was questioned in this thread related to a person asking "why an 'audiophile' USB cable would improve the sound quality". As this question also yielded some importance to me, I asked myself how validation could be found, went out, bought a cable, and checked it in a situation in which the only critical part is the USB cable, not the converters or any analog stage within the signal chain. I performed this test with the "red" cable recommended in a review and by the sales person in the HiFi shop. 
 
I also want to exaggerate that I went into this review unbiased. I also want to point out that I am absolutely positive on the fact that there are sound differences between certain converters, amplifiers, and cables for time-discrete digital and analog audio connections.
 
I am sorry that the outcome of my little test might not approve the "group opinion" about esoteric USB cables, and maybe the "red", even though advertised as improving the listening experience, does not work as such. Maybe your precious USB cable makes a difference, and even if it would only make a difference to your interpretation of sound, it's a personal win.
 
I am here because I like to listen to great music recordings in the best possible way my budget allows me to, and I just wanted to apply my experiences to this thread, which is coming out of an empirical (and re-create-able) little experiment. I would be more than happy to report that there are differences between "red" and "grey", and then go into further investigation. 
 
 
 
 
May 21, 2011 at 9:36 PM Post #168 of 835
Just to explain how "inverse summing" works inside a digital audio workstation, I prepared a couple of spectral analysis jpeg uploads:
 
these are:
 
1. original 96 kHz 24 bit material:
 

 
2. original 96 kHz 24 bit material converted to 48 kHz 24 bit:
 

 
3. original 96 kHz 24 bit material minus (= inverse summing) 48 kHz 24 bit material (upsampled to 96 kHz again to perform the summation:
 

 
4. original (but different to 1. / 2. / 3.) source audio material for "red" vs. "grey":
 

 
5. "red" minus "grey"
 

 
May 21, 2011 at 9:59 PM Post #169 of 835
Quote:
A difference in jitter would definitely be recognized, because the amount of jitter is defined by how the USB stream is received by the DAC. I can't seem to understand how noise levels should, in any way, interfere when we have coherent, 100% similar signals. If one of the usb streams would include interference, "noise" or "jitter", "red" and "grey" wouldn't be identical. 
 
I think you are mixing up problems with the transportation of time-discrete (S/PDiF, AES/EBU) digital signals, which are totally different to the USB stream feeding the audio interface. This is one of the reasons why a hardware "buffer" exists. Of course, when talking about analog signals, you are totally right with induced noise, interferences etc.
 
These terms also come into play when we're talking about the quality of a DAC and the related analog output stage, but definitely not in regards to USB streams.
 

 
1/ Did you forget that you have no DAC in your fully digital chain ? I maintain that a difference in jitter in the audio clock built upon the USB stream wouldn't be recognized. Think about it again: the USB receiver receives the data stream, converts it to spdif (in terms of bits, this conversion can be absolutly bit-perfect, it depends on your drivers and hardware) and then sends it out at a speed derivated from a clock recovered through a PLL  from the timing of the packets of the USB stream (in isochronous and adaptive modes). Two cases now:
- If the spdif stream goes into a digital input and is kept in the digital realm, the quality of that clock is pretty much irrelevant, as long as we can read all the bits in the right order and put them in a big buffer (your HDD in the end). Which is what you demonstrated.
- If it goes into a spdif to I2S receiver and the I2S into a DAC IC (typical use), the quality of that clock will in turn affect the quality of the clock recovered by the spdif receiver and thus finally the digital to analog conversion.
 
The same reasoning applies if the USB receiver is linked straight to a DAC, minus the passage through SPDIF. If the USB audio stream is in asynchronous mode however, then the jitter of the recovered clock is reduced to only the inherent jitter of the clock inside the usb peripheral and the USB cable cannot influence it.
 
2/ As for noise and interferences: a full digital system like you've got is quite robust under that respect. You'd need extreme levels to get an error. But a mixed signal system like a DAC is much more sensitive to noise. Don't forget that your USB cable is a two functions cable: data and power. If external hf interferences can get into the cable ground or power lines, if 1Khz spikes can couple from the data lines to the same lines, etc., then this noise in the power lines will affect (to an extent defined by the design and implementation of the USB device) the proper operation of the DAC IC and of the following analog stage. Your test cannot account for those factors.
 
3/ Actually, you don't have much in terms of hardware buffer in your typical USB audio playback devices. Not much more than in SPDIF receivers. Thank accountants who don't like expensive die area tied up for buffers and engineers who don't like latency for that. 
 
May 21, 2011 at 10:03 PM Post #170 of 835
A great test you did there!  Now, I wonder how this would work if the audio was converted to analog, then recorded again with professional recording equipment?  I'm guessing it'd be exactly the same, but I think the test should be done regardless.  Still, this doesn't seem like it belongs in the cable forum anymore, but I guess that's up to the mods.
 
May 21, 2011 at 10:09 PM Post #171 of 835
Man, all this testing just wrinkles my nose hairs! No matter the facts, I am keeping my fancy USB as a plain grey cable (without pin-stripes, even) would just simply destroy the color scheme of my audio system.
frown.gif

 
(...and from the great beyond, “Dandy” Don Meredith is heard singing: “Turn Off The Lights, The Party’s Over.”)
 
May 21, 2011 at 10:45 PM Post #172 of 835


Quote:
No test tone was sent, I used complex waveforms (in fact, Jazz music, to be more precise, a piece from a David Sanborn album).
 
The S/PDIF in of my computer wasn't used, signal went from S/PDIF Mbox2 out -> S/PDIF Mbox2 in, USB used both ways.
 
re: 1. - I do not know what you mean by "resolving", as the signal never leaves the digital domain.
re: 2. - "red" is marketed as audiophile cable, reported in the review to "improve the listening experience".
re: 3. - explained
re: 4. - The coaxial cable is a rather expensive, quality cable used in studios all over the world, even IF the coaxial cable would be of lower quality, chances are that any "degradation" will degrade both signals the same way, so there still would be a difference between "red" and "grey"
 
 
 


Very interesting experiment!
I could somehow understand why people want to purchase seemingly way-overpriced digital cables (USB, HDMI), psychologically, because they might think the super-cheap no-name cable simply does not match their $$$$$ investment in the system. It seems a placebo effect to me. You FEEL that the high-end USB cables make a difference because you KNOW they carry a big price tag. Maybe a blind A-B test is the most reliable and believing.
 
 
May 21, 2011 at 10:58 PM Post #174 of 835
Well, I'll keep my thoughts that aftermarket USB cables do make a difference in sound quality, not on improving sound itself but rather eliminating issues like crackling and drop-outs, for which the (double) ferrites do seem to work wonders. So I find that said cables work for correction, not improvement. But it's merely my opinion, and it's what works for me and what I've seen for a fact, everyone else is entitled to their opinion.
 
May 21, 2011 at 11:21 PM Post #175 of 835
Alright, now you really made me go down to the control room and do it DAC - ADC.
 
The reason why I wanted to avoid this is because it might be a bit misleading for some people, but I guess the measurements speak for themselves. Please do not forget that this is happening in the analog realm as well as in the digital, and while the digital realm can be sample-precise with a complete phase cancellation as mentioned above, a summed peak level of, let's say, -55 to -62 dB can already be considered as "phase cancellation". This means: In the analog realm, there will always remain a tiny bit of very low audible information while in the digital realm, nothing will remain in the same process. 
 
To demonstrate that neither cheapo USB cable "grey" nor "improved listening experience" cable "red" hold positive or negative aspects against each other, I now did the following:
 
a. Connect high quality analog audio cable to LINE OUT of the audio interface
b. Connect other end of high quality analog cable to LINE IN of the audio interface
c. Connect computer & audio interface via USB cable "red"
d. Play back a complex waveform (Jazz Music, David Sanborn - what a great artist) through DAC, re-record at the same time via ADC, repeat with "grey"
e. Repeat again with "red" to demonstrate slight deviation of converters and analog circuitry
f. compare all recordings via phase inverse summing 
 
This leads to the following results via spectral analysis:
 
1. "red" take 01 minus "grey"
 

 
2. "red" take 02 minus grey
 

 
3. "red" take 01 minus "red" take 02
 

 
 
My conclusion:
 
- Neither "red" nor "grey" are the better USB audio cable, red still looks better, grey is still exactly double as long and way thinner than "red".
 
- The Mbox2 is really not the best audio interface in the world, but I won't compare it to my Nuforce (in the sense of a review).
 
Please stop thinking that your computer or the USB stream clocks your DAC - I'll give you an example: I can set my Nuforce uDAC2 to any sample rate I want and it will still play back the music without any drops, clicks or pops, for example, I can put on a 96 kHz 24 bit WAV, switch the NuForce to 44.1 kHz and 16 bit and have the unique experience how it would sound back in the 90s on CD. Try the same thing with a 48 kHz 24 bit DAT via digital S/PDIF connected to a device that only talks 44.1 kHz 16 bit and experience what real digital errors sound like (for example, clicks & plops @ 0 dBFS).
 
Cheers!
 
 
 
 
May 21, 2011 at 11:28 PM Post #176 of 835

Quote:
A great test you did there!  Now, I wonder how this would work if the audio was converted to analog, then recorded again with professional recording equipment?  I'm guessing it'd be exactly the same, but I think the test should be done regardless.  Still, this doesn't seem like it belongs in the cable forum anymore, but I guess that's up to the mods.


Quote:
Very interesting experiment!
I could somehow understand why people want to purchase seemingly way-overpriced digital cables (USB, HDMI), psychologically, because they might think the super-cheap no-name cable simply does not match their $$$$$ investment in the system. It seems a placebo effect to me. You FEEL that the high-end USB cables make a difference because you KNOW they carry a big price tag. Maybe a blind A-B test is the most reliable and believing.
 


Yep, I quickly did that test. Blind A-B ain't sufficient, as chances are always 50-50. I know people who won in the lottery (not me though). One could get it "wrong" or "right" by pure luck, at least in regards to "red" 
gs1000.gif

 
 
May 22, 2011 at 12:24 AM Post #177 of 835
Please remember that DBT/blind test discussion isn't permitted in the cable forum (due to the result inevitably being that sane conversation becomes impossible soon after it is brought up).

Back to the discussion, I can only imagine a different USB cable being of benefit where the receiving DAC has a digital input circuit that is sensitive to differences in the waveform of the signal it receives. However, in such a case (and for myself recently) I'm far more inclined either to spend money on a better DAC without these issues or a high quality USB to S/PDIF converter.

Vandaven: Good on you for taking some measurements. My post before was to point out that people here often tend to have an attitude that a single measurement can be all-conclusive in what it shows and don't consider (as a genuine scientist would) all the possible factors that may influence the results.
 
May 22, 2011 at 6:27 AM Post #178 of 835
First, thanks for taking the time of doing another round of test. You've nicely shown that any artifact due to the cable in your system is under 80db or so (having a quick look at the specs of your equipment) and thus likely unaudible.
 
Quote:
Originally Posted by vandaven /img/forum/go_quote.gif
 
Please stop thinking that your computer or the USB stream clocks your DAC - I'll give you an example: I can set my Nuforce uDAC2 to any sample rate I want and it will still play back the music without any drops, clicks or pops, for example, I can put on a 96 kHz 24 bit WAV, switch the NuForce to 44.1 kHz and 16 bit and have the unique experience how it would sound back in the 90s on CD. Try the same thing with a 48 kHz 24 bit DAT via digital S/PDIF connected to a device that only talks 44.1 kHz 16 bit and experience what real digital errors sound like (for example, clicks & plops @ 0 dBFS).
 

 
What you suggest doesn't prove anything; it depends on your whole DAC implementation. According to the litterature I've found on the uDAC2, it includes a sample rate converter (burried in the ESS dac chip they use, it's their "jitter reduction" feature). The USB receiver gets the 24/96 USB stream from the computer, outputs a 24/96 I2S stream and the SRC transforms it into I2S 16/44.1 before feeding the DAC section. If the sample rate converter is asynchronous (very likely as it's the current fashion), the clock feeding the DAC doesn't have anything left to do with the USB stream. I can do the same with SPDIF, ASRC are wonderful little beasts.
 
If you take a simpler DAC however, you don't have that SRC. I should know, I've built USB DAC based on both clocking schemes (and not from kits). Unlike SPDIF, you have many different ways to implement USB audio.
 
Please read http://www.audioasylum.com/forums/pcaudio/messages/7719.html and http://www.eetimes.com/design/audio-design/4009467/The-D-A-diaries-A-personal-memoir-of-engineering-heartache-and-triumph
 
To put things in perspective: a commonly measured jitter figure for the audio clock of the widespread PCM2704-7 serie of adaptive usb receivers is around 3ns  (it seems it could go down to 300ps when special care in implementation is taken).
 
May 22, 2011 at 2:12 PM Post #180 of 835


Quote:
First, thanks for taking the time of doing another round of test. You've nicely shown that any artifact due to the cable in your system is under 80db or so (having a quick look at the specs of your equipment) and thus likely unaudible.
 
 
What you suggest doesn't prove anything; it depends on your whole DAC implementation. According to the litterature I've found on the uDAC2, it includes a sample rate converter (burried in the ESS dac chip they use, it's their "jitter reduction" feature). The USB receiver gets the 24/96 USB stream from the computer, outputs a 24/96 I2S stream and the SRC transforms it into I2S 16/44.1 before feeding the DAC section. If the sample rate converter is asynchronous (very likely as it's the current fashion), the clock feeding the DAC doesn't have anything left to do with the USB stream. I can do the same with SPDIF, ASRC are wonderful little beasts.
 
If you take a simpler DAC however, you don't have that SRC. I should know, I've built USB DAC based on both clocking schemes (and not from kits). Unlike SPDIF, you have many different ways to implement USB audio.
 
Please read http://www.audioasylum.com/forums/pcaudio/messages/7719.html and http://www.eetimes.com/design/audio-design/4009467/The-D-A-diaries-A-personal-memoir-of-engineering-heartache-and-triumph
 
To put things in perspective: a commonly measured jitter figure for the audio clock of the widespread PCM2704-7 serie of adaptive usb receivers is around 3ns  (it seems it could go down to 300ps when special care in implementation is taken).




This is about the alleged effects of the cable, you are talking about the sender and receiver and how they interact. By criticising the cable you are shooting the messenger who has nothing to do with what else is going on.
 

Users who are viewing this thread

Back
Top