jackmccabe
500+ Head-Fier
- Joined
- Feb 10, 2010
- Posts
- 557
- Likes
- 18
I will perform a few of these with my emu 1820m when I can.
My only caution with these tests would be: I'd want the recording equipment to have an equal, or lower THD overall than the equipment I am measuring, otherwise I'd be wasting my time. I'm sure I could do the same tests with my computer, but I'm also quite sure that its analogue audio system has inferior specifications compared to my audio gear.
A difference in jitter would definitely be recognized, because the amount of jitter is defined by how the USB stream is received by the DAC. I can't seem to understand how noise levels should, in any way, interfere when we have coherent, 100% similar signals. If one of the usb streams would include interference, "noise" or "jitter", "red" and "grey" wouldn't be identical.
I think you are mixing up problems with the transportation of time-discrete (S/PDiF, AES/EBU) digital signals, which are totally different to the USB stream feeding the audio interface. This is one of the reasons why a hardware "buffer" exists. Of course, when talking about analog signals, you are totally right with induced noise, interferences etc.
These terms also come into play when we're talking about the quality of a DAC and the related analog output stage, but definitely not in regards to USB streams.
No test tone was sent, I used complex waveforms (in fact, Jazz music, to be more precise, a piece from a David Sanborn album).
The S/PDIF in of my computer wasn't used, signal went from S/PDIF Mbox2 out -> S/PDIF Mbox2 in, USB used both ways.
re: 1. - I do not know what you mean by "resolving", as the signal never leaves the digital domain.
re: 2. - "red" is marketed as audiophile cable, reported in the review to "improve the listening experience".
re: 3. - explained
re: 4. - The coaxial cable is a rather expensive, quality cable used in studios all over the world, even IF the coaxial cable would be of lower quality, chances are that any "degradation" will degrade both signals the same way, so there still would be a difference between "red" and "grey"
A great test you did there! Now, I wonder how this would work if the audio was converted to analog, then recorded again with professional recording equipment? I'm guessing it'd be exactly the same, but I think the test should be done regardless. Still, this doesn't seem like it belongs in the cable forum anymore, but I guess that's up to the mods.
Very interesting experiment!
I could somehow understand why people want to purchase seemingly way-overpriced digital cables (USB, HDMI), psychologically, because they might think the super-cheap no-name cable simply does not match their $$$$$ investment in the system. It seems a placebo effect to me. You FEEL that the high-end USB cables make a difference because you KNOW they carry a big price tag. Maybe a blind A-B test is the most reliable and believing.
Originally Posted by vandaven /img/forum/go_quote.gif
Please stop thinking that your computer or the USB stream clocks your DAC - I'll give you an example: I can set my Nuforce uDAC2 to any sample rate I want and it will still play back the music without any drops, clicks or pops, for example, I can put on a 96 kHz 24 bit WAV, switch the NuForce to 44.1 kHz and 16 bit and have the unique experience how it would sound back in the 90s on CD. Try the same thing with a 48 kHz 24 bit DAT via digital S/PDIF connected to a device that only talks 44.1 kHz 16 bit and experience what real digital errors sound like (for example, clicks & plops @ 0 dBFS).
Please remember that DBT/blind test discussion isn't permitted in the cable forum (due to the result inevitably being that sane conversation becomes impossible soon after it is brought up).
First, thanks for taking the time of doing another round of test. You've nicely shown that any artifact due to the cable in your system is under 80db or so (having a quick look at the specs of your equipment) and thus likely unaudible.
What you suggest doesn't prove anything; it depends on your whole DAC implementation. According to the litterature I've found on the uDAC2, it includes a sample rate converter (burried in the ESS dac chip they use, it's their "jitter reduction" feature). The USB receiver gets the 24/96 USB stream from the computer, outputs a 24/96 I2S stream and the SRC transforms it into I2S 16/44.1 before feeding the DAC section. If the sample rate converter is asynchronous (very likely as it's the current fashion), the clock feeding the DAC doesn't have anything left to do with the USB stream. I can do the same with SPDIF, ASRC are wonderful little beasts.
If you take a simpler DAC however, you don't have that SRC. I should know, I've built USB DAC based on both clocking schemes (and not from kits). Unlike SPDIF, you have many different ways to implement USB audio.
Please read http://www.audioasylum.com/forums/pcaudio/messages/7719.html and http://www.eetimes.com/design/audio-design/4009467/The-D-A-diaries-A-personal-memoir-of-engineering-heartache-and-triumph
To put things in perspective: a commonly measured jitter figure for the audio clock of the widespread PCM2704-7 serie of adaptive usb receivers is around 3ns (it seems it could go down to 300ps when special care in implementation is taken).