Oh, Cardas says that resonance is a problem in audio cables. I better believe them and disregard the truth then, since they know everything about cables and wouldn't try to hide any evidence that might show that their cables don't have any meaningful advantage over other cables. They supposedly ring like a bell...
http://www.audioholics.com/education/cables/debunking-the-myth-of-speaker-cable-resonance
Let's do a little LC calculation (R is negligible) of the resonance of an interconnect. Let's say you're using
this one, since they give the capacitance and inductance per foot. Let's also say that we're using a long four foot interconnect. If you run the numbers through the LC resonance frequency equation for a four foot length, assuming the effect of the plugs is negligible as there's no mention of their performance (perhaps not an entirely sound argument, but the length of the plugs is tiny in comparison to the four foot cable) - you'll find that the resonance frequency for that particular interconnect is 25 MHz... Not even close to audible frequencies.
About that Cardas link - that's a really lackluster description of the procedure (there's no way anyone could exactly duplicate what he did - which is absolutely vital if you expect anyone to take you seriously) - and is he really comparing the noise response of two cables with different input? Someone who sells and "tests" cables should at least have the ability to output the same signal into two different cables at the same time (heck, put it out into mono with one of each cable) and record the exact same waveform for both. He doesn't show any sort of scale either, so it's impossible to even see how much that is and if it really actually means anything. Of course, this is coming from a cable manufacturer, so who knows to what level the procedure and results are biased...
As for measuring the SNR/THD/FR of cables, running the cable from the analog output to input of a DAC/ADC - explain how that is an invalid method of testing? You're subjecting the cable to the exact same conditions as it will be in use, assuming the input impedance of the ADC is similar to that of amps - a fair assumption as every analog input today (preamp, amp, or ADC) has a high impedance input to facilitate voltage bridging. Thus the performance parameters will be the same as when in real usage with an amp.
Quote:
Then you offer to capture the analog outputs of your DAC on a cheap USB soundcard(your Roland thingie?) in order to prove that jitter doesn't exist? Did I get this right? Many recent DAC's use ASRC because resampling lowers jitter drastically....this is exactly what you intend to do. This is nicht gonna work, and you will tell me that they all sound the same to you...yada yada
Anyway, you guys seem pretty happy w/ the gear you own, and all the cables sound the same too...how is that not a good thing?
Do you understand how measurement works? Let's assume his ADC has produce a whopping 1 ns of jitter. Let's also say he was to do testing of artificially generated jitter ranging from 1000 ns down to 50 ns. Say that among that he does blind testing between 100 ns and 50 ns simulated jitter.
Now I'm not an expert on summing noise, but for argument's sake let's say when you run a signal through subsequent sources of jitter (the simulation, then the DAC), you add them together to get a total jitter range. So for his 50 ns sample there should be 50 ns + 1 ns = 51 ns of jitter. Likewise, for 100 ns of simulated jitter there should be 101 ns of total jitter. A control sample would have 1 ns of jitter of course, as there's none synthesized.
If a DAC with 1 ns of jitter would be impacting the results of this test in a meaningful way - please explain how. If the jitter of the DAC is covering up the induced jitter, explain how.
Quote:
And to get back to jitter audibility:
http://www.avguide.com/forums/jitter-audibility-robert-harley-and-keith-johnson-comment
"Quote from the text linked above "Experiments were carried out in the listening booth or studio that each listener had offered. The examiner only brought there a personal computer with a digital audio interface and a mouse and each listener provided his or her favorite DAC, amplifiers and loudspeakers."
Knowing what we know about USB DACs and interfaces between portable computers and DACs, it's no wonder that jitter differences were masked by this inferior and highly variable interface methodology. Also these tests were done in 2005 when the quality of USB DACs was far inferior to today."
Do the same studies w/ the 46ps spec'ed $189 TC Konnekt 6, and I'll be more willing to take their results into serious consideration.
Well that's all fine and dandy that some random guy from a forum says that. But how exactly did the USB to S/PDIF impact the performance? Was its jitter too high? What was its jitter? Of course, they don't say what the interface used was, but are there even any USB to S/PDIF converters in existence with jitter that would make the difference between 500 ns and 250 ns insignificant? What was the jitter of the DACs involved? I hope that professionals in the industry are using DACs with jitter low enough that does not make the difference between 500 ns and 250 ns of jitter insignificant - and again, are there any DACs that bad in commercial production? I feel like half the "audiophiles" on the planet would have a heart attack if there as a DAC sold as a professional or hi-fi product with, say 100 ns of jitter... Yes, I agree that more details of that study would help solidify its conclusions as valid or invalid. But with jitter levels that high undetectable I don't see how any even remotely competent USB to S/PDIF converter or DAC could possibly undermine that study. The difference between, say, 46 ps and 1 ns (or even a bad 14 ns) on a remotely competent DAC when you're comparing 500 ns to 250 ns is entirely irrelevant to the test and will have no impact on the results.