Global clocks and jitter are all about marketing.
^This!
Originally Posted by Roseval /img/forum/go_quote.gif
[1] Some say 20 ps is the audible threshold.
[2] I do think it is obvious that improving the clocking improves jitter performance.
1. Yes, people say all sorts of things but I've never heard of a reliable test on actual commercial content which demonstrates an audible threshold below 20ns.
2. Not really, not if you are talking about the clocks themselves. Processing of the clock signal has advanced significantly in the last 20 years or so and while manufacturers employ various different methods of processing this signal, they all, AFAIK, employ at least one method. Improving the clock itself is therefore largely irrelevant, an ADC or DAC could easily have a relatively poor clock compared to another ADC or DAC but have better jitter performance depending on how that timing signal is processed after it leaves the clock.
I was instantly informed how science couldn't be discussed in the thread.
Yep, there's no point using logic like; jitter is defined by science, so any rational discussion about it must include science. The problem, as ralphp effectively mentioned, is that the science does not support the marketing or those duped by it. The only solution to this problem is therefore to eliminate science!
So is there any science that supports audible differences based on global clocks?
There is certainly evidence that using an external clock can create audible differences. We are talking about very low level differences though, near the limits the threshold of audibility. Crucially, in any discussion about global or master clocks, is the nature of those differences and this is where the marketing and the science head off in completely opposite directions! Given optimal processing of the external clock signal by the receiving ADC or DAC, there will be NO difference between running it on it's own internal clock signal or an external one. However, few ADCs and DACs process the external clock signal optimally, in which case it's performance will be
poorer with an external clock signal, possibly audibly poorer.
If an external clock will never provide any improvement, why do they even exist? Many studios use them, in fact I use one myself in my own studio. However, I don't use it to improve audio fidelity, I use it because I often have to synchronise and lock various simultaneous signals together from different sources and the tiny potential loss in fidelity caused by using an external masterclock is insignificant compared to the inaccurate synchronisation of not using one. When I don't need to synchronise signals from different sources, then I bypass my masterclock and run on the internal clock, to maximise fidelity. In a consumer environment there is no synchronisation of simultaneous signals from different sources, so a global clock will at absolute best make no different and at worst audibly degrade the signal.
If you're interested, here's an article, including tests (from a highly reputable source) on the subject:
Can an external clock signal really improve the performance of your digital audio devices?
G