@pompom...
The most generalized and rigorous way to express phase jitter in
clocks is in terms of
x ps RMS integrated from m Hz to n Hz... for example, 15 ps from 10 Hz to 100 kHz. Specifying the bandwidth is critically important to make a valid comparison. These numbers are derived from measurements made in the
frequency domain, using devices known as phase noise analyzers. The FFT is the key measurement technique. The $40K Symmetricom 5120a is an example; the $90K Agilent E5052. These prices give you a clue as to why not every company has these numbers readily at hand. But, these instruments are commonly available and the best manufacturers use them on a daily basis. Many designers do not understand the importance of phase noise and cannot be expected to bring out state-of-the-art products as a result.
Think of phase noise as the degree to which there are frequencies present at farther and farther offets from the intended frequency. For example, if 1000 kHz away from a 12.2896 MHz carrier there is -50 dB, in one product, but only -130 dB in another product, the second has much lower phase noise and other things being equal, will be a better clock.
Accurate clocks are as we all know critical for optimal A/D and D/A conversion. Furthermore, it is the opinion of some of the leading designers and recording engineers that the very low frequency offsets, meaning 0.1 Hz to, say, 100 Hz, are extremely critical with respect to realism, presence, air, and soundstaging.
My own experience is that with reference-grade gear, this is certainly true. Some DACs allow external word clocks, such as the Grimm CC-1, to be used, and with some, but not all, DACs, the increase in realism is extraordinary. The Grimm has very low phase noise in all parts of the spectrum. To get better phase noise, one has to go to very expensive but capable DACs such as the $20K TAD C-2000, which uses ovenized crystal-controlled oscillators.
Another type of jitter measurement relates to the clock period and is measured in the
time domain. Think of this as determining when the edge of a clock signal actually takes place, compared to when it should take place. The overall jitter measured has various underlying subtypes (random, deterministic, data driven, etc.) which all, unfortunately, add up in the wrong way. A less measurement expensive device, such as a WaveCrest DTS (various models are available used for $1-2K), can make reasonably repeatable and sensitive measurements. The results are expressed as
x ps RMS period jitter (y ps peak-peak). Devices under 50 ps RMS jitter should sound very well; the best devices, according to recording engineers and other folk with oversized ears and a lot of experience, are down around 2-3 ps jitter (the MSB DACs, for example).
The time domain jitter specs for S/PDIF are complicated by the fact that this signalling format mixes clock and data in the same serial bit stream. Thus, one must extract the clock, which in turn, requires a very high quality phase-locked-loop or PLL, and then measure the jitter. Building PLLs that are suitable for this is very difficult.
In either the phase noise or the period jitter technique, it's important that the measurements be taken at the S/PDIF output connector, because there's many a slip between the clocks and the outside world. Advertising copy that talks about "theoretical jitter" of such and such, well, draw your own conclusions.
Take a look at the phase noise plot... The Trimble (green) has integrated phase noise of only 16
femtoseconds, or 0.016 ps, from 10 Hz to 100 kHz. This clock is a special-purpose, single-frequency device that is designed for GPS applications, so, sorry, you can't put one in your DAC. The Grimm CC-1 is the blue trace, and it shows 260 femtoseconds over the same range; the Audiophilleo has 610 femtoseconds.
So, sure, one can measure jitter down to handsful of femtoseconds. What James was referring to in his excellent survey of jitter measurements on the Anedio Web site was more along the lines of eyeballing FFT spectra, measuring the jitter spurs, and then doing a simple calculation to get an idea about the jitter. The noise floor of the D/A process pretty much limits this approach to 1 ps; grosser values in practice. Some audio analyzers can do jitter measurements, but if you read the fine print, their accuracy is typically 1-2 nanoseconds (1000-2000 picoseconds).
And that, boys and girls, is that.