Audiophile by birth, head-fier by choice. I am new to Head-fi discussions, so I might have missed lots of discussions but this seems like a fun thread to post to. I am a EE by profession. Part of my daily routine is clock measurement for gigabit communication systems. That's frequencies in ranges of 10^9 Hz and higher. So when I see picoseconds and ppm talked about in these threads, my mouth starts to water.
What I want to say is that when operating at frequencies that I am working on, where bit times themselves are on the order of few tens of picoseconds, every picosecond counts. We take this further to breakdown the types of jitter in the clocks in order to understand the causes and possibly mitigate the the problem. There are numerous ways to measure jitter. There is no industry agreement on how jitter is measured and which parameters are critical for a specific application. For the audio DAC applications I would imagine the more relevant jitter metrics are either cycle-to-cycle and/or period jitter. The former are timing variations between adjacent clock cycles, where is period jitter is variation of any clock cycle with respect to desired (zero-jitter) clock period. These two metrics are actually interrelated through a difference equation. Furthermore, jitter is further broken down into random and bounded distributions. The random element is interesting because theoretically there is no limit to amount of jitter as more measurements are made over time. In other words, there is always some low but non-zero probability that there will be clock transition increasingly before or after the ideal clock time. Therefore, clocks jitter numbers need to be always measured with a qualifier, such as over how many clock cycles, or to a confidence level for a specific probability. My point I am trying to make here is that descriptions like "5ps peak-to-peak" have little value as a key element of information is missing.
I'd like to comment about the frequency deviation from nominal, typically referred to as ppm of a clock source. This is usually defined as a static frequency offset with respect to the desired frequency of the clock source. As we all know, ppm stands for parts per million, with 1ppm being 10^-6 deviation. As a simple example, I will choose an ideal 1000 hertz tone out of a DAC sourced by a +/-100ppm clock. A maximum of 100pm deviation will generate a tone somewhere between 999 and 1001 hertz. In otherwords, the deviation simply manifests itself as a minute shift in pitch. I don't know how many people would be able to distinguish a pitch shift to that detail, but I am personally skeptical that even a fairly loose frequency deviation like this example will create perceivable difference. You can run a simple tone test yourself on your setup and see if you could determine the differences in pitch.
I am not going by get into the spectral characteristics of jitter here, which is a fairly complex topic. In itself it is often a cruical piece of information about the jitter. I suspect might be relevant to audio applications, but off the top of my head it's not immediately obvious what the impcat might be. I would imagine the resulting waveform is a function of jitter spectral characteristics, sampling rate, and sampling technology (PCM vs a sigma-delta of DSD).