Transport do matter!!!
Sep 22, 2007 at 4:01 PM Post #61 of 63
Quote:

Originally Posted by ezkcdude /img/forum/go_quote.gif
Your comment leads me to believe you still don't understand the problem we are discussing: Jitter. I think you are referring to error correction. Do you understand the difference?


Skip my remark about cd's not being re-read. Of course they are...
blink.gif


Now let's talk about jitter. Jitter occurs when the receiving end doesn't receive the data at the right time. Right?
How is it possible that the resulting wav file from a rip and also from just playing in realtime (which is slower), will always be the same? That implies that the receiving end *does* receive all the bits in time, else that wav would be different! Hence my conclusion that jitter is a non-issue. Maybe I'm being too simplistic??

Question: if synchronization is important then there must be a mechanism to keep it in sync. How does that work? Shouldn't that avoid jitter ?
 
Sep 22, 2007 at 7:50 PM Post #62 of 63
Quote:

Originally Posted by sejarzo /img/forum/go_quote.gif
What I was asking about is really error concealment, and how often that occurs with a good transport and disc.


Under these conditions given not at all.


Quote:

Originally Posted by sejarzo /img/forum/go_quote.gif
I'd be very interested in reading the info if you would kindly post the link!


After having a look at the pages, i realised that the link i mentioned could be of a limited use since the tests were about the behavior when handling uncorrectable errors which were caused intentionally. Anyway pretty interesting:

http://pageperso.aol.fr/lyonpio2001/dae/dae.htm

Technical details about error correction:

http://pageperso.aol.fr/lyonpio2001/...et/chipset.htm

In a German forum, there was a discussion about what you're asking for. Some handled it to apply leds to the c1/c2 test points inside a cd-player to measure the encountered errors. There seems to be professional equipment for this purpose, however the economy way is to monitor the s/pdif-output of the cd-player. Uncorrectable errors are indicated by the "valid"-flag which is part of the red book specification.

This article tries to figure out the cause of perceived differences when using several cd-tunings. Definately one of the best proceedings i found so far about this topic. Amazingly, even here (despite the inclination to "voodoo") it is pointed out that uncorrectable errors and thus error concealment is not the reason for any reported sonic differences.

Some further interesting arguments. Note that the term "jitter" even has a further meaning. In regard to the pit/lands it describes the tolerance deviation from the predefined lengths (to the clock reference 3T-11T).

Quote:

Originally Posted by infinitesymphony /img/forum/go_quote.gif
Where is jitter created? It's been attributed to a lot of different sources: clock quality, cable quality, DAC quality, drive/lens quality, etc.


To my understanding, this phenomenon is inbuilt due to the fact that there simply is no perfect timing. Power supply, clock, cables - in a world of picoseconds, everything has influences. That's why i think the request to reduce this jitter at the pickup or transport is of no avail since it will be added later again, anyway.

Quote:

Originally Posted by AS1 /img/forum/go_quote.gif
Jitter occurs when the receiving end doesn't receive the data at the right time. Right?


Correct. In the case of samples you could say that they are received properly and at the correct nominal rate but with a variing "distance" between the samples.

Quote:

Originally Posted by AS1 /img/forum/go_quote.gif
How is it possible that the resulting wav file from a rip and also from just playing in realtime (which is slower), will always be the same? That implies that the receiving end *does* receive all the bits in time, else that wav would be different! Hence my conclusion that jitter is a non-issue. Maybe I'm being too simplistic??


This is possible because all devices which conform to the s/pdif-standard have to be able to capture the data correctly despite the jitter which is always present. Please refer to this paper, where the section "Jitter specifications of AES/EBU interface" may be of special interest for you.
smily_headphones1.gif
 
Sep 23, 2007 at 3:43 AM Post #63 of 63
Quote:

Originally Posted by AS1 /img/forum/go_quote.gif

Question: if synchronization is important then there must be a mechanism to keep it in sync. How does that work? Shouldn't that avoid jitter ?



It's called a PLL (Phase-Locked Loop). Unfortunately, it does not stand for "Perfectly Locked Loop".
wink.gif
When we talk about timing issues, you have to understand that jitter is orders of magnitude less than the timing between clock transitions (eg picoseconds vs nanoseconds). Just imagine that it takes some time for the clock to go from zero to one. Any change in this timing introduces some jitter. By generating a clean clock at the DAC (for example, a crystal oscillator with less than 1 ps jitter), and doing asynchronous resampling of the input signal, jitter can be reduced substantially. So, the issue is not whether jitter leads to bit errors. Just think of jitter as analog noise that can creep into an otherwise digital signal. There are ways to reduce jitter, but without generating a perfect clock (for example, a clock that can go from zero to one in an infinitesimal amount of time), jitter can not be eliminated entirely.
 

Users who are viewing this thread

Back
Top