Quote:
Originally posted by DustyChalk
Holy crap, that's not a test! A - it's not realtime, as CD playback is. [...] Sorry, you've just completely undermined my confidence in your statements that you have never had an error. |
We're NOT talking about CD playback here -- we're talking about the integrity of the data in a CD-R as compared to the original. What is it that you don't understand here? I'm comparing the original to the source. No difference in the bits. Period.
Your concern about whether this is done on-the-fly is irrelevant to the question: Is the data the same? The answer is "yes." Now, the CD-R may or may not have more or less jitter than the source, but the data -- the ones and zeros -- are the same. Remember that when the CD-R is burned, there's no going back to fix questionable sectors. Quote:
I said:
Most DACs derive their clocks from the data...
to which DustyChalk replied:
Huh?!?!? No they don't... |
No, you've got this wrong too. The VAST MAJORITY of DACs on the market do exactly that. Most articles about jitter point this out early on. For example, let's look at the very first article Tim D posted a link to, which contains the following statement: "A typical D to A converter derives its system clock (the clock that controls the sample and hold circuit)
from the incoming digital signal." (Emphasis mine.) Here's a link if you want to check it yourself:
http://www.digido.com/jitteressay.html
Note too that jitter could be made a non-issue if only manufacturers would re-clock the data before feeding it to the DAC. As you correctly pointed out, there are manufacturers that make the correct parts to do this, but sadly they are seldom used except in high-end gear. Of course, you confuse the issue with your very next paragraph, in which you basically agree with what I said: Quote:
Now, I admit that badly designed DAC's can be affected by badly clocked data -- that's the crux of my argument, but the converse is also true. You can have all the jitter you want in the digital domain, as long as (a) you don't lose data and (b) you "fix" it all in the last stage, then the rest of it is unimportant. Unfortunately, (b) is rarely true. |
See? How is that any different from what I said? Quote:
The definition of jitter is "a deviation of the digitized signal from its optimal value" -- in this case, the actual signal its supposed to represent. Seems to me that that has everything to do with the A/D and D/A processes. |
Why do you insist on
interpreting the definition? "A deviation of
the digitized signal..." -- i.e., the bitstream. Yes, it will affect the sound once converted -- I never said it wouldn't -- but "jittery" is a description of the mistimings of the 1s and 0s, not a description of the reconstructed waveform on the output side of the DAC. If jitter was a property of the sound itself, reclocking the data
before D/A conversion could not possibly fix the problem. Ergo it is a property of the digital bitstream. We keep going back and forth on this particular point, and I have no idea why. Quote:
Horse hockey, I say! This is complete jibberish. [Referring to the "computer screen" analogy.] The only way that jitter is analogous to screen brightness is if there is some sort of analog reconstruction, which there isn't. The timing problems -- and effects -- caused by jitter would be more analogous to streaming video, and the effects of timing problems on the motion described by that video. |
Look, it wasn't
my analogy, it was aos's. And unless I'm mistaken, he's arguing the same side of this as you are. I was working with what I was given. Besides, the statement I highlighted in blue above is completely wrong anyway -- a standard television screen is absolutely an analog reconstruction of (in some cases, such as with DVD) digital data. I don't think the analogy is worth pursuing further anyway. But your criticism of it as "not applicable" should be aimed at aos, not me. Quote:
Don't discount what Joe has to say just because he hasn't read the entire thread. I do believe his experience with CD burning is more valid than this entire academic discussion (including mine!). |
I wasn't discounting what he had to say -- I was simply pointing out that his concerns had already been discussed. Heck, he actually supported my main point -- that random errors in the data can't have a non-random result after conversion. I'm not discounting him! Speaking of which, I would like to quote another point from the first jitter reference: "The audible effect of this jitter is a possible loss of low level resolution caused by added noise, spurious (phantom) tones, or distortion added to the signal." Again, no mention of "normal" changes in EQ, such as brightening compared with the source.
Once again, I find myself awake fairly late (hey, I'm a father of three young kids). So I'm going to address one more point and then head off to bed. Tim D wrote (in part): Quote:
Brighter is a subjective term. It is NOT an objective term describing the rise of higher frequencies. |
Sorry, Tim -- but it is EXACTLY a term describing a rise of frequencies in the treble range. Whether you find this objectionable or not is a subjective determination, but "brighter" describes (and has for as long as I can remember) a rise in treble energy. If you don't like that definition, then kindly substitute "increased treble energy" for "brightened" everywhere I used the term. The points are still valid.
And with that, I bid you all "goodnight."