(I'm replying to RRod's replies to me - I can't seem to get the quote mechanism to handle it correctly. RRod's replies will be in regular text, and my replies to them after - and flagged with >>'s
Originally Posted by
RRod
It's fine to want overwhelming evidence, but what counts for that is up to you, and I don't it's unreasonable for someone to ask what your standards are. For instance, a one-off 9/10 that someone reports online hardly convinces me. Someone getting 23/25 in an actual controlled study gets my attention.
>> I agree, and what I consider sufficient depends on the circumstances. If I'm buying a bottle of wine, I'm perfectly willing to go with a majority of reviews on a wine blog; after all, it'll be gone tomorrow anyway. However, when someone is telling me that "there is absolutely no difference between a high-res file and a CD", here's what that means to me..... If they're right, I get to save $5 by buying the CD instead of the high-res download. If they're wrong, I save $5, but end up enjoying that song just a little less than I might have. Even worse, I'll have to buy it all over again when I find out my mistake or, even worse than that, the good copy might no longer be available by then. Paying $5 more today to buy the high-res version that just might possibly be better seems like the safest bet there to me.
I don't know enough about old ADCs/DACs to know how well they performed. Anyone care to chime in? As far as modern stuff, I've seen plenty of things that were purty-near flat from 20-20k (I would post a link but it's from a "he-who-shall-not-be-named"). That *some* early CDs sounded bad doesn't mean they all did. I just picked up a copy of the famous 1979 Telarc 1812 overture and the sonics are great. So are they from various other early ventures I have lying around.
>> A lot of early CDS sounded rather bad - probably for a variety of reasons. However, the requirements of "properly" converting analog to digital include absolutely filtering out ALL content above the Nyquist frequency before doing the conversion. That means that, in order to make a CD without really bad distortion, you must use a filter that is flat to 20 kHz, yet is down at least 80 dB at 22 Khz. In the days of analog filters this was virtually impossible to achieve, and so some compromises were always involved, which prevented the reality from performing anywhere near the theoretical performance. (Oversampling avoids this requirement, but oversampling wasn't available as a technology when the red Book standard was written.)
I've never found a reliable source on the LvB 9th anecdote. Besides, the 44.1 standard was around just a bit before the CD. The
Wiki on 44100 gives some alternative theories, but that's not a reliable source either I guess.
>> Yeah, I've heard a lot of variations on the story. However, I'm pretty sure that sample rates significantly above 48 kHz weren't readily available at the time. This means that when they tested "whether Red Book standard was audibly identical to the original" what they were really testing was whether it was audibly identical to an analog master tape. However, today we have "originals" that are far better than analog master tapes, and most people I know don't believe that analog master tape is "audibly identical to the original". (And the fact that a CD could be made to sound indistinguishable from 1970's vintage analog master tapes, when played through 1970's vintage amplifiers and speakers, really doesn't convince me that they're "audibly indistinguishable
FROM THE ORIGINAL". In short, I'm not convinced that "the best equipment in 1976 was audibly perfect - and any and all improvements claimed since then are either snake oil or wishful thinking".