Quote:
Originally Posted by CSMR /img/forum/go_quote.gif
Yes this is a fact based on properly conducted tests gauging the limits of audibility.
"Audibility of a CD-Standard A/D/A Loop Inserted into High-Resolution Audio Playback". E. Brad Meyer and David R. Moran. JAES 55(9) September 2007.
Abstract:
QUOTE
Claims both published and anecdotal are regularly made for audibly superior sound quality for two-channel audio encoded with longer word lengths and/or at higher sampling rates than the 16-bit/44.1-kHz CD standard. The authors report on a series of double-blind tests comparing the analog output of high-resolution players playing high-resolution recordings with the same signal passed through a 16-bit/44.1-kHz "bottleneck." The tests were conducted for over a year using different systems and a variety of subjects. The systems included expensive professional monitors and one high-end system with electrostatic loudspeakers and expensive components and cables. The subjects included professional recording engineers, students in a university recording program, and dedicated audiophiles. The test results show that the CD-quality A/D/A loop was undetectable at normal-to-loud listening levels, by any of the subjects, on any of the playback systems. The noise of the CD-quality loop was audible only at very elevated levels.
|
Your PROOF
by citing this Paper is not convincing and does not help your claim one bit. This test is not about DAC differences between high quality and cheap DACs since most of the players were decent and midfi in quality and included a Yamaha DVD player, a Pioneer CD player and a HHB Digital recorder player. The loop test was made with the HHB, which is not well known for using highly resolving converters. And the test were run to determine the audibility of higher resolution formats
NOT DAC quality. The results can hardly infer that DAC quality is not audible, merely that the last significant bits of a higher resolution format are not audible at normal listening levels by pedestrian listeners.
I can guarantee you if you threw lots of audiophiles in a room to listen to music on a system they never heard with music with which they were not familiar testing only the last few bits of resolution the outcome would be similar to that expressed in the test you reference. This is not about hearing differences between a decent cheap DAC like that found on a good CD player and a more expensive standalone DAC.
These tests reminds me of tests on witnesses seeing simulated crime films where they are told to watch the films then describe what the criminal was wearing. The results are laughable. The descriptions are all over the map, yet each viewer saw all the facts. But train them on what to look for and the results change.
Listening for differences takes experience. Harmon International is just now figuring that out and is using events to educate people on what to listen for. A more educated listener is able to pick out things by preference that just so happen to line up with higher accuracy in their tests. The conspiracy theorists will now come in and make their statement about how Harmon is only doing this to sell more high end stuff. Well, duh! Yes, 20 or so people at a time will certainly shoot sales through the roof. But if they can educate some folks on what music might sound like on a good system vs the boom tiss of some Bose minicubes as it relates to getting us closer to the illusion of real live music, then this is perhaps a good thing.
I'll get off of my soapbox now as I have no further desire to debate our differences nor to look at half a..ed tests that have nothing to do with proving your claim. I agree to disagree.
cheers
EDIT PS: Some anecdotal info about the test from some forums where the folks actually read the whole test and not the abstract.
"Though our tests failed to substantiate the claimed advantages
of high-resolution encoding for two-channel audio,
one trend became obvious very quickly and held up
throughout our testing: virtually all of the SACD and
DVD-A recordings sounded better than most CDs—
sometimes much better. Had we not “degraded” the sound
to CD quality and blind-tested for audible differences, we
would have been tempted to ascribe this sonic superiority
to the recording processes used to make them."
They were trying to show that 16/44.1 is a sufficient bit depth and sampling rate and that even using high resolution masters it would not degrade the sound.
From the block diagram they are going A/D/A so using a high-res SACD/DVD-A as the 'master' and then converting it to analog, resampling the analog signal at 16/44.1 on the fly, then converting back to analog for playback resulted in no discernable difference from an original CD of the same material that was sampled at 16/44.1.
if you read carefully they are trying to 'prove' that 16/44.1 is sufficient to fully capture the original analog signal. In their test they do that by taking a high-res master that was captured at a higher bit depth/sampling rate and then resample it at 16/44.1 under the theory that if the original high bit depth/sampling rate were truly superior, resampling that signal at a lower bit depth/sampling rate would clearly show differences. It did not.
They do however add the caveat that an original analog signal sampled at a higher bit rate/sampling frequency could very well sound better when played back at the same bit depth/sampling frequency.
I think the intent was to show that 16/44.1 is sufficient to fully capture the original original analog recording and their tests show that even if the original was sampled at a higher rate, it adds nothing if that original is converted to analog and then resampled at 16/44.1. In other words, they believe 16/44.1 is sufficient to fully capture the original analog signal.