Quote:
You would hope they record at 24/192; Labels that record popular genres of music generally aren't that particular except for rare "indie" labels. So someone has actually said 24/192 is overkill & just a huge waste of space, huh ? With that kind of attitude it really isn't much of a surprise that so many recordings have given "mediocrity" a whole new definition (as in garbage). It was'nt a so called "professional" that made this statement was it ? (Though it wouldn't surprise me as there doesn't appear to be any shortage of "pros" in any field who are so clueless they give the real pros a bad name! (this is only my non-professional opinion of course!!!) (& I generally don't care for lawyers either! Hehehe)
It is in fact a giant waste of space.
http://www.benchmarkmedia.com/discuss/sites/default/files/Upsampling-to-110kHz.pdf
Also, that white paper doesn't cover it, but even the best 24-bit A/D and D/A converters cannot actually resolve the entire dynamic range afforded by 24 bits of quantization. At best, you can get 20 bits or so. That is, unless you cryogenically cool all your electronics.
So the absolute best that recording technology can possibly record is roughly 20 bit, 110 kHz, give or take a few kHz depending on the chip/implementation. Anything higher than that is a total waste of space in terms of storage. There are definite exceptions when higher (24 or possibly 32) bit depth can be useful in processing audio files, such as when mastering, mixing, or even using software volume control. But as far as the capture and storage of audio goes, that's the best that electronics can possibly do.
And that's before we even look at audibility. To sum it up shortly, the only difference between 24 bit and 16 bit audio is quantization distortion and quantization noise. Quantization distortion is nasty and can easily be audible because it distorts waveforms in distinct patterns. However, quantization distortion is easily eliminated through the use of dither. Proper noise-shaping dither masks audible quantization distortion patterns with very, very low level noise (that is entirely inaudible under normal listening volumes) added to the signal.
It's possible to hear this quantization noise if you listen at what would be extremely high, ear-damaging volumes with normal music - by turning up very quiet passages to very loud volumes. But at normal listening volumes, level matching, and with proper dither (not always a given), the difference between 24 bit and 16 bit audio is for all intents and purposes inaudible. Also, occasionally, intermodulation distortion in equipment can create audible artifacts at higher sample rates (i.e. 96 kHz vs. 44.1 kHz). But that has nothing to do with higher sample rates themselves being audible.
Similarly, different masters can very easily sound different. This is why high resolution audio sounds better most of the time - not because it is high resolution, but because it was better mastered to begin with. This alone is a good enough reason to buy high resolution audio, but if the same master is available at 16/44.1 or even 24/96 versus 24/192, it's safe to go with the 16/44.1 files.
This is a good overview of the topic:
http://www.head-fi.org/t/415361/24bit-vs-16bit-the-myth-exploded
And hydrogenaudio extensively covers ABX'ing high resolution versus CD resolution audio:
http://www.hydrogenaudio.org/forums/index.php?showtopic=49843