24bit vs 16bit, the myth exploded! -CONCLUSIONS-
Apr 23, 2015 at 4:46 PM Thread Starter Post #1 of 3

cdsa35000

500+ Head-Fier
Joined
Oct 14, 2014
Posts
579
Likes
146
Conclusions:

In a naturalistic survey of 140 respondents using high quality musical samples sourced from high-resolution 24/96 digital audio collected over 2 months, there was no evidence that 24-bit audio could be appreciably differentiated from the same music dithered down to 16-bits using a basic algorithm (Adobe Audition 3, flat triangular dither, 0.5 bits).

http://archimago.blogspot.ca/2014/06/24-bit-vs-16-bit-audio-test-part-i.html

http://archimago.blogspot.nl/2014/06/24-bit-vs-16-bit-audio-test-part-ii.html
 
Apr 24, 2015 at 3:43 AM Post #2 of 3
This test does not actually prove that 16 bit resolution is transparent at 44.1 kHz sample rate (where it is most often used). That is because at 96 kHz it has lower noise density, and it is basically equivalent to 16.56 bits at 44.1 kHz. It is not a major difference, though, and 44.1/16 would likely also have been transparent anyway, but for those who want to nitpick, a 3.38 dB lower noise floor in the audio band could in theory make the difference between barely audible and inaudible.
 
Apr 26, 2015 at 2:16 PM Post #3 of 3
Originally Posted by stv014 /img/forum/go_quote.gif
 
for those who want to nitpick, a 3.38 dB lower noise floor in the audio band could in theory make the difference between barely audible and inaudible.

 
Not bloody likely! If someone wants to nitpick on as minute a difference as that as far down in volume on that, they probably aren't worth listening to, because they are obviously clueless about how the measurements relate to what we actually hear.
 

Users who are viewing this thread

Back
Top