1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.

    Dismiss Notice

24bit vs 16bit, the myth exploded!

Discussion in 'Sound Science' started by gregorio, Mar 19, 2009.
First
 
Back
330 331 332 333 334 335 336 337 338 339
341 342 343 344 345 346 347 348 349 350
Next
 
Last
  1. stonesfan129
    Next thing you know, people will be saying 24 bits isn't enough, we need to have 32-bit music.
     
  2. 71 dB
    With shaped dither that's 200 dB of dynamic range! Rocket lauches and heartbeats at natural levels in the same recording! With efficient horn speakers all it takes is a few gigawatts of power to play such recordings and kill humans and animals within 100 yards. Yeah, definitelty 32 bit music is needed! :muscle:
     
    Avi, mr.karmalicious and shwaz like this.
  3. stonesfan129
    It's just silly how people who make these outlandish claims that anything beyond CD quality makes an audible difference can never prove it.
     
  4. miksu8
    In theory 16 bits is enough, but what about real world? Bad 16bit transfer from 24bit master, due to incompetence or intentionally, to get HD versions sold. Anyone done blind tests or analyses?

    Edit: Sorry looks like this is discussed here already.
     
    Last edited: Jan 14, 2019
  5. castleofargh Contributor
    to go from a 24bit file to a 16bit file you have nothing to do but decide if and which type of dither you wish to use(a choice that under most conditions you will not notice as sounding any different). anytime a hires version is made to sound different from the CD release, of course it is intentional. be it malpractice or simply that they created a different master because somebody asked for it.
     
    miksu8 likes this.
  6. Slaphead
    32 bit probably wouldn't need dither - around 194 dB peak is the maximum an undistorted sound wave can theoretically have in the Earth's atmosphere, and 32 bit already gives you 192 dB - not counting intersample peaks.
     
    mr.karmalicious likes this.
  7. bigshot
    You can also puzzle out the numbers and see that 16 is plenty. With dither, a CD can do 90dB of dynamic range. Your listening conditions probably have a noise floor of above 30dB, so to hear the full range of a CD, you would have to boost the level of the quietest sound above that noise floor, bringing the peaks to at least 120dB. Coincidentally, 120dB is the threshold of pain and listening to sound that loud can cause hearing damage. I truth, 12 bit sound is probably enough. For more info see the article in my sig called CD Sound Is All You Need.
     
    Last edited: Jan 14, 2019
    LaughMoreDaily likes this.
  8. 71 dB
    Even below 194 dB we start to see the effects on non-linear acoustics at levels of ~160 dB.
     
    mr.karmalicious likes this.
  9. 71 dB
    1. Not plenty enough for those who think more is (always) better. 16 bits is plenty for those who understand digital audio or believe those who know this stuff.

    2. TPDF dither gives 95 dB of theoretical dynamic range, 3 dB less than just truncation error without dither (98 dB), but for sacrifying 3 dB of the dynamic range we get rid of distortion in the signal (we have total linearity). Using shaped dither we can have 20 dB (!) more perceptual dynamic range. Signals decay into the noise floor the way they do in analog audio until completely masked by the dither noise (but of course you need CRAZY volume settings to hear quiet things at signal level -110…-120 dBFS).

    3. Noise floor at 30 dB and peaks of music at 110 dB or less (sane listening that doesn't make you lose your hearing) means 80 dB or less of dynamic range needed. That translates into 13 bits, but for "DR6 pop" of today 8 bits with shaped dither would be just fine! By comparison, vinyl audio has "10 bits" worth of dynamic range at best.
     
    Steve999 likes this.
  10. old tech
    Ian Sheppard has posted a very good video, consistent with the OP.

    His demonstration that the only difference between dithered 8bits and 24bits is noise, by reversing polarity, is pure genius.

     
    Last edited: Feb 5, 2019
  11. gregorio
    To be fair, that reverse polarity test (called a "Null Test") is the first difference test taught to new audio engineering students and is used by almost all professional engineers on an almost daily basis. I've advocated it's use on numerous occasions here on head-fi, it's quick, easy, completely reliable, accurate, entirely objective and doesn't cost anything (using free software). It's hardly ever even mentioned in the audiophile world though and you're free to draw your own conclusions as to why!

    G
     
    mr.karmalicious likes this.
  12. old tech
    Yes but using the null test, along with music samples, as a simple demonstration that 8bits has identical resolution to 24bits gets that message across very effectively.
     
  13. Peter Hyatt

    That **** Qobuz marketing got me!
     
  14. visanj
    I purchased same song from itunes (MFiT) version, 24/96, 24/44 and 16/44 Flac from Qobuz, LAME MP3 from Google Play Music

    What I noticed is, the quality differs based on the equipment we use

    sound quality:
    With Bluetooth (Oneplus Wireless 2 - Aptx HD)
    24/44 > 16/44 > MFiT > 24/96 > MP3

    With Brainwavz B200 + iBasso DC01 + Comply Audio Pro
    24/44 > 24/96 > 16/44 > MFiT > MP3


    In all cases, I feel 24/44 sounds better than even 24/96. Don't know why. 24/96 sounds as if it has some noise at higher frequencies (not clear)
     
    LaughMoreDaily likes this.
  15. gregorio
    1. We have to be careful with statements like this. Have you ruled out the other possibilities? For example, are you certain they're all exactly the same master? As MFiT means "Mastered for iTunes", it's very possible they're slightly different masters.
    1a. With the Oneplus, you're not really comparing 24/44, 16/44, 24/96 and MP3 you're comparing a lossy, 576kbps codec derived from those original sample rates/bit depths. The AptxHD codec should be entirely transparent but again, we need to rule out the other possibilities before we can state that quality differences are due to something else (equipment differences).

    Having mentioned these possibilities, it's most probable there is an audible difference between the equipment. It's unlikely the two different IEMs have the same frequency response and those differences are almost certain to be within the threshold of audibility. Additionally, it's also likely that your IEMs have different sensitivity and therefore the difference may not be a difference in quality but just a difference in volume.

    2. It's possible there is some ultrasonic content (>21kHz) in the 24/96 version that is causing IMD (Inter-Modulation Distortion) in your amp sections or headphones (which obviously doesn't exist in the 24/44 version). It's also possible there is no audible difference and that what "you feel" is just a trick of your perception. There are other possibilities as well though, for example: Slightly different masters again, a slight volume change in the conversion process or even, that a resampling filter has been chosen that starts rolling-off at a relatively low frequency.

    G
     
First
 
Back
330 331 332 333 334 335 336 337 338 339
341 342 343 344 345 346 347 348 349 350
Next
 
Last

Share This Page