24bit vs 16bit, the myth exploded!
Jul 16, 2023 at 4:27 AM Post #6,691 of 7,175
And it will go on for another 14 years and probably longer, though not necessarily in this thread. There’s no choice! As long as hi-res exists and someone/some companies can charge more for it, there will be marketing which falsely states it’s audibly better in order to justify the higher price. This will continue ad infinitum, as long as enough audiophiles fall for it to make it profitable to pay for that marketing.
Well even though the main premise was about audio, you also tried to equate it to graphics. And as I remember, that was the first time I chimed in this thread. As this thread ages, that analogy becomes even more dated. That premise was dated then: that when it comes to "it's easy to see a low bit depth image", you're considering being in the 80s or 90s where images had limited color palettes and couldn't reach 24bit color (or 32bit when you add an alpha mask channel). So for quite awhile, computer graphics in consumer home environments were in this realm of 8 bits per channel graphics. With commercial video, video standards had 10bit and 12bit per channel standards in the 2000s. Now many consumers are understanding this also as HDR video when they're buying 4K movies. So as I'm focused on video, I do see many upgrades with consumer standards there. But also with audio: there's the new 3-D audio formats of Atmos/DTS-X/Auro-3D (not tied to a bit depth, but how the 3D meta data carries through).
 
Jul 16, 2023 at 5:37 AM Post #6,693 of 7,175
Well even though the main premise was about audio, you also tried to equate it to graphics.
No, I didn’t try to equate it with graphics. I just used a simple analogy that most would understand.
That premise was dated then:
No, it wasn’t. Most people in 2009 would be familiar with low bit depth images and video. Although high bit depth images and HD video were available and quite common, there was still quite a lot of SD video content available/broadcast and most would remember the highly compressed cell phone photos of not so long before 2009. Even today, most people have experience of SD video, even just from having seen archive footage, older music videos, etc.

G
 
Last edited:
Jul 16, 2023 at 5:57 AM Post #6,694 of 7,175
No, I didn’t try to equate it with graphics. I just used a simple analogy that most would understand.

G
Again, I really think "it's easy to see a low bit depth image" is for a person who lived in an age that computer graphics didn't meet standard 8bpc. That was for someone who isn't a millennial, where images were always 8bpc. I see in this response you're also misunderstanding "HD" with "HDR". HDR refers to color depths that are beyond 8bpc. HD or SD refers to resolutions (be it 1080P or 480P).
 
Jul 16, 2023 at 6:14 AM Post #6,695 of 7,175
I mean his initial thread had a point. The more headroom you use to record, the more dynamic range you are wasting, so if you use an higher but depth while recording, it makes totally sense.

But after mixing and mastering is done, there is zero advantage in using more than 16bit because you eliminate that headroom in this process. That is also the reason why 250mW @ 16Ω are enough to drive all headphones on the market.

And unless you use an BluRay, bit depth plays no role anyway.

Streaming services use, on average, 20mbps for FullHD while an FullHD BluRay has easily 130mbps

Even 4K only uses 40-60mbps on Major streaming Services.

So as long streaming services beat the living crap out of the Image quality, there is no point in even thinking about color bit depth

images (2).jpeg


But that is a whole different topic all together. So yes, 24bit 192khz makes totally sense from an recording standpoint but zero when you're listening to an master that is all done and designed to be listened to
 
Last edited:
Jul 16, 2023 at 6:25 AM Post #6,696 of 7,175
I mean his initial thread had a point. The more headroom you use to record, the more dynamic range you are wasting, so if you use an higher but depth while recording, it makes totally sense.

But after mixing and mastering is done, there is zero advantage in using more than 16bit because you eliminate that headroom in this process. That is also the reason why 250mW @ 16Ω are enough to drive all headphones on the market.

And unless you use an BluRay, bit depth plays no role anyway.

Streaming services use, on average, 20mbps for FullHD while an FullHD BluRay has easily 130mbps

Even 4K only uses 40-60mbps on Major streaming Services.

So as long streaming services beat the living crap out of the Image quality, there is no point in even thinking about color bit depth

images (2).jpeg

But that is a whole different topic all together. So yes, 24bit 192khz makes totally sense from an recording standpoint but zero when you're listening to an master that is all done and designed to be listened to
The best streaming audio is Dolby Digital+ ( with Atmos if included). Please, lets keep this separate from video standards…which is another kettle of fish (but I’ll chime in if you want to go there).
 
Jul 16, 2023 at 6:31 AM Post #6,697 of 7,175
The best streaming audio is Dolby Digital+ ( with Atmos if included). Please, lets keep this separate from video standards…which is another kettle of fish (but I’ll chime in if you want to go there).
The best streaming is from an local storage to the memory of the device :wink:
 
Jul 16, 2023 at 7:47 AM Post #6,698 of 7,175
I really think "it's easy to see a low bit depth image" is for a person who lived in an age that computer graphics didn't meet standard 8bpc. That was for someone who isn't a millennial, where images were always 8bpc.
Firstly, are you claiming that millennials have never seen a lower bit depth image or SD video and therefore couldn’t appreciate the example? Secondly, I obviously did not write the OP for nine (or younger) year olds!!
I see in this response you're also misunderstanding "HD" with "HDR". HDR refers to color depths that are beyond 8bpc. HD or SD refers to resolutions (be it 1080P or 480P).
I’m not confusing with HDR, I didn’t even mention HDR, SD, HD or video of any sort in my OP! However, SD vs HD video would have been a reasonable example in the context of the OP, which was resolution and number of bits.

G
 
Last edited:
Jul 16, 2023 at 8:45 AM Post #6,700 of 7,175
So yes, 24bit 192khz makes totally sense from an recording standpoint...
24 bit makes total sense in recording, but 192 kHz sampling makes sense only in special circumstances such as recording bats using ultrasonics to navigate between the trees and using it as a sound effect slowed down by 6 octaves. Not very common in sound productions I think!
 
Jul 16, 2023 at 9:55 AM Post #6,701 of 7,175
24 bit makes total sense in recording, but 192 kHz sampling makes sense only in special circumstances such as recording bats using ultrasonics to navigate between the trees and using it as a sound effect slowed down by 6 octaves. Not very common in sound productions I think!
But it might happen. Audio Recording is not only done by musicians. It is very rare though.

Also we established in an earlier post that the Nyquist-Shannon sampling theorem only works lossless if you have an perfect filter, that does not exist in real world as it would have an infinite runtime and so never produce any sound.

As the imperfect filters we have do vary in quality you might want to use an higher bitrate just out of sheer distrust into the filter that a person might be using that is doing an later process you have no influence on. Or maybe the filter you have yourself is garbage, but you're forced to use it for whatever reasons.

If you just resample in audacity, you already have an measureable degradion in quality that should, in theory, not exist. Its inaudible low, but it does exist.

But i do admit that this is extremely unlikely and 96kHz is still more than enough for these cases.

But recoding companies are commercial companies,. They have to calculate the risk. How much does it cost to gather the musicians and make them perform again and/or let a person re-sample the recording with a different filter and how much does the additional disk space for an 192kHz recording cost?

But yes, the chance that you'll ever actually need 192kHz is extremely low. But as this is the default frequency of most digital interfaces i experienced so far, you don't want to take the risk. If the guy who made the hardware said "We recommend 192kHz, otherwise we can't guarante that there will be no issues", you are using 192kHz in an multi million dollar project :D that is for sure.
 
Jul 16, 2023 at 11:32 AM Post #6,702 of 7,175
But it might happen. Audio Recording is not only done by musicians. It is very rare though.
It is not either or. You can use 192 kHz when recording bats and 44.1 kHz when recording a cellist.

Also we established in an earlier post that the Nyquist-Shannon sampling theorem only works lossless if you have an perfect filter, that does not exist in real world as it would have an infinite runtime and so never produce any sound.
It is about how perfect the filters have to be to be perfect for human ears. Those who make money selling hi-res music think those filters are not perfect enough. Sound engineers in general think they are perfect enough. Go figure...

Yes, in theory filters should be infinitely long, which sounds a really long time, but in practice short filters (that are used) can get extremely close to the perfection.

As the imperfect filters we have do vary in quality you might want to use an higher bitrate just out of sheer distrust into the filter that a person might be using that is doing an later process you have no influence on. Or maybe the filter you have yourself is garbage, but you're forced to use it for whatever reasons.
In my opinion these things are the least of the problems and challenges in music production. I would be much more concerned about the ability of the drummer to play well and the placements of mics etc. not to mention how the music is mixed and mastered. It is a manufactured problem, manufactured by hi-res sellers.

Even if you want to avoid this "problem", using 88.2 kHz or 96 kHz would solve it. No need to use 192 kHz.

If you just resample in audacity, you already have an measureable degradion in quality that should, in theory, not exist. Its inaudible low, but it does exist.
Resample from what to what and why?

But i do admit that this is extremely unlikely and 96kHz is still more than enough for these cases.
Yes, 96 kHz is more than enough unless you are recording bats (I suppose...)

But recoding companies are commercial companies,. They have to calculate the risk. How much does it cost to gather the musicians and make them perform again and/or let a person re-sample the recording with a different filter and how much does the additional disk space for an 192kHz recording cost?
I am not here to give financial advice to record companies. I am here to tell what the science tells. Recording at 192 kHz may make financial sense (I am not an entrepreneur myself and I understand nothing about commercial aspects), but it doesn't make sense otherwise.

But yes, the chance that you'll ever actually need 192kHz is extremely low. But as this is the default frequency of most digital interfaces i experienced so far, you don't want to take the risk. If the guy who made the hardware said "We recommend 192kHz, otherwise we can't guarante that there will be no issues", you are using 192kHz in an multi million dollar project :D that is for sure.
Still doesn't make sense from technical point of view.
 
Last edited:
Jul 16, 2023 at 1:30 PM Post #6,703 of 7,175
Firstly, are you claiming that millennials have never seen a lower bit depth image or SD video and therefore couldn’t appreciate the example? Secondly, I obviously did not write the OP for nine (or younger) year olds!!

I’m not confusing with HDR, I didn’t even mention HDR, SD, HD or video of any sort in my OP! However, SD vs HD video would have been a reasonable example in the context of the OP, which was resolution and number of bits.

G
You honestly think a millenial is 9 years old?? It's someone born in the mid 80s to 90s....so the oldest millennials are 40 years old!

Secondly, you're still not understanding what HDR is, since you keep confusing it with SD or HD resolution. For the umpteenth time, HDR has to do with an image format that's beyond 8-bits per color channel. Your OP is dated because it assumes you were old enough in the 80s to remember color systems that had lower than 8 bit per channel color. Windows 3.1 systems that were not 24bit or 32bit color space, or early internet pages that had dithered gif or png. Today, about the only image format that's still lower than 8bpc is animated gif (where you may need to reduce colors to be able to get a good image size with the animation duration).
 
Jul 16, 2023 at 1:35 PM Post #6,704 of 7,175
The best streaming is from an local storage to the memory of the device :wink:
OK, fair enough if you're streaming ripped movies on Plex. You can also rip UHD movies, so movie enthusiasts do think about bit depth (IE if they can rip HDR10 or Dolby Vision).
 
Jul 16, 2023 at 6:35 PM Post #6,705 of 7,175
For the umpteenth time, HDR has to do with an image format that's beyond 8-bits per color channel. Your OP is dated because it assumes you were old enough in the 80s to remember color systems that had lower than 8 bit per channel color.
No it does not! My OP assumes a low res image has fewer bits than a hi-res image and that the difference in resolution can be visually discerned. There are obviously fewer images today of such low res which are so easily discernible than there were in 2009 but I’d still expect most readers today to have experienced this and therefore understand/appreciate the example.

For the umpteenth time, it’s got nothing to do with HDR and I’ve never even mentioned HDR!!

G
 

Users who are viewing this thread

Back
Top