24bit vs 16bit, the myth exploded!
post-15357910
Post #5,371 of 5,635

Davesrose

Headphoneus Supremus
Joined
Oct 20, 2006
Messages
4,624
Reaction score
98
Joined
Oct 20, 2006
Posts
4,624
Likes
98
1. I don't necessarily assume other people are ignorant of the facts! If someone makes some assertion that is incorrect, I assume that EITHER they're ignorant of the facts, that they're not ignorant of the facts but don't really understand them or that they're not ignorant of the facts, do understand them but are erroneously dismissing them for some reason. What rational alternative am I missing?
1a. I didn't state your post didn't have some clarity and I didn't dispute this assertion, only some specific facts within your assertion, your analogy with digital imaging and your conclusion/assertion of there being "no final answer".

2. More than a few I should imagine!

3. But I didn't say that! What I effectively (tried to) say is that to begin with we do NOT even have audio, we have sound pressure waves that needs to be transduced into audio and only then can it be converted to digital. Therefore:
3a. "No", because photography does not have to be transduced as well. Both digital audio and (as I understand it) digital imaging involve an analogue stage and then an ADC but a major difference is that digital imagining involves converting between different types of the same form of energy (light and electricity which are both electromagnetic energy), while digital audio recording involves converting between two different forms of energy (mechanical and electromagnetic).
3b. No, a photodiode is very significantly different to a microphone. As far as I'm aware, photodiodes are microscopic devices with no moving parts (solid state) which operate at the quantum level converting a photon/s into an electron/s. Microphones do NOT conduct electricity based on the intensity of sound, they generate electricity based on the "intensity" of movement of a diaphragm. A microphone therefore has to first convert variations in sound pressure into the mechanical motion of a diaphragm and then convert that mechanical motion of the diaphragm into electricity. So, we're dealing with relatively huge mechanical devices, subject to all the limitations of the laws of physical motion/transduction, all of which results in relatively massive inefficiency (compared to image sensors) in the generated analogue signal, which then of course is the input for conversion to digital data. In the practical application of digital audio, it's this inefficiency which defines system limitations, not the ADC, DAC or number of bits (beyond 16). A somewhat better analogy with an image sensor would have been a tape recorder, which also converts between different types of the same form of energy (electrical and magnetic) but it's still a rather poor analogy as tape recorder performance is still reliant on mechanical forces (physical properties of the tape itself, plus friction, tape alignment, motor/speed, etc.).

4. Another significant difference! Over the years, digital audio has changed significantly but the resultant output resolution and dynamic range has barely changed at all. Even in the earliest days of consumer digital audio (CD) the hardware was capable of 16 bits, near perfect resolution and a dynamic range in excess of both the limitations imposed in practice by microphones or that would be experienced in the real world (at a gig).

5. And another reason to be wary of comparing digital imaging with digital audio, that I've already mentioned. With digital imaging,16 million colours (24bit) still does not cover all the capabilities of the human eye. If, as you say, it only provides 8 stops and the human eye is capable of 20 stops, then it's still a long way from the capabilities of the human eye. For the human ear though, 16bit is already beyond it's capabilities and a long way beyond "comfortable".

G
The difference is I clearly was referring to audio processors, and you conflated my short statement with file bit depth. Light is not the same form of energy as electricity. The point was that although not the same, there is still a conversion to electric signal and an analog to digital conversion happening in digital photography. And again, modern images are thought of in bits per channel: not total bits of image (for example, there are images that are a total of 32bit with 8bpc RGB and alpha transparency). And I can leave it at that.
 
Last edited:
     Share This Post       
post-15358428
Post #5,372 of 5,635

bigshot

Headphoneus Supremus
Joined
Nov 16, 2004
Messages
20,741
Reaction score
3,087
Location
Hollywood USA
Joined
Nov 16, 2004
Location
Hollywood USA
Posts
20,741
Likes
3,087
Website
www.facebook.com
I don't see much relationship between resolution in photography and sound fidelity. With photographs you can get right up close and squint at a photo and see more resolution. With sampling rate, past the point your ears can hear, there is nothing you can do to perceive more zeros and ones. And with photos you can turn up the brightness to see detail in shadows, but with sound, if you turn up the volume to hear the wider bit depth, you get blasted deaf by the peaks.

But in either case, once you get beyond a certain point, for the purposes intended, it becomes pointless overkill.
 
Last edited:
     Share This Post       
post-15358536
Post #5,373 of 5,635

Davesrose

Headphoneus Supremus
Joined
Oct 20, 2006
Messages
4,624
Reaction score
98
Joined
Oct 20, 2006
Posts
4,624
Likes
98
I don't see much relationship between resolution in photography and sound fidelity. With photographs you can get right up close and squint at a photo and see more resolution. With sampling rate, past the point your ears can hear, there is nothing you can do to perceive more zeros and ones. And with photos you can turn up the brightness to see detail in shadows, but with sound, if you turn up the volume to hear the wider bit depth, you get blasted deaf by the peaks.

But in either case, once you get beyond a certain point, for the purposes intended, it becomes pointless overkill.
I wasn't talking about resolution. Dynamic range is a separate subject. With digital imaging, resolution is the number of pixels or dots in a given area. It's as different as resolution in sound is compared to dynamic range. Instead of it meaning the range from the softest to loudest peak, DR in imaging is the value range of blackest point to brightest point. There is no such thing as infinite resolution: if you were able to go up close to a billboard (which can be as little as 20dpi), you would start seeing softness before just seeing individual points. Also, content creators need larger dynamic range of their source files to be able to pull up detail in shadows or recover blown highlights (especially when converting HDR images to monitor color spaces). Try post processing an image that doesn't have enough DR, and you'll either see noise or black splotches in shadows as well as complete white splotches in highlights. This is just with photography: HDR is also used in 3D rendering for realistically simulating environmental conditions (where it's applicable to have DR that can exceed what the human eye is able to accommodate). Digital displays are still in their infancy with having HDR, and it will be interesting to see if images appear even more "lifelike" as contrast ranges continue to improve (and color grading might not side with high key contrast for some content). I realize there is not a 1:1 comparison of digital sound reproduction to image reproduction. However, it is valid to understand respective technology and see what analogies there are.
 
     Share This Post       
post-15358572
Post #5,374 of 5,635

bigshot

Headphoneus Supremus
Joined
Nov 16, 2004
Messages
20,741
Reaction score
3,087
Location
Hollywood USA
Joined
Nov 16, 2004
Location
Hollywood USA
Posts
20,741
Likes
3,087
Website
www.facebook.com
But when dynamic range falls below a certain point, you can't see it or hear it without grossly adjusting the brightness or volume level. Dynamic range is only important if you are editing images or mixing a recording. After you finalize an image or export a mix, you don't need it at all. The eyes and ears can only detect a certain amount of dynamics at one time, The whole point of editing an image or mixing audio is to get the result within the comfortable range of the eyes and ears, and that is well within the range of a normal consumer image or sound format.
 
Last edited:
     Share This Post       
post-15358829
Post #5,375 of 5,635

Davesrose

Headphoneus Supremus
Joined
Oct 20, 2006
Messages
4,624
Reaction score
98
Joined
Oct 20, 2006
Posts
4,624
Likes
98
But when dynamic range falls below a certain point, you can't see it or hear it without grossly adjusting the brightness or volume level. Dynamic range is only important if you are editing images or mixing a recording. After you finalize an image or export a mix, you don't need it at all. The eyes and ears can only detect a certain amount of dynamics at one time, The whole point of editing an image or mixing audio is to get the result within the comfortable range of the eyes and ears, and that is well within the range of a normal consumer image or sound format.
Dynamic range still isn't useless to the consumer: there still needs to be a minimum. We haven't reached the DR limits of vision with imaging, and there are very different applications where resolution comes into play. A 46 mega pixel camera is over-kill for an image that gets scaled down to say a 1024 web image. But a person may still want that original size for making a high quality print at a large size. With a static image, it is easier to layer different exposures to get detail in a dark lit room as well as detail in a bright window. With single exposures or video, you may have to compromise and blow out parts of the scene to adequately expose for the subject.

While imaging still has room for improvement, I would agree that we've reached the limits of fidelity with audio in relation to human perception....and I leave it to others who may want to expose themselves to sound peaks over 100dB or want to argue what processing method (in relation to 1bit to 16bit to 24bit) is "best".
 
     Share This Post       
post-15359525
Post #5,376 of 5,635

bigshot

Headphoneus Supremus
Joined
Nov 16, 2004
Messages
20,741
Reaction score
3,087
Location
Hollywood USA
Joined
Nov 16, 2004
Location
Hollywood USA
Posts
20,741
Likes
3,087
Website
www.facebook.com
I think people should submit themselves to peaks over 100dB. If they expose other people to that, it might be classified as assault. For the purposes of listening to music in the home, CD quality sound is already overkill.
 
     Share This Post       
post-15359558
Post #5,377 of 5,635

Davesrose

Headphoneus Supremus
Joined
Oct 20, 2006
Messages
4,624
Reaction score
98
Joined
Oct 20, 2006
Posts
4,624
Likes
98
I think people should submit themselves to peaks over 100dB. If they expose other people to that, it might be classified as assault. For the purposes of listening to music in the home, CD quality sound is already overkill.
At least one of the advantages of DR with imaging is that you still won't ruin your eyes staring at a TV (yes, even now with HDR displays vs older old SD NTSC screens in which moms said you'd ruin your eyes). As a small aside, I remember a time in which as a kid I was staring in front of a microwave and my mom exclaimed I was ruining my eyes. Even then, I had known enough about radiation and shielding. When it comes to music, CD can be a great standard. Now it is more convoluted when it comes to streaming and compression. At least now that high bandwidth is becoming a norm, we're not having any issues with compression artifacts with music let alone "acceptable" 4K HDR Atmos video.
 
Last edited:
     Share This Post       
post-15359574
Post #5,378 of 5,635

old tech

500+ Head-Fier
Joined
Jul 8, 2015
Messages
637
Reaction score
270
Location
Sydney, Australia
Joined
Jul 8, 2015
Location
Sydney, Australia
Posts
637
Likes
270
At least one of the advantages of DR with imaging is that you still won't ruin your eyes staring at a TV (yes, even now with HDR displays vs older old SD NTSC screens in which moms said you'd ruin your eyes). As a small aside, I remember a time in which as a kid I was staring in front of a microwave and my mom exclaimed I was ruining my eyes. Even then, I had known enough about radiation and shielding. When it comes to music, CD can be a great standard. Now it is more convoluted when it comes to streaming and compression. At least now that high bandwidth is becoming a norm, we're not having any issues with compression artifacts with music let alone "acceptable" 4K HDR Atmos video.
Off topic, but decades ago that whole thing about microwave ovens and lethal radiation was a widely accepted myth. I still remember the concerns people had with door seals and worrying about leaking radiation - even the possibility of putting radiation into the cooking food from normal use. Of course, microwave radiation is non ionising and only likely to cause heat discomfort if one was very close to a badly leaking door.
 
     Share This Post       
post-15359591
Post #5,379 of 5,635

Davesrose

Headphoneus Supremus
Joined
Oct 20, 2006
Messages
4,624
Reaction score
98
Joined
Oct 20, 2006
Posts
4,624
Likes
98
Off topic, but decades ago that whole thing about microwave ovens and lethal radiation was a widely accepted myth. I still remember the concerns people had with door seals and worrying about leaking radiation - even the possibility of putting radiation into the cooking food from normal use. Of course, microwave radiation is non ionising and only likely to cause heat discomfort if one was very close to a badly leaking door.
There might have been more issues with the earliest microwaves, but I was a child of the 80s. At that point, I think the microwave was a fixture that was completely safe. My mom is now known as an actual Luddite (she's actually destroyed 2 computers from physical means). I think she did think microwaves could continue to project radiation right in front: where in fact the radiation is blocked by the metal block screen you see in the window (acting as a Faraday device).
 
Last edited:
     Share This Post       
post-15359594
Post #5,380 of 5,635

old tech

500+ Head-Fier
Joined
Jul 8, 2015
Messages
637
Reaction score
270
Location
Sydney, Australia
Joined
Jul 8, 2015
Location
Sydney, Australia
Posts
637
Likes
270
At least one of the advantages of DR with imaging is that you still won't ruin your eyes staring at a TV (yes, even now with HDR displays vs older old SD NTSC screens in which moms said you'd ruin your eyes). As a small aside, I remember a time in which as a kid I was staring in front of a microwave and my mom exclaimed I was ruining my eyes. Even then, I had known enough about radiation and shielding. When it comes to music, CD can be a great standard. Now it is more convoluted when it comes to streaming and compression. At least now that high bandwidth is becoming a norm, we're not having any issues with compression artifacts with music let alone "acceptable" 4K HDR Atmos video.
Off topic # 2. I find your posts around photography and video quite interesting, it is not a subject I know much about. Tell me something, I upgraded my monitor the other week to a Dell 32" 4k screen. Why is it that I don't notice any difference to my previous 31" UHD monitor, even with video content? Do I need 4k content to appreciate any difference?
 
     Share This Post       
post-15359630
Post #5,381 of 5,635

Davesrose

Headphoneus Supremus
Joined
Oct 20, 2006
Messages
4,624
Reaction score
98
Joined
Oct 20, 2006
Posts
4,624
Likes
98
Off topic # 2. I find your posts around photography and video quite interesting, it is not a subject I know much about. Tell me something, I upgraded my monitor the other week to a Dell 32" 4k screen. Why is it that I don't notice any difference to my previous 31" UHD monitor, even with video content? Do I need 4k content to appreciate any difference?
There's going to be some issues here. First, UHD now is essentially 4k for consumer standards...it also includes close to 8K as well. But the minimum horizontal pixels for "UHD" is 3840px while cinema standards are different (they're not locked into 16:9 aspect, and they have horizontal resolution of at least 4096px). I'm not sure how there can be resolution differences between the base "UHD" (still around 4K at 3840px) vs "4K" (maybe at best 4096px wide). If it's really as minute as a small resolution shift...then that wouldn't show much difference. I suspect that instead of resolution, you might be wondering about HDR: which can show clear differences. You mention a Dell screen: are you a Windows user? If so, you can have HDR10 color space quite easily with the latest versions of Windows 10. If you're up to date, you should be able to see HDR settings with display settings. It will let you adjust contrast for SDR vs HDR....and another great thing about Windows is that I find no problems with playing movie files that are TrueHD base Atmos output.
 
     Share This Post       
post-15359660
Post #5,382 of 5,635

old tech

500+ Head-Fier
Joined
Jul 8, 2015
Messages
637
Reaction score
270
Location
Sydney, Australia
Joined
Jul 8, 2015
Location
Sydney, Australia
Posts
637
Likes
270
There's going to be some issues here. First, UHD now is essentially 4k for consumer standards...it also includes close to 8K as well. But the minimum horizontal pixels for "UHD" is 3840px while cinema standards are different (they're not locked into 16:9 aspect, and they have horizontal resolution of at least 4096px). I'm not sure how there can be resolution differences between the base "UHD" (still around 4K at 3840px) vs "4K" (maybe at best 4096px wide). If it's really as minute as a small resolution shift...then that wouldn't show much difference. I suspect that instead of resolution, you might be wondering about HDR: which can show clear differences. You mention a Dell screen: are you a Windows user? If so, you can have HDR10 color space quite easily with the latest versions of Windows 10. If you're up to date, you should be able to see HDR settings with display settings. It will let you adjust contrast for SDR vs HDR....and another great thing about Windows is that I find no problems with playing movie files that are TrueHD base Atmos output.
Cheers, that makes sense. The native resolution of the monitor is 3840 x 2160. I have the current version of windows 10 but use the HDR settings on the Nvidia control panel.

I thought the screen type would also have an impact on picture quality. I know it is not like for like, but to my eyes my 42" 1080p plasma TV has better picture quality than the monitor. A bit like I would expect a 4k OLED TV to have better picture quality than a 4k LCD TV.
 
     Share This Post       
post-15359682
Post #5,383 of 5,635

Davesrose

Headphoneus Supremus
Joined
Oct 20, 2006
Messages
4,624
Reaction score
98
Joined
Oct 20, 2006
Posts
4,624
Likes
98
Cheers, that makes sense. The native resolution of the monitor is 3840 x 2160. I have the current version of windows 10 but use the HDR settings on the Nvidia control panel.

I thought the screen type would also have an impact on picture quality. I know it is not like for like, but to my eyes my 42" 1080p plasma TV has better picture quality than the monitor. A bit like I would expect a 4k OLED TV to have better picture quality than a 4k LCD TV.
Yep, that resolution is going to be the standard for "4K" UHD. Cinema cameras can now record in higher resolutions, but keep in mind that digital intermediates can be lower resolutions (as trying to layer all the VFX and such is so intense that many productions still process in 2K HDR for their intermediates and then upres to 4K still). The Japanese did set standards for the 8K UHD broadcast standards....but there's still much room to be the final outcome. I think when it comes to cinema cameras...RED continues to set the standard: they continue to offer higher resolutions and score high for RAW DR video codecs.

When it comes to best display, I've actually read that TVs are now better at HDR than computer monitors. I did invest in a Panasonic plasma HDTV early on, and it's stll going strong. It's just that this year, I finally did upgrade my main TV a larger OLED TV. When it comes to comparisons, I don't have any regrets: OLED easily reproduces high IQ for SD vs HD vs UHD. It's also better with full 4K HDR content. But I've also got the disclaimer that source makes a difference. A highly compressed SD video is going to look like crap: my LG 4K player does a good job of playing well mastered DVDs, BDs, and 4K disks....but still you can see and possibly hear some differences with comparable early masters with artifacts. For most my video playing, I stream material from an Apple TV 4K and play local content from an Intel NUC (that is great with lossless audio and HDR content).
 
     Share This Post       
  • Like
Reactions: old tech
post-15359873
Post #5,384 of 5,635

TheSonicTruth

500+ Head-Fier
Joined
Dec 19, 2014
Messages
953
Reaction score
117
Joined
Dec 19, 2014
Posts
953
Likes
117
Cheers, that makes sense. The native resolution of the monitor is 3840 x 2160. I have the current version of windows 10 but use the HDR settings on the Nvidia control panel.

I thought the screen type would also have an impact on picture quality. I know it is not like for like, but to my eyes my 42" 1080p plasma TV has better picture quality than the monitor. A bit like I would expect a 4k OLED TV to have better picture quality than a 4k LCD TV.
Yep, that resolution is going to be the standard for "4K" UHD. Cinema cameras can now record in higher resolutions, but keep in mind that digital intermediates can be lower resolutions (as trying to layer all the VFX and such is so intense that many productions still process in 2K HDR for their intermediates and then upres to 4K still). The Japanese did set standards for the 8K UHD broadcast standards....but there's still much room to be the final outcome. I think when it comes to cinema cameras...RED continues to set the standard: they continue to offer higher resolutions and score high for RAW DR video codecs.

When it comes to best display, I've actually read that TVs are now better at HDR than computer monitors. I did invest in a Panasonic plasma HDTV early on, and it's stll going strong. It's just that this year, I finally did upgrade my main TV a larger OLED TV. When it comes to comparisons, I don't have any regrets: OLED easily reproduces high IQ for SD vs HD vs UHD. It's also better with full 4K HDR content. But I've also got the disclaimer that source makes a difference. A highly compressed SD video is going to look like crap: my LG 4K player does a good job of playing well mastered DVDs, BDs, and 4K disks....but still you can see and possibly hear some differences with comparable early masters with artifacts. For most my video playing, I stream material from an Apple TV 4K and play local content from an Intel NUC (that is great with lossless audio and HDR content).
After ten years of doing basic setups and full calibrations on consumer TVs ranging from WW2-based CRT tube sets to the latest OLEDs, as well as desktop monitors, I've come to the conclusion that the picture menu settings matter far more than how many lines or dots of resolution a TV or monitor is capable of. Not to mention, well-produced content in the first place.

That's right - I said it: Settings/calibration makes the biggest impact on enjoyment of your display(and are also good for your eyes and for extending the life of the display).

Most people don't even know their TV has a menu, let alone know what things like Contrast, Brightness, and color actually do. Typically, the sets are just left in Retail mode, or 'Vivid' or 'Dynamic', with as many as a dozen so-called enhancers engaged. This airport runway light bright over colored cartoon clusterf-- is what the general public has come to associate with HD, and UHD, 4K, 8K, etc. It's why they think the calibrated image 'looks wrong' to them.

Just taking the TV out of 'Dynamic' mode, and turning off crap like 'Skin Enhancer', 'Motion sensing', 'Noise Reduction'(one TV had 4 flavors of NR - I disabled them all), 'Super-Duper Contrast'(I made that one up!) - the picture already looked more natural. And that was before running alignment patterns for the basic settings(Contrast, Brightness, Sharpness, etc.). As many as a dozen enhancers under the Advance Menu where people are afraid to venture: All non-standard settings. All disabled.

And the rest of what I do I'll keep secret. :wink: :wink:
 
Last edited:
     Share This Post       
post-15359951
Post #5,385 of 5,635

old tech

500+ Head-Fier
Joined
Jul 8, 2015
Messages
637
Reaction score
270
Location
Sydney, Australia
Joined
Jul 8, 2015
Location
Sydney, Australia
Posts
637
Likes
270
After ten years of doing basic setups and full calibrations on consumer TVs ranging from WW2-based CRT tube sets to the latest OLEDs, as well as desktop monitors, I've come to the conclusion that the picture menu settings matter far more than how many lines or dots of resolution a TV or monitor is capable of. Not to mention, well-produced content in the first place.

That's right - I said it: Settings/calibration makes the biggest impact on enjoyment of your display(and are also good for your eyes and for extending the life of the display).

Most people don't even know their TV has a menu, let alone know what things like Contrast, Brightness, and color actually do. Typically, the sets are just left in Retail mode, or 'Vivid' or 'Dynamic', with as many as a dozen so-called enhancers engaged. This airport runway light bright over colored cartoon clusterf-- is what the general public has come to associate with HD, and UHD, 4K, 8K, etc. It's why they think the calibrated image 'looks wrong' to them.

Just taking the TV out of 'Dynamic' mode, and turning off crap like 'Skin Enhancer', 'Motion sensing', 'Noise Reduction'(one TV had 4 flavors of NR - I disabled them all), 'Super-Duper Contrast'(I made that one up!) - the picture already looked more natural. And that was before running alignment patterns for the basic settings(Contrast, Brightness, Sharpness, etc.). As many as a dozen enhancers under the Advance Menu where people are afraid to venture: All non-standard settings. All disabled.

And the rest of what I do I'll keep secret. :wink: :wink:
I agree that calibration of the display is important. In a previous life I calibrated TV sets on the side during my uni years. And you're right, most people were underwhelmed with the result but after watching the re-calibrated screen for a couple of weeks (and re-calibrating their brains) they would never go back to the previous settings.

Anyway, the Dell monitors are calibrated from the factory (they even come with a unique print out to show the results) so any additional tweaking would be a marginal improvement at best.
 
Last edited:
     Share This Post       

Users Who Are Viewing This Thread (Users: 0, Guests: 4)

Top