You guys ever run Hertz Mille Speakers Active not passive off a 32 band capable Calibration with appropriate Time Delay personally by the Midwest Hertz Audison rep. He used a very expensive Audison Bit Tune Auto Calibration System through an Audison 24 Audison Bit One processor also supporting a 10" Dual Voice Coil Sealed Hertz Audison subwoofer with 900 watts pumped through it?
Probably not huh? I have for 7 years now. You have no idea what you are missing. One of the best sound spaces I have been in myself and hundreds of people who know good sound have sat in it for hours and agree. Don't speak of what you don't know, please. Talk of your own experience or call it your own theory.
If you're going to play that card then OK, let's play:
No, I've never heard the specific system/space you're talking about. On the other hand, have you ever worked in or even heard a commercial dubbing theatre? 900 watts is pathetic for a sub, the last system I worked with had over 20,000 watts of sub. The system was not calibrated by some sales rep but by Dolby techs and real experts and your idea of expensive is another pathetic joke, was it a $20,000,000+ purpose built film audio facility? Probably not huh? I have for about 20 years now, you have no idea what you're missing, please don't speak of what you don't know. Talk of your own lack of experience and call it your own theory!!!
OK, now we got that nonsense out of the way, let's deal with some of your so called "facts": Dolby Digital is NOT 500kbps - 3mbps! It's maximum is 768kbps but that is rarely used, for HDTV, DVD and BRD 448kbps is the DD standard, IE. It's highly compresse!. Dolby Digital is NOT 16bit and furthermore, it's a 5.1 format whereas TrueHD is 7.1, you are comparing apples and oranges, as stated by others. Your statements about localisation are also nonsense; yes, 7.1 is better than 5.1 for localisation but it still has the same basic issues, which is why it was replaced by formats such as Dolby Atmos. But none of this has anything to do with 16 vs 24bit. Is 24bit better than lossy compression? Of course but again, that's nothing to do with 16 vs 24bit!
You having spoken about "converting" people is very troubling. Converted them to what? Converting them from ignorance to incorrect/false information is doing them a serious disservice and as a professional in the field, I'd appreciate if you'd STOP your phoney "conversions"! If you want to learn how it really works, then ask, we're happy to help but don't make-up factual statements you can't back up, which conflict with the science and with how it really works or about your experience of a high-end, so called expensive system which is actually a very cheap, low-end system!
Quote:
The proper comparison would be to take the True HD version, truncate/dither/shape to 16 bits, then pad back to 24-bits and recode as True HD. In any case, the difference between purely truncated 16-bit vs. 24 are truncation errors that *peak* at -96dB, which means if you set your max peaks to 120dBSPL you are claiming to hear stuff that is at most 24dBSPL. Noise-shaped dither makes the perceived difference even quieter. So how quiet is your listening room?
We have to be careful here, film sound and music in effect are two very different things, they have very different workflows and distribution chains. We don't apply noise shaped dither in film/TV products, due to considerable amounts of additional processing being required downstream, after the print-master is completed. For this reason distribution is always 24bit or a proprietary lossy compressed format, to avoid any build up of dither, truncation or noise-shaping artefacts. We can't therefore use the same comparison logic as we can with music because there is no 16bit consumer content out there in the film world, let alone a dominant 16bit format in which the application of noise-shaped dither has been standard practise for commercial release for nigh on 20 years.
G