Dithering when reducing bit depth isn't always such a good idea, especially when dealing with 'natural' sounds. Often it only amounts to adding noise to noise.
Dithering when reducing bit depth is always a good idea, regardless of what type of sounds the recording contains! Quantisation error caused by truncation is correlated to the signal and will always be of significantly higher amplitude than dither, even noise-shaped dither. There's 3 caveats to this statement:
1. Noise-shaped dither should only be applied once, as the final step of the mastering process. Successive applications of noise-shaped dither will sum the dither noise which is already concentrated in smaller frequency bands to start with. Noise-shaped dither is not a tool the consumer should be playing with, unless they are making their own recordings which they then bit reduce.
2. When dithering to high bit rates, say from a 64bit mix environment to a 24bit file, dither is commonly not applied, as even the higher amplitude of truncation error is insignificant. It can theoretically benefit some workflows, which require multiple bit reductions to 24bit, in which case the dither applied should be TDPF not noise-shaped dither, following the rule of only one application of noise-shaped dither.
3. While the effects of truncation error can be shown to be more severe than the effects of dither, through measurements, I am not aware of any studies which have demonstrated that even truncation error is audible. In the famous Boston Audio study where SACDs were ABX'ed against 16/44.1 equivalents derived from the SACD (IE. the same master). The 16/44.1 versions were created by converting the SACDs to 24/96 and then simply truncated (no dither applied) to 16/44.1, still no one could identify any difference.
A caveat to these caveats: Of course we can artificially manufacture scenarios to magnify all these (and pretty much any other) inaudible digital artefacts to make them audible!
Originally Posted by Sal1950 /img/forum/go_quote.gif
I was somewhat offended that we were referred to as "nuts".
I do see a certain justification to it. If running the file through a spectrograph is the only way you can tell whether or not you're getting what you paid for, then why did you buy it? Isn't it "nuts" to pay a premium for a product with claimed additional sonic quality, if you can't actually hear any of that additional sonic quality? Or, are you saying that paying the premium has nothing to do with what you can hear but is worth it purely because of some additional visual quality, (IE. What the waveform looks like in a spectrograph) and if so, don't you think that too is a little "nuts"?
I've tried to explain that this "huge controversy" is itself "nuts" and indeed, what some consumers seem to want is also "nuts". Some consumers seem to want the "original" recordings and they want them at 24/96 or 24/192. This is "nuts" because it's not possible to record 24bits at any sample rate, about 14bits is the absolute maximum in practice. Of course we can write our 14bits to a 24bit file format or to say a 1,024bit file format (if someone invents such a format) but you're still only going to get a maximum of 14bits no matter what file format you write it to! The situation isn't much better with sample rates. The initial "resolution" of a commercial recording is probably somewhere around 4bit/15mHz but we can't even mix/master in that format, let alone distribute, it has to be converted (decimated)! Maybe consumers don't want the original recordings, maybe they want the original masters? Sorry but you can't have those either, they only exist virtually, in a mix/mastering environment, to turn them in to actual audio files we have to loose at least half the bits. Plus, processing commonly up-samples and down-samples, a lot of the time we the engineers don't even know what sample rate/s are occurring within our mixes/masters. Also notice that in this paragraph I haven't even mentioned what is or isn't audible!
The whole thing is a nonsense to start with and now it seems we've got a "huge controversy" about how to define that nonsense?!
G