stv014
Headphoneus Supremus
- Joined
- Jul 17, 2011
- Posts
- 3,493
- Likes
- 273
Quote:
That is nonsense. If digital attenuation would throw away the highest order bits first, the result would sound extremely distorted and almost like white noise. Additionally, properly implemented attenuation dithers the final 24-bit integer result, and does not just throw bits away.
Quote:
You are assuming that the absolute level of distortion remains constant, but that is false. Check the dynamic range measurement graph, with the -60 dBFS input, the distortion products drop to not much higher than -130 dBFS. Additionally, if you digitally attenuate by 40 dB (which is a rather high amount, by the way), you also make the sound overall much quieter, and thus the distortion/noise at the same THD+N will be more difficult to hear. If the absolute noise level is below the hearing threshold at full volume, then reducing the digital volume will not make it audible, even if the dynamic range is reduced, because at low volume the dynamic range of hearing is reduced, too.
In fact, many DACs have increased distortion with signal levels near 0 dBFS, or clip peaks that would go above 0 dBFS after the digital filter. Therefore, a few dB of software attenuation can even reduce distortion.
Quote:
He does not claim that distortion below 0.1% is inaudible, the recommended maximum is actually 0.01%. It is also difficult to comment on your peer reviewed publication - which refers specifically to one type of distortion, high frequency IMD, and may not even be relevant to the topic of digital volume control - without having access to its contents, and knowing the details would be important. Many old publications have also been proven wrong later, after unexpected flaws in the methodology were discovered (e.g. distorting tweeters making ultrasound "audible", etc.). Fishing in old papers, one could find something to support just about any agenda.
But if you give a link to some music of your choice, I can create for you a recording that goes through -20 dB digital attenuation and a D/A-A/D loop, and you can compare that to the original in a level matched software ABX test with whatever gear you have, and see if you can tell the difference at a realistic loudness (not more than 90 dB peak SPL because the sound is supposed to be attenuated, after all, so full volume would be a deafening 110 dB).
This is a myth. Software volume control is not harmless. It's theoretically correct for an ideal DAC with perfect 24 bit resolution, but for a practical device, you have to work from 0dB and consider what the device's actual resolution is, because digital attenuation throws away the high order bits first, not the low order bits.
That is nonsense. If digital attenuation would throw away the highest order bits first, the result would sound extremely distorted and almost like white noise. Additionally, properly implemented attenuation dithers the final 24-bit integer result, and does not just throw bits away.
Quote:
Originally Posted by MoonUnit /img/forum/go_quote.gif
At full scale, the ODAC has 0.0029% THD+N, which means the noise and distortion floor is at -90.7dB. It's irrelevant that the dynamic range is higher than this, unless one believes that dynamic range is more important than distortion. When you digitally attenuate by 40dB (fairly common), you're producing a device whose noise and distortion floor is now at only -50.7dB. This is in the realm of low quality tube amp territory, and will be definitely audible.
You are assuming that the absolute level of distortion remains constant, but that is false. Check the dynamic range measurement graph, with the -60 dBFS input, the distortion products drop to not much higher than -130 dBFS. Additionally, if you digitally attenuate by 40 dB (which is a rather high amount, by the way), you also make the sound overall much quieter, and thus the distortion/noise at the same THD+N will be more difficult to hear. If the absolute noise level is below the hearing threshold at full volume, then reducing the digital volume will not make it audible, even if the dynamic range is reduced, because at low volume the dynamic range of hearing is reduced, too.
In fact, many DACs have increased distortion with signal levels near 0 dBFS, or clip peaks that would go above 0 dBFS after the digital filter. Therefore, a few dB of software attenuation can even reduce distortion.
Quote:
Originally Posted by MoonUnit /img/forum/go_quote.gif
You can even see this on the graphs on NWAVGuy's site. I realize NWAVGuy makes his recommendation because he believes distortion below 0.1% (-60dB) is inaudible, but he has never given any support for that assertion and it is not consistent with the academic evidence from listener tests, by a wide margin. See, e.g., this peer-reviewed publication: http://www.aes.org/e-lib/browse.cfm?elib=2962 (0.003% is -90.4dB)
He does not claim that distortion below 0.1% is inaudible, the recommended maximum is actually 0.01%. It is also difficult to comment on your peer reviewed publication - which refers specifically to one type of distortion, high frequency IMD, and may not even be relevant to the topic of digital volume control - without having access to its contents, and knowing the details would be important. Many old publications have also been proven wrong later, after unexpected flaws in the methodology were discovered (e.g. distorting tweeters making ultrasound "audible", etc.). Fishing in old papers, one could find something to support just about any agenda.
But if you give a link to some music of your choice, I can create for you a recording that goes through -20 dB digital attenuation and a D/A-A/D loop, and you can compare that to the original in a level matched software ABX test with whatever gear you have, and see if you can tell the difference at a realistic loudness (not more than 90 dB peak SPL because the sound is supposed to be attenuated, after all, so full volume would be a deafening 110 dB).