TheVinylRipper
500+ Head-Fier
- Joined
- Jul 12, 2007
- Posts
- 796
- Likes
- 10
Good morning everyone.
This may well be my last post/thread here, I strongly suspect that my views upset more than a few of you and that is not my intention at all.
I've been doing a little math and have come to some conclusions that I would like to share.
The majority of high quality amplifiers these days have a distortion level of 0.01 percent or less.
The formula for calculating decibel gain or loss for voltage or current is:
Gain in dB = 20*(log base 10*(gain ratio))
http://www.allaboutcircuits.com/vol_3/chpt_1/5.html
0.01% =1/10,000 or a ratio of 10,000/1
log 10,000=4
20*4=80
So the distortion level of modern amplifiers is approximately -80 dB down from their output.
On this thread, the most golden eared audiophile who took the test could hear distortion down to approximately -40 dB.
The formula for calculating decimal ratio from dB is:
decimal ratio=10^(dB/20)
Which gives us:
10^(40/20)=10^2=100
Therefore the most golden eared audiophile that took the test could only hear distortion down to 100 times that of the high quality amps which we have today.
Even the best transducers (headphones, speakers, microphones) today have distortion far greater than 0.01%, personally from my own experience with phonograph cartridges I would guess closer to 1.0%
Strangely enough that is the same ratio by which the most golden eared audiophiles heard distortion in the test to which I linked above.
My conclusion therefore is that the differences in high quality amplifiers today are totally hidden by the distortion of the transducers through which we listen to those amplifiers.
I welcome discussion on this subject.
As I said at the beginning, my intention is not to cause anyone to become upset, it is rather to introduce some rationality into what I see as a somewhat irrational fixation on ever and ever better specs on amplification and/or DAC's.
This may well be my last post/thread here, I strongly suspect that my views upset more than a few of you and that is not my intention at all.
I've been doing a little math and have come to some conclusions that I would like to share.
The majority of high quality amplifiers these days have a distortion level of 0.01 percent or less.
The formula for calculating decibel gain or loss for voltage or current is:
Gain in dB = 20*(log base 10*(gain ratio))
http://www.allaboutcircuits.com/vol_3/chpt_1/5.html
0.01% =1/10,000 or a ratio of 10,000/1
log 10,000=4
20*4=80
So the distortion level of modern amplifiers is approximately -80 dB down from their output.
On this thread, the most golden eared audiophile who took the test could hear distortion down to approximately -40 dB.
The formula for calculating decimal ratio from dB is:
decimal ratio=10^(dB/20)
Which gives us:
10^(40/20)=10^2=100
Therefore the most golden eared audiophile that took the test could only hear distortion down to 100 times that of the high quality amps which we have today.
Even the best transducers (headphones, speakers, microphones) today have distortion far greater than 0.01%, personally from my own experience with phonograph cartridges I would guess closer to 1.0%
Strangely enough that is the same ratio by which the most golden eared audiophiles heard distortion in the test to which I linked above.
My conclusion therefore is that the differences in high quality amplifiers today are totally hidden by the distortion of the transducers through which we listen to those amplifiers.
I welcome discussion on this subject.
As I said at the beginning, my intention is not to cause anyone to become upset, it is rather to introduce some rationality into what I see as a somewhat irrational fixation on ever and ever better specs on amplification and/or DAC's.