nick_charles
Headphoneus Supremus
- Joined
- Feb 26, 2008
- Posts
- 3,180
- Likes
- 336
Quote:
In the other subforums having poor measured performance is often a mark of quality. I see several members wetting themselves over an $1100 DAC that has an SNR of 96db (bang on the money for 16 bit CD) but which claims 24 bit resolution. In this day and age I would suggest this is a mark of mediocre engineering when my 1998 Entech can do better. Of course anything that can manage on-spec performance for CD is probably perfectly fine but why pay extra for something that is technologically average at best when my anonymous $90 DAC/headphone amp from ebay can do better and still give the builders a profit. But that is not the worst of it; in Stereophile technically truly awful kit often at several 10s of thousands of $ is treated with bewildering respect often to the puzzlement of the measurer-in-chief john Atkinson. Of course our old friends cognitive biases are hard at play here; put it in a shiny box and you can sell a cold turd for a fortune to some folks...
Back in the past I read a paper (peer reviewed) no idea where that suggested that in normal listening we really cannot tell the difference between 16 and 14 bits. I remember how stunned I was hearing my first CD player a modest Marantz CD63 (1984) and was itself only a 14 bit (4x OS) machine and the clarity and utter lack of perceived noise yet it could only manage about 90db SNR but of course I was a vinyl user back then and 90db was science fiction to LP spinners...
Regardless it seems utterly certain that non-blind tests are of almost zero value. So often (elsewhere) you have two chaps arguing about which CD player has the more extended top end when both have identical FRs. I've even seen folks with a straight face say that the one with the 3db roll off had a more extended top end even after seeing the graphs.... that is some heavy duty self-deception.
What needs to be done is to take a nice clean digital source and gradually degrade it (blindly and randomly) until the difference can be reliably detected. For instance in a JAES paper back in 1979 JVC researchers added low pass filters to music with very high frequency content and concluded that filters at 20k and 18K could not reliably be detected but 16K and 14K filters could by most (not all) listeners, its a test anyone can try. I'm sure i'd fail at anything above 13K now
I think science is somewhat to blame for audiophoolery too. Numbers and specifications get thrown about without relating them to the most important context of all- The thresholds of human perception. If high bitrate lowers the noise floor and higher sampling rates extend the high frequency response, it doesn't matter if humans can't hear it.
There should be an audio equivalent of the USDA daily recommended allowance on the sides of cereal boxes... In parenthesis after the spec should be the threshold point, so folks can instantly see if they're over the line into overkill or not.
In the other subforums having poor measured performance is often a mark of quality. I see several members wetting themselves over an $1100 DAC that has an SNR of 96db (bang on the money for 16 bit CD) but which claims 24 bit resolution. In this day and age I would suggest this is a mark of mediocre engineering when my 1998 Entech can do better. Of course anything that can manage on-spec performance for CD is probably perfectly fine but why pay extra for something that is technologically average at best when my anonymous $90 DAC/headphone amp from ebay can do better and still give the builders a profit. But that is not the worst of it; in Stereophile technically truly awful kit often at several 10s of thousands of $ is treated with bewildering respect often to the puzzlement of the measurer-in-chief john Atkinson. Of course our old friends cognitive biases are hard at play here; put it in a shiny box and you can sell a cold turd for a fortune to some folks...
Back in the past I read a paper (peer reviewed) no idea where that suggested that in normal listening we really cannot tell the difference between 16 and 14 bits. I remember how stunned I was hearing my first CD player a modest Marantz CD63 (1984) and was itself only a 14 bit (4x OS) machine and the clarity and utter lack of perceived noise yet it could only manage about 90db SNR but of course I was a vinyl user back then and 90db was science fiction to LP spinners...
Regardless it seems utterly certain that non-blind tests are of almost zero value. So often (elsewhere) you have two chaps arguing about which CD player has the more extended top end when both have identical FRs. I've even seen folks with a straight face say that the one with the 3db roll off had a more extended top end even after seeing the graphs.... that is some heavy duty self-deception.
What needs to be done is to take a nice clean digital source and gradually degrade it (blindly and randomly) until the difference can be reliably detected. For instance in a JAES paper back in 1979 JVC researchers added low pass filters to music with very high frequency content and concluded that filters at 20k and 18K could not reliably be detected but 16K and 14K filters could by most (not all) listeners, its a test anyone can try. I'm sure i'd fail at anything above 13K now
