1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.

    Dismiss Notice

Testing Audiophile Claims and Myths : The Original Compilation

Discussion in 'Sound Science' started by upstateguy, May 30, 2015.
  1. upstateguy
    ●●  This information is from Prog Rock Man's original thread.  It was difficult to compile and should be readily available to read.   ●●
    The original thread is here:  http://www.head-fi.org/t/486598/testing-audiophile-claims-and-myths
    (As I find more blind tests I will add them to the list here.)

    So, we love to have a good discussion/argument/rant here (and on all the other audio forums I have seen) about the many claims audiophiles make that others dismiss as myths. The arguments go round in circles; I hear a difference - but there cannot be a difference, it is all in your mind - have you tried different cables? - I don’t need to it is all in your mind etc etc, we all know how it goes.

    Occasionally there are attempts to test such myths. WHF’s own Big Question is an example. Three What Hifi forum members are invited to their listening rooms and have been blind tested on cables to bit rates. From the issues I have read, there is a confirmation that the myths of differences are not correct, the differences are real. Different bit rates have been correctly identified, different cables have produced different sounds in the same Hifi kit. But, they are blind listening reviews, which are different from ABX tests where people are asked to correctly identify products.

    Here is a list of blind listening and ABX tests that I have found on the internet. What I have done is summarise their conclusions.
    It is important to note the difference between blind and ABX testing as they produce different results.
    Blind tests mean the listener does not know what they are listening and are asked to describe any differences they can hear which is a type of blind testing commonly used in audio. That kind of test often results in low priced hifi 'surprisingly' doing as well as high priced as factors such as image, product reputation is hidden from the listener. Some blind testing also invloves a competition betwen products were say two amps are pitched next to each other and the wimnner progresses to the next round. As you see I have bery broad in the defenition of blind testing.
    ABX testing more of a test. You listen to product A and product B and are then played X, which is either A or B and have to say which it is. There can be more than A and B as some tests invlove multiple cables. Then any differences have to be clearly audible, which for the likes of cables has not been the case yet.  I have also been broad in the definition of ABX testing.

    The aim is to see what the overall result of these tests gives us and whether they provide evidence to back up or deny the reality of alleged audiophile myths. Before you read on here is a test you can try out yourself...

    ...and here is a very interesting article on a debate between audio sceptic Arny Krueger and Stereophile editor John Atkinson on ABX testing
    Stereophile The Great Debate
    Finally, for those who say blind testing is designed to produce fails and discredit audiophiles, here are some positive ones where differences have been identified

    1 - ABX Double Blind Comparator.

    This is a web site dedicated to such testing. Back in May of 1977 there was a comparison of amplifiers which found over three tests of two amps each, listeners could tell a difference in two, but not the third which was an even split. It is important to note that not all of the ABX tests here are negative. Some do find differences can be identified. That shows that with some parts of the hifi chain there are real differences, but with others there are not.

    ABX Double Blind Comparator Data

    A test of interconnects and speaker cables found that no one could pick out the differences between a series of wires from ‘blister pack $2.50 to $990 speaker cable. All the results were even with approximately 50% going for the cheap and expensive options.

    There is an interesting comparison of ‘video cables’ which found that once over 50 feet it was easy to spot which was the 6 foot cable and the much longer one.

    DACs don’t fair well with CDPs finding an original CDP being distinguishable from a more modern one, but an expensive stand alone DAC being the same as a CDP.

    None of the tests involve a large amount of people and some are just of one person.

    2 - Effects of Cable, Loudspeaker and amplifier interactions, an engineering paper from 1991.


    Twelve cables are tested from Levinson to Kimber and including car jump leads and lamp cable, from $2 to $419 per metre. The results are based on the theory that loudspeaker cable should transmit all frequencies, unscathed to any speaker from any amplifier and loss is due to resistance. There is an assumption that letting through more frequencies with less distortion will sound better. But that seems reasonable to me.

    The best performance was with multi core cables. The car jump leads did not do well and cable intended for digital transmission did! The most expensive cable does not get a mention in the conclusions, but the cheapest is praised for its performance and Kimber does well. Sadly there is not a definitive list of the cost of the cables and their performance, so it is not clear as to whether cost equals performance, but the suggestion is that construction equals performance.

    3 - Do all amplifiers sound the same? Original Stereo Review blind test.
    (The original Bruce Coppola link is broken, and I cannot find any existing link at this time)

    A number of amplifiers across various price points and types are tested. The listeners are self declared believers and sceptics as to whether audiophile claims are true or not.

    There were 13 sessions with different numbers of listeners each time. The difference between sceptic and believer performance was small, with 2 sceptics getting the highest correct score and 1 believer getting the lowest. The overall average was 50.5% getting it right, so that is the same as you would expect from a random guess result. The cheapest Pioneer amp was perfectly capable of outperforming the more expensive amps and it was ‘striking similar to the Levinson‘.
    As an extra to this and for an explanation of how amps can all sound the same, here is a Wikipedia entry on Bob Carver and his blind test amp challenges

    4 - Cable directionality.

    Not the best link as it only refers to a test without giving too many specifics. The cable maker Belden conducted a test with an un named magazine which found the result was perfectly random.

    I liked the next sentence which was “Belden is still happy to manufacture and sell directional cables to enthusiasts”

    The Truth About Audio and Other Cables - AES PNW Section Meeting Report -

    5 - Head - Fi ABX Cable Taste Test Aug 2006.

    Three cables from Canare, Radio Shack and a silver one were put into the same sleeving to disguise them, a mark put on each one so only the originator knew which was which and then sent around various forum members. The result was that only one forum member got all three correct. The Radio Shack cheap cable and the silver were the most mixed up.

    Unfortunately I cannot see from the thread, which is huge how many members took part and what the exact results were.

    6 - HiFi Wigwam, The Great Cable debate. Power cable ABX test Oct 2005.

    This is a very well done large scale ABX test. A similar set up to Head-fi where four mains cables including 2 kettle leads (stock power cords that had come with hifi products), an audiophile one, a DIY one and a tester CD were sent out forum members. The results were inconclusive to say the least, for example;

    The kettle lead was C. There were 23 answers :
    4 said that the kettle lead was A
    6 said that it was B
    8 said that it was C
    5 said that they didn't know.

    The overall conclusion was that the kettle lead could not be properly identified or that one cable was better than another.
    EDIT - one of the participants to this test has pointed out that the two kettle leads, described in the test as exactly the same were in fact not identical and were just basic leads which had come with hifi products.

    7 - What Hifi The Big Question on cables. Sept 2009

    From the Sept 2009 issue. Three forum members were invited to WHF and blind tested where they though the kit (Roksan, Cyrus, Spendor) was being changed, but instead the cables were. The same three tracks were used throughout.

    The kit started out with the cheapest cables WHF could find and no one liked it saying it sounded flat and dull. Then a Lindy mains conditioner and Copperline Alpha power cords were introduced and the sound improved.

    The IC was changed to some Atlas Equators and two out the three tracks were said to have improved with better bass and detail.

    Last the 60p per metre speaker cable was changed for £6 per metre Chord Carnival Sliverscreen. Again, changes were noticed, but they were not big.

    Various swaps took place after that which confirmed the above, that the power cords made the biggest difference. When the test was revealed the participants were surprised to say the least!
    But, this is not an ABX test, it is a blind listening review and as you read on you find the two produce different results. What is worrying is that when I asked Clare Newsome, the Chief Editor about such tests she claimed that they were ABX and elsewhere on their forum they have claimed to do ABX testing. But they do not, they are blind listening reviews, which allow people the chance to claim a difference, but offers no evidence they they can really hear a difference.

    8 - Secrets of Home Theatre and High Fidelity. Can We Hear Differences Between A/C Power Cords? An ABX Blind Test. December, 2004

    A comprehensive article with pictures and the overall result was 73 out of 149 tests so 49% accuracy, the same as chance.

    9 - Boston Audio Society, an ABX test of Ivor Tiefenbrun, the founder of Linn. August 1984

    A rather complex testing of Ivor Tiefenbrun himself, who at that time was very pro vinyl and anti digital (the opposite almost of how Linn operate now!). There are various different tests and the overall conclusion was
    "In summary, then, no evidence was provided by Tiefenbrun during this series of tests that indicates ability to identify reliably:
    (a) the presence of an undriven transducer in the room, 
    (b) the presence of the Sony PCM-F1 digital processor in the audio chain, or 
    (c) the presence of the relay contacts of the A/B/X switchbox in the circuit."
    Even the founder of Linn could not back up claims he had been making when subjected to an ABX test of those claims.
    10 - The (In)famous Audioholics forum post, cables vs coathanger!. June 2004
    11 - Matrixhifi.com from Spain. ABX test of two systems. June 2006.
    Two systems, one cheap (A)  with a Sony DVD and Behringer amp (supported on a folding chair) with chepo cables and the other more expensive (B) with Classe, YBA, Wadia and expensive cables and proper stands were hidden behind a sheet and wired to the same speakers.
    The results were;
    38 persons participated on this test
    14 chose the "A" system as the best sounding one
    10 chose the "B" system as the best sounding one
    14 were not able to hear differences or didn't choose any as the best.
    12 - AVReview. Blind cable test. April 2008
    Some of AVR's forum members attended at a Sevenoaks hifi shop and listened to the same kit with two cheap Maplins cables at £2 and £8 and a Chord Signature at £500. They found the cheaper Maplins cable easy to differentiate  and the more expensive harder to differentiate from the Chord. Their resident sceptic agreed he could hear differences. The final conclusion was;
    ....from our sample of 20 near-individual tests, we got 14 correct answers. That works out at 70 per cent correct....
    So that is the second ABX to join What Hifi which suggests there is indeed a difference. But like What Hiif it shows the difference in results from Blind to ABX testing and how easy it is to try and obscure the two types of test.
    http://www.avreview.co.uk/news/article/mps/uan/1863#ixzz0nGpGRfCB - note link broken, unable to find another
    13 - Journal of the Audio Engineering Society, ABX test of CD/SACD/DVD-A. Sept 2007
    You need to be a member of the AES to access the article here; (EDIT, the link has changed and I cannot find the actual test referred to)
    a summary of which states "A carefully controlled double-blind test with many experienced listeners showed no ability to hear any differences between formats".  The results were that 60 listeners over 554 trials couldn’t hear any differences between CD, SACD, and 96/24.
    EDIT - this test is apparently flawed, see post 962 for ful details, but basically the hi rez example used was from an original CD.
    14 - What Hifi, Blind Test of HDMI cables, July 2010
    Another What Hifi test of three forum members who are unaware that the change being made is with three HDMI cables. As far as they know equipment could be being changed. The cables are a freebie, a Chord costing £75 and a QED costing £150. Throughout the test all three struggle to find any difference, but are more confident that there is a difference in the sound rather than the picture. They preferred the freebie cable over the Chord one and found it to be as good as the most expensive QED. That result is common in blind testing and really differentiates it from ABX tests.
    In my opinion the way the differences between the cables are reported, they can be explained by the fact that it would have taken three brave testers to have said there was no difference. They had been invited to a test expecting to be able to identify differences. 
    15 - Floyd Toole from Harman International (AKG, Infinity, JBL) Audio, Science in the service of art 1998
    A paper written by Floyd Toole which covers a number of topics about scientific measurements and audio. Go to pages 10 and 11 and there is a paragraph on blind testing. It shows how the 'differences' between speakers were greater when sighted tests were used over blind tests. The obvious conclusion is that sighted tests result in factors other than sound come into play when deciding on what sounds better.
    16 - Sean Olive, Director of Acoustic Research Harman Int, blog on The Dishonesty of Sighted Listening Tests 2009
    Research using 40 Harman employees and comparing the results of blind vs sighted tests of four loudspeakers. As with the above by fellow Harman director, sighted tests show bias that blind do not.
    Below the article are various responses to the blog, including a very interesting exchange between Alan Sircom, editor of Hifi Plus magazine and Sean Olive. Alan Sircom makes the very interesting point that volume has a role to play with blind tests
    "Here's an interesting test to explain what I mean: run a blind test a group of products under level-matched conditions. Then run the same test (still blind), allowing the users to set the volume to their own personal taste for each loudspeaker under test. From my (admittedly dated and anecdotal) testing on this, the level-matched group will go for the one with the flattest frequency response, as will those who turn the volume 'down', but those who turn the dial the other way often choose the loudspeaker with the biggest peak at around 1kHz, saying how 'dynamic' it sounds."
    I had not thought of that before. You will end up with different conclusions between a blind test where the volume is set and where the volume can be adjusted. Adjustment allows preferences for different sounds to be expressed, without other influences being present that clearly have nothing to do with sound.
    17. Russ Andrews re-cable David Gilmour's recording studio (not a blind test) 2000-2001
    This is not a blind test, but I think it is worth including here. The studio used (and I think owned) by David Gilmour was re-cabled using Kimber cables by Russ Andrews. This was apparently after extensive AB testing. I would have loved that to be after extensive ABX testing!
    (Thanks to Pio2001 for finding the below tests and links)
    18. DIY Audio forum, confessions of a poster. 2003
    A forum member joined and confessed that " Then I started to hear about some convincing blind tests and finally conducted my own. I was stunned at the results. I couldn't tell a $300 amp from a $3000 in the store I was working at. Neither could anyone else who worked there." Then he did his own blind test on a mate between an Onkyo SR500 Dolby Digital receiver and a Bryston 4B 300 wpc power amp and a Bryston 2 channel pre-amp owned by his mate. The 'red faced' mate could not tell the difference.
    19. The Boston Audio Society, discussion of two blind tests and their analysis 1990
    The BAS in an article discussing a CD tweek blind test by Stereophile; " In the CD-tweak test Atkinson and Hammond conducted a 3222-trial single-blind listening experiment to determine whether CD tweaks (green ink, Armor-All, expensive transports) altered the sound of compact-disc playback. Subjects overall were able to identify tweaked vs untweaked CDs only 48.3% of the time, and the proportion that scored highly (five, six, or seven out of seven trials--Stereophile's definition of a keen-eared listener) was well within the range to be expected if subjects had been merely guessing."
    Then the BAS are very critical of a Hifi News analysis of a blind test of amps from 2006; " Listeners scored 63.3% correct during those trials where the amplifiers were different (95 of the 150 A-BB-A trials). However, subjects scored correctly only 65% of the time when the amplifiers were the same (26 of 40 A-A/B-B trials.) Another way of saying this is that subjects reported a difference 35% of the time (14/40 trials) when there could have been no difference."
    20. Cowan Audio, an Australian audiophile and a blind test between CD players 1997
    A $1800 un named (they were reluctant to name it) versus a $300 Sony which resulted in both only guessing and getting about 50%. William Cowan stated that a sighted test before hand made them say "This will be easy, lets get on with the blind test". Ooops!
    21. Pio2001's own ABX test between CD and vinyl in Hydrogenaudio 2003
    The results were 3/7 and 5/8 correct.
    22. Tom Nousaine, article to Tweak or not to tweak? 1988.
    A test of identical CDP and speakers but different amps and cables, one being $300 and the other $5000. The results with 7 listeners of varying interest in hifi and 10 trials was a fail.
    23. AV Science Forum, Monster vs Opus cables. 2002
    Not particularly rigorous as in there were not enough tests, but as the poster states "And to cut to the chase, Mike could not identify the Monster from the Opus MM with any accuracy (nor the reverse, which also would have been a positive result if he had been consistently wrong) using our testing methodology. We stopped the test a little less than halfway through, I think we got through 8 A/Bs before we gave up."
    24. Stereo.net, blind testing of two pre-amps April 2008
    Its an Australian forum so the conclusion is typically forthright "CONCLUSION:There is bugger all between the 2 preamps, they were so close that any difference could not be reliably picked." The test was run well despite what doubts the tester has.
    25. Stereomojo Digital amp shootout 2007
    Various amps were tested blind, in pairs where the preferred amp went through to the next round. The winner was one of the cheaper amps called the Trends Audio TA-10 at $130, which is the tiny one on the top right of the pile
    26. Head-Fi ABX Cable Test by member Edwood Aug 2006
    Three ICs made with Canare, Solid Silver and Rat Shack cables, but dressed to look the same. Only one person could tell the difference, which you would expect to happen when there is no audible difference and people are most likely guessing.
    27. Les Numeriques. A blind test of HDMI cables by a French site (Google Translator used)
    Nine participants using no name, Belkin and Monster HDMI cables. Only one claimed to have a preference, but his feedback was inconsistent.
    28. Home Cinema Fr .Com, a French test of interconnects (Google Translator used) May 2005
    The cables included ones from Taralabs, VDH, Audioquest and DIY ones. The result was that no one could reliably tell a difference.
    29. Sound & Vision. Article by Tom Nousaine with 3 Blind Tests of speaker cables. c1995
    All three are fails by the listeners using their own hifi systems and with their choice of track, volume and time.
    30. Insane About Sound, Blind Tests of CD vs Audio Files and expensive vs cheap speaker cable. Wall Street Journal Jan 2008
    Tests set up at an audio show in Las Vegas, found Wav files (52%) doing better than MP3 (33%) when compared with CD and in a test of $2000 Sigma speaker cables vs hardware store cable 61% of the 39 who took the test preferred the more expensive cable. So nothing conclusive for any of the tests, but interestingly John Atkinson and Michael Fremer from Sterophile magazine were described as easily picking out the more expensive cable.
    31. AV Science forum, Observations of a controlled cable test Nov 2007
    A blind test between Monster cables and Opus MM, which as far as I can find is $33,000 worth of cable
    but the owner of the very high end kit and cables was unable to tell the difference.
    32. The Audio Critic, ABX test of amps Spring 1997
    A letter by Tom Nousaine to The Audio Critic in which he describes an ABX test of the owner of a very high end system, where a Pass Labs Aleph 1.2 200w mono block amp was randomly changed with a Yamaha AX-700 100w integrated amp. In the first test the owner got 3 out of 10 identified, then 5 out of 10. His wife then got 9 out of 16 and a friend 4 out of 10 correctly identified.
    The letter is split between pages 6 and 7 of the link.
    33. Expert Reviews. Blind test of HDMI cables. Expert reviews 8 Feb 2011
    Two TVs, two Sony PS3s and a James Bond film played side by side with the only variable being changed HDMI cables. What is interesting is that there was little difference with the picture, but much more perceived difference with the sound. But, as many preferred the sound of the cheap to the expensive cables.
    Note - not an ABX test and the reviewer acknowledges there could also be slight differences in the TVs and PS3s to contend with.
    34. Blind test of six DACs, Stereomojo
    Like the other blind as opposed to ABX tests this one found the cheapest and most expensive DAC in the final, with only a hairs width between the two in terms of sound.
    35. The Wilson ipod experiment CES 2004. Stereophile Jan 2004
    Tenth paragraph down. A 'trick' blind test where a group at a consumer technology tradeshow thought they were listening to a $20,000 CDP, but were actually, happily listening to an ipod and uncompressed WAV files.
    Sight really does have a major role to play in sound!
    36. An evening spent comparing Nordost ICs and speaker cables. AVForums June 2006
    Further to the above ipod experiment, a report from a member of the AVForums and his experience of sighted and blind listening tests at a dealers.
    The conclusion comparing the tests
    "And here's what I heard.

    1. All the cables sounded subtly different with one exception.
    2. Differences were less apparent with some music than others
    3. My assessment and experiences "blind" were different to my experiences "sighted""
    37. A blind test of old and new violins. Westerlunds Violinverkstand AB March 2006
    This is really a bit of fun, but it again shows how we hear differently sighted to blind. In this test 6 violins, three c1700 (including a Stradivari) and three modern were played to a group of string teachers who cast votes 1 to 3 on their preferred violin. The stage was kept dark and they could not see which was which. The Stradivari came last, a modern brand won.
    38. The Edge of audibility, blind test of recordings made with and without a mains filter. Pink Fish Media forum June 2011
    You can download and try the recordings yourself. Of those who have already, 2 preferred one, 6 the other and 10 had no preference.
    39. Try a blind test of bit rates. mp3ornot.com
    A really well set out and easy to use bind test of different bit rates.
    40. Blind test of CD transports Stereo.net.au Oct 2008
    Well set up and described, but to reinforce the Australian stereotype, after one set of failed tests they admitted no one could hear a difference, gave up and drank some beers instead! (New link via a NZ forum)
    41. ABX test of tracks with various levels of jitter added. HDD Audio forum March/April 2009
    One member MM has recorded his scores and they are no better than random.
    42. Stereophile ABX test of power amps July 1997
    There were 505 listeners producing the following nicely made graph of results
    which is a bell curve around random, just as you get from guessing. Yet Stereophile claim there was success with test as some people did better than average. There could be some truth in that as there have been blind test passes for amps. Even so it is a very small part of those tested who really need to tested again to confirm whether or not they were just lucky. The test is not statistically significant enough to say there is an audible difference.
    43. Head-fi. A forum member testing cables sighted and blind Nov 2011
    This provides yet more evidence that sighted and blind testing produces consistently different results whereby people can hear a difference when sighted and cannot when listening blind.
    44. Audio Society of Minnesota. Speaker cable listening test. April 2012
    The results are very mixed with no cable making any clear difference. They accept there is no objective difference, but since there is a difference found which can easily be explained by random selection, they conclude a subjective difference is there and so allegedly "cable do make a difference".
    45. The Richard Clark Amplifier Challenge - Reported by Tom Morrow June 2006
    "The Richard Clark Amp Challenge is a listening test intended to show that as long as a modern audio amplifier is operated within its linear range (below clipping), the differences between amps are inaudible to the human ear."
    It is an ABX test which to pass needs two sets of 12 correct identifications. Reputedly over a thousand have taken the test and none have passed.

    "Do the results indicate I should buy the cheapest amp?

    No. You should buy the best amplifier for your purpose. Some of the factors to consider are: reliability, build quality, cooling performance, flexibility, quality of mechanical connections, reputation of manufacturer, special features, size, weight, aesthetics, and cost. Buying the cheapest amplifier will likely get you an unreliable amplifier that is difficult to use and might not have the needed features. The only factor that this test indicates you can ignore is sound quality below clipping."
    Which is a relief for those who have shelled out a lot on a nice amp. 
    46. Audio Video Revolution Forum, thread on blind speaker tests, Nov 2007.
    Positive results which strongly suggest speakers are clearly different even under blind testing conditions, both objectively and subjectively.
    47. PSB Speakers, blind comparison test of four speakers, Nov 2005.
    The writer is happy he did not pick out the cheapo speaker, but he makes no mention of whether or not the speakers were easily identified as different or not.

    The clear conclusion is that ABX testing does not back up many audiophile claims, so they become audiophile myths as they show cables do not inherently change sound. Any change in sound quality comes from the listeners mind and interaction between their senses. What is claimed to be audible is not reliably so. Blind testing is also sometimes passed off as ABX. But blind testing is not really testing, it is a review of a product without seeing it, and that allows claims to be made about sound which have not been verified.
    If hifi is all about sound and more specifically sound quality, then we should, once the other senses have been removed be able to hear differences which can be verified by being able to identify one product from another by only listening. But time and again we cannot.
    So you can either buy good but inexpensive hifi products such as cables, amps, CDPs and be satisfied that the sound they produce is superb. You do need to spend time with speakers as they really do sound identifiably different. Or you can buy expensive hifi products such as cable tec and luxuriate in the build and image and identify one hifi from another by looks and sound. But you cannot buy expensive and identify it from cheap by sound alone.
    Here is The Institute of Engineering and Technology's conclusions on audiophile myths
    which backs up the above conclusions.
  2. Steve Eddy
    Can you edit the subject? Should be Testing Audiophile Claims and Myths.

  3. upstateguy

  4. Steve Eddy

    And you always have. :D

    EDIT: I have to admit though, I like anetode's version better. :p

  5. jcx
    it would help to go into more detail about actual psychoacoustic controlled listening requirements, rank the "tests" on whether they used all (or any?) required controls
    my understanding is that the Meyer-Moran study did on reexamination use some source labeled as "hi rez" that spectrum analysis showed wasn't - but not all of the hi rez sources used were problematic - so a portion of the subjects did compare "true high rez" sources to the 16/44 bottleneck
  6. arnyk
    On careful examination about half or more of all SACDs and DVD-As that were available on the market at the time of the Meyer and Moran tests were sourced from low resolution sources.
    This obviously had a bad effect on their test but the breadth of the problem was not discovered until later. There were already extant papers detailing the problem, but they were not well-known and didn't attempt to cover the market.
    In a way the problem of low resolution sources is eternal or nearly eternal, because making a recording whose dymamic range taxes the CD format is very difficult or impossible and of course it is rarely evem approached.
    The problem of detecting any negative effect due to the bandpass limits of CDs is a separate problem based on how the ears work, but it is just as tough.
  7. prot
    Thank you
  8. upstateguy

  9. asdfg
    Damn. That's one thorough analysis.
  10. asdfg
    But quite a good one actually.
  11. old tech

    I remember that, it was supposed to be a "gotcha" moment. That issue was later picked up and as you say still half of the material was hi res.

    So, far from being flawed, it further reinforced that there is no difference as no one picked up the samples that were hi res. The authors later invited subjects, including production engineers, to bring in their own hi res material. The results were still the same,

    So overall, it is hardly a flawed study or a flawed result. Indeed, the results still haven't been refuted to this by anyone demonstrating that they can hear a difference more than chance in a controlled environment.
  12. castleofargh Contributor
    it's very much science in action, you test something, get a result for that test. later you find out you missed a parameter, instead of behaving like a spoiled kid, you factor it and see how it alters the conclusions of the test. because the aim is knowledge, not money, not winning an argument, not feeling strong. just knowing a little more.
    on one side you have that state of mind, always ready to adapt to reality. on the other hand you have a bunch of empty claims with zero evidence(not one!) from sighted evaluations who decide they are better than testing methods, or sometimes even better than measurements.
     hard to guess who's closer to the truth.[​IMG]
    sighted evaluation guy: "I'm so good I can spot the ball every single time"
    it's obvious that kind of test makes no sense and can never prove a thing, but the audiophiles who say they can tell highres from cd sound will go a step further in the quest for meaningless conclusions, they will move the transparent glasses themselves when they pretend to listen for differences.
    and the cherry on the cake, when we don't take them seriously, they get offended. [​IMG] 
  13. upstateguy

Share This Page