Quote:
I'm not sure there is any logic in a "burn-in doesn't exist" argument. |
The point from every reasonable critic's POV that I've read is not that there's nothing that could theoretically change over time, but rather that those changes could not be large enough within the tolerances of a headphone's driver to become audible.
Add to that the fact that there isn't one single person who has been able to detect these differences when they didn't know which pair they were listening to and the argument ("burn in" is "real") becomes very weak.
Now cue a couple more people to chime in that they didn't like headphone X at first, then thought the sound got dramatically better over time so that "proves" to them that "burn in" is real...
PS: Something else that works against the reality of "burn in" (at least in my mind) is the fact that not one single phone has EVER gotten
worse with break in. I would think if there were indeed these dramatic changes taking place in the driver's materials that at least occasionally they would sound better to someone before those changes took place. Does this not strike anyone else as odd?
Quote:
Look at CPU's, with my Barton 2500 straight outa the box couldn't go above 3200 on default voltage, and became unstable after upping voltage. And now I can get to 2.7GHZ+ AFTER BURN-IN. |
Agreed (and very common), but it's not the CPU that's "burning in"- it's the thermal paste/pad.
Yeah, my grilled-cheese sandwich toasts more evenly in the pan once the butter "burns in", too...
Quote:
I mean how can someone not "BELIEVE" in burn-in, when it’s so blatantly obvious with every headphone and speaker out there…? Then how can that same person then consider himself an audiophile, and have the nerve to come here preaching out there ass? |
Whoa- who's preaching again? Think about that.
As it stands right now, there is no evidence for "burn in". Someone taking the stance that they can't accept something without evidence is not preaching- it's actually the only reasonable conclusion.