No, it isn't about quality of sound. Nobody who advocates for burn in ever says the result of burn in made the phones in question sound worse. All I'm looking for is some relatively meaningful proof of ANY audible change as a result of some recommended burn in period. Once that has been established, then maybe we can start talking about whether that change is good or not...
And I added an edit to my previous post which presents at least something more than just beliefs.
...and it's worth noting that one set of these beliefs is commonly used TO SELL STUFF...which it seems to me should require a little more in the way of proof than beliefs that are not designed to sell stuff.
And I'll see if I can find that sennheiser test you asked for. In the mean time perhaps you can find me the one that shows ten or more people WERE able to tell the difference! LOL...
That link you posted concludes with: Therefore, it is possible that there are headphones in the market that would require break-in but were not included in our test. It also possible that increasing the length of test by a few hundreds of hours, or testing headphones of others types (with electrostatic or hybrid drivers) would show evidence of burn-in. Additionally, we only compared the headphones in terms of frequency, phase, and harmonic distortion response. Other metrics such as inter-modulation distortion or non-coherent distortion may be able to show a pattern of change that could be considered as evidence for headphones break-in.
Yes, it's about quality of sound. In my example about pad wear, an audio engineer will find the quality worse, your typical at home listener will find it better. Sound quality from an Audiophile (and NOT an audio engineer)'s perspective is subjective. Why are you looking for meaningful proof? If you've bought a pair of headphones, listened to them, liked them, listened to them for 500 hours, did not notice any kind of change and are still happy with them, then all's well in your world.
The link your posted previously also shows some changes. Whether those are audibly relevant or not is going to depend on many different factors, but the phase and frequency response timelapse graphs on the link you posted clearly show a shift in curves for X hours of break in.
As for selling stuff, you'd have to be a pretty terrible salesman if your sales argument was "my product is only good after 500 hours of use". I don't think that's a very good sales argument at all.
As for showing you links where people can tell a difference, so far the only ones I've seen are the articles about Stradivarius violins, where it is explicitly pointed out that people can tell a difference between the old & new. I don't know myself of any actual volume matched, double blind ABX testing done with headphones as I've asked for some, and I am not here to prove my belief one way or the other. I'm only interested in seeing actual data and valid testing, not anecdotal evidence, whether that evidence favors the idea that burn-in has an audible effect on sound signature or not. Personally I am very happy with the gear I have, and the only burn-in I've done on any pair of headphones I've ever owned was when they were on my head with music playing through them. That being said, I haven't read anything in this whole thread that invalidates the argument that something like a diaphragm will loosen over time and impact sound signature, and it seems very plausible to me.