Is burn in real or placebo?
Mar 19, 2019 at 3:25 PM Post #751 of 897
lol. Can you present me with a random article by a random guy nobody knows anything about that proves burn in exists? I mean that test I linked to seemed pretty good to me - and pretty well implemented. It's exactly what I had in mind prior to my having found it. Sure, it isn't perfect, but it's a fair bit better than nothing - which is what you've presented so far. I'm not worked up, btw.
 
Mar 19, 2019 at 3:36 PM Post #752 of 897
lol. Can you present me with a random article by a random guy nobody knows anything about that proves burn in exists? I mean that test I linked to seemed pretty good to me - and pretty well implemented. It's exactly what I had in mind prior to my having found it. Sure, it isn't perfect, but it's a fair bit better than nothing - which is what you've presented so far. I'm not worked up, btw.

You seem to think I am here trying to prove burn-in is real, or at least that is the role you seem to have taken on in order to argue back and forth with me. That is not my goal here, and providing you with articles is not what I'm interested in doing in this discussion. The test you linked to is hardly conclusive of anything in my opinion. It's a start, certainly, but again this is the Sound Science forum. Anything short of perfect testing is not going to fly.
 
Mar 19, 2019 at 3:41 PM Post #753 of 897
Here is my take on burn-in:
  • I reckon a good starting hypothesis is that burn-in that makes a large, positive change in the sound from headphones or speakers is a myth. If someone can supply controlled scientific tests on a pair of decent headphones that contradicts this, then I'll be happy to revise my opinion, but, as usual, the burden of proof is on the folks that believe the myth.
  • The question of the existence of burn-in is kind of moot, since (if it were a real thing) burn-in would happen naturally as a result of use. If it happens, we're good eventually, if it doesn't, we're good right out of the box.
  • As has been pointed out, the myth of burn-in allows dealers and manufacturers to stall unhappy customers until their brains have a chance to adjust to the sound of their new speakers or headphones. It also gives those customers a warm fuzzy feeling that they are participating in a ritual that will improve their new purchase.
 
Mar 19, 2019 at 3:41 PM Post #754 of 897
I'm sorry, but when the alternative is no testing at all, imperfect testing is suitable for the time being. You asked if there had been any tests done comparing headphones pre burn in to phones post burn in. The answer is yes, there has been.

I'm suggesting that burn in is probably a myth - in response to the question posed by the thread title. I've presented an article that does some fairly rigorous tests on 4 different models of headphone and that concludes that burn in is probably a myth. If you don't want to advocate in favor of burn in, then there's no need for you to respond. If you do wish to advocate in favor of burn in...show me something better than what I presented.
 
Last edited:
Mar 19, 2019 at 3:50 PM Post #755 of 897
It's a start, certainly, but again this is the Sound Science forum. Anything short of perfect testing is not going to fly.
Actually, this is the Sound science forum, not the Unsound science forum. Perfect testing before flying just can't happen, and that's not the way science works. All the evidence I've seen indicates that burn-in isn't a real consideration. If you can supply one well-conducted study that shows it's real even for one set of decent cans, then we'll have to change our theory. Expecting a study that tests all the headphones on the market and finds no evidence of burn-in before you'll believe that it doesn't exist is equivalent to having to look in every English garden finding no faeries to prove that faeries don't exist.
 
Mar 19, 2019 at 4:07 PM Post #756 of 897
Great, and that's likely the case for that model of headphone, or maybe it's not, plenty of product designers ended up finding out new facts about their products only after end-users came back with feedback. That is typically how products are improved upon. What brand/model of headphones were these? "I've only spoken with one headphone designer who said burn-in is not necessary" is also anecdotal. Any concrete evidence to back up this claim?

I explained this earlier in the thread, but I'll explain it again...

I was asked by Oppo (along with Tyll and several other people) to evaluate their prototype for the PM-1. They sent me three iterations of the headphones over the period of four months or so, the third set was a final retail copy. We had a forum where evaluators and the designer of the headphones could communicate. The designer was very open and available to us to answer questions and provide technical information. He told us that burn in wasn't going to affect the sound at all, and that the manufacturing tolerance had a tolerance of +/-1dB across the response curve.

I brought over a friend of mine who is a sound engineer with an interest in achieving a balanced response from professional speaker installations. He brought over his test equipment and we set it up on my kitchen table. I had two sets of PM-1s at the time he was here, one was a couple of months old and I had been using them a lot since I received them. The other set had just arrived and was still in the shipping box. I was told that the only difference between them had to do with clamping pressure. (That was the issue several of us in the evaluation committee had commented on.) My friend went over the response his way using tones, and I did it my informal way using a reference music track and an equalizer. We both came up with pretty much the same results. and my sound mixer friend found the difference between the two sets of cans was within 1dB, differing only slightly around 1kHz to 3kHz (if I remember correctly). With my informal way of comparing, I couldn't tell any difference. When Tyll's measurements were published (I don't know if he burned in or not), his results were within a dB or so of ours too.

The engineer from Oppo explained to me that the reason a high end set of cans costs more than a midrange set isn't because of design or materials, it's because of manufacturing tolerances. Oppo was testing and rejecting a fair percentage of the headphones as they came off the production line because they didn't meet the spec. That cost and the cost of testing each set of headphones individually got rolled in to the cost of the headphones that did meet spec that ended up being sold.

This was not a strict scientific test by any means, but you have three people (me, my sound mixer friend, and Tyll) using three different methods of evaluating and coming out at basically the same place. And burned in and not burned in headphones came out the same. My notes on my evaluation are here in Head-Fi somewhere in the archives if you're interested.

I would recommend that you not come here with a chip on your shoulder demanding things of us unless you are prepared to provide evidence yourself. It's easy to be an armchair duffer ramming through unfounded beliefs with sloppy logic, but those kinds of people don't fare well in Sound Science.
 
Last edited:
Mar 19, 2019 at 4:10 PM Post #757 of 897
I'm sorry, but when the alternative is no testing at all, imperfect testing is suitable for the time being. You asked if there had been any tests done comparing headphones pre burn in to phones post burn in. The answer is yes, there has been.

I'm suggesting that burn in is probably a myth - in response to the question posed by the thread title. I've presented an article that does some fairly rigorous tests on 4 different models of headphone and that concludes that burn in is probably a myth. If you don't want to advocate in favor of burn in, then there's no need for you to respond. If you do wish to advocate in favor of burn in...show me something better than what I presented.

I disagree. Bad testing can be worse than no testing at all in many many cases.

Actually, this is the Sound science forum, not the Unsound science forum. Perfect testing before flying just can't happen, and that's not the way science works. All the evidence I've seen indicates that burn-in isn't a real consideration. If you can supply one well-conducted study that shows it's real even for one set of decent cans, then we'll have to change our theory. Expecting a study that tests all the headphones on the market and finds no evidence of burn-in before you'll believe that it doesn't exist is equivalent to having to look in every English garden finding no faeries to prove that faeries don't exist.

Again, bad testing is worse than no testing. I've not seen any such well-conducted study that shows burn-in isn't a real consideration, still waiting for someone to show me some data here. I'm not expected a study that tests all headphones, just 1 model, but under proper circumstances, meaning comparing 2 pairs, 1 brand new, 1 with 500 or more hours of use on it, with volume matching in a double blind ABX setting.

Ultimately, everything you call "evidence" is showing slight differences after burn in. You call those "not a real consideration" and that's fine, but that is subjective and your belief, not concrete evidence of anything.
 
Mar 19, 2019 at 4:18 PM Post #758 of 897
I explained this earlier in the thread, but I'll explain it again...

I was asked by Oppo (along with Tyll and several other people) to evaluate their prototype for the PM-1. They sent me three iterations of the headphones over the period of four months or so, the third set was a final retail copy. We had a forum where evaluators and the designer of the headphones could communicate. The designer was very open and available to us to answer questions and provide technical information. He told us that burn in wasn't going to affect the sound at all, and that the manufacturing tolerance had a tolerance of +/-1dB across the response curve.

I brought over a friend of mine who is a sound engineer with an interest in achieving a balanced response from professional speaker installations. He brought over his test equipment and we set it up on my kitchen table. I had two sets of PM-1s at the time he was here, one was a couple of months old and I had been using them a lot since I received them. The other set had just arrived and was still in the shipping box. I was told that the only difference between them had to do with clamping pressure. (That was the issue several of us in the evaluation committee had commented on.) My friend went over the response his way using tones, and I did it my informal way using a reference music track and an equalizer. We both came up with pretty much the same results. and my sound mixer friend found the difference between the two sets of cans was within 1dB, differing only slightly around 1kHz to 3kHz (if I remember correctly). With my informal way of comparing, I couldn't tell any difference. When Tyll's measurements were published (I don't know if he burned in or not), his results were within a dB or so of ours too.

The engineer from Oppo explained to me that the reason a high end set of cans costs more than a midrange set isn't because of design or materials, it's because of manufacturing tolerances. Oppo was testing and rejecting a fair percentage of the headphones as they came off the production line because they didn't meet the spec. That cost and the cost of testing each set of headphones individually got rolled in to the cost of the headphones that did meet spec that ended up being sold.

This was not a strict scientific test by any means, but you have three people (me, my sound mixer friend, and Tyll) using three different methods of evaluating and coming out at basically the same place. And burned in and not burned in headphones came out the same. My notes on my evaluation are here in Head-Fi somewhere in the archives if you're interested.

I understand what you're saying, but that is hardly evidence that burn-in never happens for any headphones. In fact, since there was a recorded difference (only slightly, but a recorded difference nonetheless) with the PM-1s, what's to say that this difference isn't more accentuated with other headphone brands/models? Again, you could not hear a difference yourself and that's fine, but you're asking me to trust your ear, not really giving me factual evidence, just an anecdote about you and a friend testing, and then Tyll, who was able to reliable tell a difference (albeit small) between a brand new pair and a "burned in" pair: https://www.innerfidelity.com/content/testing-audibility-break-effects-page-2

He says himself in that article: It's clear to me, having had the experience, that there is indeed an audible difference when breaking-in a pair of Q701 headphones. I've seen measured differences, and now experienced audible differences. While the measured differences are small, I believe the human perceptual system is exquisite and able to perceive, sometimes consciously and sometimes sub-consciously, subtle differences.
 
Mar 19, 2019 at 4:19 PM Post #759 of 897
I understand what you're saying, but that is hardly evidence that burn-in never happens for any headphones.

Uh oh. You just went from "show me one example" to "one example isn't enough". Obviously no one is going to be able to prove a negative with every set of cans in the world. That just isn't possible and it isn't reasonable to demand that. It's better to try to prove burn in exists. Can you show me evidence that burn in is audible to human ears with any headphones under a controlled listening test? I'd be interested in seeing that. (remember when you answer that you already said you didn't know of any.)
 
Last edited:
Mar 19, 2019 at 4:23 PM Post #760 of 897
I disagree. Bad testing can be worse than no testing at all in many many cases.



Again, bad testing is worse than no testing. I've not seen any such well-conducted study that shows burn-in isn't a real consideration, still waiting for someone to show me some data here. I'm not expected a study that tests all headphones, just 1 model, but under proper circumstances, meaning comparing 2 pairs, 1 brand new, 1 with 500 or more hours of use on it, with volume matching in a double blind ABX setting.

Ultimately, everything you call "evidence" is showing slight differences after burn in. You call those "not a real consideration" and that's fine, but that is subjective and your belief, not concrete evidence of anything.

lol. OMG. The test was only "bad" according to you and because it didn't conclude what you wanted it to conclude. If you have something that is at least as good that indicates burn in is real, present it. Until then...I win. :D
 
Last edited:
Mar 19, 2019 at 4:35 PM Post #761 of 897
Uh oh. You just went from "show me one example" to "one example isn't enough". Obviously no one is going to be able to prove a negative with every set of cans in the world. That just isn't possible and it isn't reasonable to demand that. It's better to try to prove burn in exists. Can you show me evidence that burn in is audible to human ears with any headphones under a controlled listening test? I'd be interested in seeing that. (remember when you answer that you already said you didn't know of any.)

The goal posts have moved substantially over the past couple pages! :)
 
Mar 19, 2019 at 4:43 PM Post #762 of 897
A fairly simple approach to testing if someone with the right equipment wants to take a run at it.

Eliminating the variables:
  • Need to have identical pad wear
  • Need to have identical placement
  • Need to avoid reliance on human audio memory.
To deal with the above, I would suggest buying two of the same model headphones, building a jig that allowed for identical placement on the measurement rig, and two measurement rigs. Using the jig to place both headphones on the measurement rig on day 1 would ensure identical (within reasonable limits) placement and would ensure pad break-in would (within reasonable limits) to occur identically on both headphones. Take baseline measurements of both headphones

Play music/tones/whatever on headphone "A" for 30 days while leaving headphone "B" on it's rig without any input. Without moving the headphones, measure again on day 30.

While not perfect, I believe that process should show any non pad/placement changes due to burn in if any exist
 
Mar 19, 2019 at 4:47 PM Post #763 of 897
A fairly simple approach to testing if someone with the right equipment wants to take a run at it.

Eliminating the variables:
  • Need to have identical pad wear
  • Need to have identical placement
  • Need to avoid reliance on human audio memory.
To deal with the above, I would suggest buying two of the same model headphones, building a jig that allowed for identical placement on the measurement rig, and two measurement rigs. Using the jig to place both headphones on the measurement rig on day 1 would ensure identical (within reasonable limits) placement and would ensure pad break-in would (within reasonable limits) to occur identically on both headphones. Take baseline measurements of both headphones

Play music/tones/whatever on headphone "A" for 30 days while leaving headphone "B" on it's rig without any input. Without moving the headphones, measure again on day 30.

While not perfect, I believe that process should show any non pad/placement changes due to burn in if any exist

I think the way it was done in the article I linked is even better. Using two same model pairs allows the variable of variance between the two pairs. I mean I think testing at different points during the burn in process of a pair that hasn't been moved at all on the testing device is pretty sound...
 
Last edited:
Mar 19, 2019 at 4:50 PM Post #764 of 897
I think the way it was done in the article I linked is even better. Using two same model pairs allows the variable of variance between the two pairs. I mean I think testing at different points during the burn in process of a pair that hasn't been moved at all on the testing device is pretty sound...


Good point. My only thought about using two pairs would be to avoid the inevitable "unicorn" argument and the variability between the two headphones should stay consistent.
 
Mar 19, 2019 at 5:11 PM Post #765 of 897
... It also gives those customers a warm fuzzy feeling that they are participating in a ritual that will improve their new purchase.

salient point. That ritualistic quality is pretty appealing. Setting up your little burn in station and then waiting for the process to complete at which time you get to audition those sweet burned-in phones...it's like brewing up a good beer or something. :)
 

Users who are viewing this thread

Back
Top