question about iem drivers
Jan 1, 2009 at 4:20 PM Thread Starter Post #1 of 15

xcluded

100+ Head-Fier
Joined
Dec 27, 2008
Posts
151
Likes
116
hi people , as far as i know dynamic driver need to burn in.

what about balanced armature driver ? i searched quite a few threads but could not confirm on this.

i'm running in my IEM for sometime.....the sound seemed a little open up and settled down....wondering whether is it the burning in effect of the driver or the cable instead.

confused.gif
 
Jan 1, 2009 at 4:33 PM Post #2 of 15
Burn-in varies case by case. Most don't, but some do. If it sounds better to you, it is better - regardless of why it is so.
 
Jan 1, 2009 at 5:37 PM Post #4 of 15
Some argue that the cable requires burn in, but so far as I kno the drivers arent burning in.
I personally hold the view that if you hear a difference than that is what matters.
My guess is that the cable likely burned in, I think cardas posted up an insight about cable burn in here - Cardas Audio
if you'd like to read it.

Dave
 
Jan 1, 2009 at 6:37 PM Post #5 of 15
On a related issue.

I went to see an optician to get myself new glasses, as my eyesight had gone worse. I finally got them yesterday, and at first everything seemed very sharp compared to my old glasses. I even got a light headache. But now it seems much better and more accurate, they have 'softened up'.

Do you guys think the lenses of the glasses burnt in, or was it maybe the frame? Has anyone had similar experiences?

Before you say anything silly about glass not containing moving parts, let me tell you that the frames surely do move, and glass is actually not in a solid phase, so it moves (though very slowly). Also, the photons which go through the glass might also require lots of burn in.

On topic, I think that it was the solder connecting the cable to the headphones that required burn-in. Solder require lots of burn-in, in my opinion. It's true, because I believe it, and you can't prove otherwise. Ha!
 
Jan 1, 2009 at 7:25 PM Post #8 of 15
The best thing to do would be to ask a company rep if the people that design whatever earphones have "burn-in" designed into, or if it has any effect on the earphones.

By designed into I mean the material used to create the sound is expected to change over time.
 
Jan 1, 2009 at 7:31 PM Post #9 of 15
My eyes have a burn in time too.

Like two years ago I went for a check up and my eyesight was 20/19.

I went this August and my eyesight is apparently a 20/15 now? How the hell did it just get better?
 
Jan 1, 2009 at 7:50 PM Post #10 of 15
Quote:

Originally Posted by mape00 /img/forum/go_quote.gif
On a related issue.

I went to see an optician to get myself new glasses, as my eyesight had gone worse. I finally got them yesterday, and at first everything seemed very sharp compared to my old glasses. I even got a light headache. But now it seems much better and more accurate, they have 'softened up'.

Do you guys think the lenses of the glasses burnt in, or was it maybe the frame? Has anyone had similar experiences?

Before you say anything silly about glass not containing moving parts, let me tell you that the frames surely do move, and glass is actually not in a solid phase, so it moves (though very slowly). Also, the photons which go through the glass might also require lots of burn in.



1. Glass is NOT a liquid. Antique window panes are thicker at the bottom because of how they were made.

2. I won't even start on "the photons which go through the glass might also require lots of burn in".

3. My optician told me last week that it'd take a few days for my eyes to adjust to my new glasses... so it's the EYES that need burn-in. Based on what I've learned on head-fi, you should probably watch TV static (white noise) for 100 hours or so before using your new glasses.
very_evil_smiley.gif


4. Back on-topic, I did notice a slight amount of change with my TF10s over the first few days... although, like I hint at in #3, that could just be my ears adjusting to the new sound signature.
 
Jan 1, 2009 at 8:02 PM Post #11 of 15
Quote:

Originally Posted by FYDave /img/forum/go_quote.gif
My eyes have a burn in time too.

Like two years ago I went for a check up and my eyesight was 20/19.

I went this August and my eyesight is apparently a 20/15 now? How the hell did it just get better?



Maybe you ate a lot of ice cream and accidentally (and fortunately!) cryoed your optical nerves?

Nailzs: I think Don Wilson who designed ER-4 wrote somewhere that they didn't take into account cable burn-in (more precisely, he couldn't even imagine how changing the cable could make any difference) or driver burn-in (he claimed their measurements indicated extremely little change over time). So there you have a partial answer.
 
Jan 1, 2009 at 8:28 PM Post #12 of 15
Quote:

Originally Posted by oogabooga /img/forum/go_quote.gif
1. Glass is NOT a liquid. Antique window panes are thicker at the bottom because of how they were made.


There may still be changes in the internal structure of the glass. It doesn't have to change visibly on a macroscopic level. There could be elastic effects similar to those introduced by dislocations in crystals, which are somehow affected by usage.

Quote:

Originally Posted by oogabooga /img/forum/go_quote.gif
2. I won't even start on "the photons which go through the glass might also require lots of burn in".


It's a complicated matter. Because of quantum mechanics, you can't even say the photons went through the glass. Also, since you can never have a photon at rest in any inertial frame, you won't be able to stop it and ask if it likes to burn in. Also, photons might turn out to be strings or branes, which are vibrating the same way a membrane is vibrating. Hence, it might require burn-in. You can't disprove it.

Quote:

Originally Posted by oogabooga /img/forum/go_quote.gif
3. My optician told me last week that it'd take a few days for my eyes to adjust to my new glasses... so it's the EYES that need burn-in. Based on what I've learned on head-fi, you should probably watch TV static (white noise) for 100 hours or so before using your new glasses.
very_evil_smiley.gif



Yeah, but every educated person would also tell you that leaving your cable burning in for hundreds of hours is ****ing bat**** insane, yet it's very common here on head-fi, and lots of people think it makes a difference. There might be a conspiracy, don't you think?

Quote:

Originally Posted by oogabooga /img/forum/go_quote.gif
4. Back on-topic, I did notice a slight amount of change with my TF10s over the first few days... although, like I hint at in #3, that could just be my ears adjusting to the new sound signature.


I still think it's the solder. Maybe it's the electrons passing through it burning residue flux or something. Or maybe it's related to quantum gravity. Who knows.
 
Jan 1, 2009 at 8:45 PM Post #13 of 15
Quote:

Originally Posted by mape00 /img/forum/go_quote.gif
On a related issue.

I went to see an optician to get myself new glasses, as my eyesight had gone worse. I finally got them yesterday, and at first everything seemed very sharp compared to my old glasses. I even got a light headache. But now it seems much better and more accurate, they have 'softened up'.

Do you guys think the lenses of the glasses burnt in, or was it maybe the frame? Has anyone had similar experiences?

Before you say anything silly about glass not containing moving parts, let me tell you that the frames surely do move, and glass is actually not in a solid phase, so it moves (though very slowly). Also, the photons which go through the glass might also require lots of burn in.



That happens to me when I get new glasses and I believe that burn-in can be defined as two meanings. The first is that burn-in is the actual settling of the drivers of the headphone. The second is that burn-in is the act of the person getting used to the sound of the headphone. The analogy of the glasses is probably the second definition of burn-in.
 
Jan 1, 2009 at 9:13 PM Post #14 of 15
Quote:

Originally Posted by mape00 /img/forum/go_quote.gif
There may still be changes in the internal structure of the glass. It doesn't have to change visibly on a macroscopic level. There could be elastic effects similar to those introduced by dislocations in crystals, which are somehow affected by usage.


I can't see how photons (in the visible spectrum, at any rate) are going to effect a change in the internal structure of glass. As for the dislocations on crystals (which I assume to mean defects in the crystal lattice), glass is not a crystal but an amorphous solid, so I don't see how these elastic effects would apply.

Quote:

Originally Posted by mape00 /img/forum/go_quote.gif
It's a complicated matter. Because of quantum mechanics, you can't even say the photons went through the glass. Also, since you can never have a photon at rest in any inertial frame, you won't be able to stop it and ask if it likes to burn in. Also, photons might turn out to be strings or branes, which are vibrating the same way a membrane is vibrating. Hence, it might require burn-in. You can't disprove it.


I think my point was that it would be the glasses that would require burn-in, not the photons, in the same way that people say the cable/cans/amp needs burning-in, not the electrons. Even if you could stop the photon (or slow it down in a Bose-Einstein condensate), I don't think there is a "burn-in" Hamiltonian that you could apply to measure the burn-in energy, hence it doesn't exist. You can't disprove that!

Quote:

Originally Posted by mape00 /img/forum/go_quote.gif
Yeah, but every educated person would also tell you that leaving your cable burning in for hundreds of hours is ****ing bat**** insane, yet it's very common here on head-fi, and lots of people think it makes a difference. There might be a conspiracy, don't you think?


I'm of two minds on this... I'll get back to you after I've finished burning in my cable for 200 hours.
biggrin.gif


Quote:

Originally Posted by mape00 /img/forum/go_quote.gif
I still think it's the solder. Maybe it's the electrons passing through it burning residue flux or something. Or maybe it's related to quantum gravity. Who knows.


Could also be new tips soaking up the cerumen...
tongue_smile.gif
 
Jan 1, 2009 at 11:23 PM Post #15 of 15
Quote:

Originally Posted by oogabooga /img/forum/go_quote.gif
I can't see how photons (in the visible spectrum, at any rate) are going to effect a change in the internal structure of glass. As for the dislocations on crystals (which I assume to mean defects in the crystal lattice), glass is not a crystal but an amorphous solid, so I don't see how these elastic effects would apply.


And I can't see how microwave radiation from mobile phones could cause cancer, but there's still lots of money going into that kind of research. Maybe there is enough short-range order somewhere in the glass; maybe there are crystals present somewhere because of inadequate processing. And what about the anti-reflective coating?

Quote:

Originally Posted by oogabooga /img/forum/go_quote.gif
I think my point was that it would be the glasses that would require burn-in, not the photons, in the same way that people say the cable/cans/amp needs burning-in, not the electrons. Even if you could stop the photon (or slow it down in a Bose-Einstein condensate), I don't think there is a "burn-in" Hamiltonian that you could apply to measure the burn-in energy, hence it doesn't exist. You can't disprove that!


It would be kind of ugly it photons had a tiny but undetectable rest mass (which they would need to have, to be able to have burn-in, unless something is wrong with special relativity). I bet the burn-in Hamiltonian would be equally ugly and undetectable (except by my videophile eyes).
 

Users who are viewing this thread

Back
Top