Originally Posted by ACDOAN
" All Bryston amps get a rugged 100 factory hours burn-in consisting of a square wave input signal driving the amplifier into capacitive load slightly under clipping.Unlike resistive load, which dissipates all the energy as heat, a capacitive load feeds back the entire signal into the amplifier which puts maximum thermal stress on the output stage. After burn-in, each amplifier is again tested; the results are shipped with the amplifier. "
As I said - if the product will change in a meaningful way due to burn-in, it will be burned in at the factory. No company wants their product to perform poorly out of the box - that first impression is important.
If it wasn't burned in at the factory, then there's no meaningful change from burn-in.
And please note that meaningful is the key word there. That's not to say the performance will remain completely unchanged over time, but that it won't change in some significant way so that it now sounds completely different to the device you didn't like when you took it out of the box.
If, after a few hundred hours of listening (listening, not leaving them running unattended) you start to like the headphones/amp/whatever a lot more, then it's because you have now adjusted to its sound - it's not that it has changed.
And it sounds like the "burn-in" that Bryston are doing, is actually stress-testing the hardware before shipping it out, rather than it actually impacting the audio quality in any way.
I see the same thing mentioned on forums discussing Plasma TVs as well. Originally people suggested that you needed 1000 hours on a set before it was properly run in and operating at its peak performance.
People were making claims about how the dithering noise on their Pioneer plasma sets magically disappeared by the time it hit 1000 hours.
That "hash digital image" when they first switched from their CRT to a Plasma disappeared after several hundred hours.
And over time, that number has dropped to about 100 hours because newer plasmas are better than they used to be, so they don't need run in as long.
A lot of these arguments can be quite convincing, and you have a lot of people repeating this information, and saying it made a big difference to them.
Well things like the dithering on a Pioneer plasma are inherent to those displays - if they reduced the amount of dither, you would see a drastic increase in banding in the image. They have very limited gradation and need to use a lot of dither to make up for it.
So that can't be changing over time - yet people report that it does.
There used to be reports that the black level on Panasonic plasmas got better over time - well it has now been shown that it actually gets worse over time.
And claims that the "harsh digital image" has now become more analog - well displays like Plasmas have a fixed pixel structure, and their image processing isn't changing over time. So it's the same thing there - people are just getting used to the device, rather than it actually improving over time.
Edited by StudioSound - 5/17/13 at 7:55am