Sgt. Ear Ache
100+ Head-Fier
Amps SHOULD perform better than the published specs - those are only the minimum performance figures. And most do just that - they get better.
And, most of them do need about those 300 hours to really settle down and remain constant in performance - at least when operated at intended temperature. I hope no one will dispute the fact that amplifier from the stone cold after non operation for a considerable amount of time sounds exactly the same as after having reached its optimum operating temperature.
Reaching published specs zealots have been perhaps best answered by Bob Carver; after one of his tested amps failed to reach published power output by a few watts ( at 100 or 200 W spec IIRC ) due to a few too little turns of wire in the secondary of the transformer in the first batch, both competition and specsomaniacs had a field day. After that, he spec'd all of his amps as 101 + 101 Watts, 201 +201 Watts, 401 + 401 Watts, etc - and seeing to that that even with AC voltage lower than normal, that " one more watt " has been available.
Have any published data you can point me to comparing any amps specs out of the box and after 300 hours use and revealing the audible differences?
I'd also be interested in seeing the numbers after 100 and 200 hours as well. I always wonder how these burn-in numbers are arrived at. Why does something need 300 hours of burn in? What could happen between 200 and 300 hours? And why does the process then stop at 300 hours? lol. What if measurements are taken at 400 hours (assuming such measurements are ever taken, and I don't think they ever are) and its now appreciably WORSE than it was at 300 hours?
And while the Bob Carver story is mildly amusing I suppose, it really has nothing to do with what we're talking about since we are actually talking about IMPROVING on published specs (via burn in) rather than simply reaching those specs.
Last edited: