Here my 2 cents on burn-in. And I want to address both the audio and the failure aspects. I worked for 20+ years in semiconductor (AMD, ON Semi, TI, etc) and electronics (starts with A is one of the largest in the world in terms of sales and money in the bank). So I have background on these electronics components and systems, but not in audio.
First, on burn-in for sound. There is no burn-in that will change for electrical parameters. The only possible burn-in effect might be physical or material based. Possibly for some capacitors, drivers, etc. Then the question is whether burn-in changes anything related to audio, or if it is even perceivable, or which direction is the change?
if burn-in affects audio, my take is that material change related to burn-in is random and not well controlled, if it exists at all. So the audio impact can also be random. 50% might find sound “better” after burn-in, and other 50% might find sound “worste after burn-in. Just like some like their audio with an edge, others like it more mellow and land smooth. But everyone talks about burn-in as audio sounding better. My take it is placebo effect. But who knows. If someone believes it, do burn-in. If I don’t believe it, I just start listening on day 1.
(Tubes are different and I have zero experience with this.)
Secondly, I want to talk about the early failures that Shane D and some others experienced. Electronic component has this bathtub shaped curve when failure rate is plotted against time. One contribution is early failures, often called infant mortality. Due to manufacturing defects or process issues, there is a higher failure rate at the beginning of usage.
https://en.wikipedia.org/wiki/Bathtub_curve
This is not due to QC (quality control) or testing. Even if units are tested to weed out non-functional or sub-performing (not meeting spec) units, all passing components still follow these failure curves. Failures might be 1 in thousands for low quality components, to 1 in hundred thousand for higher quality ones. Then in an amp, there are hundreds of these components. So the failure rates “add” up as any component failure will cause the system to fail.
Only way to address these early failures is thru burn-in. When you buy computers, you might hear some system integrators will assemble and do burn-in (maybe hours to overnight). That is to weed out early failures. In consumer electronics industry, no one does burn-in as it cost money and takes time. The amount of hardware needed and time required will not be able to keep up with manufacturing demand, Exception is in Space, Flight, Military industries. Components used in satellites cost 1000x more and burn-in is required. Once the multi-million or billion dollar satellite goes up, damaged components cannot be replaced or repaired. Components do go thru many days to 1 week burn-in at component level depending on the grade.
Again, just want to separate true QC from burn-in failures. Burn-in failures cannot be caught by Quality control or testing. These failures only exhibit after running the units for hours, days, weeks, Cheaper products using cheaper components can have higher failure rates across the board, including more early failures. So you can say it is “quality“ related.
For me, when I buy electronics, I will definitely do my burn-in to make sure any infant mortality failures are caught during return period. So catching this is a good thing. If issues occur multiple times to same user (like Shane D), then hard to say if there is some systemic issue vs random early failure.