If the voltage after insertion a power conditioner changes, say +/- 5% because it needs headroom to perform regulation, the only thing relating to loudness that would change would possibly be power amp headroom (clipping point), and that would be something like 1dB or less. Overall system gain and loudness should not, and cannot change, because if it did that would imply system gain is line voltage dependent. And that would be a very bad design indeed. The process of matching an unconditioned line to the output voltage of a voltage regulated power conditioner would be expensive and could change the source impedance of the line, changing test conditions. If the test is an ABX/DBT of the insertion of a power conditioner then nothing on the input side should be changed.
"Make before break" is not necessary, and may not even be desirable. The short "break" in a "break-before-make" switch is only a few milliseconds long, and would be covered by the power supply filter in every audio device. In fact, MBB switching of power line AC devices may not be desirable in the case of a regulated power conditioner because it would apply a momentary input to output short.
For correct ABX switching of power conditioners a 3-relay system is required. Three, because the X choice must also include a relay and timing system that generates an identical interruption and an identical acoustic "click" or there would be a "tell" for the X choice. Shunyata probably hasn't responded because a simple switch wouldn't work, and a power line level ABX comparator switcher is not a simple project. Or possibly he figured out none of their products could go through an ABX with anything better than 50% results.