There are two things you should know....
First, virtually all solid state headphone amps have a gain that is set by the circuitry (and you can vary it with the volume control). Likewise, things like slew rate aren't likely to change much as long as it's operating "within normal limits"). This means that changing the supply voltage a little is
NOT going to change the amount of volume you get with a particular setting. In general, the only thing it will alter significantly might be the maximum output level you can get before it clips. (The gain will be exactly the same, but, with the lower supply voltage, it will "run out of steam" at a slightly lower level. If the supply voltage gets too low, you may simply reach a point where the amp can't make any usable output before it starts to distort.)
You should consider the manufacturer's recommendations because, depending on the exact circuitry they used, a voltage that is much too high can make the unit run very hot - or just plain burn it out (most components will just run a little bit hotter as you raise the voltage, but, at some point, many will just plain die), and an amp that clips sooner will sound "rougher and less dynamic".
Second, when you have a device that uses an AC adapter that
PUTS OUT AC, that means that most of the power supply - the rectifiers and regulators - is inside the device itself. When an AC voltage is rectified, you end up with a DC voltage more or less proportional to it (and about 1.5x as much). This voltage is them smoothed by filter capacitors, and then usually smoothed more perfectly by a regulator. In the process of smoothing the signal, the regulator loses some voltage - typically 1 or 2 V - so you need more voltage going into the regulator than you're asking it to put out. In fact, most small AC adapters are somewhat "soft" in terms of what voltage they put out - their output varies depending on how much current you draw from them and how heavy the power supply itself is. A typical adapter rated "15VAC @ 500 mA" probably really puts out around 17 VAC @ 0 mA, and maybe as low as 10-12 VAC @ 500 mA. Another, bigger adapter, rated at "12VAC @ 2 Amps" may put out 14 VAC @ 0 mA, 13.5 VAC @ 500 mA, and 12 VAC @ 2 Amps. In this case, the bigger power supply is actually putting out
MORE voltage at 500 mA than the smaller one rated for the higher voltage.
AC power supplies that put out DC voltage are different, and fall into three major groups. Two of them (linear regulated and switch mode regulated) put out pretty much the same voltage regardless of the current you draw until you reach their limit - at which point the voltage drops suddenly or is shut off entirely. The third type (unregulated) acts like I described above, and has a voltage that drops lower as you increase the current you draw (and, again, heavier units can deliver more current with less of a drop in output voltage).
If you received a unit, and it came with a power supply, then you should assume that the folks you bought it from made sure that the one they included exceeds the required amount of voltage and current required to work properly... and that adding a bigger one or one at a higher voltage is probably unnecessary and not especially beneficial. (When they told you what one to use, they were probably being conservative; that way, as long as you bought one that met those ratings, they were sure it would work properly.)
Quote:
Regarding power adapters, I am a little bothered. My unit came with a 12VAC, 500mA adapter, but I thought I have read somewhere that the minimum voltage was 13.5V (JDS?). They do recommend a 12VAC adapter nevertheless, which makes it even more confusing.
I bought a 18VAC adapter in the end but couldn't tell the difference in volume when I kept the knob fixed and swapped adapters. Is this right? Or is the 18VAC going to provide better voltage swings, slew rates, etc?
Thanks!