My question here mainly comes from the real-world example I have found here in my own set-up in my room:
The other day I happened to stop and think about the difference in specs between the A/V receiver I have and desktop amp I have. One, my A/v receiver, puts out about 100 W/channel into 8 ohm speakers and the other, desktop amp, puts up to 6W into 32ohm load. My first thought was, what the heck, why did I ever buy an amp for my headphones that outputs a 15th of the power. Well upon looking into, I know that may factors such as output resistance (being the biggest factor) and nominal load of the speakers/headphones will come into play. I know that my desktop amp has an output resistance of 1 ohm and that my receiver is somewhere near 40 or 50 ohms in output impedance. But how does this correlate to lower power at the output and/or driving the headphones more efficiently? I've read that a voltage source ideally has an output impedance of 1 ohm or less and that the opposite, a current source, would be ideally the opposite, but I don't understand how having a lower output impedance can cause reason to believe that less power at the output (6 W) is needed to drive a pair of headphones just the same or even better than my A/V receiver. I tried to look this over using real world examples with Ohm's Law and it's variations but came to no solutions or conclusions on this. Can anyone here guide me in the right path on this? Thanks for any help you may have!