Lately I've been obsessing about headphone amps, after experiencing first hand what some extra power did to the sound of my IEMs on my portable rig. I want to replicate this on my desktop rig.

However, I think I'm lacking some technical knowledge about what all the power spec numbers actually mean for my music and my equipment. I'm not technical illiterate, it's just that I could afford to know a bit more before going on a shopping spree filled with hubris and half truths.

For instance, I read some 400 USD amplifiers that state this on their output specs:

300 mW RMS at 300 Ohms, Single ended

900 mW RMS at 300 Ohms, Balanced.

Whereas I go and find a 500 USD class A amp that simply states:

2.5 W at 16 Ohms

Or a 900 USD amp/dac that goes:

1.8 W per channel at 8 Ohms.

Or a 2.7K USD amp/dac that states:

Output: 7.0 volts balanced

And then a 350 USD Tube amplifier:

Power handling capacity: 1 W.

A 250 USD Class A amp:

250 mW at 300 Ohms RZ (single ended)

1000 mW at 300 Ohms RZ XLR (balanced)

What's going on here? How do I bring all these numbers to a common ground so I can get an accurate comparison between all of them? Am I getting more power at 300 Ohm per each dollar with a 250 USD amp?

This is without even taking into account all the other variables like output impedance, sound signature and other connectivity, headphone efficiency...

Will my music sound "better" with a much more expensive, albeit probably "less powerful" amp, or I just go for the most Watts at 300 Ohms for the lowest cost? Why am I measuring all against power at 300 Ohms? Because that's what my main cans have.

How do I go about reading through all the amp specs out there? And please let's keep it at a numbers level, I know there's also the soundstage, imaging, best pairing and all other poetic yet conceited words we use to describe sound, just POWER!

However, I think I'm lacking some technical knowledge about what all the power spec numbers actually mean for my music and my equipment. I'm not technical illiterate, it's just that I could afford to know a bit more before going on a shopping spree filled with hubris and half truths.

For instance, I read some 400 USD amplifiers that state this on their output specs:

300 mW RMS at 300 Ohms, Single ended

900 mW RMS at 300 Ohms, Balanced.

Whereas I go and find a 500 USD class A amp that simply states:

2.5 W at 16 Ohms

Or a 900 USD amp/dac that goes:

1.8 W per channel at 8 Ohms.

Or a 2.7K USD amp/dac that states:

Output: 7.0 volts balanced

And then a 350 USD Tube amplifier:

Power handling capacity: 1 W.

A 250 USD Class A amp:

250 mW at 300 Ohms RZ (single ended)

1000 mW at 300 Ohms RZ XLR (balanced)

What's going on here? How do I bring all these numbers to a common ground so I can get an accurate comparison between all of them? Am I getting more power at 300 Ohm per each dollar with a 250 USD amp?

This is without even taking into account all the other variables like output impedance, sound signature and other connectivity, headphone efficiency...

Will my music sound "better" with a much more expensive, albeit probably "less powerful" amp, or I just go for the most Watts at 300 Ohms for the lowest cost? Why am I measuring all against power at 300 Ohms? Because that's what my main cans have.

How do I go about reading through all the amp specs out there? And please let's keep it at a numbers level, I know there's also the soundstage, imaging, best pairing and all other poetic yet conceited words we use to describe sound, just POWER!

Last edited: