Head-Fi.org › Forums › Equipment Forums › Sound Science › Amp input sensitivity vs. voltage fed (searched but only got a partial answer)
New Posts  All Forums:Forum Nav:

Amp input sensitivity vs. voltage fed (searched but only got a partial answer)

post #1 of 9
Thread Starter 

I'm a little bamboozled over amps with multiple gain settings as described to accomodate a wider variety of headphones.

 

I googled and read a "gain" article and from that understand high gain increases the amps input sensitivity (decreasing voltage reqd to achieve full rated power) therefore creating a danger of clipping if the source is delivering more voltage than what the corresponding gain setting has done to the input sensitivity voltage.

Conversely, a low gain setting will decrease the sensitivity (increasing the voltage necessary to achieve full rated power) and you might never get the amp to deliver full power.

 

Is this the same thing as the multiple gain settings on portable headphone amps? I'm getting confused because all the disscussion seems to be on the amp output and how it relates to driving the load vs. how incoming voltage can effect amplifier performance (and thus output load)

 

Let's say we have a DAC that can output 2.0V and two headphones with equal sensitivity, Headphone A has a flat 40 ohm load and B has a 600 ohm load. Would the A work best with low gain and B with high?  Lower impedance wants more current, higher impedance wants more voltage....right?

 

I'm losing it trying to get my head around this

post #2 of 9
Quote:
Originally Posted by Sandman65 View Post

I'm a little bamboozled over amps with multiple gain settings as described to accomodate a wider variety of headphones.

 

I googled and read a "gain" article and from that understand high gain increases the amps input sensitivity (decreasing voltage reqd to achieve full rated power) therefore creating a danger of clipping if the source is delivering more voltage than what the corresponding gain setting has done to the input sensitivity voltage.

Conversely, a low gain setting will decrease the sensitivity (increasing the voltage necessary to achieve full rated power) and you might never get the amp to deliver full power.

 

Is this the same thing as the multiple gain settings on portable headphone amps? I'm getting confused because all the disscussion seems to be on the amp output and how it relates to driving the load vs. how incoming voltage can effect amplifier performance (and thus output load)

 

Let's say we have a DAC that can output 2.0V and two headphones with equal sensitivity, Headphone A has a flat 40 ohm load and B has a 600 ohm load. Would the A work best with low gain and B with high?  Lower impedance wants more current, higher impedance wants more voltage....right?

 

I'm losing it trying to get my head around this

 

 

Gain is just that, gain. Think of it as this:  Vin x Gain = Vout

 

The Vin is what your source supplies. The Vout is what you get at the amp output. The gain is just a multiplier.

 

The sound created by the headphone is related to power.

Most headphones will have a dB SPL per mW rating or db SPL per 1V RMS, which is the peak sound output, at 1mW or 1V rms. 

 

  • Power = Antilog ( (Required SPL - SPL per mW)/ 10)
  • Power = Antilog ( (Required SPL - SPL per 1V rms) / 20) 

          


So,

A headphone will high sensitivity will need less power to create the same dB SPL.

 

Since P = V^2 / R , given the power, the impedance will play a role to determine the voltage the headphone requires to output that power.

 

Assume this required voltage to be "Vreq" ,  So ,  Vreq = Gain x Vin.

 

Vin is fixed, lets say 2V, so a high impedance headphone will need more gain than a low impedance one. So more voltage, less current.

 

That said, there's voltage and current limits. If you set the gain too high, the output Vreq might go too high during peaks and the amp will start to clip. 


Edited by proton007 - 10/4/12 at 6:54pm
post #3 of 9

Then is there any benefit of applying high gain on high impedance headphones, or if low gain can sound loud enough, it'll be better to stick with low gain?

post #4 of 9

In practice, it often shouldn't make much of a difference.

 

But unless the device has big problems, technically the best option is just to set the lowest gain that is loud enough.

post #5 of 9
Quote:
Originally Posted by autumnholy View Post

Then is there any benefit of applying high gain on high impedance headphones, or if low gain can sound loud enough, it'll be better to stick with low gain?

 

You should use the minimum gain that can power your headphones.

post #6 of 9
Quote:
Originally Posted by proton007 View Post

 

You should use the minimum gain that can power your headphones.


GREAT advice !!

So, if you are using a line-level source (2V) ..

How much gain is really needed ??

post #7 of 9
Thread Starter 
Quote:
Originally Posted by proton007 View Post

 

 

Gain is just that, gain. Think of it as this:  Vin x Gain = Vout

 

The Vin is what your source supplies. The Vout is what you get at the amp output. The gain is just a multiplier.

 

The sound created by the headphone is related to power.

Most headphones will have a dB SPL per mW rating or db SPL per 1V RMS, which is the peak sound output, at 1mW or 1V rms. 

 

  • Power = Antilog ( (Required SPL - SPL per mW)/ 10)
  • Power = Antilog ( (Required SPL - SPL per 1V rms) / 20) 

          


So,

A headphone will high sensitivity will need less power to create the same dB SPL.

 

Since P = V^2 / R , given the power, the impedance will play a role to determine the voltage the headphone requires to output that power.

 

Assume this required voltage to be "Vreq" ,  So ,  Vreq = Gain x Vin.

 

Vin is fixed, lets say 2V, so a high impedance headphone will need more gain than a low impedance one. So more voltage, less current.

 

That said, there's voltage and current limits. If you set the gain too high, the output Vreq might go too high during peaks and the amp will start to clip. 

Thanks I understand whtat your saying...

 

Still, what I'm trying to understand is does the gain switch on the amp affect input sensitivity voltage?  sorry if i was unclear on this question in my opening query

post #8 of 9

The input sensitivity specified on an amp is usually the max input signal voltage that is "needed" to drive the amp to max output.

Depending on the implementation of the amp, a higher input signal could lead to a) damage, clipping b) regardless of the position of the volume control or c) only if the volume control is turned up.

 

In consumer electronics we have line levels of -10 dBV (0.316 Vrms) and in pro audio we have +4 dBu (1.228 Vrms).

The redbook standard for CD players is an output level of 2 Vrms.

 

 

Like proton007 said: on an amp with gain switch use the lowest gain that is needed to reach the volume you want with whatever headphones you use.

 

I guess the input sensitivity for such devices is specified at the highest gain setting. Theoretically you could use a higher input level with lower gain settings, but this could lead to one of the points above, or in case of c) if the input level is not too high it will be fine even if you turn up the volume to max.


Edited by xnor - 10/6/12 at 6:33am
post #9 of 9
Quote:
Originally Posted by AKG240mkII View Post


GREAT advice !!

So, if you are using a line-level source (2V) ..

How much gain is really needed ??

 

Not needed for most amps. I use my HD650 without any gain if the Vin is through the ODAC which outputs around 2V.

New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Sound Science
Head-Fi.org › Forums › Equipment Forums › Sound Science › Amp input sensitivity vs. voltage fed (searched but only got a partial answer)