To pursue the attenuation idea, I rigged up a pot in series with my headphones and set out to find how much resistance I need to just get the tone below hearing threshold / background noise (it's very quiet in the room though). The value needed was pretty much 100 ohms, which translates to 0.24 factor in voltage assuming the nominal headphone impedance of 32 ohms and neglible output impedance in the uDac. That means a bit over 12 dB in sound pressure level, maybe that tell's someone something about the volume of the tone.

Now, the problem with just putting a resistor in series with the phones is that if and when the impedance curve of the headphones is not perfectly flat, you will get a variable voltage drop at the resistor depending on the frequency. Looking at the impedance curve measured by Headroom, the maximum deviation seems to be around 33% for my Denons, and I calculated that this translates to about 1.8 dB deviation in the volume when using a 100 ohm resistor in series.

Is that something that could be perceived? 1.8 dB deviation in the frequency response in exchange for 12 dB reduction of noise. Anyway, the deviation can be reduced by using another resistor in parallel with the headphones which effectively acts to flatten out the impedance curve. So for example, putting a 10 ohm resistor in parallel and a 24 ohm resistor in series should give the same attenuation, 31 ohm total impedance, but only 0.4 dB deviation in the frequency response. Anyone tried something like this before? I guess I need to buy some resistors and try it out and hope my calculations are correct.

I hear the tone even when connecting to an externally powered USB hub that's not connected to anything itself.

Edited by IsoOctane - 10/1/10 at 4:53pm