ProtegeManiac
Headphoneus Supremus
Technically speaking, the HD600's sensitivity of 97 SPL/mW means that for every milliwatt of power, it can produce a sound pressure level of 97dB.
It is not 97dB for every 1mW like you'll get 194dB if you input 2mW.
It means that for the first 1mW of input you will theoretically be able to hit 97dB, but
1. SPL is logarithmic, the scale is actually doubling the power input for every +3dB. So to theoretically be capable of hitting 100dB it will need 2mW; 103dB will require 4mW; and so on.
2. "Theoretically" because in real world dB measurements, you're not actually getting the rated number off of 1mW. If you did any loudness war recording ie high enough digital gain coming off a 2V line output source and feeding into any amp will have barely any gradation from no audio to OHMYGODSTOPSTOPSTOPSTOPSTOP!!! and more people will have hearing damage. Hell, even a smartphone wouldn't really have a lot of gradation between no audio to hearing damage...I mean sure my SGS3 had that with a 120dB/1mW, 32ohm IEM, but that was more because there's something screwed up in the software (that an Android update fixed about a year later).
This is why you can use an amp with 500mW of power into 300ohms, have +6dB gain, and...still be able to get that volume knob to about 10:00 and have a lot of time rocking out before you start getting hearing damage. I mean sure you're not exactly putting 500mW into that thing but if your amp is capable of doing very low distortion, very low noise 500mW, then you'd be even cleaner at way less than that, but it still wouldn't just be 1mW.