No, I thought that was low sensitivity. I mean if the 2Vrms input signal is somehow *over-driving* the first tube stage. Or is this super unlikely?
Gotcha, just wanted to clarify. So a 1Vrms input (high sensitivity) vs. a 2Vrms input (lower sensitivity). What that would represent is too much gain in the amplifier for the source. Assuming it is in an integrated amplifier with a variable gain stage and an power stage, the issues I see are 1) the amplifier is going to get loud very early in on the volume pot 2) the power stage can be driven to clipping / unsafe volumes more easily by accident 3) the S/N ratio of the amplifier will lower, i.e. higher noise floor for a given output voltage relative to a lower, more appropriate input sensitivity. Maybe others can think of additional issues but that's what comes to mind for me.
As far as the over-driving concern, the 2Vrms input will be attenuated by the volume pot. The goal of the gain stage is to swing enough voltage with some headroom at low distortion to drive the power stage to full output. The power stage should clip well before the gain stage, if that makes sense. If the gain stage is clipping before the power stage or the power tube is not being driven to full power output at maximum input signal, you should probably be using a higher gain input tube. This is all assuming getting max power output out of the amplifier is the priority. Does that answer your question? Sorry if it doesn't, I'm a little delirious on 2 hours of sleep.