I'll try to chime in here with my limited understanding.
I think the way to know with certainly what the B+ (plate voltage) of an amp is would be to actually measure it. This is a fixed value that is determined by the power supply design of the amp. There is usually some loss (called voltage sag or drop) that happens in the rectifier tube which gets accounted for in the overall design, and the intent is to arrive at a specific voltage that the tube in question needs to be ideally biased. Technically you can run a tube over or under its nominal plate voltage, with some consequences for doing the former, but I think most designers shoot for whatever the nominal value is as specified in the tube data sheet. Whatever value results in the most linear operation without exceeding maximum values is ideal.
So we get various "drop in" substitute tubes because their pin out is the same as the one originally designed for the amp, they want a plate voltage that is the same or similar to the original tube, and the other electrical parameters are close enough that the circuit still behaves as it was designed to.
Thanks for the explanation.