Interesting! Most of the op-amps I've rolled in my iBasso PB2 Pelican have datasheets that show what you're talking about. For example, the OPA1611 and 1612 allow a minimum supply voltage (Vs) of something like 4.5V (if I recall correctly) and a maximum supply voltage of 18V (I'm certain of this figure), but they show lowest THD+N at a supply voltage of 15V (and I'm certain of this, but too lazy to look up the datasheet at the moment).
So, yeah, lots of op-amps show an improvement in noise measurements until some peak is reached - always close to the maximum allowable supply voltage, it seems. That's had me scratching my head for a while, now. Your hypothesis makes sense to me, but I'll remain open-minded, especially since I've tried real hard to actually hear the difference, between 8.4V and 15V with the OPA1612 into HD800, for example, to no avail. So, the whole subject is moot, with my ears and my gear, at least. But... I run at 15V because the output voltage increases better than linearly with an increase in supply voltage - and that I can hear, as improved dynamics and bass control, with inefficient headphones.
Ok you've gone and done it!
I looked up the spec sheet on an LM4562.
I chose this one because I have a downloaded copy of the datasheet, but don't have a copy of the OPA1611 Op Amp.
In practice, not much difference in distortion between +/-17 and +/- 2.5 Volt power supply rails so long as you stay below clipping.
Sure, distortion is lower with the higher voltage power supply, but the levels are so low, well, it'a academic.
Is 0.0002% THD low enough?
Or is 0.00002 % THD enough?
Numbers for IM are fairly similar, power supply voltage does not make enough of the difference in the real world.