If the M8 does it's own voltage regulation, then I think using the minimum voltage required would offer the greatest life out of a battery pack. Otherwise all that excess voltage gets shunted off as heat or whatever else it does. Well, maybe/maybe not. I'm not fully educated on what exactly goes on in a voltage regulator. A resistor divider would waste all the energy as heat, but I imagine a proper regulator is significantly more sophisticated than that.
My guess would be the opposite. You may have longer play time with a higher input voltage. Modern power supplies don't waste as much energy as a resistor. They are about 95% efficient these days. However, there is something else to contend with at the lower input voltage, and that's the current. When you are running 2+ amps down a longer cable, you invariably lose some energy due to the heat generated in that cable. No cable has zero resistance. While we can control the efficiency of a regulator, it's much harder to control the power dissipation on a brute force "resistor" that is formed when you have a long power cable. If the cable starts to warm up, so increases its resistance. Then, all bets are off as to how much energy is lost. This is why I'm making a guess (this needs to be proven in practice) that you would have longer play time with higher voltage, since that would lead to lower current and less heat-up in the power cable, so potentially less energy loss.
We are talking about small numbers however -- 5% delta difference max would be my gut-feel guess. So low voltage vs. high voltage will not be a "night and day" difference.
Edited by mgoodman - 12/16/12 at 9:57pm