The amount of power drawn depends on the design of the external USB device and very little on the host device and cable. I guess if you're talking about performance, yes, it's easier to get better performance and especially higher output power (note: higher output power gives you higher volume, and if comparisons are not level matched, the louder device usually wins) if you're allowed to draw more power. You don't need to make as many compromises in terms of topology and going for especially low-power electronics if not being as frugal. However, the time and quality of design and effort of the design team matter more than one design constraint like power consumption: you should easily be able to find lower-power stuff outperforming higher-power stuff.
"Appreciably better" depends on whether you're talking about machine-measurable, human-perceivable, or human-preferable. With good enough bench gear and the right conditions and downstream device you may get something measurably different. For one, if the wires are very thin (though this matters a lot less over 18" of wire), there could be some larger voltage drop across them, so the DAC/amp gets less and perhaps does a worse job at regulating voltages and so on, which could affect the outputs in some small way. You could get some EMI as well. A ferrite bead is probably not a bad idea.
I'd be very much more concerned about the electronics and layout of the DAC/amps than miniscule details about power delivery. Also, if you're going to be playing the theory game and talking about ones that plug in directly, you might say they're closer to sources of interference from the host computer.
In any case, with respect to power, look first at the audio electronics and its susceptibility to issues. Then at the power supply filtering on the DAC/amp. Then at the quality of power from the host device. Then way down the list of importance is anything to do with the USB cable aside from whether or not it's plugged in.