I'll give you the answers I know....
First, any buffer adds to the overall latency since it takes time to fill the buffer. In something like a CD player, the data buffer CAN be filled very quickly, which minimizes latency (most drives can read at 10x or 20x until the buffer is filled if their software wants them to). I would suspect that most USB interface devices prefer to run their read clock at a more or less normal rate, in which case they would just delay playback a few seconds to load those few seconds of samples into the buffer. As you say, as long as the delay is fixed, it doesn't cause any problems with audio quality. Implementing a buffer in a USB interface device is slightly complicated; the Audiophilleo can do it because it's got a processor and system memory; simple devices with a USB receiver chip alone in them cannot; they're pretty much limited to requesting the data, one or two samples at a time, reclocking them, and sending them along. (I've heard rumors that the HiFace has a buffer of some sort, but that could be in the driver and not hardware... but I'm pretty sure most of the other little cheap ones do not. Remember, we're talking about a "hardware buffer" in the device itself here, and not a software one in the driver.)
I don't know specifically which USB modes can request a resend of missing data (I know the bulk mode used by USB data drives obviously can). I suspect, whether the USB standard supports it or not, it's going to depend on the individual driver as to whether the feature is implemented. If the interface device doesn't have a buffer, then it isn't going to matter since they can't "wait" for the replacement data to arrive. The interface device basically has to have a big enough buffer that, if data is lost of delayed, it hasn't needed the missing data before it has arrived or been resent. Another thing to remember is that "buffers" in the computer aren't the same as buffers in the interface device itself. The buffer in the PC helps the player program by giving it a place to "queue up" samples as it plays them, therefore "having them waiting on the loading bay" waiting to be sent out. This will help eliminate problems where the player program is resource intensive and things are choking up because it gets bogged down (or the CPU gets busy, or the network is slow, or the HD is slow - this is "at the thread level"). That buffer, however, is still inside the O/S. It's not going to help if the USB port or driver itself gets bogged down; or if something in the O/S that's "downstream" of the buffer chokes up. (To fix that you would need some sort of super-fancy USB card with a built-in hardware buffer right at the output. There used to be serial cards like that, but I've never seen a USB one - although they MIGHT exist somewhere.)
In general, USB is pretty reliable in terms of data - but not so good in terms of timing. This also brings up another entirely different idea....
Asynch USB is better than the other modes because the timing is much better (since the interface requests data based on its clock). Now, assuming that your interface device had its own buffer and clock, that really wouldn't matter anymore at all! Since the interface device would be sending the data out of ITS buffer using ITS clock, it doesn't matter at all how the data got there, now does it? You could use bulk mode (which has crappy timing, but absolutely allows resend requests - it's the mode that USB hard drives usually use). You could even copy the entire file into the buffer, then play it from there (its been done). You don't need the benefits of asynch USB anymore because the source timing is now irrelevent. A USB DAC or interface that has a significant buffer doesn't have to be asynch, and it shouldn't matter at all if it is or not. Unfortunately, most DACs don't have buffers (usually because they prefer NOT to deal with delaying the audio, which might, for example, throw it out of synch with the video on your movie).
An audiophile DAC with a buffer could use ANY USB mode, and still be bit perfect ... as long as it's an audio-only DAC and not intended to be part of a home theater system... and a few seconds of latency don't matter; it would be an easy design (DIYers should take note of this !!!)
Jitter would be ONLY dependent on the quality of the clock used to clock data from the buffer (it doesn't matter what you use to fill the buffer at all, as long as the right data ends up there); put the buffer and a high-quality clock inside the DAC itself and the quality should be quite impressive.
Keith