Romaz,
On the issue of a USB source sounding better for one reason or another, I trust we all understand it has nothing to do with the 'bits' or their integral transmission. Modern USB interfaces (source and DAC) get this right. We all know what digital errors sound like (pops, snaps, crackles) and they have nothing to do with staging, treble glare, musicality.
Everything we regard as USB deficiencies are all attributed to noise on the USB +5v pins...seeping into the DAC and then mucking with the internal clock timing or radiating around to the analog section. Variables on the source (server, software, psu, etc) just vary the disturbances on +5v of the USB cable.
Asynchronous USB has 100% fidelity from a digital data perspective. What we hear is the effects of low level analog noise.No one source is inherently 'better' from a digital perspective - they may just have reduced effects on the USB power pins.
Yes, these are mitigated by Galvanic isolation, I suppose, but for some reason the noisy ripples on the USB still contaminate the DAC. My evidence is my ears (and brain) with my exaSound and 2qute...both galvanically isolated but both transformed for the better by a regulated battery inserted into the USB interface.
Maybe Rob nailed this issue on the DAVE but maybe not.
Dan
Here is what Rob had to say about the USB VBUS +5V issue that you believe is a problem:
"No don't worry about the USB VBUS +5v, as it too is isolated from Dave."
Recently, having purchased my LPS-1, I also purchased a W4S Recovery (a USB regenerator that combined with the LPS-1 provides a very clean +5V VBUS that is devoid of leakage current). I bought this mainly for a USB hard drive to see if it would improve SQ (and it didn't and so I sent it back). While I had it, I connected it to DAVE and it made absolutely no difference. Zero.
There has been much talk about the deficiencies of USB compared to other modes of transmission (SPDIF, AES, AOIP, etc) and it's true that USB has problems but they can be overcome and you don't need to add a dozen trinkets to your USB chain to overcome them if you have a DAVE. Very rarely will you see DAC manufacturers boast of their USB input over their other inputs. It's often the opposite and so USB is often added more for convenience's sake. If you pour through the AOIP or RedNet threads here on Head-Fi and other forums, it seems the consensus is that USB is dead in the water. "Long live AOIP." That may one day happen, especially if Rob decides to one day incorporate an ethernet input into the DAVE but as of right now, the best source I have heard is a USB source.
LIke so many others, I understand you have not had good experiences with USB in your system but what I would suggest is that you spend some time with the DAVE and decide for yourself if Rob has overcome these issues that you talk about because I believe he has. Of course, you will also want to make sure you have a good low-impedance music server paired with a good low-impedance PSU and lastly, a proper USB cable that meets spec. That's all I use. My setup is fairly simple.
As for a proper USB cable, as you know, there are literally dozens out there and I'm sure everyone has their opinion on which one sounds best. Here's the thing, most audiophile USB cables regardless of price don't even meet proper USB 2.0 spec. Most companies seem to focus on expensive conductors and shielding. Some will separate the power and data lines. These are all good things but ask these manufacturers what the measured differential impedance of their cables are and whether they take individual measurements of every USB cable they make since there will be variances from cable to cable as well as variances among cables of different lengths. The answer will likely be that most manufacturers don't take individual measurements because it isn't an easy thing to do. The assumption that most manufacturers make is that if a signal is being passed without interruption, the impedance must be where it needs to be. This logic is fine for laser printers but not for high-end audio. There are even some boutique USB cable manufacturers who don't believe in taking measurements because they tune their cables "by ear." There's a reason why a coax SPDIF cable should measure 75 ohms impedance and why an AES/EBU cable should measure 110 ohms impedance, otherwise, you get reflections that will have a negative impact on SQ. Same thing applies for USB cables.
According to Gordon Rankin, an electrical engineer who was the first to implement asynchronous USB in a consumer DAC and is regarded by many as one of the world's best authorities on USB tested about 30 audiophile USB cables and surprisingly, most did NOT meet USB 2.0 spec which means that a cable must have a differential impedance of 90 ohms. He says that unless this parameter is met, the USB feedback mechanism doesn't work because reflections get the signal confused on the computer end. Without it, there's no way for the computer to know that it's sending the data at the proper rate and bits get dropped. Since USB receivers have no error correction, a dropped bit gets interpolated at the DAC, and distortion increases.
Exactly which companies measure the impedance of each USB cable they manufacture to make sure their cables meet spec? I know of quite a few who don't (and I intentionally won't publically name them) but I can verify that Clarity Cables based in Wichita, Kansas does. This is a small outfit run by a husband and wife. Chris Owens (the husband) is an electrical engineer and he makes each USB cable he sells by hand and measures each cable to make sure they meet full USB 2.0 spec. Aside from applying heavy shielding and separating his data and power lines, there's no real magic ingredients used (he uses only standard oxygen-free copper). Thus far, this is the finest USB cable I have heard.
Here is a recent interview with Gordon Rankin on John Darko's website. I believe you will find it to be an informative read:
http://www.digitalaudioreview.net/2016/05/gordon-rankin-on-why-usb-audio-quality-varies/