ex0du5
500+ Head-Fier
- Joined
- Dec 26, 2005
- Posts
- 781
- Likes
- 29
Preface: I am a graduated computer engineer and work in the software business. I am also a relative newbie with respect to the audiophile scene. That being said, my father is an avid audiophile and I have had the pleasure of auditioning many setups with him. Recently, I built him a music server and we've auditioned the Meitner, Invicta, and Weiss DAC202 on his B&W 800 Diamond speakers.
Being a computer engineer, I am relatively well versed in the technical workings and interactions of computer interfaces. That being said, I do not have any professional experience with the USB interface. Most of my hardware engineering experience is limited to FPGA development, microcontroller programming, and communication with relatively simple interfaces such as serial interfaces.
I am not going to discount the possibility that the quality of the USB interface on the PC or that the quality of the USB cable matters. It very well might. I'm here to find out why.
To me, used in the intended asynchronous fashion, USB should be able to deliver perfect 1:1 data that is completely devoid of data degradation or timing issues. Take a USB drive, for example. When you transfer data to and from it, nothing happens to it. Error correction and proper back/forth communication in between the drive and the USB's port controller ensures that the data arrives perfectly. If there are any flaws in the data, they are detected thanks to error correction and promptly fixed. Undetected errors are extremely rare, to the point where I have never encountered a corrupt file sent over USB in my lifetime. They're possible, but the error correction used is quite robust and makes undetected errors extremely unlikely.
That being said, I realize that the USB interfaces in DACs may not use USB in the ideal fashion. They say they use asynchronous USB, but I'm not sure how it's implemented. Data is buffered into the DAC's USB interface at a rate determined by the DAC's asynchronous USB controller. How big is the buffer? How likely are data errors? I would have to assume that the PCM audio data arrives in "chunks", each chunk representing a bunch of PCM data with attached error correction bits. If the chunk is determined to be erroneous, the DAC would have to request a new chunk from the PC. If it's not retrieved correctly in time, you'd basically have a missing chunk of data, resulting in audio drop outs. After the data makes it through the buffer, it's essentially reclocked by the DAC's master clock, retransmitted and sent to the DAC chip.
I've described what I suspect happens using asynchronous USB implementations. If designed in a proper fashion, I see absolutely no reason why the quality of the cable or USB port would effect the sound. At least, there's no reason it should effect the bits and how they're processed by the DAC in any way.
That being said, there are a few unknowns to me: can the -V, +V lines in the USB connection somehow "dirty" the DAC's power? I don't know...I'm not terribly comfortable with electrical engineering subject matter.
If someone could enlighten me, I'd be forever grateful.
Being a computer engineer, I am relatively well versed in the technical workings and interactions of computer interfaces. That being said, I do not have any professional experience with the USB interface. Most of my hardware engineering experience is limited to FPGA development, microcontroller programming, and communication with relatively simple interfaces such as serial interfaces.
I am not going to discount the possibility that the quality of the USB interface on the PC or that the quality of the USB cable matters. It very well might. I'm here to find out why.
To me, used in the intended asynchronous fashion, USB should be able to deliver perfect 1:1 data that is completely devoid of data degradation or timing issues. Take a USB drive, for example. When you transfer data to and from it, nothing happens to it. Error correction and proper back/forth communication in between the drive and the USB's port controller ensures that the data arrives perfectly. If there are any flaws in the data, they are detected thanks to error correction and promptly fixed. Undetected errors are extremely rare, to the point where I have never encountered a corrupt file sent over USB in my lifetime. They're possible, but the error correction used is quite robust and makes undetected errors extremely unlikely.
That being said, I realize that the USB interfaces in DACs may not use USB in the ideal fashion. They say they use asynchronous USB, but I'm not sure how it's implemented. Data is buffered into the DAC's USB interface at a rate determined by the DAC's asynchronous USB controller. How big is the buffer? How likely are data errors? I would have to assume that the PCM audio data arrives in "chunks", each chunk representing a bunch of PCM data with attached error correction bits. If the chunk is determined to be erroneous, the DAC would have to request a new chunk from the PC. If it's not retrieved correctly in time, you'd basically have a missing chunk of data, resulting in audio drop outs. After the data makes it through the buffer, it's essentially reclocked by the DAC's master clock, retransmitted and sent to the DAC chip.
I've described what I suspect happens using asynchronous USB implementations. If designed in a proper fashion, I see absolutely no reason why the quality of the cable or USB port would effect the sound. At least, there's no reason it should effect the bits and how they're processed by the DAC in any way.
That being said, there are a few unknowns to me: can the -V, +V lines in the USB connection somehow "dirty" the DAC's power? I don't know...I'm not terribly comfortable with electrical engineering subject matter.
If someone could enlighten me, I'd be forever grateful.