Does the USB Cable Matter (USB DACS)
Nov 19, 2014 at 3:44 PM Post #106 of 120
OK it is impossible for a USB hub to in some way affect the audio signal carried over Asynch USB, other than the very slight possibility of mains noise but a hub is going to be better than a PC for noise so I use a hub. Everything in Asynch introduces delays and it doesn't matter within reason, and unless you want it for AV, that is why it is called Asynch. i.e. no clock.

Not surprisingly some of you haven't quite got my explanation of how an Asynchronous protocol works. And I speak as someone who has coded two asynchronous device drivers, one entirely in assembler, and I still remember the sinking feeling in my stomach both times I was told my name was against that work package, Oh no. These were running at vastly slower speeds than USB 2.0 but the principles are exactly the same.

Basically they do what is called a hand-shake, the receiver sends an ACK or a NAK after each packet . If the sender gets a NAK it sends the packet again and records an error. If it gets an ACK then it sends the next packet of data when it is ready to. Note the last part, when ever it feels like it, not when a clock is telling it to. This means that it is not possible for odd bits to be lost and if it happens sometimes then the protocol will deal with it and keep a log somewhere. If these NAKs start to happen very often then the sender may do a few things to try and fix it like slowing down a bit or other clever stuff. There is probably some clever code looking for patterns or whatever and deciding if an error needs to be raised to the O/S. If the errors keep stacking up it will eventually fall over and log a complete failure. So bits cannot get lost or corrupted. You could stream a file backwards and forwards across an Asynch USB 2.0 interface an infinite number of times and the file will still be perfectly mathematically unchanged.

So when the guys at Schiit were testing all this they will have had their shiny DACs connected to a few different PCs (and this is the problem, how many can we expect them to test ?) running streaming overnight. In the morning they will look at the error logs and see if there have been any problems. The first time you do this you expect to see lots odd little errors here and there just as you describe and you hope to get a nice warm feeling when you see that your code coped with it, no one died, no bits were lost, you emerge a hero, well not quite.

What I bet they saw and also what I saw was no errors at all. You end up having to find a way of injecting some in order to test the bloody error handler. In real life when there is a problem, which is rare, then you will see either an instantaneous complete failure, or a few short bursts of errors followed by a complete failure.

But to make all this work you need everything to be a lot faster than when you have a clock keeping everything in order. But it will eliminate any jitter other than that between the DAC and the sample which is very small.

I do quite like the 'more processing somehow leads to more noise' argument, because you can't actually show that it is Mathematically incorrect and there is certainly more processing. But it does rather have the feel of an idea dreamed up at sales conference rather than an audio engineering symposium. Am I wrong?
 
Nov 19, 2014 at 4:16 PM Post #107 of 120
The theory was iirc described by an audio designer, but not is association with a particular brand or model or marketing material. I doubt it would be brought up at an audio engineering convention. It would be a claim that the digital receiver is not indifferent to the incoming signal. Would stir things up and give away r&d. Nobody wins.
 
Nov 19, 2014 at 7:30 PM Post #109 of 120
Quite possibly, some USB trasnports go to great lengths to isolate even the chassis ground of the computer, for example Berkeley BADA USB.  I think most USB cables have the shield connected at both ends so chassis safety ground is effectively connected between the computer and DAC chassis.  Signal ground is another thing and probably best left connected to the receiver, later that can be dealt with by galvanic isolation of the USB input inside the DAC.  Could be overkill but that seems to be the name of the game in hifi.
 
Nov 19, 2014 at 8:43 PM Post #110 of 120
OK it is impossible for a USB hub to in some way affect the audio signal carried over Asynch USB, other than the very slight possibility of mains noise but a hub is going to be better than a PC for noise so I use a hub. Everything in Asynch introduces delays and it doesn't matter within reason, and unless you want it for AV, that is why it is called Asynch. i.e. no clock.

Not surprisingly some of you haven't quite got my explanation of how an Asynchronous protocol works. And I speak as someone who has coded two asynchronous device drivers, one entirely in assembler, and I still remember the sinking feeling in my stomach both times I was told my name was against that work package, Oh no. These were running at vastly slower speeds than USB 2.0 but the principles are exactly the same.

Basically they do what is called a hand-shake, the receiver sends an ACK or a NAK after each packet . If the sender gets a NAK it sends the packet again and records an error. If it gets an ACK then it sends the next packet of data when it is ready to. Note the last part, when ever it feels like it, not when a clock is telling it to. This means that it is not possible for odd bits to be lost and if it happens sometimes then the protocol will deal with it and keep a log somewhere. If these NAKs start to happen very often then the sender may do a few things to try and fix it like slowing down a bit or other clever stuff. There is probably some clever code looking for patterns or whatever and deciding if an error needs to be raised to the O/S. If the errors keep stacking up it will eventually fall over and log a complete failure. So bits cannot get lost or corrupted. You could stream a file backwards and forwards across an Asynch USB 2.0 interface an infinite number of times and the file will still be perfectly mathematically unchanged.

So when the guys at Schiit were testing all this they will have had their shiny DACs connected to a few different PCs (and this is the problem, how many can we expect them to test ?) running streaming overnight. In the morning they will look at the error logs and see if there have been any problems. The first time you do this you expect to see lots odd little errors here and there just as you describe and you hope to get a nice warm feeling when you see that your code coped with it, no one died, no bits were lost, you emerge a hero, well not quite.

What I bet they saw and also what I saw was no errors at all. You end up having to find a way of injecting some in order to test the bloody error handler. In real life when there is a problem, which is rare, then you will see either an instantaneous complete failure, or a few short bursts of errors followed by a complete failure.

But to make all this work you need everything to be a lot faster than when you have a clock keeping everything in order. But it will eliminate any jitter other than that between the DAC and the sample which is very small.

I do quite like the 'more processing somehow leads to more noise' argument, because you can't actually show that it is Mathematically incorrect and there is certainly more processing. But it does rather have the feel of an idea dreamed up at sales conference rather than an audio engineering symposium. Am I wrong?


Are you referring to the following? 
 
http://www.head-fi.org/t/701900/schiit-happened-the-story-of-the-worlds-most-improbable-start-up/3810#post_11058782
 
I haven't tried it, so can't say. All I can say is that what the world-is-flat-audio-engineers were saying wasn't possible 30 years ago is now common knowledge (why aren't those papers being recirculated?). So, allow possibility of placebo, but also allow possibility of ignorance and/or arrogance. :wink:
 
Nov 19, 2014 at 11:24 PM Post #111 of 120
  Are you referring to the following? 
 
http://www.head-fi.org/t/701900/schiit-happened-the-story-of-the-worlds-most-improbable-start-up/3810#post_11058782
 
I haven't tried it, so can't say. All I can say is that what the world-is-flat-audio-engineers were saying wasn't possible 30 years ago is now common knowledge (why aren't those papers being recirculated?). So, allow possibility of placebo, but also allow possibility of ignorance and/or arrogance. :wink:

I would guess that it's pretty safe to say that the people creating the technologies would on average know a little more than your usual audiophile. in an ignorance/arrogance contest we all know very well that the amateur audiophile will always win.
yes techs evolved with times making for new possibilities. sure some guy said that the car would never replace the horse. but for that one guy how many others said the opposite? anyway engineers are still the ones making sure that our cars don't blow up when we turn the key, or that our house will actually not fall down next time there is a little snow on it, or that our internet is working more than 2 days a week. so if those guys are trusted with our lives(I'm talking about internet obviously ^_^), I would tend to also listen to them when it comes to more trivial stuff like modern audio products. because they are still the ones who came up with the tech.
 
Nov 20, 2014 at 5:59 PM Post #112 of 120
Firstly, no I am not referring to the Wyrd. I am referring to any of their products which used Asynch USB 2.0. Jason talks about it and explains Asynch USB very well in his chapter, but some people may have misunderstood his complaints, about how much hassle software is for an audio company, to be a criticism of the technical solution and its robustness. In fact he acknowledges quite readily that it is a much better solution but a hassle to implement and support. Welcome to the wonderful world of IT.

The error handler testing I described was probably actually done by their software kernel provider in Taiwan.

I am far from infallible as an Engineer myself. I well remember arguing with a sales guy in about 2001 about a bid I was managing for a large IT project because he wanted to put in a bunch of stuff about possible use of voice recognition software. I said that it was 'a load of bollocks' because it would never work well enough in our lifetime to be use-able in real life applications, like Emergency Services or Hospitals, because computers would never be fast enough to run the type of Mathematical algorithms needed to do this in almost real time. Less than 10 years later I had precisely that on my mobile phone.

I really can't comment about potential earthing issues or other sources of noise. I am a software guy, I don't do hardware. But I know enough about it to know that these kind of problems should be dealt with by a decent DAC design and if you aren't drawing power from the USB cable it should be quite straightforward. And, until a hardware designer explains why I am wrong and precisely how a USB cable can introduce distortion into the analogue side of a system through a twisted pair digital connection which is providing bit perfect data, I will continue to use bog standard cables.

Drez - that was joke. I meant 'it feels like clutching at straws', but I am not totally dismissing it, just very sceptical indeed.
 
Nov 20, 2014 at 6:10 PM Post #113 of 120
I just read the article about ground plane noise and it seems to support what I have just written. It describes a problem which occurred with a board layout. And then at the end speculates a bit about cables without saying why. This was a hardware design issue which they found and fixed. It happens all the time. That is what engineers do.
 
Nov 20, 2014 at 6:43 PM Post #114 of 120
I guess some noise could bleed a little in the analog thanks to having only one common ground(and a bad one at that). that's how I imagine it being possible... maybe?
but the level of those noises in the cable would be low and even lower for the part that would bleed into analog. compared to the DAC sending out up to 3 or 5v depending on the cipset, I can't really see how that would matter on any ok design.
and as we're here talking about cable differences, it would mainly be a story about cable shielding then. so that would need some crazy stuff near the cable to make a noise that could matter in the end.
and for anything else asynchronous usb would deal with it.
 
Nov 20, 2014 at 7:00 PM Post #115 of 120
Sorry me again. Thinking about noise being introduced by the USB interface processor having to do more work. Let's call it 'brain ache'. That would mean that a higher sample rate would be more noisy because that is the only thing that would affect the amount of processing, so the effect would be more noticeable playing 192 KHz files than 44 KHz files. I have yet to read a post from anyone saying that their DAC sounds better playing the CD version than the HD version of a track? But I am sure there is one :>)
 
However it would not in any way be affected by the type or make of cable, that doesn't change processing at all.
 
People are getting confused between 'cables from different manufacturers but which meet the same standard' and different protocols or different standards, e.g. Asynch vs Synch and USB 2.0 vs USB 1.1 etc. Of course standards and protocols do make a difference, otherwise why have them?
 
Feb 5, 2015 at 3:57 PM Post #117 of 120
  I just rearranged my kit and have dispensed with the 5 metre USB cable and now have a hi-speed 1metre one connecting my Audiolab MDAC to my PC. No more flushing buffers.


good to hear. 5meters was long for usb.
 
Mar 14, 2015 at 6:53 AM Post #118 of 120
Oh dear I'm actually having more problems with this short USB cable than I did with the long one. Getting a lot of unlocked when I change to USB, have to literally cycle through the inputs to one that has something playing and back to the USB to get it working. Sometimes have to switch my MDAC off and turn it on again to get a signal. Not sure how far to push in the USB cable, I assume the click is as far as it should go, don't want to break it.
 
I have my Sky+ box connected so there is always sound coming in through that, but a little perplexing having to mess about. Try another USB slot on my PC perhaps.
 
Mar 18, 2015 at 2:01 PM Post #119 of 120
After reading this thread, I agreed with most of the others: A very cheap very poor USB will result in issues. One decent USB cable should have almost no difference with a very expensive one. As other have stated, the reason is that USB is digital and not analog. In digital cables no quality is lost.
 
Sep 18, 2019 at 10:32 PM Post #120 of 120
I know this thread is long dead, but I just experimented with quite a few USB cables, starting from ones that came with my DAC to my Printer cable, to OCC Copper Black Dragon, Silver Dragon USB.
They all sound different.
The OCC Copper Black Dragon killed treble on my system, everything sounded dark, and instruments were too rich and saturated.
Silver opened up the sound and brought back the top end shimmer.
Cheap Plated cables were as you would expect, somewhere in the middle.
The biggest bump in sound quality came when I made my own cable with Solid OCC Silver conductors, resolution and imaging saw a major bump, and distortion was one of the 1st things I noticed, or I should say lack thereof.
As of Now there is not a shred or stranded cable in my entire system, I even went as far as opening the speakers up and changing every piece of stranded cable for Solid OCC Copper neotech wire.
Interconnects are also Solid OCC Copper wire, Home made.
Everything sounds absolutely fantastic.

So yes USB Cables make a difference, and it is not subtle, more than that Solid Conductors make the biggest difference.
 
Last edited:

Users who are viewing this thread

Back
Top