Why do USB cables make such a difference?
Status
Not open for further replies.
Oct 13, 2017 at 12:25 PM Post #316 of 1,606
Since this thread is now in sound science let me try to correct a few things even if I risk to confuse some a bit more.
Bit Perfect:
In Digital Audio terminology it is only meaning that the original audio file data bits have not been modified by any process (EQ/Volume/Up or Down sampling/Etc...).
It does not mean or guarantee that the original digital audio file is delivered to DAC without any bit corruption at all.

USB Digital Transmission:

When transfering non real time digital files from A to B bulk transfer with CRC check is used. In case of true or false error detection the packet is resent.
With such the probability to have a corrupted bit is extremely low let us say around 1 out of 10exp15.

When transfering real time digital files isochronous transfers are used. In case of true or false error detection the packet is not resent.
With such the probability to have a corrupted bit is still extremely low but higher compared to bulk transfer.

When dealing with USB compliant cables, at worst (with minimum specifications) the probability to have a corrupted bit at maximum rate is around 1 out of 10exp12 (Terabit).

  • A bulk transfer is used to reliably transfer data between host and device. All USB transfers carry a CRC (checksum) that indicates whether an error has occurred. On a bulk transfer, the receiver of the data has to verify the CRC. If the CRC is correct the transfer is acknowledged, and the data is assumed to have been transferred error-free. If the CRC is not correct, the transfer is not acknowledged and will be retried.
  • Isochronous transfers are used to transfer data in real-time between host and device. When an isochronous endpoint is set up by the host, the host allocates a specific amount of bandwidth to the isochronous endpoint, and it regularly performs an IN- or OUT-transfer on that endpoint. For example, the host may OUT 1 KByte of data every 125 us to the device. Since a fixed and limited amount of bandwidth has been allocated, there is no time to resend data if anything goes wrong. The data has a CRC as normal, but if the receiving side detects an error there is no resend mechanism.

In conclusion no use to worry at all with data integrity whatever protocol is used for transfering. But in absolute there are differences how data is treated.



 
Oct 13, 2017 at 12:28 PM Post #317 of 1,606
wait are we speaking about digital or analog there cause in digital "radio interference" and "current leakage" is a non issue if the receptor received the bit issued...I believe that what you are saying is that a "dirty" digital signal will impact the analog output like a defective ground? Nonsense, the DAC chip will never in any case use the power from the DATA channel to perform the conversion, it will use a dedicated power line that should be of course clean fora good conversion.

the PCM experts will correct me but the timing information is contained in the data, and not the actual spacing of the bit transmission? in any case an audiophile cable will not provide a more stable frequency transmission, as any normal cable does not change characteristics from one second to another.
 
Oct 13, 2017 at 12:31 PM Post #318 of 1,606
Your knowledge of EE is obviously superior to mine but my point throughout much of this thread has been the marketed implication and understanding of some audiophiles that within a DAC, RF/EM and system/source noise is effectively impossible to isolate against to levels below audibility. My knowledge of EE and the specific designs of audiophile DACs is insufficient to counter this understanding with technical arguments and probably wouldn't help any way, as those I've been communicating with would not appreciate it, so I've attempted to use simple logic instead. Namely, how come pro audio DACs, even cheap ones, routinely manage a level of isolation which is apparently impossible for audiophile DACs costing far more? In many/most cases, I believe the differences heard by some audiophiles between USB cables is purely the result of a perception bias but there are some audiophile devices out there which are poorly enough designed to allow interference through to audible levels. In fact, I myself have heard an audiophile DAP which produced audible noise when positioned within a few inches of a laptop, noise which was rendered inaudible by an audiophile USB cable (or by moving it further away from the laptop). My response to this scenario is not that audiophile USB cables are therefore valid but that some audiophile devices are effectively faulty, designed with inappropriate isolation. If an audiophile USB cable really does make an audible difference, it's because it's providing a level of isolation which the DAC itself should be providing.

G
I agree that audiophile devices may be poorly designed, and that they may indeed produce audible noise from various sources. Sure, if a USB cable produces a positive difference it's providing a solution, albeit a band-aid that shouldn't have been necessary in the first place. I was simply objecting the blanket "chokes are bad" statement. Chokes/ferrites are not bad or good, they are a circuit element that needs to be properly applied if the goal is improvement.

A also firmly believe that most percieved improvements are the result of expectation bias. I have yet to see a properly performed, bias controlled test with sufficient statistical data, though I have little desire to search deeply.
 
Oct 13, 2017 at 1:22 PM Post #319 of 1,606
wait are we speaking about digital or analog there cause in digital "radio interference" and "current leakage" is a non issue if the receptor received the bit issued...I believe that what you are saying is that a "dirty" digital signal will impact the analog output like a defective ground? Nonsense, the DAC chip will never in any case use the power from the DATA channel to perform the conversion, it will use a dedicated power line that should be of course clean fora good conversion.

the PCM experts will correct me but the timing information is contained in the data, and not the actual spacing of the bit transmission? in any case an audiophile cable will not provide a more stable frequency transmission, as any normal cable does not change characteristics from one second to another.

Well... many DACs ARE powered by USB, and a fairly high current (1A or more) has to pass through the same cable which in itself creates interference. Best practice is to completely remove the +5V from the cable, but how many people do this?

USB has no timing information. In theory the time signal information can be reconstructed at the DAC end knowing that the data packets are supposed to be synchronous, however again very very few DACs do this (requires buffering, a processor, good clocks, a little bit of engineering, etc). Typically the signal is either received "as is" from the source clock or the clocking can be slaved to the DAC, however the packet is then sent for decoding without further corrections.

It's the "should be" that kills it all... reality is more deceiving than theory would suggest.
 
Oct 13, 2017 at 2:01 PM Post #320 of 1,606
In practice USB works perfectly unless something is wrong with the basic design. I push music across my wifi network and theoretically that creates timing error, but in practice, it's totally inaudible.
 
Oct 13, 2017 at 3:57 PM Post #322 of 1,606
I did A/B comparison between the source CD and the output over my Airport network.
 
Oct 13, 2017 at 5:16 PM Post #323 of 1,606
wait are we speaking about digital or analog there cause in digital "radio interference" and "current leakage" is a non issue if the receptor received the bit issued...I believe that what you are saying is that a "dirty" digital signal will impact the analog output like a defective ground? Nonsense, the DAC chip will never in any case use the power from the DATA channel to perform the conversion, it will use a dedicated power line that should be of course clean fora good conversion.

the PCM experts will correct me but the timing information is contained in the data, and not the actual spacing of the bit transmission? in any case an audiophile cable will not provide a more stable frequency transmission, as any normal cable does not change characteristics from one second to another.
what some people fear here is noise somehow creeping in the circuit board and whatever impact that could have. delayed trigger of the 0/1 switch, noise passing into the analog circuit, like when coming from the the power source... pretty much all the stuff that are expected to be crazy small even in consumer level systems. except that here we're discussing with people who strive to get the very best possible. or at least like to think they do, because once again pretending to incrementally improve fidelity without measuring anything, that's one hell of a concept. one I don't get.
in any case, the mentality is different because instead of thinking "low enough not to bother" like I would do, some are thinking "can it be improved, no matter how low it already is?" it's a different world view and a different quest. there is nothing wrong with people who wish to achieve excellence. too bad audiophiles rarely wish to achieve excellence in logical reasoning, fact checking, and measurements. then this topic might lead somewhere.
 
Oct 13, 2017 at 5:30 PM Post #324 of 1,606
pretty much all the stuff that are expected to be crazy small even in consumer level systems. except that here we're discussing with people who strive to get the very best possible. or at least like to think they do, because once again pretending to incrementally improve fidelity without measuring anything, that's one hell of a concept. one I don't get.
in any case, the mentality is different because instead of thinking "low enough not to bother" like I would do, some are thinking "can it be improved, no matter how low it already is?" .

I think I will disagree with the "low enough" bit. Some equipment will never show differences between sources, bad and good recordings, swapping out equipment, audio formats and so on. In others, the differences jump to the listener's ears. The listening test is the ultimate one if the differences are sufficiently evident - one may not be able to tell which one is better, but at least can identify that a change is there. Measurements of currents, jitter, ground voltages... those things are incredibly difficult to measure, and how much do they tell you quantitatively? Sometimes not much.

Just to illustrate? I have the same file in different formats, all from the same studio master (and provided by the studio man himself). I play them in one DAC, they all sound the same. I play them in the second DAC - night and day difference. Does this mean it makes audio formats make a difference of not? What's the answer to the question "do audio formats matter?" Finally, is that a measurement? Just food for thought...
 
Oct 13, 2017 at 5:55 PM Post #325 of 1,606
...the mentality is different because instead of thinking "low enough not to bother" like I would do, some are thinking "can it be improved, no matter how low it already is?" it's a different world view and a different quest.
What science is attempting to define is the threshold of "low enough not to bother". It is, unfortunately, not just a threshold or a number but a complex multi-dimensional array that is in some cases not yet fully defined.
I think I will disagree with the "low enough" bit. Some equipment will never show differences between sources, bad and good recordings, swapping out equipment, audio formats and so on. In others, the differences jump to the listener's ears. The listening test is the ultimate one if the differences are sufficiently evident - one may not be able to tell which one is better, but at least can identify that a change is there.
All part of the above. The listening conditions affect the audibility threshold. True enough, but that doesn't mean there isn't one.
Measurements of currents, jitter, ground voltages... those things are incredibly difficult to measure, and how much do they tell you quantitatively? Sometimes not much.
Huh? Currents, jitter, ground voltages, noise, distortion, response, temporal response...all are now easily measured and quantified. The task, as I've said, is correlation to the audible. The error is to try to assign a single figure or two-axis data point. That doesn't mean audible correlation is impossible, just a much bigger task. But measurement itself is a done deal.
Just to illustrate? I have the same file in different formats, all from the same studio master (and provided by the studio man himself). I play them in one DAC, they all sound the same. I play them in the second DAC - night and day difference. Does this mean it makes audio formats make a difference of not? What's the answer to the question "do audio formats matter?" Finally, is that a measurement? Just food for thought...
Not much food for thought, though, other than....it's another example of an opinion with zero information and no statistics. No equipment specified, no information on playback conditions, hardware, connection, environment, and it's the fully sighted and biased opinion of one guy.

I'm not trying to invalidate the above as your opinion, it is valid as such, but it's not an indication of anything other than your opinion, one that does not invalidate or validate anything about formats, interconnects, or system performance. It's an opinion, not a data point.
 
Last edited:
Oct 13, 2017 at 6:05 PM Post #326 of 1,606
Just to illustrate? I have the same file in different formats, all from the same studio master (and provided by the studio man himself). I play them in one DAC, they all sound the same. I play them in the second DAC - night and day difference. Does this mean it makes audio formats make a difference of not? What's the answer to the question "do audio formats matter?"

Well, that is really easy to figure out. If what you say is true and assuming all the formats should be audibly transparent, one DAC is properly presenting all of the formats as the same sound, and the other isn't presenting some of them properly. There are a million ways to mess up sound, but only one way to get it right. Audibly transparent should be audibly transparent. As long as it is, it's fine for the purposes of listening to music in the home. If it was me, I'd send the one that is making them all sound different back for a refund. It's clearly defective.
 
Last edited:
Oct 13, 2017 at 6:35 PM Post #327 of 1,606
Huh? Currents, jitter, ground voltages, noise, distortion, response, temporal response...all are now easily measured and quantified. The task, as I've said, is correlation to the audible. The error is to try to assign a single figure or two-axis data point. That doesn't mean audible correlation is impossible, just a much bigger task. But measurement itself is a done deal.

Not much food for thought, though, other than....it's another example of an opinion with zero information and no statistics. No equipment specified, no information on playback conditions, hardware, connection, environment, and it's the fully sighted and biased opinion of one guy[/QUOTE]

It is certainly easy to take measurements, to make meaningful ones it is not so much. I'll disagree with you on that one. Anyways, how many here have access to lab grade oscilloscopes and know how to get meaningful ffts and so on? Not so many I'd guess.

Anyhow, don't take my comment as nothing more than a data point. In reality I mentioned this because it's something I'm working on but I still need to do comparisons, take measurements and eliminate variables, but I'll be happy to share the link when it's done. However, if done carefully it's a data point. My conclusions may or may not apply to you, but according to the scientific method seeing no difference in something doesn't prove that no difference may exist - only if you rule out all possibilities you can claim that. On the other hand, to prove that a difference may exist it is enough to find one and only one case!
 
Oct 13, 2017 at 6:44 PM Post #328 of 1,606
It is certainly easy to take measurements, to make meaningful ones it is not so much. I'll disagree with you on that one.
You must be working with a unique and individual definition of "measurement" then. We work with standard units, volts, amps, watts, dB ratios, Hz, seconds...what about that is not meaningful?
Anyways, how many here have access to lab grade oscilloscopes and know how to get meaningful ffts and so on? Not so many I'd guess.
I'm not going to guess, but anyone trying to make a scientific statement at least can reference someone's work who does have access to that equipment. (I am smiling...just a bit...about "lab grade oscilloscopes"...funny.)
...according to the scientific method seeing no difference in something doesn't prove that no difference may exist - only if you rule out all possibilities you can claim that.
ok....
On the other hand, to prove that a difference may exist it is enough to find one and only one case!
Nope! That would closer to trying to define "margin of error".
 
Oct 13, 2017 at 6:50 PM Post #329 of 1,606
On the other hand, to prove that a difference may exist it is enough to find one and only one case!

And the validity would depend on how badly you wanted that result and what kind of slack controls you employed to allow you to reach your goal.
 
Oct 13, 2017 at 9:57 PM Post #330 of 1,606
I think I will disagree with the "low enough" bit. Some equipment will never show differences between sources, bad and good recordings, swapping out equipment, audio formats and so on. In others, the differences jump to the listener's ears. The listening test is the ultimate one if the differences are sufficiently evident - one may not be able to tell which one is better, but at least can identify that a change is there. Measurements of currents, jitter, ground voltages... those things are incredibly difficult to measure, and how much do they tell you quantitatively? Sometimes not much.

Just to illustrate? I have the same file in different formats, all from the same studio master (and provided by the studio man himself). I play them in one DAC, they all sound the same. I play them in the second DAC - night and day difference. Does this mean it makes audio formats make a difference of not? What's the answer to the question "do audio formats matter?" Finally, is that a measurement? Just food for thought...
I'm not against listening feedback, I wouldn't waste my time learning and discussing ABX and other listening methods and the best way to volume match some gears if I saw no value in listening impressions. if anything listening impressions are a very good to notice if something seems wrong. it will gives the idea to investigate and make sure. I'm only against mistaking listening impressions for a high fidelity measurement tool. most variables can be checked for fidelity magnitudes better than what hearing could hope to achieve, it's plain wrong to assume that they're equivalent methods or worst, that hearing is superior. and that's even before bringing up all the biases, preconceptions and placebo effects.

and to be clear, I was talking about people who aim at getting better fidelity. the tuning of music with USB cable is crazy talk to me I don't even want to participate in such a conversation. and all those who only care about their personal preferences need nobody else to make a decision. different approaches for different purposes.

about anecdotes, let's go straight to the top, you can find anecdotes about how some famous DAC designers were hearing something but couldn't find any issue in measurements. much more credible than some noname guy on the web. the advantage with anecdotes, you only need for things to happen once by accident to have some impressive stories for the rest of your life. even Shiit has one of those stories. but realistically, how often do you figure that happens, something clearly noticeable(under controlled listening!!!!!!!!!) but eluding measurements completely? and what do you think it leads to? the guy going "ok so my hearing is superior I don't need measurements", or the guy trying to develop a reliable procedure to confirm objectively what's potentially a problem he didn't think about? you know like how every single measurement method was born. ^_^
those anecdotes that people love to misuse as evidence that human hearing is important in testing, IMO they are all testimonies to the lack of proper measurement and methods(for various reasons, laziness, rig too expensive, pioneering in a domain and having to make the tools as we advance...). but they do not convince me that I should trust my ears more.
because how easy is it to fool people into making up impressions of audible differences in sighted tests, simply by changing the price tag, the look, or the marketing of the same cable? we can fool pretty much anybody and we know it for we tried so many times. fidelity assessment through sighted test... yeah funny. and expecting others to just trust someone's impressions and conclusions based on those impressions without any sort of evidence, IMO that's not funny, that's gullibility. but I guess that's why I hang out mostly in this section of the forum(aside from doing bad police job from time to time).
 
Status
Not open for further replies.

Users who are viewing this thread

Back
Top