Don't get why "Audiophile" USB Cable would improve sound quality
Jul 31, 2011 at 11:49 PM Post #601 of 835
Lossless however has the advantage of archiving and transcoding without extra loss in data quality. With hard drive storage so cheap these days, lossless is a more than legitimate option to backup your audio data.
 
The protocol USB is noisy...however than due to the USB controller and not the cable itself. That and it's not likely at all audible.
 
Aug 1, 2011 at 6:25 AM Post #602 of 835


Quote:
Lossless however has the advantage of archiving and transcoding without extra loss in data quality. With hard drive storage so cheap these days, lossless is a more than legitimate option to backup your audio data.

 
I think he was talking about Lessloss, another shady cable company.
http://www.lessloss.com/tunnelbridge-distortionless-interconnect-system-p-204.html
They have a machine that increases the quality of your interconnects! I might be wrong, but I thought adding more stuff to the signal's plath was bad.
 
Oh and what consensus did this thread reach on power cables? I understand that the fact thatyou have hundreds of meters of cable in your walls conducting the power is going to count more than the high quality stuff that you use for those 2 meters, but a friend of mine once told me his DVD player needed a good power cable to work, and the normal cable wasn't good enough... I know I could go back and read everything, but come on...
 
Aug 1, 2011 at 6:38 AM Post #603 of 835


Quote:
I would place locus [pocus?] design in the same bag with LessLoss audio.  The way they market their products is appalling and dumb, as is the physical width of their flagship cable.
 
"hear the difference" doesn't cut it -  there are too many frailties to casual perception to rely upon it.  One can either measure electrical performance, then prove that the results can be heard, or do a direct ABX or equivalent. 
 
With USB cables it really is the measurements that count, and the changes a USB cable make can only be to RF noise and signal Jitter.  There is no mysterious magic going on to warrant such a product development approach.  
 
At the very least some oscilloscope measurements with real world connectors would be appropriate, but it would seem as price increases, accountability decreases and snake oil factor increases.




That is what does it for me and cables, all cables. No maker can prove a link between the way they make a cable and how it can sound different from other cables. Any difference is not in the cable.
 
Aug 1, 2011 at 8:27 AM Post #604 of 835
LessLoss' credibility is further damaged by the fact they sell the "LessLoss Blackbody," (a magical quantum McGuffin) which only the most devoted audiophile would believe actually works. It received a good review from 6moons, but then again, the reviewer also seemed to be labouring under the delusion that he could feel wireless networks and that they suppressed his mental faculties through means unknown.
 
Aug 1, 2011 at 8:38 AM Post #606 of 835
Are you going to see how good it is with a blind test, or are you merely going to plug it in and decide whether it sounds better?
 
Aug 1, 2011 at 11:32 AM Post #607 of 835
It seems as absurdly easy a double blind test is to do, nobody who buys into the snakeoil cable market actually tries them. Well, in fact, I suspect many do try it, but find that they can't distinguish a difference when they don't know which cable they're listening to. Then they simply don't post anything, or say something like: "Well just use your ears!" [Ironically enough, a double blind test isolates exactly that - your ears] "All that really matters anyway is whether you get enjoyment out of it. Everything else doesn't matter!" [Nervous laughter as the brain subconsciously suppresses knowing that $1000 was paid on something absolutely worthless.]
 
Aug 1, 2011 at 1:14 PM Post #608 of 835
Quote:
It seems as absurdly easy a double blind test is to do, nobody who buys into the snakeoil cable market actually tries them. Well, in fact, I suspect many do try it, but find that they can't distinguish a difference when they don't know which cable they're listening to. Then they simply don't post anything, or say something like: "Well just use your ears!" [Ironically enough, a double blind test isolates exactly that - your ears] "All that really matters anyway is whether you get enjoyment out of it. Everything else doesn't matter!" [Nervous laughter as the brain subconsciously suppresses knowing that $1000 was paid on something absolutely worthless.]


What's even worse, the snakeoil cable companies don't do it either. You'd think it would be a great way to prove superiority over competitors. Even small DBTs with a couple people, which wouldn't be very significant scientifically, would be more than enough to impress consumers and make great review fodder to boot.
 
But differences wouldn't show up. And the target market, subjective as it is, doesn't care much anyway.
 
Aug 1, 2011 at 7:15 PM Post #609 of 835
 
Quote:
I'm pretty sure this has been explained over and over and over again.  It is physically impossible for a USB cable to significantly (as in, audibly) change the sound.

 
Just how old are you?  Do you remember "Pure Perfect Sound, Forever".  Do you know/remember what the first CD players sounded like back in the '80's before they discovered jitter?  Bits-is-bits-is-bits, they told us.  So, now we have discovered jitter, we understand it, and we buy into it, and we accept that bits-wasn't-bits-wasn't bits.  But today, so you tell us, a USB cable can't POSSIBLY affect the sound quality, because we KNOW that the signal is re-clocked at the DAC, so there CAN'T be any jitter, so it CAN'T POSSIBLY affect the sound.  And, moreover, there is NO POSSIBILITY WHATSOEVER that there may be other effects that we haven't got round to reliably quantifying/measuring yet.
 
And I'm so dumb, that I'm just hearing things that I want to hear?  Puh-lease!
 
So tell me, if a signal is playing at (for example) 24/192, just how small must the jitter be in order for it to be incapable of inducing significant artifacts (i.e. artefacts which would cause the digital data stream to be different if it were re-sampled) into a theoretically perfect DAC's output?  And how would you set about measuring that?...  Now, I'm no expert, but I think I can do that calculation.  Why don't you give it a go yourself, and tell me what you come up with.  (Hey, I'm not trying to put you down here - I'm being serious).
 
 
Aug 1, 2011 at 7:43 PM Post #610 of 835
Placebo effect has nothing to do with being dumb. You could be the most brilliant person in the world and still "suffer" from the effect.
 
I'll admit I am not an electrical engineer (but a computer engineer), so I can't say very specifically how DACs are designed internally. However I do know that there is literally no way a digital audio signal itself can be modified in sound due to the digital cable. At most, you'll have data loss and skipping of samples, which could cause stuttering or skipping. To modify the sound, the sound waveform itself would need to change in a very specific way. Again, skipping samples would cause skipping sounds, not a "warmer sound" or "better soundstage". It's just impossible.
 
Do modern DACs actually run directly off the USB data feed without any buffering? Do they not have their own local clock signal? If they run off a USB clock and/or don't buffer the data feed, that sounds incredibly stupid to me. That's not a digital problem, it's a flaw with the DAC circuit implementation if clock jitter is permitted to be introduced to the analog output. Still, this won't change the sound signature. Even in this case a better cable won't give you boosted bass, or "better soundstage" or whatever; at most, a jittered clock to the DAC could definitely make it sound distorted and "rough" sound, but again, this is a flaw with your DAC circuit implementation. And again, even in this case, there's no way it can give you a "bass boost" or "louder treble" etc.
 
This was my only point to begin with. People who imagine a bass or treble boost or totally different sound signature coloration must be imagining things, because this is simply not possible. Jitter and noise coming out of the DAC are of course to be expected, but that's a DAC issue. The cable shouldn't effect this, if the DAC circuit is properly designed.
 
Aug 1, 2011 at 7:51 PM Post #611 of 835


Quote:
 
 
Just how old are you?  Do you remember "Pure Perfect Sound, Forever".  Do you know/remember what the first CD players sounded like back in the '80's before they discovered jitter?  Bits-is-bits-is-bits, they told us.  So, now we have discovered jitter, we understand it, and we buy into it, and we accept that bits-wasn't-bits-wasn't bits.  But today, so you tell us, a USB cable can't POSSIBLY affect the sound quality, because we KNOW that the signal is re-clocked at the DAC, so there CAN'T be any jitter, so it CAN'T POSSIBLY affect the sound.  And, moreover, there is NO POSSIBILITY WHATSOEVER that there may be other effects that we haven't got round to reliably quantifying/measuring yet.
 
And I'm so dumb, that I'm just hearing things that I want to hear?  Puh-lease!
 
So tell me, if a signal is playing at (for example) 24/192, just how small must the jitter be in order for it to be incapable of inducing significant artifacts (i.e. artefacts which would cause the digital data stream to be different if it were re-sampled) into a theoretically perfect DAC's output?  And how would you set about measuring that?...  Now, I'm no expert, but I think I can do that calculation.  Why don't you give it a go yourself, and tell me what you come up with.  (Hey, I'm not trying to put you down here - I'm being serious).
 



I'm not talking about just jitter here, I'm talking about the cables.  Any jitter that there may be won't be introduced by the cables but by the USB controllers, which actually can introduce a significant amount of jitter to the bitstream.  However, having a better cable simply will not help with this.  It may hypothetically reduce the amount of damage being done to the stream as much as possible, but if you have jitter, it's always gonna be there whether your cable is $2000 or not.  The amount of jitter a competently designed USB cable will actually introduce into the bitstream is utterly insignificant at the frequencies audio operates at.  I mean, these (USB 2.0) are designed to transmit data at around 480Mbits/s, if you had bits dropping and getting skewed everywhere along the line, you'd barely be able to transfer anything, let alone transfer data at that speed without some serious slowdown.
 
Rather, the real issues involved here are that of the controller and the clock, not the cable.  And I highly doubt that the guys at, say, Locus have spent as much time and energy (read: millions of dollars and thousands of man hours), nor have as much experience in designing a cable to work with the USB protocol.  They clearly just feel that their experience in making analog cables out of pure silver will translate directly into making digital cables.  They probably don't even make measurements, because there really aren't any for them to take...they'd just show that it's not helping.  Not to mention the tools they would need to measure such things would be extremely expensive.
 
At any rate, show me some solid measurements that show that an audiophile USB cable will measure differently (doesn't even have to be audible) from any run-of-the-mill USB cable that comes bundled with a camera or something then we'll talk.  And please don't say that there's "something we haven't discovered yet," if this "something" was really so much of a problem, then it would have plagued high-speed data transfer far more than the relatively low-stress job of transmitting audio (10Gbps is a lot more than what's transmitted with audio.  And as far as timing problems go, the wiring inside of a computer must be extremely, extremely precise since at the speeds at which the processor operates, pretty much any deviation of the signal will cause bad things to happen).
 
Aug 1, 2011 at 7:56 PM Post #612 of 835


Quote:
Placebo effect has nothing to do with being dumb. You could be the most brilliant person in the world and still "suffer" from the effect.
 
I'll admit I am not an electrical engineer (but a computer engineer), so I can't say very specifically how DACs are designed internally. However I do know that there is literally no way a digital audio signal itself can be modified in sound due to the digital cable. At most, you'll have data loss and skipping of samples, which could cause stuttering or skipping. To modify the sound, the sound waveform itself would need to change in a very specific way. Again, skipping samples would cause skipping sounds, not a "warmer sound" or "better soundstage". It's just impossible.
 
Do modern DACs actually run directly off the USB data feed without any buffering? Do they not have their own local clock signal? If they run off a USB clock and/or don't buffer the data feed, that sounds incredibly stupid to me. That's not a digital problem, that's a flaw with the DAC circuit implementation. Still, this won't change the sound signature. Even in this case a better cable won't give you boosted bass, or "better soundstage" or whatever; at most, a jittered clock to the DAC could definitely make it sound distorted and "rough" sound, but again, this is a flaw with your DAC circuit implementation. And again, even in this case, there's no way it can give you a "bass boost" or "louder treble" etc.
 
So I ask, are there many high end DACs that run off the USB without buffering, and without an internal independent clock? If so, I'm shocked.



Actually, as far as I know, most DACs do run without any internal buffering...all of the buffering is done on the host-side, from what I understand.  Not really sure why this is though...I think it's something to do with how the USB protocol works.
 
Aug 1, 2011 at 8:00 PM Post #613 of 835
That's horrible. There should be no reason a high quality DAC box can't have its own buffer and clock.
 
Even in the case where you're relying on an external clock, it's still not gonna change the sound signature like boosting bass or whatever. That was my only point here.
 
Aug 1, 2011 at 8:03 PM Post #614 of 835

That's horrible. There should be no reason a high quality DAC box can't have its own buffer and clock. 
 
Even in the case where you're relying on an external clock, it's still not gonna change the sound signature like boosting bass or treble or whatever. That was my only point here.


based on zero real world experiment I presume? making your opinion mostly a wild guess?
 
 
Aug 1, 2011 at 8:06 PM Post #615 of 835
The big wild guess is that jitter will cause audible distortion at all, because all I've ever encountered was skipping due to the USB controller. I don't even know what clock rates we're talking about here, or the specifics of how DACs work, so it may not even be an issue within the spectrum of human hearing at all, for all I know.
 
The non-guess (fact) is that to have boosted bass, you need to modify the digital signal in a very specific, low-frequency way. Jitter at most will distort the analog output at a high-frequency granularity. This cannot change the amplitude of sound across the frequency spectrum like believers seem to say.
 
Also, keep in mind this is the science forum. You have two options to assert a claim:
 
1) Do a double-blind test. Sorry, "use you ears and eyes" doesn't work here. Only "use your ears without eyes" is proof, and it must be statistically significant.
 
2) Show scientifically why something is or is not possible, under a given context.
 

Users who are viewing this thread

Back
Top