Head-Fi.org › Forums › Equipment Forums › Sound Science › Don't get why "Audiophile" USB Cable would improve sound quality
New Posts  All Forums:Forum Nav:

Don't get why "Audiophile" USB Cable would improve sound quality - Page 54  

post #796 of 835
Quote:
Originally Posted by uelover View Post

 

There are many, I mean, really, many, asynchronous USB DACs out there in the market.

 

6Moons is an audio reviewer website though so I am not sure what you meant by 6Moons' Wavelength DAC. They don't make DAC.

 

Right you are. It's from Wavelength Audio and it's called The Brick. Here we go. I'm doing more reading on the subject and you're right, there are quite a few more. I think even with asynchronous streaming there's still no data re-sending, but now I can't be sure.

post #797 of 835

Within every industry there are products that cost more just for the sake of it.  If one manufacturer didn't make it someone else would.

 

Some people have more money to spend than others 

post #798 of 835

Yesterday I read in Hobby Hi-Fi magazine a short article titled "The truth about usb cables" or smth like that. I skipped the part where the reviewer described his listening test but in the second part he used an audiophile usb cable vs generic one to scan images, and audiophile cable gave a slightly larger file, although there was no difference to the eye.

post #799 of 835
Quote:
Originally Posted by MaciekN View Post

in the second part he used an audiophile usb cable vs generic one to scan images, and audiophile cable gave a slightly larger file, although there was no difference to the eye.

 

Way too many questions about process there to know what's going on. For instance, many scanners autofocus and adjust exposures before each and every scan (and unless you are very tech savvy, you may not know, or know how to disable that feature). Any adjustment to either feature (or quite a large number of other factors), could affect file size - especially if using a compressed file format.

post #800 of 835

I guess the idea was that audiophile cable preserves more information. IMO usb audio is about using communication protocol that allows handshakes between PC and usb chip and not uber expensive cables, though it was interesting to see what a hi-fi magaizne has to say 'bout this.

post #801 of 835
Quote:
Originally Posted by MaciekN View Post

Yesterday I read in Hobby Hi-Fi magazine a short article titled "The truth about usb cables" or smth like that. I skipped the part where the reviewer described his listening test but in the second part he used an audiophile usb cable vs generic one to scan images, and audiophile cable gave a slightly larger file, although there was no difference to the eye.

 

That is typical of the tactic used. Did the article show any causal link to audibility of the difference? Did the article show a causal link to sound quality difference? Or did it just suggest a link by saying I heard a difference, I can measure a tiny difference, what I heard was caused by that difference?

post #802 of 835

I always get a good laugh fro these discussions.....

 

The output of a scanner is a DIGITAL FILE - which is a textbook example of the type of data that CANNOT, EVEN POSSIBLY be affected by the cable (as long as the cable isn't downright defective). When transferring a digital file between data devices (like a scanner and a computer), the file really is just numbers. The timing of the numbers is totally unimportant since they are simply being moved from one storage point to another. As long as the numbers arrive in good enough shape that the receiving device can read them correctly, then there cannot be any difference. Period! End! Anybody who claims otherwise really doesn't understand what's going on.

 

A DAC has the potential to be affected by cables because the cable may affect the TIMING of the numbers (usually by adding or aggravating jitter), and the operation of a DAC IS sensitive to the timing of the incoming numbers. Now a reclocker (like the Audio-Gd DI or the Audiophilleo) is supposed to remove all original timing - by substituting the timing of its own clock (which is hopefully much better). Likewise, a DAC with a sample-rate-converter in it is supposed to re-do the clock, and replace the original clock with its own, which would make the quality of the timing of the original signal totally unimportant as well. Assuming that either device does its job correctly, then the cable cannot matter at all (the cable can matter to the extent that the device FAILS to do its job perfectly; and nothing is totally perfect).   
 

HOWEVER, on a DAC that doesn't reclock the data it is quite reasonable to expect that, since the cable may cause jitter or other timing problems, and the DAC is still sensitive to those problems, the cable may indeed influence the sound.

 

The answer isn't to get a better cable, though; the answer is to utilize something that eliminates the timing problems, and so make the cable unimportant (like a DAC with an ASRC, or a DI, or an asynch USB converter, or whatever).

 

Keith

post #803 of 835
post #804 of 835

Yes, and no.

 

Digital data is indeed really an analog waveform; and that fact causes all sorts of problems. The whole "trick" is to make it as much like pure digital data as possible. In the case of USB, and DACs, and s/PDIF, the data itself is just numbers. HOWEVER, the process of turning those numbers back into audio requires timing information - a clock. It is this timing information that suffers because of the difference between an ideal square wave and reality.... in almost all cases, the data itself can easily be read perfectly, but the clock cannot be recovered anywhere near "perfectly".

Luckily for us, what the clock should be is pretty well known and easy to determine for a given signal; this means that we can "fix" or "regenerate" the clock. Now, with our perfect data, and our new and very high quality clock, we have everything we need to get back the original audio with high accuracy.

 

We have three choices here: we can try to repair the original clock; we can figure out what the clock should be and create an entirely new and equivalent one; or we can create an entirely new and different clock and then use really fancy math to figure out the correct new numbers (based on the old numbers) that will give us the output we want. Of course, we can also choose "none of the above" and do the conversion with the buggered up original clock. If we do any of the first three, then the USB cable we use shouldn't make any difference whatsoever - except in our minds.

The real sole exception to that is that, if we choose to fix the original clock, a better cable might get it to us in a little bit better shape, which might enable our limited ability to repair it to give us a better overall result. 

post #805 of 835

Doesn't that depend on which specs you read.... and which ones really matter?

 

When we're talking about an analog signal, there are all sorts of variables, and different people place different priorities on each.

When we're talking about digital data (like what comes out of a CD transport), it all becomes a lot simpler.

 

What's on a CD is just a file with a list of numbers in it.

There are only two "relevent" specifications.

The first is whether the numbers themselves are correct (that one is a simple "yes or no").

The other is how accurately the timing between the numbers is delivered (the timing is NOT recorded on the disc; it is re-created by the player itself).

 

IF you're connecting your transport to something that re-clocks the data, like a DAC with a re-clocker, or you're ripping it to a PC server,

then the ONLY thing that matters is the numbers. In that case, unless one or the other is defective, the digital output of a $30 player IS

ABSOLUTELY the same as the output of a $30,000 "CD transport" - and anyone who says differently is pulling your leg.

They both play the disc and give you the numbers that are there. Since the timing doesn't matter, the discussion is over.
 

Since the timing isn't stored on the disc, then something, either the player of some clocking device, must re-create it.

It seems to make very little sense to NOT re-clock the data at the DAC since, by doing it that way, you are pretty much deliberately

making things so any defects in the transport or the cabling will make the sound worse.

(Remember that it isn't even theoretically possible to somehow make it sound better than it would with a good clock and correct data.)

 

Keith

post #806 of 835

Thats fine in theory - and with most USB receivers nowadays use asynchronous transfer mode so there is no clock recovery at the USB receiver.  There is however a buffer and as far as I know some sort of FIFO process to make sure no bits are dropped.  Some better transports even have bitperfect audit capability to make sure there is nothing being lost.  The only difference I can see happening with an asynchronous USB interface is that if the USB cable or computer has high latency then a larger buffer might need to be used, which in turn requires more processing power etc. but as far as I know should still allow bitperfect output.  At worst if there is a huge latency spike then some bits might be dropped.  If you have a wireless card then this might happen about once every minute or so with Windows operating system.  Linux and Mac tend to have lower latency but it is still a good idea to disable wireless and other unnecessary processes to avoid other threads competing with the audio playback thread/s.

 

The point of all these buffers and PLL filters etc is to reduce the jitter in the signal to ensure that a bitperfect accurately clocked signal reaches the DAC.  Ideally in theory you could have reclocking inside the DAC as well but this is very uncommon at least in the DAC's I have been looking at and most just use bitperfect filtering and clock recovery, so in the real world the transport does play an influence at least in theory as the transport clock is recovered by the DAC as far as i know.  Also in theory, at least with the latest USB transports from XMOS, Audiophilleo etc it should be possibly to use any decent USB cable and computer without running into problems.  Users of transports like the Audiophilleo tend to agree that the USB cable and computer software/hardware do not make a very noticeable difference.  Experiences with the XMOS USB transports seem to be pretty thin as this is a new design but it is becoming very popular, even HiFace use the XMOS now, and some XMOS transports have been compared favorably to the Audiophillo AP2.  So in the ideal world these new generation of USB transports should take whatever the computer/cable throws at them and still produce a high quality bitperfect SPDIF or I2S signal.

 

Unfortunately though a lot of transports floating around use adaptive or older asyncrhonous USB receivers and drivers which in my own subjective experience with a modified HiFace are not 100% robust and show differences in computer hardware, software and cabling.  Similarly some DAC's have excellent digital inputs with or without reclocking and seem indifferent to the transport choice, but most do not seem to match this level of performance.  My guess is that the more powerful the USB receiver, the better it can manage larger buffer sizes without breaking a sweat.  So if you have a DAC with  very high quality and powerful USB input such as Anedio, Antelope, Calyx, PSAudio etc it is not very likely that USB or computer will play a significant influence on the system performance.  For less advanced DAC's choosing a modern XMOS or Audiophilleo USB transport should take most of the computer and USB out of the equation.

 

In the real (or imaginary) world nothing is 100% certain, no system is 100% jitter immune, and there will be specific cases where software/hardware including USB will matter eg with older tech USB receivers and DAC's, very slow, noisy, poorly set-up or old computers or other particular circumstances.  I think a lot of the seeming explosion of USB cables on the market is a product of these older technologies still on the market (esp adaptive USB) as well as particular circustances, as well as of course paranoid tweakaholics using imperfect systems.  IMO the name calling and stubborn minded opposition between ardent objective and subjective thinkers, or rather beleivers and non-believers (yes science and philosophy tend to overlap and in many cases stand in for one-another) is really not constructive.  The fact that many from the objective camp have taken a particular attitude toward the "unscientific" sectors of the Audiophile industry and those who support it doesn't help either (not that this has cropped up recently in this discussion).  The difference in philosophy over what constitutes audio performance is 100% a matter of opinion - one side seems to insist (at least in particular cases) that audio performance is a purely objective matter, while the subjectively minded would insist that objective measurements are a sidenote, an alternative measurement rather than an overriding and exclusive measurement.  Equally the confusion that results when purely subjective and anecdotal impressions are used to qualify performance continues to create confusion about jitter, electrical noise and its effect on digital to analog converters.  In my own experience certain forms of jitter can actually improve the subjective quality of order by creating a darker, warmer and more forgiving audio reproduction.  Similarly the designer of Antelope DAC's has said in an interview that certain forms of jitter can improve the subjective audio quality of a DAC by hiding certain electrical shortcomings in DAC chips, or otherwise creating a subjectively " more natural" presentation.  The problem occurs when a particular tweak is tested and considered to improve performance and therefore posited to lower jitter even though on paper it can be shown to increase jitter and latency.  Another interesting phenomenon is that people with warmer/darker gear tend to chose correspondingly brighter or more analytical source and transport equipment, which equally does not always correlate to lower noise or jitter.

 

It is only human to consider that the particular point of view one has is valid and all other's are not.  I am not saying that anyone in particular is exhibiting this point of view, I am speaking purely in an abstract and theoretical manner.  This in a sense is essential to being productive, once one decides on a certain course of action it is productive to follow through with this until the task is complete.  In the context of a dialogue or at the stage of evaluating possibile courses of action this is not productive as it may exclude certain possible course of action or inquiry which may be valid or fruitful such as considering component selection or setup.  I think in this dialogue it is good that we have people from different perspectives that can contribute to the discussion as long as the door is left open to perspectives outisde if not contradictory to that particular perspective.


Edited by drez - 5/26/12 at 4:04am
post #807 of 835

Can't you simply have a buffer at the receiving end and take the transport out of the equation? So it won't matter if your USB cable's asynchronous, clocked accurately blah blah? I mean thats what Amarra does on my Mac... it takes the data sent to my external dac and back to it, stores it on its buffer, and plays back the optimal version?

post #808 of 835
To risk repeating what has been said before, USB doesn't actually transmit the audio at 44.1 khz or whatever, this is apparently above the frequency that USB operates at. So the computer must send the music data in packages which the transport/DAC must then reconstruct into a real time stream of audio data. The difference between asynchronous and adaptive USB transfer modes is that in adaptive the computer controls the sending of the packets, and in asynchronous the transport/DAC controls the sending of the packets. In both of these transfer methods buffers are needed in the transport/DAC as well as in the computer to account for small variations in the rate at which data is fed to each stage.

What gets complicated though is when you start talking about how changes in any of these hardware/software components might affect jitter of the audio stream being fed to the DAC chip when all these buffers are in the way to make sure no data is lost and that all the samples are kept in the right order. In theory a USB cable is a passive component so it should not contribute actively to any timing variations in the data. it can only affect the USB signal (which is not transmitted in real time) maybe by provinding better shielding, better phase characteristics, impedance control etc for the USB signal (or in many cases worse performance but people subjectively think it is better). If you look at things from a technical perspective you would not predict that the USB cable would be able to affect the sound quality in any meaningful way. But I would be a hypocrite if I were to say that I think this is the complete story as I use audiophile USB cables myself, just I wouldn't say that the make a major difference in my system, definitely not "night and day" or something I am confident I could tell in a blind test. I would also say that a lot of audiophile USB cables I have tested have been rubbish and appear to add jitter to the sound, make the sound warmer and more vague etc. Pleanty of people swear by Monoprice and Belkin or DIY cables.
Edited by drez - 10/31/12 at 5:40am
post #809 of 835
Originally Posted by drez View Post

USB doesn't actually transmit the audio at 44.1 khz or whatever, this is apparently above the frequency that USB operates at.

 

44.1 kHz x 16 bits per sample is 705 kilobits per second, plus overhead for error correction/detection, packetization and transfer protocols over USB. I don't know the actual amount of overhead, but for the sake of discussion, let's say it doubles the amount of data. That would still require less than 1.5 megabits per second. USB 2.0 is specified to handle up to 480 megabits per second. For CD grade audio playback, USB frequency isn't a limiting factor

post #810 of 835
Quote:
Originally Posted by Bostonears View Post

44.1 kHz x 16 bits per sample is 705 kilobits per second, plus overhead for error correction/detection, packetization and transfer protocols over USB. I don't know the actual amount of overhead, but for the sake of discussion, let's say it doubles the amount of data. That would still require less than 1.5 megabits per second. USB 2.0 is specified to handle up to 480 megabits per second. For CD grade audio playback, USB frequency isn't a limiting factor

 

Enjoying your music in mono?  I think you need to multiply by 2 somewhere in there.  tongue_smile.gif

 

That said, you're right, and I think somebody must have made some kind of typo or botched explanation.  I don't think it was meant that USB can't handle the data rate, but I can't tell what was meant either.

New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Sound Science
This thread is locked  
Head-Fi.org › Forums › Equipment Forums › Sound Science › Don't get why "Audiophile" USB Cable would improve sound quality