New Posts  All Forums:Forum Nav:

USB cable and Sound Quality - Page 6  

post #76 of 134
Quote:
Originally Posted by icebird144 View Post
Thanks for the summary.

So, in a hypothetical situation. If the DAC has a large enough buffer, then it can store up (maybe 1-2 second) of audio, so if the jitter occurs, it can use the buffer and fix the problem. Considering USB 2.0 has a bandwidth of 480Mpbs, it shouldn't be a problem in terms of bandwidth.
So anyone know if any of the higher end DACs has this function? If so, then we can use a less than perfect cable?
Re-read what FallenAngel posted. It's a timing issue. Even if the DAC chip had a buffer, it will still be storing timing information that could be good or bad. There's no way for the DAC to reference it to something else - except its own clock (or wherever) when the data arrived. If that's off, then storing it in a buffer isn't going to make it any better.

There is something talked about in terms of packeted data that might keep the proper timing signal with the packets, but it's way above my head. Try reading the thread in the Computer Audio section, "USB Cable Matters!

There are some great lessons in that thread - both about data transmission and timing relative to audio and also about some upset people with rigid thinking.
post #77 of 134
I think whether an expensive or cheap USB cable won't reduce or introduce jitter to the transmitted signal. Jitter is introduced by the source and can only corrected by the receiving end (DAC).
post #78 of 134
Quote:
Originally Posted by Godziltw View Post
I think whether an expensive or cheap USB cable won't reduce or introduce jitter to the transmitted signal. Jitter is introduced by the source and can only corrected by the receiving end (DAC).
What do you make of our friend the professional audio engineer's posts on the previous pages?

Honestly, I get a little tired of folks just making claims with no supporting evidence, particularly when counterarguments and detailed explanations have already been marshaled against a given claim. Then the next dude hops in the thread and makes the same claim without referring to those explanations or counterarguments.
post #79 of 134
So it boils down to timing, correct? If the cable is subpar or whatever it might lead to timing errors between the source and the DAC, known as "jitter". And that's what colors the "sound"?

The 0's and 1's that get sent can only be interpreted as 0's and 1's, and there are no differences between one "1" and another "1".... it's a timing issue between the 1's and 0's only, right?

Unless the cable is soo bad that the DAC can't tell if its a 1 or 0.
post #80 of 134
Quote:
Originally Posted by joshd View Post
?

Untill I have the money to get an outboard DAC, and someone lends me an expensive cable, I will not believe there is a difference untill I actually think I can hear one myself. (In which case I will be massively confused, for the reasons gone over earlier in the thread.)
I won't even BELIEVE my ears.

my ears are connected to my brain. my brain runs logic but ALSO run emotion programs

therefore, you need test equip since those don't have emotions and have no 'stake' in the outcome.

my view is that usb cable means 100% nothing to the sound and that if your source is jittery, your output will be.

I won't waste time 'auditioning' datacomm cables since its a fool's errand to me. I do realize a LOT of people in high end don't think and only want to 'feel' and have their wallets emptied that way. fine for them; just not me.

I haven't heard any diff in ANY spdif interconnect. maybe I should feel lucky that I don't need to overspend on copper or plastic.
post #81 of 134
Quote:
Originally Posted by jrosenth View Post
What do you make of our friend the professional audio engineer's posts on the previous pages?
there is no consensus that I've heard.

various 'audio engineers' also believe in magic.

so it means nothing. a LOT of high end is fake hocus pocus anyway. many so-called high end designers are just modern day snake oil salesman.
post #82 of 134
Thanks to Lapwing and all the others that try to explain the basics, but it seems this is just becomming another audiophile phenomenom that we will be unable to explain or messure...

USB cables are now part of the religion of audophila... whohooo!


Will my el-cheapo USB cable sound better if I put a Kimber sticker on it?
post #83 of 134
also, here's a tidbit to think about.

you can make 'n' digital copies and the bits still stay in place. you can go from DAT to DAT to DAT (yes, I started my digital days back when DAT taping just came out and all us tapers were concerned about this new thing called 'jitter') and you don't lose data and nothing, not even timing, is lost. timing is entirely based on original encoding at the source and not the cables, at least in a semi-buffered semi-realtime scheme like usb.

I used to build boxes that killed scms and allowed consumer decks to do digital transfers (of works they had rights to, of course). but it removed the politics from the DAT taping medium and now you could get bit-perfect copies as long as SCMS=00 (2 zeroes). if a tape had scms=00 on it, then any copy could be copied by even a consumer deck (iirc). anyone who wanted to 'GPL' (before there even was a GPL) their work would set the scms to 00 and then the tape would be easily copyable.

what if 100 people made serial copies of that tape? would copy 100 be the same as 1?

yes. in fact, even if you had 'wet noodles' as your interconnects, you'd still get all the bits there or not, no inbetweens. (OT: if you worried about data loss, you worried about using non-60meter tapes on audio drives. data/dds drives could handle thin 90meter tape but not the audio DAT mechanisms (has to do with tensioning) and so you'd get digital 'buzzsaw' as your data loss!)

the moral is that the ONLY place jitter has any say in things is at the last final d/a stage. you could have a long long chain of source-DAT-DAT-DAT...DAC and only the final DAC stage 'counts' for jitter. all - ALL - the previous 100 chains of d-d interconnect DO NOT MATTER. there never is data loss along the interconnects and there is enough timing (self-timed) to get things to work. the only time you need master clock is for smpte (many things that need sync when mixed). self-clocked spdif is Just Fine(tm) and has been since day-1.

what I'm also saying is that a collection of data (packet, whatever word you want) has to be buffered (ie, received) by the last DAC stage and then decoded. the days of 'pure realtime' are long gone and elasticity exists in buffering in software, firmware and even pure hardware.

in the usb case, you receive data, then you SELF CLOCK IT to your own local analog stages (ultimately). its YOU to blame (the local board and system) for jitter in this case, NOT THE CABLES!'

the cables are way too far away in the buffering of things to matter, guys. at least in the usb case.
post #84 of 134
Linux,
That is an interesting view point. However you still have to assume that usb cables are using swings of voltage to transfer data. Any cable that uses electricity is going to be affected by dielectric dispersion and absorption, or the dielectric being unable to fully drain so instead of a 101 you get a 111 because the voltage is not dropping instantaneously and you have almost a half voltage before the next swing which can be misread. On the order of 44100 swings a second you can get some nasty foulups. You also need to account that most dacs run in isochronous adaptive mode, so there is no error correction or two way communication. You can also end up with slight differentials in timing between the bits in a packet caused by the cable before it reaches the buffer, so your only buffering data that already has a fair amount of jitter.

This does not mean go out and buy a ridiculous usb cable for 500$ but don't get one with horrid dielectrics.

Dave
post #85 of 134
Quote:
Originally Posted by myinitialsaredac View Post
Linux,
That is an interesting view point. However you still have to assume that usb cables are using swings of voltage to transfer data. Any cable that uses electricity is going to be affected by dielectric dispersion and absorption, or the dielectric being unable to fully drain so instead of a 101 you get a 111 because the voltage is not dropping instantaneously and you have almost a half voltage before the next swing which can be misread.
ALL the usb errors I have gotton have been due to windows software or bad hardware or even a faulty controller (design). almost never is the actual *implementation* flakey. it works or it does not. it delivers bits there, somehow the receiver can pick the bits out and then save them to something (camera, flash card, disk, sound device, or even keyboard/mouse!).

usb is very very asynch (not isoch like firewire is). with all the slop that is designed into the protocol, you'd think that 'bit timing' would not be one of the things that would affect data delivery.

the ideal thing would be to have 'timestamps' labeled with each single data sample. duh! you'd think that would be obvious, right? they do that in a LOT of datacomms (my field of work) but they don't do that in audio. you have to extract clock from the data, in consumer spdif.

but even with noise and such on the cable, usb is differential (for one thing) so its already pretty immune to a lot of 'cable things' that make one cable better than another.

once the frames or packets are pulled off the wire, they have to be assembled into data samples and assigned their own timestamps (so to speak) in timing. note WHO is doing this! this is AT the usb receiver side of things. he receives a datagram, he opens (time goes by), he strips off headers and checksums and other protocol stuff (time still going by) and then he gets enough data for send THAT to the 'internal dac' and THIS is where timing really matters.

and the waveform on the usb cable matters about..... 0%. its digital, guys, and a LOT of trash can be on a digital signal and the bits STILL seem to get thru.

again, I've personally seen something close to this 100-digital-deck chain (grin) and the bits really do get there. that says a lot about how GOOD spdif really is. people need to give more credit to those that designed this since it has been doing a pretty OK job of things all these years.

I would care more about long long cable length, but this is for ALL usb (even more important is my remote usb disk!) as long as the cable is within spec and the receivers are doing their jobs, the bit delivery works Just Fine(tm).

blame the dac or its stages. stop blaming cables in the digital world.
post #86 of 134
I found an interesting discussion about jitter in this long thread with Dan Lavry and some folks from Apogee.

PSW Recording Forums: Dan Lavry => Proper word clock implementation

I recommend anyone interested in the subject of jitter to read it entirely. Many differing opinions are shared there about jitter in passive components such as cables, resistors, capacitors, etc., but unfortunately that thread was closed before some important questions about the subject were able to be answered.
post #87 of 134
as an aside, wouldn't it be fun if usb audio went faster than 1.1 speeds?

lets assume a tcp-like connection (grin). you have sequencing, timestamps, timeouts and retries. all the basic things you need to keep a serial connection 'going'.

the trick would be to have a transmission speed far exceeding the data output speed. this gives you enough time to truly do NAKs (I didn't get your last frame, please resend) and be really a reliable transport.

burst at 10x the speed, maybe. send a packet (with error detection, no need for error -correction- as much anymore with retries now in the protocol) and then you have some 'slop time' that can be used (or not) if needed for retries or whatever. for the next 10n timeframes, the dac is going to be 'slowly' clocking out that data at its own local rate.

this would give near 100% isolation between timing (elasticity) of source and sink.

will the industry ever do this?

usb audio isn't like this, today. but its still not a pure realtime 'hot off the presses' kind of datalink, either.

usb1.1 is yesterday's news. everyone can run at 2.0 speeds today and even 2.0 is fast enough to allow some retries (some) and still not lose data.

I bet eventually there will be true 2.0 speed audio devices and then there shall be NO excuse for 'cable magic' on those
post #88 of 134
Linux,
Usb audio is not asynch in almost all implementations. It is isochronous adaptive. There are no retries, this differs from usb harddrives or the like using bulk mode transfer where timing doesnt matter at all, just the correct bits. Most of the time the bits do arrive correctly ie a stream of 101010101 would arrive just as that 101010101, but it could be 1 wait 0101 wait 01 wait 01 i.e. have jitter. This could be caused by the dielectrics being unable to drain fast enough to go from 1 to 0. In a perfect situation the dielectrics could drain instantly, but in our limitations instantaneous motion is impossible (i.e. the speed of light).

As for the 2.0 data the best of all worlds would be to implement a 2.0 data speed with a huge buffer that is many seconds, and uses bulk mode transfers with an offboard ultra clock. Actually a company is already doing this, I just dont remember which one, but I believe the dac was between 3000$ and 5000$. They neglect to inform you what chip they use also.

Cheers,
Dave
post #89 of 134
I will buy a gold-plated, for the possible difference...
post #90 of 134
Thread Starter 
I just cant spend money like that on a USB cable when there are so many tubes to buy! But ya never know, maybe when the wife isn't looking.
New Posts  All Forums:Forum Nav:
This thread is locked