1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.

    Dismiss Notice

Supra USB cable and my findings/predictions!

Discussion in 'Sound Science' started by manueljenkin, Nov 27, 2019.
2
Next
 
Last
  1. manueljenkin
    I'm pulling in a few technical posts, but nothing relating to measurements but rather relating to guessing what I'm hearing :: please let me know if it violates any rules.

    I received my supra cable 2 days ago and have been comparing with the stock cable on the apogee groove. I believe I'm hearing a difference, but can't say for sure. I tried doing like 30 swaps and gave it two days. At the moment, I'm kinda convinced that there is a difference. Unfortunately I'm not sure if the difference is for good, the supra maybe sounds softer and bass has a weird character (off phase or something maybe?). Maybe the cable is worse, or maybe it is revealing more of issues in windows. (more in latter passages).

    There does seem to be a possible technical reason since I've had serious issues with choosing music players for windows - almost every single music player sounds different (even out of asio). I can blind tell between foobar2000 and winyl easily, both in asio configurations, and the other music players I've tried - musicbee, and few others were a total disaster in comparison, sounded very low passed and dull. The stock groove player is terrible and has serious buffering artefacts and heavy amount of low passing. There are songs where I need to replay from beginning to get proper output on groove player (if i scroll around, some details go missing). In a lot of ways I can compare the stock cable vs supra to be similar "in feel" to the difference between winyl and foobar2000. I'll try on a linux machine at a later time and update my post.

    I can kind of sum it up this way.

    The difference between
    1. Foobar 2000 and winyl
    2. Apogee groove and geek out
    3. Supra cable and stock cable
    Are relatively similar feeling phenomena.

    The latter's sound more in your face and slightly forced. The former's sound a tad too unforced. I think that unforced thing comes from windows. I can hear changes in multiple areas but it doesn't really translate into better. I am running out of my Surface book, and I for sure know that windows schedulers are not well optimized for Audio. It's hard to analyse anything of these sort in windows. Will check on linux and mac soon. I'll also try the same with a DAW which might force windows to prioritize audio (since it'll capture a lot of the memory for the processes). That may be a much better ground truth. Hopefully I'll get a better clue. Underlying cause of groove vs geek out is different tho. Geek out lacks dynamic range (post 8826 - https://www.head-fi.org/threads/sennheiser-hd-700-impressions-thread.612502/page-589). The rest two may be scheduler artefacts or channel artefacts.

    Please do try the winyl vs Foobar comparison (both in asio). It's super audible in almost any gear. I'm confused which is right. Anyone can ab-x this 10/10 since even the perceived loudness is different, apart from the channel separation changes it brings in. I still have to think more about usb difference. If there's an error on the normal cable. It shouldn't sound the same everytime. Error is random. Not constant. There are ways this might be masked in the later stages in DAC filters!!

    So some technical analysis of the same.

    USB has got 4 pins - power, ground, data+, data-.. so data is sent as a differential. There's little chance of noise issues. All issues can now be safely traced back to "timing" and transistor pull up/impedance matching. Basically all digital signal are still an analog waveform with an eye pattern. They are just discretized in terms of usable states. They still need to be sampled by transistors at the phy layer. And there's concepts of impedance matching to be optimal point in load curve for those transistors. 90ohm is the recommended one iirc. I think the difference has got not much to do with noise and isolation since USB is a differential signal. It's about stress on Phy layer. Phy layer is the one that samples the signal back into the circuit.

    The biggest change I've been able to perceive and quantify so far is 1. channel separation 2. A sense of dryness in bass (similar to polarity/phase offset). It doesn't mean it's wider soundstage or anything but, it feels like the sides are more well defined but bass feels enigmatic. Works in one song, doesn't work in another. I do think the channel separation has an explanation. (https://www.usb.org/sites/default/files/audio10.pdf) - 3.4 Inter Channel Synchronization - "It is up to the host software to synchronize the different audio streams by scheduling the correct packets at the correct moment, taking into account the internal delays of all audio functions involved". I think this is an old USB spec, but I still think the content should still hold.

    I pretty much suspect that usb is prone to errors, otherwise why would they have such highly defined sequences for error correction and re-transmission on the other modes for file transfer.

    Basically usb has different modes for transfer of data.

    The normal mode for network/file transfer supports error correction and re-transmission. However it's not true for audio. We used to have isosynchronous protocols (where there is a specific timing maintianed and it sends data uniformly over time) of 125us per transfer and 8000 frames per transfer. This would have meant alternating frame count every second - 5 in first second for 40000 samples and 6 in next 48000 samples averaging 88000 for two seconds (the 200 samples will have to be interpolated). If you had audio in 48000hz, no interpolation would have been necessary.

    Important thing to note is, this 125us polling time is an override mode. The general pollling time of usb is 1ms for the other transfers, so we are overriding it by atleast 8x and it becomes way more stressful and jitter sensitive.

    For asynchronous audio (after usb 2.0), it uses 1ms poll time with tonnes of buffer (defined between the dac and host during initial communication) and the clock is now determined by the dac. The dac requests data when it desires and the usb bus is supposed to buffer it and send it following the clock of the USB interface in the dac. So we have been able to remove the effect of jitter quite a bit. But now the computer needs to make sure it responds to this request timely. It is still not error correcting since there's not much time for re-transmission. Also, USB transfer is through serial data packets. All your volume control info, info for both the audio channels are sent together with a specific framing structure. The Xmos and other interface thing is supposed to decode back the different channel info and send it to the i2s interface of actual dac chip. No one knows how it manages if it gets a erroneous data package or if the stereo information is messed up (just shuffling, glitch or jitter is enough to create sampling artefacts and mess things up).

    One may wonder, what's the big deal!. You'll understand the deal when you understand that the essence of a DAC is its "timing". Pro Interfaces have their own master clock and interfaces (like Focusrite Rednet).

    https://www.xmos.com/file/fundamentals-of-usb-audio

    Few Quotes I've found in online forums that I'm re-quoting (I'm not exactly sure of the validity of the quotes) ::

    1. 90 ohms is the USB standard impedance for signal transfer. Cables, connectors, traces should all have this characteristic from what I gather from the USB spec. It is actually +- 15%, so 78 - 104 ohms is within spec, 90 being the target. As to "why" they chose that spec, I am not sure, only that once such a spec is decided upon, maintaining it allows for better signal transfer. Here, we are talking about power and data, over separate wire pairs, all housed within the same cables, so I don't think the same rules should apply as with analog audio transmission or those of SPDIF digital audio data transmission. It seems more compatible with power and noise rejection and then allowing the connected devices to control the operations.

    2. The data is serialized on the wire and sent bit by bit. Loss of some bits doesn't necessarily result in corrupted audio but the information ends up being altered because of that. There is no error correction or re-transmission when an error is detected, the receiver is likely to just drop a sample and extrapolate the missing value from the surrounding known good samples. So it is at least theoretically possible for a USB cable to have an effect on SQ, I'm just surprised it doesn't take an obviously crappy cable to hear the difference.

    3. There is no re-transmission in Isochronous Transfer mode, the best receiver can do is CRC-check and and drop broken sample(s) and interpolate. Seems that USB jitter is actually a problem, or at least was in the early days. Now that the problem is well understood there are solutions to deal with it. Still, less jitter to begin with is always better I would think

    4. (Commenting on Xmos audio fundamentals document) - Interesting read. But the isochronous, control, and interrupt mentioned are the raw USB interfaces, or building blocks. If you scroll further down, they talk about sending extra sample data (8 extra samples per second). Doesn't this indicate that something at the receiving end is doing something with those samples to ensure data accuracy? What happens with those extra samples? The fact that you're sending any extra data indicates that the receiver is not blindly passing bits to it's DAC, it must be doing something with all of the incoming data before passing any of it along, otherwise the extra sample data would disrupt the audio signal.

    5. Guys that keep talking about 1's and 0's.......there's gotta be more to this. Too many people are claiming they are hearing the same things/differences. Speaking of specs, I scanned through the USB 2.0 specification and I'd say 25% of the document is devoted to error detection, recovery, re-transmits, etc., and the document was written assuming the transmission media that is within the spec. From that, even being "within the spec", doesn't necessarily mean error free to me.

    Last but not the least, I searched around and I've seen people online say the same perceived change, independently. So there must be more to this. https://www.hifisystemcomponents.com/forum/usb-cable-burnin_topic1613.html

    I think if I have access to a xmos/other usb interface that mimics the dac function, except replacing the Dac with a logger that stores data from i2s streams in a memory, It is possible to test what is happening.

    Thanks and Regards,
    Manuel Jenkin.
     
    Last edited: Nov 28, 2019
    Gazny and bequietjk like this.
  2. bigshot
    I think you fell into the wrong forum. You should describe how you conducted your listening test, not describe your conclusions. We have the right here to ask you for proof. Get ready.

    Go.
     
    Last edited: Nov 28, 2019
  3. manueljenkin
    Well, I asked here for, umm guidance on usb timing and error analysis when it is in asynchronous mode. Honestly I'm still not sure if it's real or placebo, but I'm kind of skewing towards the former.

    How I conducted my listening - myself swapping cables. As simple as that. Which is why I'm still having "doubts". I don't really have anyone beside me to help with ab-x ing neither do I have two identical dacs lying around to have everything else balanced.

    The issues with windows groove player is really evident and anyone can ab-x it. It has poorly implemented audio stack and some sort of low pass filter. My friend is looking into the source code if winyl to see if it has anything artifically intended. You can ab-x winyl for yourself (asio on winyl vs asio on foobar2000).

    Proof for audio cable thingy, "I think if I have access to a xmos/other usb interface that mimics the dac function, except replacing the Dac with a logger that stores data from i2s streams in a memory, It is possible to test what is happening." If anyone can guide me on this I can most likely show a proof. Or even a bitrate error tester ( BERT) that interfaces to the usb in asynchronous mode for audio.

    Also, I'm not gonna defend anything till I do have a concrete proof. I was mostly asking for guidance on the topic, describing what I'm intending to measure and scenarios which can trigger it. Most of the post is just reiterating the specs for the protocol.
     
    Last edited: Nov 28, 2019
  4. castleofargh Contributor
    to paraphrase some smart guy who's name I forgot: we know something once it has been demonstrated as true.
    in your situation, you have a feeling of something sounding differently, but you have not proved(to yourself or to us) that your impressions are caused by audible sound differences. it's something that right now we don't know. you have your impressions, and obviously some ideas about those impressions and where they could come from(so do I), but you don't know yet if sound differences are causing them. until you get the opportunity to fact check that in a controlled experience, I find it rather dangerous to make up possible scenarios for why you would be correct. there is a good reason why the scientific method tries to disprove ideas instead of supporting them and going full confirmation bias on audiophile myths.

    but once we get to the point where we know that sound did cause your impressions of differences, we'll have a good reason to wonder why. and I for one will be interested in discussing it. but I'm not interested when the little information I have so far, strongly suggests human biases and poor testing conditions.
     
  5. manueljenkin
    I'll try to see if i can source identical setups (other than cable) to do an ab-x. But, in parallel, I'd like to see if I can measure something, which was why I made the request at the end. I have had over a year of VLSI experience - validating digital designs with their protocol schemes. Its a vast sea and I had only touched the tip of it (and i never handled phy layers much), but I still think it's not out of possibility, which is why I'm interested. I rarely care about hifi audio in purest sense since i personally enjoy my 20$ bluetooth speaker. I'm more interested in the science. Hope I can get the help I desire. Maybe I posted in the wrong place?
     
  6. bigshot
    Isn't it interesting how bias works? You can clearly state a bias and you don't even realize you're doing it!

    Measuring is fine. But have you done a controlled listening test yet to make sure you aren't just hearing bias and placebo? Easier to do that than to try to justify your bias with engineering theory.
     
    Last edited: Nov 28, 2019
  7. manueljenkin
    I have already said the purpose of this post is partly to get more enlightenment of the protocol from people who have actually been in design/validation of internal layers and also to get advice on equipment that can help me check if there's anything happening as I am assuming. I wrote the full sequence to show that there is a *possibility* of change, looking at the polling times and synchronisation sequences.

    Most of the things I'm saying as perceivable, even I'm still skeptical of it since I personally haven't done ab-x. All I'm saying is, I'm not discounting the chances of there being actual differences, more so, measurable. I care more about understanding protocols which is why I'm taking the effort.
     
    Last edited: Nov 28, 2019
  8. bigshot
    If you can't hear the difference, what difference does it make? The first step in testing something like this is determining whether an audible difference exists. Trying to theorize why there *might* be a difference before proving one exists is putting the cart before the horse. All you're doing is strengthening your bias.

    There is absolutely no reason to believe that there is an audible difference between properly functioning USB cables. If you want to say there might be a difference, start by proving that.
     
    Last edited: Nov 28, 2019
  9. manueljenkin
    A difference is a difference, audible or not. I told my intentions - to understand the physical working and limitations of the protocol. I thought this was a "science forum" and approaches to measuring new things was encouraged, regardless of whether it really helps in perceivable ways or not.
     
  10. captainsubtext
    Oh this guy again.
     
  11. bigshot
    Yes, and a difference you can't hear doesn't matter.

    What we do here in this forum is to apply science to achieve optimal sound quality. This isn't a science forum. It's a forum about the science of sound. Sound is audible. You can feel free to talk about things that are purely theoretical. But they don't involve sound.
     
    Last edited: Nov 29, 2019
  12. castleofargh Contributor
    I have to disagree with this. this hobby can get us interested in a great many non audio stuff simply because they involve audio gears. saying that if it's not audible it doesn't matter, that's merely your opinion. each person can decide what matters to him or doesn't. personal interest can go well beyond audibility.


    audio gears, and even more amateur audio can move away from standards by quite a lot. so when surprising things happen, it can really be a case by case thing. plus USB standards have changed over time while audio protocols are mainly stuck to usb1(I think everybody blames windows for waiting too long to change the plug and play drivers). the power over USB doesn't apply to all DACs, many are self powered despite using USB, so just with that it might be worth studying both typical situations(although 5V DC doesn't seem like something that would create mayhem for nearby cables). I'm guessing some cables if we were to really plug anything we happen to have in our houses, could affect the total voltage at the DAC(when USB powered). but I have no idea when that would become audible, I'm guessing each DAC has it's own way to manage/filter power. and if we think about this as important, I'd bet that the USB card on the computer would have a much bigger role in supplying the right voltage or supplying it clean.

    this just doesn't hold. we have a significant amount of errors and then the communication would simply fail because the bit transmitted aren't just audio signal and errors would also occur outside of the audio content. or the number or errors is trivial(which is what I've always seen from guys who measured them), and then how audible would be one altered sample out of millions? whatever impressions reported by people are simply incompatible with that idea of bit errors IMO. I'm guessing the massive oversampling happening on almost all modern DACs would suffice to greatly reduce the impact of any false sample properly read as another value.

    same as above. if we have lower jitter, we measure lower jitter. we can of course consider that better, like with any variable, but the impact on the audio signal is another story. most people(outside marketing!!!!!!) agreed that jitter was a non issue on typical consumer products some decades ago. also how much impact will a cable have on jitter? again I would assume that the computer and DAC are more relevant if it mattered at all.

    I'm a noob at this and only read related papers a bunch of years ago while looking for something specific. so hopefully someone else will comment.

    of course there is more than 1 and 0. there is noise, there is timing, there is the possibility of an error for whatever reason. what's important here is to keep in mind the typical magnitude and potential impact of those in a typical setup. and to be aware that if something was massively problematic, the industry would already be implementing all the tricks they can think of to mitigate that problem(as any half good designer would do about noise from any origin). and once again, the role of a passive cable in all that, can be expected to be trivial in most cases but the ones where something was initial bad and only needed that extra push. or cases where something very atypical is going on with the setup or the environment it is in. neither should serve as textbook USB behavior when they're clearly not it.

    "Too many people are claiming they are hearing the same things/differences". the number of people having a belief does not measure how true something is. and as I said in my first post, when most people don't have a clue how to run a listening test, it is crucial to take their feedback for what it most likely is, probably wrong information. we fall for the heuristic trap where when we lack information, we take the highly unreliable information we have and extrapolate from it anyway. that's human. but in such instance it's also really bad and an urge we need to actively and consciously fight.

    USB standard obviously need to be super tight on errors and on what to do when they happen. because a single error of code in a program or some data can ruin it depending on how it's encoded, so as any other transfer protocol it must be able to deal with errors no matter how unlikely. that says nothing about usb audio and the impact of errors, or how often they occur.


    about checking if you have changes and if they are audible, of course I advise you to do it if you're curious about this. of course I would suggest to first start with confirming that you're hearing what you think you're hearing. then you could try to measure stuff in a loop to try and put a finger on what is changing and how loud it is. I've tried random USB cables, including some that were specifically intended for charging gear and nothing else(not even sure why the other connectors and wires where there, maybe a needed handshake for the gear to accept USB signal? IDK). and I've had some really crappy cables that basically were absolutely not to spec. one somehow increase crosstalk in RMAA for some reason. again I don't understand why at all, but I could get that repeatedly so real or not, there was a change that at least tricked RMAA into showing that weird stuff. but I would never imply that this is something we have to worry about and or evidence that USB cables have a deep audible impact on sound. because those cases were abnormal, just like those USB cables I got once that had like 20ohm they costed nothing and were worth pretty much that ^_^.
     
  13. manueljenkin
    Update: winyl vs Foobar was measured and there was a noticeable problem of harmonic spike in Foobar. My friend measured that. I'm not saying this change is audible but what I'm saying is there's something happening which is causing multiple changes, one of which is this. I'll pm you the measurements in some time.

    On the other hand, there are reasons why I'm still convinced on the erroneous tranfer part. I did get timeout errors with apogee groove on asio at times, out of the blue, all of a sudden. And on my nx4dsd, the stock cable was slightly loose - and it gave me address decoder errors often. Frequency of dropped packets - I'd love to see if anyone measured that in "asynchronous audio mode".

    Yeah I'm not really validating subjective opinions but another subjective opinion. I'm only saying that there could be something to this that can be probed further. Most others may not be interested, but I'm curious.
     
  14. castleofargh Contributor
    As I've been using foobar to play test tones for plenty of measurements, I'm gonna go on a limb and say that what you're talking about is a spacial issue or that it's so low that I can't measure it with my stuff? Maybe something specific to one version that got rapidly fixed? Or could it be that your friend just measured clipping?
    About having errors, sure it happens and in fact i'm tempted to believe that repeating errors are more likely to break the stream than to change how we hear the music. The amount of error per minute probably needs to be much smaller to crash the stream. How many errors of what type do we need to statistically cause a dac to stutter or stop? Idk. It's probably dac and setting dependent. With asio the default aim is often to reduce latency to a minimum, maybe that results in using a smaller buffer? Again idk. I'm letting myself fall into low confidence conjectures. Bad me!

    Now consider a cable and what power it has on causing or stopping an error, or some amount of noise that will cause an error and what needs to hit the cable to create such magnitude of noise? It's a passive cable, it's not even supposed to act as high pass or low pass filter. All usb cables are supposedly shielded, they all should have relatively close electrical specs by standard(unless the plug is utter garbage and fits super loosely, in which case all bets are off about the impedance and for all i know walking in the house might affect the signal in some extreme cases). How much difference should a usb cables be able to create compared to another proper usb cable? That should not be a lot. Everything you know about electricity and passive conductors should tell you that. If you do get a noticeable difference(and you know it:wink:), then i'm inclined to believe that something is wrong with one of the cables tested, or at least with one of the devices it's plugged into. A device so close to become unstable that the tiny difference from a new cable can push it over the edge. I'm sure such circumstances exist, but they should be rare and not be considered normal imo.

    You can post the measurements here, i don't think anybody in sound science will cry from seeing actual measurements ^_^. And foobar is well known so someone is going to be interested in testing his own with the type of measure that was used. Unless you don't want this topic to be side tracked at all(too late:sweat_smile:), then put that in another thread.

    edit: capitalization power!!!!
     
    Last edited: Nov 29, 2019
    SoundAndMotion likes this.
  15. bigshot
    Interest in purely theoretical things is fine, but for the purposes of home audio, if you can’t hear it, it doesn’t matter. The problem here is that audiophoolery uses purely theoretical stuff that doesn’t matter to plant seeds of doubt that “maybe it does matter if my ears are sensitive enough”. Then the slope gets as slick as a bucket of oil when people start letting their ego convince them that they do have extraordinary hearing ability. “Well, *I* can hear it plain as day... If you can’t it must be your ears or your lousy stereo.”

    What we are seeing when people say “I don’t know if it’s audible BUT...” is delusion and denial. They aren’t saying “I know no one can hear it BUT...” They’re using it as a wedge to open a door without doing it honestly with the key... a controlled listening test to prove audibility.

    It sounds to me like you have a defective cable, whether by error in manufacture or design. I would take whatever measurements you have that look fishy and I would demand a full refund from whomever sold it to you. I wouldn’t waste any time thinking about it, because the longer you wait, the harder it is to get a refund. You can get an Amazon Basics USB cable for a few bucks that performs flawlessly. It is a waste of money to spend a lot of money on wires.

    If you’re looking for something to think about and research to figure out the science of how it works, get an inexpensive cell phone, personal computer or blu-ray player. I never fail to be amazed and impressed at how well common consumer electronics perform. It’s a marvel, actually.
     
2
Next
 
Last

Share This Page