Why do USB cables make such a difference?
Status
Not open for further replies.
Sep 9, 2017 at 8:53 PM Post #151 of 1,606
If you have noise or degradation or timing issues, it (correct me if I'm wrong please Greg) is going to result in pops, dropouts, and jitter.

If the level of noise is significant enough to distort the digital waveform only (see here: http://lampizator.eu/LAMPIZATOR/TRANSPORT/CD_transport_DIY.html). The noise that is of concern is at a level low enough not to affect the digital signal, but high enough to affect the analog components. The main issue there in isolating that noise related, on the one hand, to the availability of components (ie: transformers, chips) that could isolate noise from a USB source while still allowing full USB 2.0 speeds (Schiit found that network isolation transformers work well for that) and on the other hand, removing noise generated by the USB components themselves.

I still don't understand why you chose to pull me up on that but not theorist,

You "pulled up" theorist very well, and for the most part, I agree with your responses to him. I don't need to duplicate that. Going by my own experiences, I don't believe that he may not have experienced a difference between USB cables, but what he attributes that to is wrong to me as well. The difference is that I know of perfectly reasonable, technically sound reasons why he (and other people) may have experienced what they have.

The two main problems I see here are, respectively:

1. Someone believes they have a difference swapping a cable. Instead of asking "Why is this happening? What don't I understand about electronics that would be useful that I could improve my system without expensive and unnecessary trial and error?" they may end up arguing with incorrect assumptions as theorist did.
2. Someone else believes that the first person's experiences are BS, but instead of asking "Why are they experiencing this? What could be a possible technical explanation if what they are experiencing is genuine?" they end up arguing from the point of view of their limited knowledge.

The irony is, I now know two companies who, respectively, solved pretty much all analog and digital cable issues using basic electrical knowledge. One I've already mentioned.
 
Sep 10, 2017 at 10:35 AM Post #152 of 1,606
2. Someone else believes that the first person's experiences are BS, but instead of asking "Why are they experiencing this? What could be a possible technical explanation if what they are experiencing is genuine?" they end up arguing from the point of view of their limited knowledge.
[3] The irony is, I now know two companies who, respectively, solved pretty much all analog and digital cable issues using basic electrical knowledge.

Firstly, I do not believe that theorist's experiences are BS! I do not dismiss his experiences, just his explanation of them. There are two possibilities:
1. He has a faulty DAC, although it's hard to see how a so called audiophile grade USB cable would make any difference as far as the data is concerned.
2. There are some biases and/or other issues of perception at play. I realise that for many audiophiles even the suggestion of such a thing appears to be some sort of heinous insult. Not for me though, in fact if anything, the exact opposite. If it weren't for the biases and issues of human hearing perception, it would be impossible to appreciate music, films and TV and, my job and the many thousands of others who work in my field, effectively depends entirely on my understanding of how trick and manipulate this biased perception.
3. Even more ironical as far as I'm concerned! I'm sure many here, maybe even the majority have more experience of audiophile equipment than me. I do and have owned some pieces (transducers mainly) of what would probably be described as mid-level audiophile equipment and I have heard some of the ridiculous priced audiophile stuff. However, I worked extensively with high-end pro audio equipment and on many occasions with equipment at the lower end of the pro-audio scale. I've used relatively cheap USB audio interfaces over the course of the last 15 years or so from EMU, RME, Focusrite, Prosonus, Digidesign, Yamaha and various others and here's the thing: We're talking typically about 16 x 16 ins/outs which in normal usage ends up being about 8-10 times more USB 2 data per second than an audiophile stereo DAC ever has to cope with and they do it typically for 50 or so hours a week, week in, week out, flawlessly. Last year I was working with a USB unit (Antelope, if I remember correctly) running simultaneously 30 inputs, a main mix out and two different stereo cue mix outs all at 24/96 over a rather decrepit looking old no brand USB2 cable, no data corruption in the double session and impressively clean/quiet.
I don't know how many audiophile USB DACs have issues with power isolation/noise but why are there any? The pro audio market apparently has been employing the appropriate "basic electrical knowledge" for years, has far more demanding data handling requirements, power and noisy (EM/RF) environments to deal with than anything the audiophile market faces and yet apparently some/many audiophile DAC manufacturers don't have or don't apply the appropriate "basic electrical knowledge" even though they're typically charging way more. The solution should be for more audiophile DAC manufacturers to drop some of the silly "audiophile" specs/formats and put a bit more effort into "basic electrical knowledge" but why bother when instead, you can sell them even more rubbish at inflated prices to cure the problems which the DACs should never have been released with in the first place! If a USB DAC manufacturer also sells a USB recleaner/filter/purifier which supposedly improves the SQ then why on earth didn't they put that circuitry in their DAC in the first place? The answer is that many audiophiles will fall for it and better still, not feel duped and complain but do and feel the exact opposite?! I can't recall ever seeing a USB cable in a pro studio which cost more than about 20 bucks or so and even mentioning audiophile or audiophile equipment is likely to be met with open ridicule.

G
 
Sep 10, 2017 at 4:41 PM Post #153 of 1,606
Did you test the two cables side-by-side, ensuring the output was completely volume matched, and completely blind - so all you are judging is on sonics. And this means multiple times with no uniformity in which cable being presented so you actually have no visual or other cues? Because without that - you can't say they sound different.

I did not have the two beta samples at the same time (which became the Tellurium Q Silver), but I compared them with my then system USB cable. My then system USB cable was a Tellurium Q Black Diamond. Obviously, I was aware of what cable was in the system, since I had to swap them. They were swapped numerous times with my amp mute button pressed on, so at least at first listen (mute off) they would have had a completely matched volume (as far as I am aware a USB cable can have no effect on system gain). If I had any initial expectation bias it would be that the first tested beta would sound better than my Black Diamond cable, but it very clearly did not. I then rather grudgingly tried the second cable a few weeks later, with a clear in my mind expectation bias that I wasting my time with all this about-to-do cable swapping, as the new beta would probably be not a lot better than the first, and I really just wanted to listen to my music, but I felt obliged to do so. After sitting on the second beta for a number of days I finally swapped it into my system and was immediately impressed by the increased musicality/liveliness that I heard in my system, swapped it out, this went away, swapped back -- there it was. So not a blind test, but for me it was rather conclusive.

Firstly, I do not believe that theorist's experiences are BS! I do not dismiss his experiences, just his explanation of them.

Thank you G for that. But could not a third explanation not be there. I have a very simple, but pretty good HP digital audiophile system on a dedicated power supply separate from the rest of the household attached to pretty clean mains power. The system is star earthed wired and grounded (2 metres directly below the system with a two metre earth rod) and the power and analogue cables are all high quality. In other words the system is well sorted and very clean, ie the noise floor is very low. Compare this to your "far more demanding data handling requirements, power and noisy (EM/RF) environments to deal with than anything the audiophile market faces". Between yourself, Shannon (1948) and others I now except that the transcription of digital files, short of drop-outs, must be near perfect with any properly made USB cable -- thank you for that, again, since I said this before -- but this does not mean that the reproduction of music from these files will be perfectly reproduced (just think of the unnatural sounding pre-ringing induced by the digital filters in most DAC/ADCs -- what MQA is trying to fix) and here everything in the system may have an impact on this replication from the data file of the musical sound. Could not this also include USB cables, at least at very low levels and not directly related to the digital file transcription itself (ie EM/RF noise that has little or no impact on the digital data transcription itself, but does have an impact on the DAC circuit, even one that is properly designed to spec)? I would happily like to blind test this, but in a system with the attention to detail similar to my system, and see if the listeners can tell the differences, or not, between USB cables.
 
Sep 10, 2017 at 6:44 PM Post #154 of 1,606
I did not have the two beta samples at the same time (which became the Tellurium Q Silver), but I compared them with my then system USB cable. My then system USB cable was a Tellurium Q Black Diamond. Obviously, I was aware of what cable was in the system, since I had to swap them. They were swapped numerous times with my amp mute button pressed on, so at least at first listen (mute off) they would have had a completely matched volume (as far as I am aware a USB cable can have no effect on system gain). If I had any initial expectation bias it would be that the first tested beta would sound better than my Black Diamond cable, but it very clearly did not. I then rather grudgingly tried the second cable a few weeks later, with a clear in my mind expectation bias that I wasting my time with all this about-to-do cable swapping, as the new beta would probably be not a lot better than the first, and I really just wanted to listen to my music, but I felt obliged to do so. After sitting on the second beta for a number of days I finally swapped it into my system and was immediately impressed by the increased musicality/liveliness that I heard in my system, swapped it out, this went away, swapped back -- there it was. So not a blind test, but for me it was rather conclusive.

So not tested side-by-side, you knew which cable was which, and you formed an opinion based on auditory memory - which we know is extremely fleeting (more than a few seconds is simply not reliable). I'm not discounting what you think you heard - as I'm sure you believe there is a difference (and that is the filter our brain can put on for us). Talk to audio engineers or producers, and I'll guarantee that practically everyone will have an example of EQing a mix, spending hours getting it just right, and then finding that the EQ was never engaged at the time. You convince yourself that you hear a difference so you do. I've done it myself. Unfortunately, the only way to get a more valid test as to what is really happening in your case is direct comparison. I do not doubt for a minute that your experience of the perceived change is valid. What I doubt is whether it is actually real (the change), or audible.
 
Sep 10, 2017 at 6:52 PM Post #155 of 1,606
[1] But could not a third explanation not be there.
[2] In other words the system is well sorted and very clean, ie the noise floor is very low. Compare this to your "far more demanding data handling requirements, power and noisy (EM/RF) environments to deal with than anything the audiophile market faces". ...
[3] .. (just think of the unnatural sounding pre-ringing induced by the digital filters in most DAC/ADCs -- what MQA is trying to fix)
[4] Could not this also include USB cables, at least at very low levels and not directly related to the digital file transcription itself (ie EM/RF noise that has little or no impact on the digital data transcription itself, but does have an impact on the DAC circuit, even one that is properly designed to spec)?

1. Not really, I'll explain why:
2. Typically, commercial studios have a "machine room". The computers, power amps, ADC/DACs and stacks of other electronics are put in this room and they're noisy rooms. However, the control rooms are sonically isolated from the machine room and additionally have been constructed and acoustically treated for the purpose. So while pro ADC/DACs often have to operate in a noisy environment, it's extremely unlikely your listening environment has a lower noise floor than a good commercial recording studio control room, unless you've spent very serious amounts on isolation/acoustics.
3. That's another whole audiophile myth/marketing opportunity! Again, the science forum might be the better place for that discussion but essentially, no. Pre or post ringing digital artefacts of digital filters are completely inaudible. Like jitter, it was a bit of an issue in the early days of digital but not for many years. Having said this, there are some devices with audible filters, the Pono is an example I have experienced but audible filters is either due to incompetent filter design or a deliberate design choice not to aim for a linear response/high fidelity (as is the case with the Pono). Incompetent filter design comes under the the "faulty" category, as even extremely cheap DACs manage very competent digital filter designs (those found inside an iPhone for example).
4. Yes, it could and I've seen evidence of audiophile DACs being affected by EM/RF noise, just from being placed near a laptop. Again though, this comes under the first explanation, it's an incompetent, faulty design! In addition to my high-end pro ADC/DACs I've got a little USB 2x2 ADC/DAC. It sits about 1ft from a 12 core workstation, about 3ft from a 10 disk NAS, about 8ft from a powerful wireless router and the small room it's in has loads of other equipment which uses in total over 4kW. It cost about $90, uses a 3ft Amazon Basics USB 2 cable and is impressively unaffected by it all, as are the other pro USB ADC/DACs I've used and of course my high-end ADC/DACs are unaffected (but they're not USB).

G
 
Sep 10, 2017 at 9:21 PM Post #156 of 1,606
So not tested side-by-side, you knew which cable was which, and you formed an opinion based on auditory memory - which we know is extremely fleeting (more than a few seconds is simply not reliable). I'm not discounting what you think you heard - as I'm sure you believe there is a difference (and that is the filter our brain can put on for us). Talk to audio engineers or producers, and I'll guarantee that practically everyone will have an example of EQing a mix, spending hours getting it just right, and then finding that the EQ was never engaged at the time. You convince yourself that you hear a difference so you do. I've done it myself. Unfortunately, the only way to get a more valid test as to what is really happening in your case is direct comparison. I do not doubt for a minute that your experience of the perceived change is valid. What I doubt is whether it is actually real (the change), or audible.

While I agree that blind or preferably double-blind testing will certainly remove any potential for self-delusion and provide a 'scientific' certainty of perception if repeated sufficiently by enough cases for statistical validity, I'm not sure that I can pragmatically buy into constantly using this high standard of testing for my personal hobby choices, even if this lack of blind testing may be at times lead to incorrect perceptions.
 
Sep 10, 2017 at 9:30 PM Post #157 of 1,606
1. Not really, I'll explain why:
1. unless you've spent very serious amounts on isolation/acoustics.
2. That's another whole audiophile myth/marketing opportunity!

1) I just listen with HPs, I have no room speakers and was referring to system, not acoustical, noise floor.

2) I would actually like to hear your take on MQA then if you don't buy into its claims about taking ADC/DAC distortions out of the data signal recording/playback chain.
 
Sep 10, 2017 at 9:40 PM Post #158 of 1,606
While I agree that blind or preferably double-blind testing will certainly remove any potential for self-delusion and provide a 'scientific' certainty of perception if repeated sufficiently by enough cases for statistical validity, I'm not sure that I can pragmatically buy into constantly using this high standard of testing for my personal hobby choices, even if this lack of blind testing may be at times lead to incorrect perceptions.

No problems with that approach either. Sometimes people are happy with what they don't know - and that's a completely valid choice :) You'll get one guy who simply likes tube amps, and 2nd guy who likes the sound, but recognises the added 2nd order harmonic distortion is what is colouring the sound, and knows that is why he likes it. Both are valid observations.

I'd love for you to be able to take that test someday though. It wasn't until I started taking the time to test myself on a lot of pre-held beliefs that I started realising how much I had to learn.

Enjoy the music :beerchug:
 
Sep 11, 2017 at 4:48 AM Post #159 of 1,606
1) I just listen with HPs, I have no room speakers and was referring to system, not acoustical, noise floor.
2) I would actually like to hear your take on MQA then if you don't buy into its claims about taking ADC/DAC distortions out of the data signal recording/playback chain.

1. HP's will typically reduce the the acoustical noise floor by about 10-15dB, sealed IEMs by as much as about 30dB. A typical commercial studio will have an acoustic noise floor roughly 10-20dB lower than a typical home listening environment. Therefore there is a very rough equivalence (as far as the noise floor is concerned) between listening at home on headphones and listening on speakers in a commercial studio. However, although virtually all commercial music releases are mixed and mastered on speakers, those mixes and masters are typically checked/referenced on HP's in the studio. I have a set of Senn HD650's just for this purpose. All this is relevant because as far as modern digital audio is concerned, the limiting factor is the acoustical noise floor, not the digital noise floor,which is many times (as much as 1,000 times) lower than the acoustic noise floor! The analogue side of a digital system is far noisier than the digital side, transducers in general and speakers in particular, but most good quality DACs advertise a noise floor anywhere from about -110dB to as low as about -125dB in a few cases. To be able to hear that DACs noise floor above or at the same level as the acoustic noise floor of your listening environment (even wearing cans rather then listening on speakers) would mean playback levels at or well beyond the pain threshold, levels roughly 10-100 times higher than anyone would choose to listen. If you can hear the noise floor of your DAC without turning the volume up to ear splitting levels then your DAC is seriously flawed.

2. MQA are making a number of claims and backing most of them up with actual scientific data but with MQA it's often what they don't tell you which invalidates the claim. There are 4 main problems with this particular claim: 1. As already mentioned, except possibly in some early digital recordings, the ADC/DAC timing distortions MQA is trying to "fix" should be well beyond audibility to start with. 2. The timing distortions which do exist pale into utter insignificance compared to the other timing distortions in the chain, the application of EQ, compression and other effects during mixing and mastering and the timing errors inherent in microphone placement being obvious examples and there is no way to know what these timing distortions.were or correct for them. 3. Even when the original ADC is known (which may not be often) and even if MQA can compensate for it's timing distortion, that's still irrelevant in many/most cases! The recording and mixing workflow varies significantly, depending on both the music genre and when the recording/mixing occurred (and therefore the technology used). Up until about 10 years ago, there was not just a an ADC used during recording, there were also ADCs and DACs used in the mixing and mastering chain, typically multiple round trips through at least two different ADCs and DACs, possibly as many as a dozen or more times. This compounds the timing distortions (possibly to audible levels) but again there is no way to know how many round trips were made or even all the different ADCs and DACs employed on any particular mix/master. Any attempted compensation just for the recording ADC and consumer's DAC is just as likely to make matters worse rather than better. Even today, many mastering engineers will convert the mix to analogue to pass through some vintage analogue compressor or EQ and then convert back to digital again using a different ADC than the one used for recording. 4. If timing errors are audible the mix engineer and/or mastering engineer had the option of correcting them, often they don't because the result of those timing distortions may be preferable artistically. Obviously then, the last thing you want is a distribution format to come along and correct for something which has already been corrected or which the artistic decision has already been made not to correct. Having said this, a single pass through a recording ADC and then a consumer's DAC should not have any audible effect and therefore neither should any correction which MQA is applying.

G
 
Last edited:
Sep 11, 2017 at 6:06 AM Post #160 of 1,606
While I agree that blind or preferably double-blind testing will certainly remove any potential for self-delusion and provide a 'scientific' certainty of perception if repeated sufficiently by enough cases for statistical validity, I'm not sure that I can pragmatically buy into constantly using this high standard of testing for my personal hobby choices, even if this lack of blind testing may be at times lead to incorrect perceptions.

1. Although the term "self-delusion" is probably technically accurate, it's negative association makes me uncomfortable. Without that "self-delusion" there would be no music in the first place, it would just sound like a sequence of unrelated noises with no emotional significance/impact, no associations with qualitative judgements and no meaning in general!

2. I don't think anyone is suggesting constantly using DBT to inform every decision. My advice would be to start slow and just use it for the more expensive and most contentious decisions. Audiophile USB cables can be expensive and are extremely contentious because only some in the audiophile community believe they make any difference and none (AFAIK) in the pro audio community. A good candidate for DBT if you're considering purchasing an audiophile USB cable. As I think Brooko mentioned, doing DBTs is very informative, sometimes surprisingly so. Over time you build up knowledge of your perception and can therefore better judge when a DBT would be appropriate. If you haven't already seen it, I recommend this short (3 min) video of the McGurk Effect, which is a bit of an eye opener if you're a perception doubter and leads to some interesting conclusions.

G
 
Sep 11, 2017 at 3:50 PM Post #161 of 1,606
MQA are making a number of claims and backing most of them up with actual scientific data but with MQA it's often what they don't tell you which invalidates the claim...

Thanks for this info, its very useful to know, I had not thought about this from the mixing perspective, and I assume that this variation in ADCs and DACs use must also vary to some degree between tracks due to differences in mixing for each track on many albums. It certainly does invalidate their claims!

Compression today really drives me spare, as while think I understand the commercial justifications for it: perceived louder is 'better', at least in the market, it so destroys the dynamics of the music. I was not aware of how bad this is until copying a hi res 24/176 copy of Shelby Lynne's "Just a little lovin' onto my DX and for some reason it dumped all the tracts side by side to a prior CD rip of the work -- easily enough fixed when I got around to it -- but in the interim I could both hear (and see the output visually in Roon's playback display) the profound difference in compression between the two copies of each track. With the CD rip loud but boringly dynamically flat (visually a flat solid band of output in the Roon display) and the hi res version so much more alive, dynamic and emotional, visually in the display (without the emotional bit!), as well as most importantly, sonically to my ears, especially once the volume was turned up to compensate for the lower output on the majority of quieter parts. If all hi res was this less compressed, compared to CDs, it would be worth replacing all my 1000s of CD rips for the better dynamics alone!
 
Sep 12, 2017 at 6:24 AM Post #162 of 1,606
[1] I assume that this variation in ADCs and DACs use must also vary to some degree between tracks due to differences in mixing for each track on many albums.
[2] Compression today really drives me spare, as while think I understand the commercial justifications for it: perceived louder is 'better', at least in the market, it so destroys the dynamics of the music.
[3] If all hi res was this less compressed, compared to CDs, it would be worth replacing all my 1000s of CD rips for the better dynamics alone!

1. Generally, though not always, the various ADC/DACs used on one track will be the same on all the tracks on the album but not the number of round trips through those ADC/DACs, which could vary wildly between the different tracks. All MQA attempts to "fix" is one pass through the recording ADC and the pass through the consumer's DAC. Note that this mainly affects recording older than about 5-10 years. Today the typical workflow is ITB (in the box), meaning one pass through the recording ADC and then all processing is carried out in the DAW (digital audio workstation) without any further D to A or A to D conversions for analogue processing. This isn't always the case though.

2. This isn't quite as simple an issue as it appears. To try and simplify, there's two parts to it:
A. High levels of compression (destroyed dynamics) is actually "better" in many situations. Many consumers listen to music while doing other tasks in extremely noisy environments, such as when working, travelling, exercising, doing chores, etc. Wide dynamic range recordings are completely useless in these situations because the quieter parts will be beneath the noise floor of the listening environment and inaudible. For example, I don't listen to classical music when driving because I could be sitting there for minutes not hearing anything at all during the quiet passages in the music and I don't want to be turning up and down the volume all the time as the music gets quieter and louder. So I just listen to popular music genres which are heavily compressed when driving and at least I can hear it all. Listening critically though, for example when I'm at home doing nothing else but listening, on a decent system in a moderately quiet listening environment, then destroyed dynamics is very annoying or at least far less preferable than wider dynamics. So, there is a good justification for two masters, a highly compressed one and a far less compressed one or if only one master, for it to be highly compressed because music is consumed by far more people in noisy environments than there are audiophiles.
B. Economics! The problem with the CD container format (16/44) is that it's been around for a long time. That's a very serious problem economically because once a consumer has bought a decent 16/44 DAC/DAP/system there's no real need to buy another one until it wears out, and this isn't just an equipment problem but also a problem for content owners (record labels) and distributors. That's why "high-res" consumer distribution formats were invented, you had to buy new equipment to play "hi-res" and obviously you have to buy the content/music again or buy new content in a more expensive "high-res" format. The reason I put "high-res" in quotes is because in practise it's NOT higher resolution, "high-res" results in EXACTLY the same resolution as 16/44 within the audible range. This inconvenient fact can be overcome for some consumers with marketing, however, that marketing is far easier and the number of potential consumers is far greater if there is a real audible difference. This brings us back to the point above and the justification for two masters, release the highly compressed one in 16/44 format and the less compressed one in high-res, now we have a real, measurable and audible difference between high-res and 16/44 and in critical listening situations the high-res version should always sound superior! Of course, because there is no resolution difference, they could release both versions in 16/44 and the less compressed version would be audibly indistinguishable from a "high-res" version but economically that doesn't serve any section of the industry's best interests.
3. As just explained, you buying your entire music collection again is one of the main reasons hi-res exists. Record labels have literally billions of dollars worth of back catalogue content and they want/need to make money from it. Sometimes it's not technically possible to create a far less compressed version for "high-res" release, in which case you either have to add even more compression and/or create a new (or existing master) which is different in some other respect. Therefore, you cannot guarantee that repurchasing your collection in "high-res" will always get you a less compressed version.

We may be in danger of getting kicked out! Although arguably obliquely related, what we're discussing now is pretty far off topic.

G
 
Sep 12, 2017 at 3:35 PM Post #163 of 1,606
We may be in danger of getting kicked out! Although arguably obliquely related, what we're discussing now is pretty far off topic.

G

Yes, perhaps it is, but it is really useful information, so I am sure everyone will bear with us. Thanks for sharing this with us.

Just to keep it on topic, while we may still disagree about the value of audiophile USB cables (perhaps I need my delusions), thanks for contributing to the topic string, I have learnt from your posts!

On another related note check out http://antipodesaudio.com/articles.html . I received this link in a marketing email this morning to spend money on upgrading my music server. Of specific interest to this topic string are two links on this page to: 'Of Faith & Science' and 'Design Approach' (I could not get direct links to each specific topic article, so have to include the full marketing pitch -- sorry). Yes, it is marketing bumph, but I think this string's readership may find these two specific articles interesting, either in agreeing with what is put forth by Mark or in knocking holes in what has been stated, as you think is appropriate.
 
Sep 12, 2017 at 3:50 PM Post #164 of 1,606
You know, I'm starting to get annoyed with some of these post. I'll admit, I too was skeptical about USB cables but I figured I would try some for myself and see. I was given the opportunity to try some higher-end familiar cables for free before I decided to make a purchase.... This is something that some posters here need to do before you start claiming we are all delusional when you haven't even tried a higher end USB cable. If you have tried some and still think it's nonsense, I will admit not all of the higher end cables necessary offer better performance, but some do, you have to be careful.

Why would I go out and spend hundreds of dollars on a USB cable that made no difference? I find this offensive.
 
Last edited:
Sep 12, 2017 at 4:05 PM Post #165 of 1,606
You know, I'm starting to get annoyed with some of these post. I'll admit, I too was skeptical about USB cables but I figured I would try some for myself and see. I was given the opportunity to try some higher-end familiar cables for free before I decided to make a purchase.... This is something that some posters here need to do before you start claiming we are all delusional when you haven't even tried a higher end USB cable. If you have tried some and still think it's nonsense, I will admit not all of the higher end cables necessary offer better performance, but some do, you have to be careful.

Why would I go out and spend hundreds of dollars on a USB cable that made no difference? I find this offensive.

Cartma: I don't disagree with you indeed my Nordost USB cable cost rather more than your Heimdall 2 (I only use Heimdall 2 HP cables in my system, as that is as far as Nordost goes in HP cables). However, others do disagree, as gregorio has stated in considerable detail in numerous posts, but, importantly, we have to respect each others perspective and here I was simply acknowledging his in stating "perhaps I need my delusions" [which are that good USB cables do make an important difference].
 
Status
Not open for further replies.

Users who are viewing this thread

Back
Top