robertbudding
Head-Fier
That's a common misconception. The signal that moves along a USB cable isn’t digital – NOT little ones and zeroes – but an electrical-pulse representation of those ones and zeroes. This therefore analogue signal is prone to disturbance from EMI emanating from the host computer and electrical noise arriving over the air, otherwise known as RFI. Greater vulnerability to noise can degrade a cable’s ability to do its job.
Everyone here understands this and this is not what we're debating. What's being debated is to what level that degradation in the USB cable, (or lack of degradation) can be perceived.
The signal is interpreted as a 0 or a 1. All that is required is that the signal is clear enough to be interpreted. And any problems can be easily identified by comparing the resultant bit stream. And their are algorithms to ensure accuracy. If your expensive cable is better, then it can be measured.
Last edited: