One example of digital not quite being black and white is evident on a cpu wafer. A wafer might have room for a dozen or so chips, but the ones near the middle are the best, and the ones at the edges are worse, typically. What you get is two of the same processor, but one is capable of more than the other. Digital should be digital though right? Not quite, one is literally *better* and digital will happen there at higher speeds, endure higher temperatures, and consequently be given a different model no. clocked up and sold for like $1,000. The rejects will go in a separate bin, clocked down, and sold for less money. Sometimes you'll hear about a CPU that is a great overclocker, thats probably because it's off a wafer for a higher end CPU but wasn't up to spec enough to go in the top bin.
How is that in any way related to digital not being digital? All digital circuits run on real-world electronics, obviously, and there are some very complex high frequency electronics/physics involved. That still doesn't change the nature of what it means to be digital. Digital is by definition "black and white". The fact that production inconsistencies result in varying tolerances to clocking a chip faster doesn't take away from that, it's just a simple observation that all digital chips aren't "magic" and are in fact built with electronic components. A properly functioning digital system does not lose data ultimately. Yes, there is a lot of technology in storage media and error detection to make this happen, but trust me, computers do this incredibly well.
Computers crash and get errors all the time, but I'm not sure people realize this has nothing to do with the computer's hardware unless something is actually a faulty component. Crashes, errors, etc. are caused by programmer error, something that is pretty much impossible to get rid of completely.
In any case... let me put it this way. If you put a group of head-fi'ers together who are all professional computer or electrical engineers, I'd guarantee you not one of them would be so silly to believe that changing digital cables have an effect on audio at all (aside from bitrate modes of course I mentioned above). I actually said this on the sound science thread a while back about USB cables, and the response I got was this: "I don't trust scientists and engineers. They don't know what they're talking about." People here literally don't trust the people who literally designed and built the thing they're blindly quibbling about.
> USB is best avoided unless you know the receiving DAC handles it well. Some setups are happiest with Toslink as it electrically isolates the computer from the DAC. This is not an issue with better designs.
Maybe I overestimate the quality of DACs sometimes. It is certainly possible for noise to make it from anywhere into the sound signal (I mean, any stupid design is possible), but I'm surprised that any "audiophile" DAC wouldn't be designed to completely eliminate this.
Edited by ac500 - 11/5/11 at 10:24am