[1] There are expensive optical cables out there and I can imagine experienced listeners hearing a difference since in theory a less pure fibre might degrade the signal.
[2] The main reason people with high end gear prefer not to use optical is the break in the medium. [2a] Meaning you go from electrical to optical and then back to electrical and inevitably those two conversions are an additional source of jitter.
[2b] You can get around the jitter issue, like Ted Smith does with his DACs but he's the only DAC designer I know of how does this.
1. What signal are you talking about? Sure, in theory the digital data signal could/would be somewhat degraded by a less pure optical cable. However, digital data is just zeroes and ones, a degraded zero or one is still just a zero or a one and therefore, there is absolutely NO difference in the data and NO difference in the analogue signal after it's converted. In other words, digital/binary has just two states, zero or one (on or off) there is no state that can represent a degraded zero or one, so it cannot make any difference how degraded the signal is, provided it's not so terribly degraded that the zeroes and ones can no longer be differentiated (in which case you'd get obvious errors, clicks/pops for example). This is of course the reason that binary/digital was invented in the first place! So, I can't "imagine experienced listeners hearing a difference, since in theory" there is absolutely no difference (in the actual data), however, I can very easily imagine some audiophiles perceiving/imagining a difference!
2. That statement is untrue. The main reason people with high end gear prefer not to use optical is because optical cable is relatively easily damaged compared to the alternatives. That's not really an issue for consumers but is/can be for studios. Did you mean only audiophiles rather than (all) "people with high end gear"?
2a. No, it is not "inevitable" those two conversions will add jitter. Potentially it could but it's irrelevant because ...
2b. Are you saying this because you don't "know of" any DAC designers other than Ted Smith? Any competently designed DAC, even cheap, sub $60, mass produced DACs, "get around the jitter issue" (reduce jitter artefacts to far below audibility) and, this has been the case for many years!
1a) I haven't auditioned different optical cables, so I can't say. 1b) In theory impurities can introduce error, yes.
2a) I don't know how much jitter is introduced, this probably depends on the sender, medium and receiver. 2b) I'd be impressed to see any digital connection without error correction that is bit perfect
3) This thread was created here by me. If this is the wrong forum, somebody please move it to the correct one.
1a. Auditioning optical cables wouldn't enable you "to say" (with any accuracy) either! One could relatively easily compare the digital data after it has passed through the optical conversion/cable with the original data and obtain an accurate, objective answer.
1b. Yes, in theory they could. In practice though, if an optical cable (even a cheap one) had such a high level of impurities that it degraded the signal to the point of zereos and ones not being distinguishable (and therefore "introducing error"), then it would be a defective optical cable!
2a. Firstly, as you're considering jitter to be an error, then there aren't any DACs without error correction, even cheap ones. The only potential exception might be some obscure, esoteric, incompetently designed, audiophile DAC. Secondly, I would NOT be impressed in the slightest to see any digital connection that is bit perfect, as that is the only thing a digital connection exists to do! I would be seriously unimpressed by any digital connection which was incapable of bit perfect transfer!
3. Maybe you missed that this is the "Sound Science" forum? Your "experience" or impressions are only relevant in terms of how they relate to the actual facts/science but unfortunately, many/most of your statements do the exact opposite and contradict the actual facts/science!
I hope I haven't come across as too harsh? Your position is entirely understandable because all or nearly all digital audiophile products only exist by getting audiophiles to believe fallacies/falsehoods. For example, they may correctly state that jitter is bad and therefore that less jitter is better than more jitter. However, they then rely on a fallacy to sell their product, that their product has lower jitter than another product and therefore must sound better. This is a correlation fallacy based on the omission of a vital fact, that beyond a certain point, jitter artefacts are inaudible. For instance, all else being equal, a DAC which has jitter artefacts that are say 10 times below audibility will, by definition, sound identical to a DAC which has jitter artefacts that are say 1,000 times below audibility. They omit the fact that for many years even cheap DACs achieve jitter artefacts more than 10 times below audibility and there are cheap ($60) DACs today that reduce jitter artefacts to about 1,000 times below audibility. The idea that jitter is a problem in modern digital audio equipment only exists in the marketing manipulated understanding/psyche of audiophiles but in reality it's a problem that was solved decades ago! It's a similar situation with bit perfect transfers, assuming the software/driver isn't deliberately altering the bit stream and the system is setup correctly, how often do we actually encounter bit errors in the output of even cheap DACs/cables? It's a potential/theoretical problem that's falsely marketed as a common/real problem that only an expensive audiophile DACs/cables avoid! The difficulty for audiophiles is that this false marketing is effectively their only source of information, as there's no money/incentive to be made from adhering to the actual facts/science, which is why this sub-forum exists!
G