*> The jitter matter has been beaten up to death.*

Exactly. Which is why I'm trying to talk about the discretely provable matter: Whether a digital signal across a cable can go in one end as waveform A, and emerge on the other as the completely differently characterized waveform B with boosted bass. Sorry, this doesn't happen, and it's not hard to prove formally, even taking signal error and jitter into account.

*> I haven't claimed anything, I'm mostly asking what experiments you've made in order to make those statements...*

You don't understand. A digital waveform is numeric data which either arrives intact, or with errors or missing data. In both cases (errors, and missing data), it's easy to prove mathematically that this will NOT produce, for example, a bass boost, under *any* circumstances (barring a probability which likely rivals the chances of spontaneous formation of living cells from raw materials).

Again, it's not particularly difficult (maybe tedious) to prove mathematically that errors introduced into a digital stream won't magically cause the bass to be boosted. It just doesn't work that way, period! To say otherwise is like saying two times two is not four.

If you really want, I could write out a *formal proof* that random errors and skipped samples and even jitter, no matter how severely introduced into a digital stream, will not produce a bass boost effect. But then again, it will make no sense unless you've taken at least undergrad (college) level discrete mathematics, probability theory, and calculus. Those of you who do understand what I'm talking about here fully will know how to prove it yourself anyway, and most likely are on the non-believer side to begin with.

Edited by ac500 - 8/1/11 at 5:29pm