Can I add oil to the fire by doing the noob(again)?
I find the hypothesis: "any changes in sound from 2 sources giving digital signal can only come from jitter", to really need to be confirmed first. must I understand that you would call jitter any kind of change that would occur to the digital signal?
Could a difference in source impedance lead to some matter of jitter? Say that voltage ends up going higher into dac C because of impedance difference, couldn't it change the moment when switches from 0 to 1 or 1 to 0 are triggered? (pure guessing here so let me know when I've reached bullshiiiit mountain)
Jitter can only lead to the sample being moved on the time-line while quantization error can only move it in amplitude, is that right? But both would still affect the sinusoid in both axis as it would move the possible path going from that sample to the next. in any case it would mostly affect high frequencies right? Because for lower freqs the duration of lag becomes less and less significant.
What max change could it do in amplitude on, say worst case scenario, 20khz and 300ns jitter that I've seen thrown into a post here ? Again I guess only the highest frequencies could be really affected as the others would have a lot more points to create the signal wave so the impact of one error should be reduced.
For pitch changes I guess I can manage alone, let's take the 300ns jitter. it makes 0.0000003s.
Worst case scenario again 20khz = 0.00005s . so max change would be + or minus 0.0000503s or 0.0000497 so between 19880hz and 20120hz .
is that right?
for me it's 5am so I hope the jury will take it into account before delivering its verdict.