[1] i wonder, has there been experimentation on that... like just what degree of minute differences people can pick up? [2] like by using music that has been altered to varying degrees aproximating certain types of quantization errors? or [3] slightly modifying dynamic range etc... sort of in the vein of hi-res tests I guess ([4] i.e. do inaudible frequencies affect audible frequencies peceptibly?)
1. Yes, in regards to pretty much any minute type of difference you can think of, plus many you can't. How much experimentation depends on the specific difference but some of it has been done numerous times over a span of up to 9 decades. Also, the reliability of that experimentation can sometimes be questioned, especially if the funding for the scientific study/experiment has come from those who have a financial interest in the outcome.
2. Yes, this has been done but this is an example of not really needing to do a study/experiment or of just confirming the obvious. Quantisation errors are heard as random (white) noise when converted back to analogue and the amount of quantisation error is a known statistical quantity. In other words, we have a certain amount of unwanted (quantisation) noise which even using just 16bit is at least 10 times and typically over 100 times lower than the almost identical unwanted noise on the recording itself.
3. The dynamic range of commercial audio recordings are orders of magnitude lower than the dynamic range of which 16bit is capable. Slightly (or even significantly) increasing the dynamic range of the recording will still easily fall within the dynamic range capabilities of 16bit and therefore will be identical at 16bit as at 24bit (hi-res). This is another case of where studies have confirmed what is obvious (given all the correct facts!).
4. Again, yes and in this case quite a few studies, although the answer is not as clear cut because of some special circumstances, IMD for example. Allowing for those special circumstances, the answer is "no" but this answer has been obfuscated to a certain degree by flawed experiments/studies being published by those with an ultrasonic product to sell.
With regards to my last post, my response to TheoS53's assertion of digital timing errors (jitter). Experiments indicate that with music, most people can't discern less than 500ns (nano seconds) of jitter and no one can discern less than 250ns of jitter. The jitter expected of even a cheap consumer converter is about 500 times less than even the 250ns figure!
G