To the best of my knowledge there has never been any **properly controlled listening test**, anywhere, that has shown any evidence that jitter at any frequency or type has an audible effect at the levels found in commercially available kit (up to 14ns).
Benjamin and Gannon, who did use correlated jitter put the threshold at about 20ns for some frequencies and much much higher for others, at 20K jitter is a *theoretical* problem at low levels but , 1) There is not much happening relatively speaking musicwise at 20K, 2) masking, 3) the models for jitter audibility at 20K assume a signal that is 120db above the hearing threshold (Dunn et al) lol, 4) At 20K telling the difference between 16 bits and 15 bits is untrivial even for those with excellent hearing and 5) the B and G listening tests did not support this as a problem.
In another HiFi forum a chap created a jitter simulation and injected deterministic jitter into samples, after several weeks only one person was able to reliably (P< 0.05) detect 10ns of jitter and that only after extreme volume and repetitions. Nobody else could reliably place varying jitter 0 to 100ns amounts in the correct order !
I did it though by looking at the spectra and calculating the relative signal loss/degradation for each sample, even at 100ns it was pretty trivial.
There is much magical thinking and anecdote about how bad Jitter is, but this is simply not supported by anything remotely resembling good evidence , in the absence of such evidence jitter is just not something to worry about !