Does source matter in an all-digital era?
Dec 22, 2007 at 6:20 AM Post #16 of 16
I do hear audible differences in transports with my headphone setup....and if I A/B my DVD players, they have subtle differences due to the way they decode the bitstreams over the digital connection (or at least I assume). But I find that the differences in video aren't as pronounced...if you A/B they might sound subtly different, but they're all good. So I don't really factor that in at all. Get whatever player you want and focus on what receiver you want
biggrin.gif


"And with LCD TVs is there even A/D conversion? It's basically a scaled-up computer monitor, and a digital signal is basically saying, "Make pixel X,Y color value R,G,B.""

Actually, TVs are different then computer monitors. It's mainly because they have to be legacy and support NTSC standards (which are a different aspect then a PC or Mac). Besides aspect, the other difference is frequency.....NTSC gets around 30fps. HD movies are 1080p/ at 24 fps. So even if your TV is 1080p, it's having to de-interlace it's normal 60HZ (or 60 fields) to 24fps. Only some of the latest TVs support 1080p input as well....the older 1080p sets got a 1080i/60 signal and then converted to 1080p. So really differences between 1080i and 1080p are perceptually very minor with movies....it's the processors and de-interlacers used that's the most important part.
 

Users who are viewing this thread

Back
Top