Patrickhat2001
1000+ Head-Fier
- Joined
- May 13, 2003
- Posts
- 1,068
- Likes
- 17
I see this statement all the time now--"Many people find standard definition signals to look better on SD televisions than HD televisions." But what, exactly, are people referring to when they say standard definition signals? Are they referring exclusively to television signals (sent over air/cable/satellite) which usually arrive via a low quality coaxial (RF) cable? Or are they also referring to other signals that are of the so called standard definition such as game consoles (especially when they do not have progressive scan enabled--not all titles support progessive scan) which can be hooked up via superior S-Video/Component cables? How about older game consoles--like those of the 32/64bit generation (PS1, N64, Sat) or even :gasp: the 16-bit generation (SNES, Gen)? Do those look better on an HD televisions or a standard one?
To say nothing about DVD, of course, which I must admit I'm a little confused on as well. I have a DVD player that can support progressive scan but would it actually improve the signal if the image is displayed on a monitor with a 1080i output (such as a direct view HD CRT). After all, why have the DVD player deinterlace the signal when it's just going to be interlaced again on the monitor? I'm a little confused?
I realize there can be a lot of variance here--after all there are only a few technologies for standard definition sets (mostly revolving around the cathode ray tube) whereas HD (and ED) can run the gamut of possible technologies (LCD, Plasma, CRT, LCOS, DLP, etc) which can differ in there ability to display lower resolution analog signals. Still, any advice is appreciated. I'm sorely in need of a new television and I'm trying to decide if I want to make the leap to a HD set (most likely a direct view CRT) or if I might be better off buying a cheap analog set to hold me over for a few years (and provide legacy support for low resolution signals in the future, if need be).
To say nothing about DVD, of course, which I must admit I'm a little confused on as well. I have a DVD player that can support progressive scan but would it actually improve the signal if the image is displayed on a monitor with a 1080i output (such as a direct view HD CRT). After all, why have the DVD player deinterlace the signal when it's just going to be interlaced again on the monitor? I'm a little confused?

I realize there can be a lot of variance here--after all there are only a few technologies for standard definition sets (mostly revolving around the cathode ray tube) whereas HD (and ED) can run the gamut of possible technologies (LCD, Plasma, CRT, LCOS, DLP, etc) which can differ in there ability to display lower resolution analog signals. Still, any advice is appreciated. I'm sorely in need of a new television and I'm trying to decide if I want to make the leap to a HD set (most likely a direct view CRT) or if I might be better off buying a cheap analog set to hold me over for a few years (and provide legacy support for low resolution signals in the future, if need be).