Quote:
Originally Posted by GAD /img/forum/go_quote.gif
right...
|
It's true... 1080i and 1080p will look the exact same regardless of framerates on a still image.
Quote:
Originally Posted by GAD /img/forum/go_quote.gif
First off, I've never seen a true 1080p source - have you? I'm not being snarky - I really don't know of any to compare. Even my HD-DVD player plays games with interlacing (See my previous link on the XA2).
If the signal at 1080p/30 is interlaced to 1080i/60, then sent, then de-interlaced back to 1080p/30, what's the difference? I'd agree that a 1080i screen would be icky, but there aren't any 1080i screens these days - 1080i gets converted back to 1080p for display since the technology for most flatscreens is inherently progressive. If the amount of data is the same, and my TV de-interlaces back to 1080p/30, what's the difference?
On a 1080i screen, I'd agree with you that 1080p would be much better.
|
I think we're more in agreement than it first seemed.
I have seen NBA07 on a PS3 on a 1080p 46 inch XBR5 sony. It's supposedly 1080p/60. It definitely had some smooooth graphics... Not sure how much smoother than a full constant 30fps though since that's rare in the real world unless it's a movie.
Assuming you have a TV that would de-interlace 1080i/60 into 1080p/30 it's going to look just like it would if it started at 1080p/30 since it'll just plop the two half scans of every frame into one. A 1080p/30 signal converted into 1080i/60 will twitter since there's no way around getting half a new frame on the screen without the other one.
Funny thing is... I have a KV-34HS420 which is a hi-scan 1080i native set. Now it's small enough so I haven't really noticed any nasty twittering and have been extremely happy with the image quality even during fast action stuff... But it does actually display 1080i. My in-laws have an older 65inch 'big screen' HD set that displays 1080i. I HAVE noticed twittering on that and if I was buying a TV that size I wouldn't want one that displays 1080i natively.
Quote:
Originally Posted by GAD /img/forum/go_quote.gif
Ahh, but I love my LCoS TV
|
Indeed... looking forward to a nice new SXRD Sony sometime in the near future.
Quote:
Originally Posted by GAD /img/forum/go_quote.gif
This, IMO is the biggest thing most consumers are missing in the whole HD mess. I'd be interested to know if George Lucas shot the last Star Wars in 1080p or better. Many of the HD players are only now coming to terms with a 1080/24p mode.
|
Yeah, I would assume 1080p/60 would start becoming popular for TV shows and other things not using film cameras.
Maybe we need to be more specific about our arguments. 1080i/60 on a 1080i set will have twittering or tearing issues. 1080i/60 on a set that actually shows 1080p/30 will not have those issues since it'll just de-interlace into single full frames.
1080p/30 on a set that displays 1080i will still have the same tearing issues since it has to de-interlace. 1080p/30 on a set that displays 1080p/30 obviously will not have those problems. 1080p/60 on a 1080i/60 set is impossible without chopping frames or mushing them somehow. 1080p/60 is superior to 1080p/30 only because it has smoother playback.
Now the real argument is whether 1080p/60 is useful right now. I'd say no... Movies are almost all shot at 24fps, TV is capable of 1080p/30 at best, the action needs to be very fast and the TV fairly large to even notice a problem with 1080i/60, and most tvs already de-interlace to 1080p/30 anyway.
ALSO 720p is good enough for a smaller set or a large one at a large enough distance since you can't see the extra pixels anyway.
Dan