Trying to get 1080i out of my tv, few questions.
Sep 5, 2007 at 2:27 PM Post #16 of 24
Quote:

Originally Posted by MontyPythizzle /img/forum/go_quote.gif
480P is the greatest improvement in about all of the formats, as it goes from 256 colors of 480i and doubles it to 512 of 480p :p


Do you mean grayscale, 256 steps of grayscale? At 256 steps you would be able to reproduce 16.7mil colors, with the human eye only being able to see around 16mil.

Even if this is what you meant the comment doesn't make much sense, the source is still the same whether interlaced or progressive, and therefore will show no change in the number of available colors.
 
Sep 5, 2007 at 9:49 PM Post #17 of 24
Quote:

Originally Posted by Gaara /img/forum/go_quote.gif
Do you mean grayscale, 256 steps of grayscale? At 256 steps you would be able to reproduce 16.7mil colors, with the human eye only being able to see around 16mil.

Even if this is what you meant the comment doesn't make much sense, the source is still the same whether interlaced or progressive, and therefore will show no change in the number of available colors.



Yeah what he said didn't make any sense. The only difference between 480i and 480p is the fact that it no longer interlaces when drawing. Interlacing means it draws every other line on one pass, and the remaining lines (skipping every other again) on the next pass. 480p will draw every line on every pass. So it essentially doubles the framerate and also gets rid of "twittering" which makes diagonal lines look like they're sort of tearing when moving.

1080i is interlaced as well... but there are far more lines and it takes faster movement to notice anything. It's also why sporting events are normally (or at least SHOULD be) broadcast in 720p instead since it handles fast motion better.
 
Sep 5, 2007 at 10:18 PM Post #18 of 24
Remember, 1080p has twice the resolution, and half the frame rate of 1080i.

Most HD DVDs and BlueRay discs that say 1080p reportedly interlace the signal then rebuild it:

http://hddvd.highdefdigest.com/hreview_hdxa2030207.html

Check out the section "1080p - miracle feature or hype".

Most HDTVs are 1080p internally (flatscreens are progressive by their nature).

1080p as we know it is 1080p/30 and is not a big deal over 1080i because 1080i is 1080i/60. In the future, something like 1080p/60 would be a major improvement:

From http://en.wikipedia.org/wiki/1080p

Quote:

Due to bandwidth limitations of broadcast frequencies, the ATSC and DVB have standardized only the frame rates of 24, 25, and 30 frames per second (1080p24, 1080p25, 1080p30). Higher frame rates, such as 1080p50 and 1080p60, could only be sent over normal-bandwidth channels if a more advanced codec (such as H.264/MPEG-4 AVC) were to be used. Higher frame rates such as 1080p50 and 1080p60 are foreseen as the future broadcasting standard for production


Sorry for the choppy text - one arm and all...

GAD
 
Sep 5, 2007 at 10:43 PM Post #19 of 24
Quote:

Originally Posted by GAD /img/forum/go_quote.gif
Remember, 1080p has twice the resolution, and half the frame rate of 1080i.

Most HD DVDs and BlueRay discs that say 1080p reportedly interlace the signal then rebuild it:

http://hddvd.highdefdigest.com/hreview_hdxa2030207.html

Check out the section "1080p - miracle feature or hype".

Most HDTVs are 1080p internally (flatscreens are progressive by their nature).

1080p as we know it is 1080p/30 and is not a big deal over 1080i because 1080i is 1080i/60. In the future, something like 1080p/60 would be a major improvement:

GAD



That's sort of true. The visible resolution of 1080i and 1080p is identical so that statement is misleading. The resolution of each frame of 1080i is half of 1080p since the 1080i frame is only filling in 540 lines leaving the intertwined 540 lines sitting at the last frame. If it's done 60 times a second it uses the same bandwidth as 1080p/30 since that fills in every line 30 times a second. But what you can physically see on the screen is nearly identical.

However the negative effects of 1080i/60 outweigh the slower frame rate of 1080p/30 since in fast motion video on 1080i/60 you could see twittering or slicing of diagonal lines where in 1080p/30 you won't.

Now I would also argue that the refresh times of most LCD TV is going to be slower than either of those can keep up with under hard color changes anyway so you'll see the effects of the slow pixels well before the tearing of 1080i.


Another thing to consider is since most films are shot at 24fps basically making the difference between 1080i and 1080p useless. Essentially they can fill all lines of 1080i/48 before the film frame changes anyway so you won't see any twittering. I'm not sure if any newer films are shot at 1080p/60, although it's likely some might be soon enough...

Video games, on the other hand, running at 1080p/60 could definitely have an advantage over 1080i/60 or even 1080p/30.
 
Sep 6, 2007 at 12:42 AM Post #20 of 24
I believe 720p has 1 million pixels and 1080p has 2 million. So 720p is almost equal to 1080i.
 
Sep 6, 2007 at 12:47 AM Post #21 of 24
Quote:

Originally Posted by immtbiker /img/forum/go_quote.gif
I believe 720p has 1 million pixels and 1080p has 2 million. So 720p is almost equal to 1080i.


1080i and 1080p are the same resolution, just delivered differently. 720p and 1080i side by side, are quite apparently different.

GAD
 
Sep 6, 2007 at 1:06 AM Post #22 of 24
Quote:

Originally Posted by dan1son /img/forum/go_quote.gif
That's sort of true. The visible resolution of 1080i and 1080p is identical so that statement is misleading. The resolution of each frame of 1080i is half of 1080p since the 1080i frame is only filling in 540 lines leaving the intertwined 540 lines sitting at the last frame. If it's done 60 times a second it uses the same bandwidth as 1080p/30 since that fills in every line 30 times a second. But what you can physically see on the screen is nearly identical.


right...

Quote:

Originally Posted by dan1son /img/forum/go_quote.gif
However the negative effects of 1080i/60 outweigh the slower frame rate of 1080p/30 since in fast motion video on 1080i/60 you could see twittering or slicing of diagonal lines where in 1080p/30 you won't.


First off, I've never seen a true 1080p source - have you? I'm not being snarky - I really don't know of any to compare. Even my HD-DVD player plays games with interlacing (See my previous link on the XA2).

If the signal at 1080p/30 is interlaced to 1080i/60, then sent, then de-interlaced back to 1080p/30, what's the difference? I'd agree that a 1080i screen would be icky, but there aren't any 1080i screens these days - 1080i gets converted back to 1080p for display since the technology for most flatscreens is inherently progressive. If the amount of data is the same, and my TV de-interlaces back to 1080p/30, what's the difference?

On a 1080i screen, I'd agree with you that 1080p would be much better.

Quote:

Originally Posted by dan1son /img/forum/go_quote.gif
Now I would also argue that the refresh times of most LCD TV is going to be slower than either of those can keep up with under hard color changes anyway so you'll see the effects of the slow pixels well before the tearing of 1080i.


Ahh, but I love my LCoS TV
smily_headphones1.gif


Quote:

Originally Posted by dan1son /img/forum/go_quote.gif
Another thing to consider is since most films are shot at 24fps basically making the difference between 1080i and 1080p useless. Essentially they can fill all lines of 1080i/48 before the film frame changes anyway so you won't see any twittering. I'm not sure if any newer films are shot at 1080p/60, although it's likely some might be soon enough...


This, IMO is the biggest thing most consumers are missing in the whole HD mess. I'd be interested to know if George Lucas shot the last Star Wars in 1080p or better. Many of the HD players are only now coming to terms with a 1080/24p mode.

Quote:

Originally Posted by dan1son /img/forum/go_quote.gif
Video games, on the other hand, running at 1080p/60 could definitely have an advantage over 1080i/60 or even 1080p/30.


Agreed.

GAD
 
Sep 6, 2007 at 1:53 AM Post #23 of 24
1080p is a video format that probably is geared more towards play back of 24p movies. The higher definition that "film" gets worked at is usually called 2k thats been the standard for a bit. There are higher levels of quality available as well but this is mainly for production and post production use.

There is allot of data that gets thrown away in the process of delivering it to your home. Color sampling, non-square pixels and all sorts of creative ways to squeeze the data stream down to get to the end user.

A really interesting and smoking hot camera that everyone was/is going apepoop over in the industry is called the RED. http://www.red.com/
It may just change the way movies get made check it out.

*edit: sorry this has zero to do with getting the dvd player out to the monitor.
 
Sep 6, 2007 at 2:10 AM Post #24 of 24
Quote:

Originally Posted by GAD /img/forum/go_quote.gif
right...


It's true... 1080i and 1080p will look the exact same regardless of framerates on a still image.
smily_headphones1.gif


Quote:

Originally Posted by GAD /img/forum/go_quote.gif
First off, I've never seen a true 1080p source - have you? I'm not being snarky - I really don't know of any to compare. Even my HD-DVD player plays games with interlacing (See my previous link on the XA2).

If the signal at 1080p/30 is interlaced to 1080i/60, then sent, then de-interlaced back to 1080p/30, what's the difference? I'd agree that a 1080i screen would be icky, but there aren't any 1080i screens these days - 1080i gets converted back to 1080p for display since the technology for most flatscreens is inherently progressive. If the amount of data is the same, and my TV de-interlaces back to 1080p/30, what's the difference?

On a 1080i screen, I'd agree with you that 1080p would be much better.



I think we're more in agreement than it first seemed.

I have seen NBA07 on a PS3 on a 1080p 46 inch XBR5 sony. It's supposedly 1080p/60. It definitely had some smooooth graphics... Not sure how much smoother than a full constant 30fps though since that's rare in the real world unless it's a movie.

Assuming you have a TV that would de-interlace 1080i/60 into 1080p/30 it's going to look just like it would if it started at 1080p/30 since it'll just plop the two half scans of every frame into one. A 1080p/30 signal converted into 1080i/60 will twitter since there's no way around getting half a new frame on the screen without the other one.

Funny thing is... I have a KV-34HS420 which is a hi-scan 1080i native set. Now it's small enough so I haven't really noticed any nasty twittering and have been extremely happy with the image quality even during fast action stuff... But it does actually display 1080i. My in-laws have an older 65inch 'big screen' HD set that displays 1080i. I HAVE noticed twittering on that and if I was buying a TV that size I wouldn't want one that displays 1080i natively.

Quote:

Originally Posted by GAD /img/forum/go_quote.gif
Ahh, but I love my LCoS TV
smily_headphones1.gif



Indeed... looking forward to a nice new SXRD Sony sometime in the near future.

Quote:

Originally Posted by GAD /img/forum/go_quote.gif
This, IMO is the biggest thing most consumers are missing in the whole HD mess. I'd be interested to know if George Lucas shot the last Star Wars in 1080p or better. Many of the HD players are only now coming to terms with a 1080/24p mode.


Yeah, I would assume 1080p/60 would start becoming popular for TV shows and other things not using film cameras.

Maybe we need to be more specific about our arguments. 1080i/60 on a 1080i set will have twittering or tearing issues. 1080i/60 on a set that actually shows 1080p/30 will not have those issues since it'll just de-interlace into single full frames.

1080p/30 on a set that displays 1080i will still have the same tearing issues since it has to de-interlace. 1080p/30 on a set that displays 1080p/30 obviously will not have those problems. 1080p/60 on a 1080i/60 set is impossible without chopping frames or mushing them somehow. 1080p/60 is superior to 1080p/30 only because it has smoother playback.

Now the real argument is whether 1080p/60 is useful right now. I'd say no... Movies are almost all shot at 24fps, TV is capable of 1080p/30 at best, the action needs to be very fast and the TV fairly large to even notice a problem with 1080i/60, and most tvs already de-interlace to 1080p/30 anyway.

ALSO 720p is good enough for a smaller set or a large one at a large enough distance since you can't see the extra pixels anyway.

Dan
 

Users who are viewing this thread

Back
Top