QUOTE(Lifter @ Nov 11 2006, 02:49 PM)
Get a clue people.....snip......what that means is that whether you play the 1080i version or the 1080p version - it makes no difference whatsoever. None. Zip. Nada. Just like the article says. Play it back at 1080i, and your digital 1080 TV set is just converting it to 1080p at 30fps. Since the movie is only 24fps, you lose absolutely nothing. Full resolution. Full framerate. All from a 1080i source signal. A 1080p source of MI3 gives no advantage.
In fact, it is you who needs to get a clue. Too many of you people are not professionals in these fields yet you try to talk like you are. You have a superficial understanding of a complex field.
If you had of studies this field instead of shooting off your mouth talking about things you don't understand you would realise that it is impossible to perfectly convert interlaced fields to a full progressive frames. Even the best consume level $US3000 video processes still leave interlace artifacts in certain situations.
Not having to do interlacing to progressive from the original 1080P24fps source is a distinct advantage because it eliminates any interlace artifacting.
Amatuer.
QUOTE(unclepauly @ Nov 11 2006, 01:01 PM)
1080p IS the holy grail of hdtv. It's basically the most perfect image you are going to get for the next 10yrs(maybe less). I agree it is not a gigantic leap over 1080i, but over 720p? It's 2x the pixels... The more you can see the better. I think this study is just trying to do some damage control for somebody.
This is true, but contrast and blacks have improved in later generation of LCOS / LCD / Plasma.
SED and Laser will take over eventually.
CRTs are practical for 60" screen sizes which for most viewing distances is ideal for 1080 sources. Think of the THX recommended viewing angle.