Randall Leong
December 19th, 2009, 01:14 PM
I'm still struggling with understanding why people think the progressive recording is visibly superior for normal HD usage playing back through an HDTV. The 1080i recordings from the Sony cams look spectacular on a Sony 46" 60Hz TV, and I understand the newer TVs at 120Hz do all sorts of clever interpolation and the video might look even better there. I can't even really tell the difference between playback quality on a PC (nominally progressive) vs the HDTV (nominally interlaced) except that the former is on a much smaller screen and isn't my target output device in practice. So that rules out the only source of regret I might have about using a 1080i cam.
I've also heard that some of the advertised 1080p cams aren't actually capturing in that mode, they just interpolate themselves and play it back that way.
I understand the technological difference but wonder how much it really matters at this point. For example, I definitely don't buy that a 720p recording looks superior to a 1080i one on my TV. So is the difference something that is visible to the average consumer on an average HDTV today, or is it a futures consideration except for people who are trying to emulate film cameras used for movies?
The problem there is that all LCD and Plasma HDTVs that are currently out there are natively progressive--and they deinterlace interlaced signals to varying degrees of quality depending on the set. Only CRT HDTVs (now extremely rare) and some of the HD projectors are natively interlaced.
On the other hand, Canon's so-called "1080p" modes are really 1080p streams embedded inside a 1080i container. Thus, they are read as 1080i video in a program which cannot properly remove the pulldown encoded in the videos.
I've also heard that some of the advertised 1080p cams aren't actually capturing in that mode, they just interpolate themselves and play it back that way.
I understand the technological difference but wonder how much it really matters at this point. For example, I definitely don't buy that a 720p recording looks superior to a 1080i one on my TV. So is the difference something that is visible to the average consumer on an average HDTV today, or is it a futures consideration except for people who are trying to emulate film cameras used for movies?
The problem there is that all LCD and Plasma HDTVs that are currently out there are natively progressive--and they deinterlace interlaced signals to varying degrees of quality depending on the set. Only CRT HDTVs (now extremely rare) and some of the HD projectors are natively interlaced.
On the other hand, Canon's so-called "1080p" modes are really 1080p streams embedded inside a 1080i container. Thus, they are read as 1080i video in a program which cannot properly remove the pulldown encoded in the videos.