|
|||||||||
|
Thread Tools | Search this Thread |
December 2nd, 2008, 04:38 AM | #1 |
Major Player
Join Date: Oct 2004
Location: London, UK
Posts: 243
|
Why can some HD TVs display 1080i but not 1080p?
A friend asked me this question last night. I couldn't answer and it's really bugging me!
Some (consumer) LCD "HD-ready" TVs claim to be able to handle a 1080i input but they can't handle a 1080p input. Why is this? Why can't they handle 1080p if they can handle 1080i? Surely 1080/50i and 1080/25p both require the same bandwidth? And surely it's less computationally expensive to downsample from 1080/25p to the TV's native resolution than to downsample from 1080/50i (because in order to properly downsample an interlaced image you first have to de-interlace to produce a 1080/25p signal and then downsample the progressive-scan image). Of course, most "HD-ready" TVs don't have a native 1920x1080 liquid crystal panel; instead they have something like 1366 x 768. Is the answer that these "HD-ready" displays actually don't deinterlace and then downsample an interlaced input; instead they just throw away one of the fields to end up with a 1920x540 signal??? Many thanks, Jack |
December 2nd, 2008, 08:08 AM | #2 |
Inner Circle
Join Date: Aug 2006
Location: Efland NC, USA
Posts: 2,322
|
Because 1080p isn't part of the ATSC interface standard.
__________________
http://www.LandYachtMedia.com |
| ||||||
|
|