Robert Bobson
November 30th, 2008, 10:23 AM
As I understand it: you get the best quality video when the Source Resolution matches the Native Resolution of the display.
If a tv says it's Native Resolution is 1080p, does that mean it takes a 1080i60 signal and deinterlaces it to display 1080p60? if so, doesn't that degrade the 1080i signal by interpolating each field into a full frame - resulting in only 960 x 540 TRUE RESOLUTION?
(why is this all so confusing? maybe I should give up huffing paint!)
If a tv says it's Native Resolution is 1080p, does that mean it takes a 1080i60 signal and deinterlaces it to display 1080p60? if so, doesn't that degrade the 1080i signal by interpolating each field into a full frame - resulting in only 960 x 540 TRUE RESOLUTION?
(why is this all so confusing? maybe I should give up huffing paint!)