|
|||||||||
|
Thread Tools | Search this Thread |
April 3rd, 2008, 03:36 PM | #1 |
Major Player
Join Date: Apr 2006
Location: Boston, Massachusetts
Posts: 616
|
Why does 1080i exist?
I've been told that 720p/1080i tvs have a native resolution of 1366x768. So it's pointless to view something at 1080i because the tv has to downscale to 720p and even de-interlace. If LCD TVs have to de-interlace than why are we still making interlaced video? Is 720p better than 1080i on a monitor that has a native resolution of 1366x768? (I have the option to play blu-ray in either 720p or 1080i -this is why I ask)
|
April 3rd, 2008, 04:52 PM | #2 | ||
Major Player
Join Date: Jan 2007
Location: Portland, OR
Posts: 949
|
Quote:
Quote:
Other people do because of the limitations of their camera (e.g. most cameras lack 60p), their skill (can't hold the camera steady enough for 30p), subject matter (can't always get it right the first time in sports), or delivery requirements. Generally, yes. Whichever one has the better deinterlacing and resizing algorithms. |
||
April 3rd, 2008, 05:17 PM | #3 |
Major Player
Join Date: Oct 2005
Location: Hillsborough, NC, USA
Posts: 968
|
30 frames per second look jerky. 60 frames per second don't. 60 frames per second require a lot of bandwidth. 60 fields per second require the same amount as 30 frames per second. Progressive vs. interlaced = jerky vs. smooth and is why interlacing was originally invented in the early 20th century. Once 1080p/60 camcorders become affordable to the average consumer, interlacing will eventually disappear.
Same arguments apply for 25 vs 50. 720p/1080i TVs = 1366 x 768 isn't true. 1080p native TVs (1920 x 1080) can display 720p/1080i. |
April 3rd, 2008, 05:26 PM | #4 |
Major Player
Join Date: Jan 2007
Location: Portland, OR
Posts: 949
|
|
April 3rd, 2008, 05:36 PM | #5 | ||
Inner Circle
Join Date: Jun 2003
Location: Toronto, Canada
Posts: 4,750
|
Quote:
Quote:
- HDTV standards were established several years back. At that time, the CRT was king and the flat panel monitors weren't really that good for critical evaluation. They would evaluate systems by looking at a CRT monitor (which obviously is fine with interlace). It was in 1990 when competing HD formats were announced. http://en.wikipedia.org/wiki/Grand_Alliance_(HDTV) And before that, people were even thinking about doing analog HD. - At that time, the choice was mostly between 720p (60fps) and 1080i60 (29.97 fps). That's about the bandwidth that will easily fit over a single coaxial cable; to go higher bandwidth like 1080p60 won't do that. Presumably the cameras and recording formats at that time would also have had problems with higher bandwidth. - Between 1080i and 720p, 1080i is capable of slightly higher spatial resolution. Because of interlacing, the camera has to filter/blur the image vertically a little bit to avoid interlacing nasties. The deinterlacer in the TV, depending on its quality, can also reduce resolution. (Right now most TVs will discard a field and interpolate, which loses half the resolution.) The engineers in favour of 1080i argue that deinterlacers will get better in the future. Horizontally, most broadcasters broadcast 1080i at 1440x1080 raster. The argument for 720p is that: - It avoids interlacing artifacts. This is important since most HDTVs will be flat panels and *need* to deinterlace the image. Yves Faroudja, a designer of high-end deinterlacers, has said that "deinterlacing does not work!". - Interlacing is not a very efficient way of compressing imagery. Instead of (filtering and then) discarding half the data, it is better to use other compression techniques like DCT, motion compensation, and other compression techniques. 720p uses bandwidth much more efficiently. One EBU test also shows that 1080p50 looks better than 1080i50. (I believe their tests also show that 720p only looks better at low bitrates.) |
||
April 3rd, 2008, 05:51 PM | #6 |
Inner Circle
|
Nope.................
720p/ 1080i/ 1080p sets can have almost any native screen resolution they like as long as they physically have a minimum of 720 horizontal lines. (see "Weasel Worded" below).
The most common "weasel worded" "HD READY" screens (here at least) are indeed 1366 X 768 or thereabouts. The only thing those screens can display NATIVE is 720p, there is no 720i as far as I'm aware. 1080p (if available)/ 1080i / 576p/ 576i/ 480p/ 480i has to be scaled to play on these screens, with varying degrees of success. To play 1080p/ 1080i NATIVE your screen must physically have 1080 (minimum) lines of 1920 pixels each. More is available, with 1920 X 1200 quite popular for computer monitors. (these are the only true (Full) HD screens). CS |
April 3rd, 2008, 11:41 PM | #7 |
Inner Circle
Join Date: Jan 2006
Location: Minnesota (USA)
Posts: 2,171
|
If I recall correctly, interlaced video came into being, because back in the 1930s it was cost prohibitive to produce a cathode ray tube that could "paint" all 480 lines of video fast enough to avoid a significant lag between the first and last lines being painted (similar to today's difficulties with rolling shutters on CMOS sensors). Since it wasn't practical to paint all 480 lines at once every 1/30th of a second, the solution that was come up with, was to paint every other line and then fill in the gaps a 60th of a second later. I believe that increased temporal resolution was actually an unintended benefit.
|
April 4th, 2008, 08:56 AM | #8 |
Major Player
Join Date: Apr 2006
Location: Boston, Massachusetts
Posts: 616
|
Thanks, I guess I'll switch the blu-ray player to 720p instead of 1080i.
The more I think about HDV the more I like it. I have a sony FX1 and people constantly bash it because it is compressed, and 1080i (or atleast stretched to 1080i). The only thing I don't like is I hear that it compresses more when I capture it to final cut. I'm going to try to learn how to avoid that extra compression. Last edited by Aric Mannion; April 4th, 2008 at 08:58 AM. Reason: Oh, and the reason I do like it is I can shoot HD by cutting all these corners. |
April 4th, 2008, 09:23 AM | #9 | |
Wrangler
|
Quote:
If you ever want to see the persistence of phosphor, shut off all room lights for awhile so that your irises open up. Turn on a TV for a few moments then turn it off. The screen will seem to glow for awhile. As to the 1366x768 statement, it is true that low cost, budget LCD tvs have this limitation. Take a look at other LCDs of the same size with a much higher price tag and you'll see that they are full 1920x1080 panels. I had to point this out to my elderly parents as they wanted to buy an HDTV this past Christmas. -gb- -gb- |
|
April 4th, 2008, 01:52 PM | #10 | |
Inner Circle
Join Date: Jun 2003
Location: Toronto, Canada
Posts: 4,750
|
Quote:
One clear advantage of 60 fields/second is that flicker is pretty low. Whereas to do 60 frames/second would take up too much bandwidth. I think interlacing was a very good compression/bandwidth-reducing trick for analog, but now that we have digital compression techniques it's a very inefficient way of reducing bandwidth. |
|
April 4th, 2008, 02:28 PM | #11 |
Inner Circle
Join Date: Jan 2006
Location: Minnesota (USA)
Posts: 2,171
|
Suffice it to say, 60i was an elegant solution to the technical challenges of the 1930s. Unfortunately, that legacy sort of haunts us nowadays.
|
April 4th, 2008, 02:57 PM | #12 |
Major Player
Join Date: Oct 2005
Location: Hillsborough, NC, USA
Posts: 968
|
You have to take into account something else, namely the original imaging of the video. Even though the compression schemes exist to manage the bandwidth of broadcast material and the processing power of the latest generation of multiprocessor computers can (given enough money) handle encoding true 1080p60 in real-time, affordable consumer camcorders cannot encode the analog information coming from the sensor(s). To do so, the electronics would have to sample process analog information at twice the rate. That significantly increases the cost - doubling performance in a single quantum leap ain't cheap.
|
April 4th, 2008, 04:27 PM | #13 |
Regular Crew
Join Date: May 2007
Location: Penang, Malaysia
Posts: 123
|
A little history
Interlacing was invented by an RCA engineer named Randall C Ballard during the 1930's to improve picture quality without increasing the bandwidth of the signal. It should be interesting to note that at the time all video was 16 fps, so his interlacing scheme transmitted half the information at 32 fps, nowhere near the 60 fps used today... not sure when that got picked up. As mentioned previously all this was to work around limitations, not necessarily circuitry as a large portion of the interlacing was actually done mechanically.
For a little light reading, heres the original patent issued in 1939. http://patft.uspto.gov/netacgi/nph-P...mber=2,152,234 Mark |
April 4th, 2008, 04:40 PM | #14 | |
Inner Circle
Join Date: Jan 2006
Location: Minnesota (USA)
Posts: 2,171
|
Quote:
|
|
April 4th, 2008, 09:36 PM | #15 |
Major Player
Join Date: Apr 2002
Location: Lewisburg PA
Posts: 752
|
|
| ||||||
|
|