Why does 1080i exist? at DVinfo.net
DV Info Net

Go Back   DV Info Net > The Tools of DV and HD Production > The View: Video Display Hardware and Software
Register FAQ Today's Posts Buyer's Guides

The View: Video Display Hardware and Software
Video Monitors and Media Players for field or studio use (all display technologies).

Reply
 
Thread Tools Search this Thread
Old April 3rd, 2008, 03:36 PM   #1
Major Player
 
Join Date: Apr 2006
Location: Boston, Massachusetts
Posts: 616
Why does 1080i exist?

I've been told that 720p/1080i tvs have a native resolution of 1366x768. So it's pointless to view something at 1080i because the tv has to downscale to 720p and even de-interlace. If LCD TVs have to de-interlace than why are we still making interlaced video? Is 720p better than 1080i on a monitor that has a native resolution of 1366x768? (I have the option to play blu-ray in either 720p or 1080i -this is why I ask)
Aric Mannion is offline   Reply With Quote
Old April 3rd, 2008, 04:52 PM   #2
Major Player
 
Join Date: Jan 2007
Location: Portland, OR
Posts: 949
Quote:
Originally Posted by Aric Mannion View Post
I've been told that 720p/1080i tvs have a native resolution of 1366x768. So it's pointless to view something at 1080i because the tv has to downscale to 720p and even de-interlace.
Correct.

Quote:
Originally Posted by Aric Mannion View Post
If LCD TVs have to de-interlace than why are we still making interlaced video?
I'm not making interlaced video.

Other people do because of the limitations of their camera (e.g. most cameras lack 60p), their skill (can't hold the camera steady enough for 30p), subject matter (can't always get it right the first time in sports), or delivery requirements.

Quote:
Originally Posted by Aric Mannion View Post
Is 720p better than 1080i on a monitor that has a native resolution of 1366x768? (I have the option to play blu-ray in either 720p or 1080i -this is why I ask)
Generally, yes. Whichever one has the better deinterlacing and resizing algorithms.
Daniel Browning is offline   Reply With Quote
Old April 3rd, 2008, 05:17 PM   #3
Major Player
 
Join Date: Oct 2005
Location: Hillsborough, NC, USA
Posts: 968
30 frames per second look jerky. 60 frames per second don't. 60 frames per second require a lot of bandwidth. 60 fields per second require the same amount as 30 frames per second. Progressive vs. interlaced = jerky vs. smooth and is why interlacing was originally invented in the early 20th century. Once 1080p/60 camcorders become affordable to the average consumer, interlacing will eventually disappear.

Same arguments apply for 25 vs 50.

720p/1080i TVs = 1366 x 768 isn't true. 1080p native TVs (1920 x 1080) can display 720p/1080i.
John Miller is offline   Reply With Quote
Old April 3rd, 2008, 05:26 PM   #4
Major Player
 
Join Date: Jan 2007
Location: Portland, OR
Posts: 949
Quote:
Originally Posted by John Miller View Post
30 frames per second look jerky. 60 frames per second don't.
For some subjects, you're right. Amateur home video, unscripted sports events, pans that don't follow a subject, etc.
Daniel Browning is offline   Reply With Quote
Old April 3rd, 2008, 05:36 PM   #5
Inner Circle
 
Join Date: Jun 2003
Location: Toronto, Canada
Posts: 4,750
Quote:
I've been told that 720p/1080i tvs have a native resolution of 1366x768.
Some of them do... but other TVs have 1920x1080 pixels. In the future I'd expect more and more TVs to be capable of full HD resolution (but possibly cropping some pixels on the sides for overscan).

Quote:
If LCD TVs have to de-interlace than why are we still making interlaced video?
Possible answers:
- HDTV standards were established several years back. At that time, the CRT was king and the flat panel monitors weren't really that good for critical evaluation. They would evaluate systems by looking at a CRT monitor (which obviously is fine with interlace).
It was in 1990 when competing HD formats were announced.
http://en.wikipedia.org/wiki/Grand_Alliance_(HDTV)
And before that, people were even thinking about doing analog HD.

- At that time, the choice was mostly between 720p (60fps) and 1080i60 (29.97 fps). That's about the bandwidth that will easily fit over a single coaxial cable; to go higher bandwidth like 1080p60 won't do that. Presumably the cameras and recording formats at that time would also have had problems with higher bandwidth.

- Between 1080i and 720p, 1080i is capable of slightly higher spatial resolution. Because of interlacing, the camera has to filter/blur the image vertically a little bit to avoid interlacing nasties. The deinterlacer in the TV, depending on its quality, can also reduce resolution. (Right now most TVs will discard a field and interpolate, which loses half the resolution.) The engineers in favour of 1080i argue that deinterlacers will get better in the future.
Horizontally, most broadcasters broadcast 1080i at 1440x1080 raster.

The argument for 720p is that:
- It avoids interlacing artifacts. This is important since most HDTVs will be flat panels and *need* to deinterlace the image. Yves Faroudja, a designer of high-end deinterlacers, has said that "deinterlacing does not work!".
- Interlacing is not a very efficient way of compressing imagery. Instead of (filtering and then) discarding half the data, it is better to use other compression techniques like DCT, motion compensation, and other compression techniques. 720p uses bandwidth much more efficiently.
One EBU test also shows that 1080p50 looks better than 1080i50. (I believe their tests also show that 720p only looks better at low bitrates.)
Glenn Chan is offline   Reply With Quote
Old April 3rd, 2008, 05:51 PM   #6
Inner Circle
 
Join Date: Feb 2007
Location: Fairfield, Dunedin, New Zealand
Posts: 3,689
Images: 18
Nope.................

720p/ 1080i/ 1080p sets can have almost any native screen resolution they like as long as they physically have a minimum of 720 horizontal lines. (see "Weasel Worded" below).

The most common "weasel worded" "HD READY" screens (here at least) are indeed 1366 X 768 or thereabouts.

The only thing those screens can display NATIVE is 720p, there is no 720i as far as I'm aware. 1080p (if available)/ 1080i / 576p/ 576i/ 480p/ 480i has to be scaled to play on these screens, with varying degrees of success.

To play 1080p/ 1080i NATIVE your screen must physically have 1080 (minimum) lines of 1920 pixels each. More is available, with 1920 X 1200 quite popular for computer monitors. (these are the only true (Full) HD screens).


CS
Chris Soucy is offline   Reply With Quote
Old April 3rd, 2008, 11:41 PM   #7
Inner Circle
 
Join Date: Jan 2006
Location: Minnesota (USA)
Posts: 2,171
If I recall correctly, interlaced video came into being, because back in the 1930s it was cost prohibitive to produce a cathode ray tube that could "paint" all 480 lines of video fast enough to avoid a significant lag between the first and last lines being painted (similar to today's difficulties with rolling shutters on CMOS sensors). Since it wasn't practical to paint all 480 lines at once every 1/30th of a second, the solution that was come up with, was to paint every other line and then fill in the gaps a 60th of a second later. I believe that increased temporal resolution was actually an unintended benefit.
Robert M Wright is offline   Reply With Quote
Old April 4th, 2008, 08:56 AM   #8
Major Player
 
Join Date: Apr 2006
Location: Boston, Massachusetts
Posts: 616
Thanks, I guess I'll switch the blu-ray player to 720p instead of 1080i.
The more I think about HDV the more I like it. I have a sony FX1 and people constantly bash it because it is compressed, and 1080i (or atleast stretched to 1080i).
The only thing I don't like is I hear that it compresses more when I capture it to final cut. I'm going to try to learn how to avoid that extra compression.

Last edited by Aric Mannion; April 4th, 2008 at 08:58 AM. Reason: Oh, and the reason I do like it is I can shoot HD by cutting all these corners.
Aric Mannion is offline   Reply With Quote
Old April 4th, 2008, 09:23 AM   #9
Wrangler
 
Join Date: Oct 2003
Location: DFW area, TX
Posts: 6,117
Images: 1
Quote:
Originally Posted by Robert M Wright View Post
If I recall correctly, interlaced video came into being, because back in the 1930s it was cost prohibitive to produce a cathode ray tube that could "paint" all 480 lines of video fast enough to avoid a significant lag between the first and last lines being painted (similar to today's difficulties with rolling shutters on CMOS sensors). Since it wasn't practical to paint all 480 lines at once every 1/30th of a second, the solution that was come up with, was to paint every other line and then fill in the gaps a 60th of a second later. I believe that increased temporal resolution was actually an unintended benefit.
Well, you are close. It wasn't cost prohibitive in the normal sense. They simply didn't have circuitry designs that could achieve much faster than 30 fps. The problem was, in brighter environments where tv is viewed, 30p does have strobing issues. 24P is supposed to be watched in a darkened environment so that our 'persistence of vision' will fill in the space between frames. So they decided on the 60 interlaced fields per second to keep the motion sampling rate high enough to achieve smooth motion. In the CRT world, this isn't such a bad thing due to the persistence of phosphor which will still have a residual of one field as the next one is being painted on the adjacent lines. That helps your eyes perceive a full frame being displayed at once.

If you ever want to see the persistence of phosphor, shut off all room lights for awhile so that your irises open up. Turn on a TV for a few moments then turn it off. The screen will seem to glow for awhile.

As to the 1366x768 statement, it is true that low cost, budget LCD tvs have this limitation. Take a look at other LCDs of the same size with a much higher price tag and you'll see that they are full 1920x1080 panels. I had to point this out to my elderly parents as they wanted to buy an HDTV this past Christmas.

-gb-

-gb-
Greg Boston is offline   Reply With Quote
Old April 4th, 2008, 01:52 PM   #10
Inner Circle
 
Join Date: Jun 2003
Location: Toronto, Canada
Posts: 4,750
Quote:
So they decided on the 60 interlaced fields per second to keep the motion sampling rate high enough to achieve smooth motion.
I don't buy that explanation. To my eyes, 60 interlaced fields / second looks weird. The motion doesn't look right.

One clear advantage of 60 fields/second is that flicker is pretty low. Whereas to do 60 frames/second would take up too much bandwidth. I think interlacing was a very good compression/bandwidth-reducing trick for analog, but now that we have digital compression techniques it's a very inefficient way of reducing bandwidth.
Glenn Chan is offline   Reply With Quote
Old April 4th, 2008, 02:28 PM   #11
Inner Circle
 
Join Date: Jan 2006
Location: Minnesota (USA)
Posts: 2,171
Suffice it to say, 60i was an elegant solution to the technical challenges of the 1930s. Unfortunately, that legacy sort of haunts us nowadays.
Robert M Wright is offline   Reply With Quote
Old April 4th, 2008, 02:57 PM   #12
Major Player
 
Join Date: Oct 2005
Location: Hillsborough, NC, USA
Posts: 968
You have to take into account something else, namely the original imaging of the video. Even though the compression schemes exist to manage the bandwidth of broadcast material and the processing power of the latest generation of multiprocessor computers can (given enough money) handle encoding true 1080p60 in real-time, affordable consumer camcorders cannot encode the analog information coming from the sensor(s). To do so, the electronics would have to sample process analog information at twice the rate. That significantly increases the cost - doubling performance in a single quantum leap ain't cheap.
John Miller is offline   Reply With Quote
Old April 4th, 2008, 04:27 PM   #13
Regular Crew
 
Join Date: May 2007
Location: Penang, Malaysia
Posts: 123
A little history

Interlacing was invented by an RCA engineer named Randall C Ballard during the 1930's to improve picture quality without increasing the bandwidth of the signal. It should be interesting to note that at the time all video was 16 fps, so his interlacing scheme transmitted half the information at 32 fps, nowhere near the 60 fps used today... not sure when that got picked up. As mentioned previously all this was to work around limitations, not necessarily circuitry as a large portion of the interlacing was actually done mechanically.

For a little light reading, heres the original patent issued in 1939.

http://patft.uspto.gov/netacgi/nph-P...mber=2,152,234

Mark
Mark Keck is offline   Reply With Quote
Old April 4th, 2008, 04:40 PM   #14
Inner Circle
 
Join Date: Jan 2006
Location: Minnesota (USA)
Posts: 2,171
Quote:
Originally Posted by John Miller View Post
You have to take into account something else, namely the original imaging of the video. Even though the compression schemes exist to manage the bandwidth of broadcast material and the processing power of the latest generation of multiprocessor computers can (given enough money) handle encoding true 1080p60 in real-time, affordable consumer camcorders cannot encode the analog information coming from the sensor(s). To do so, the electronics would have to sample process analog information at twice the rate. That significantly increases the cost - doubling performance in a single quantum leap ain't cheap.
Actually, I believe the biggest bottleneck is getting in-camera compression that is efficient enough for recording 1080p60 inexpensively. Unless I'm mistaken, many cameras (like the HVX200) do the raw AD conversion at 1080p60, before converting to whatever format will be recorded. MPEG2 isn't really very well suited for compressing 1080p60. No problem getting MPEG2 fast enough to compress it, but the bitrate would be unwieldy to use with currently inexpensive recording media. AVC is coming along nicely though, and should be practical to deliver a little better than HDV like quality at HDV like bitrates (25mbps VBR for 1080p60 on cheap flash memory, like class-6 SDHC) in the not-so-distant future. If a major camera manufacturer decided to make a firm commitment to wavelet (rather than DCT like) lossy compression for acquisition, they could get there in pretty short order (assuming proper management of software development - piece of cake if it's done right).
Robert M Wright is offline   Reply With Quote
Old April 4th, 2008, 09:36 PM   #15
Major Player
 
Join Date: Apr 2002
Location: Lewisburg PA
Posts: 752
http://hd1080i.blogspot.com/
Peter Wiley is offline   Reply
Reply

DV Info Net refers all where-to-buy and where-to-rent questions exclusively to these trusted full line dealers and rental houses...

B&H Photo Video
(866) 521-7381
New York, NY USA

Scan Computers Int. Ltd.
+44 0871-472-4747
Bolton, Lancashire UK


DV Info Net also encourages you to support local businesses and buy from an authorized dealer in your neighborhood.
  You are here: DV Info Net > The Tools of DV and HD Production > The View: Video Display Hardware and Software


 



All times are GMT -6. The time now is 07:59 PM.


DV Info Net -- Real Names, Real People, Real Info!
1998-2024 The Digital Video Information Network