View Full Version : progressive vs interlaced


Laurence Bannister
May 23rd, 2008, 10:16 PM
Hi all, just a really quick question (hopefully not to dumb of a question. Does one get more usable information out of 50i than 25p. I feel I can convert 50i to 25p quite well using Donald Graft's Smart deinterlace filter so my question is not so much relating to the intendend final media. My question is that if I can convert 50i to 25p (ignoring the quality of the conversion) why would I not just shoot in 50i to get the maximum amount of information and then just "discard" as required?

John Bosco Jr.
May 23rd, 2008, 10:33 PM
Hi all, just a really quick question (hopefully not to dumb of a question. Does one get more usable information out of 50i than 25p. I feel I can convert 50i to 25p quite well using Donald Graft's Smart deinterlace filter so my question is not so much relating to the intendend final media. My question is that if I can convert 50i to 25p (ignoring the quality of the conversion) why would I not just shoot in 50i to get the maximum amount of information and then just "discard" as required?

No questions are dumb. You don't get anymore or anyless information out of 50i than 25p. So no need to convert 50i to 25p if you can shoot 25p.

Laurence Bannister
May 23rd, 2008, 10:37 PM
So I can then shoot 50i and assuming I can deinterlace nicely, once I've deinterlaced it will be the same as 25p?

(btw: I'm shooting with a Canon XH-A1)

Glenn Chan
May 23rd, 2008, 11:54 PM
I doubt it. 50i does not have the same information as a 25p signal (and vice versa).

The easy way to answer your question would be to print out a test pattern (I suggest zone plate), and shoot that handheld in both modes with some motion. You'd also take into account other camera quirks like the quality of the progressive shooting mode (on some cameras, it's crap).

In interlaced mode, you might notice that the vertical resolution is reduced (this is intentional; on some cameras you can turn this off which lowers sensitivity and increases interlace-related artifacts).

David Heath
May 26th, 2008, 05:02 PM
Does one get more usable information out of 50i than 25p.
Better to say that each gives different information - i gives better motion rendition, p gives better resolution. One throws one set of things away, the other a different set.

But starting off with 1080i/25 and going to 1080p/25 likely means you will throw both away! If you ultimately want 1080p/25, much better to shoot it that way from the start.

Aaron Courtney
May 27th, 2008, 10:26 AM
David, speaking strictly of the interlace format vs. progressive format, it does not give better motion rendition than shooting progressively when displayed on today's progressive displays. Even though you are capturing 50/60 images per sec vs. 25/30, a progressive display must de-interlace those fields by recombining them into 25/30 frames. So you're left with the exact number of real frames whether you shoot 50i/60i vs. 25p/30p.

Now this does not mean that video processor manufacturers cannot implement their own proprietary de-interlacing algorithms that attempt to create 50/60 frames/sec from 25/30 frames/sec derived from 50/60 fields/sec. But this is a function of the de-interlacing technology, not an attribute of the interlace format.

Check out the exhaustive frame rate thread in the AVCHD forum for more info...

Graham Hickling
May 28th, 2008, 06:31 AM
I would assume "todays" flat panel displays are simply an interim step towards displays capable of displaying genuine 50P/60P. Or is there some reason that's not so?

Tim Polster
May 28th, 2008, 08:16 AM
I might be wrong, but I am pretty sure todays TVs will display 50/60p without a hickup.

I think the previous post was talking about 50/60 fields from interlaced as oppsed to 25/30 frames.

Ron Evans
May 28th, 2008, 08:44 AM
Quite a few of the latest LCD displays can display at 120P or higher interpolating the extra frames to improve the smoothness of the video image and allow true 24P display without pulldown.

Ron Evans

Thomas Smet
May 28th, 2008, 08:52 AM
David, speaking strictly of the interlace format vs. progressive format, it does not give better motion rendition than shooting progressively when displayed on today's progressive displays. Even though you are capturing 50/60 images per sec vs. 25/30, a progressive display must de-interlace those fields by recombining them into 25/30 frames. So you're left with the exact number of real frames whether you shoot 50i/60i vs. 25p/30p.

Now this does not mean that video processor manufacturers cannot implement their own proprietary de-interlacing algorithms that attempt to create 50/60 frames/sec from 25/30 frames/sec derived from 50/60 fields/sec. But this is a function of the de-interlacing technology, not an attribute of the interlace format.

Check out the exhaustive frame rate thread in the AVCHD forum for more info...

This is not true at all. A HDTV or other progressive display TV will bob or use an even better way where it turns 60i into 60p by displaying one field at a time. So shooting 60i will have much better motion then 30p no doubt. Of course this means each 1/60th of a second is only showing one field worth of pixels but it moves so fast that they create the illusion of full pixel images. Interlaced can work well as long as the pixels are small enough. SD interlaced can fall apart pretty quick since it is easier to see each line. Some really nice HDTV's do an even better job then just bobbing the fields. Some HDTv's will start with a bob but then fill in the missing pixels between each line by doing some very funky math.

Aaron Courtney
May 28th, 2008, 12:43 PM
This is not true at all. A HDTV or other progressive display TV will bob or use an even better way where it turns 60i into 60p by displaying one field at a time.

Thomas, I suggest you do some more research on this topic. Here's a great place to start:

http://www.hqv.com/technology/index1/deinterlacing.cfm?CFID=15270681&CFTOKEN=17127493

Successful de-interlacing does not mean displaying a 1920x540 frame on a 1920x1080 progressive display. Every piece of commentary that I have read on the subject of de-interlacing indicates that is the worst-case end result, where the author implies that the de-interlacing video processor has "failed" to properly de-interlace the video stream.

Everything that I said in my post can be corroborated by technical discussions provided by HQV, Anchor Bay Technologies, or any of the other video processing chipset manufacturers, including the following:

Now this does not mean that video processor manufacturers cannot implement their own proprietary de-interlacing algorithms that attempt to create 50/60 frames/sec from 25/30 frames/sec derived from 50/60 fields/sec. But this is a function of the de-interlacing technology, not an attribute of the interlace format.

Aaron Courtney
May 28th, 2008, 12:54 PM
I would assume "todays" flat panel displays are simply an interim step towards displays capable of displaying genuine 50P/60P. Or is there some reason that's not so?

Actually, all progressive displays must operate at a specific refresh rate stated in cycles per second (Hz). 60 Hz is most common in the U.S., while 50 Hz (I assume) is the most common in PAL countries. There's your 60P & 50P right now - no waiting required. Several manufacturers have also implemented intervals of 24 fps (e.g., 48 Hz, 72 Hz, etc.) to frame double, triple, etc. 24 fps material in order to eliminate 3:2 pulldown-derived frames.

This discussion has more to do with how we are feeding those 50/60 progressive frames per sec to the display than anything else. As an aside, Graham, I'm going to go ahead and give your down-rez'ing of 1920x540 fields to 720/60P procedure a try and see what the results are. I imagine the results will compare very favorably to 1080/60I, particularly with these inexpensive consumer cams.

Graham Hickling
May 28th, 2008, 01:28 PM
Well my one contribution to this discussion is that I presently have a 42" rear projection TV being fed i30, p30 and/or p60 signals via the component output of a Tixv M4000u. (The p30 and p60 footage being produced by software deinterlacing of 1080i footage, using avisynth and tdeint).

With that setup, sitting close to the screen, the "sweet spot" is 720p60 for my own video footage from a Sony FX7. Motion is better defined than with p30, and it lacks a slightly fatiguing "jitteriness" present when the original 1080i30 footage is replayed. As an aside it's critical that the footage be shoot with a fixed 1/60 shutter speed - anything at higher shutter speeds looks like cr@p after the software deinterlace.

I certainly accept that the "sweet spot" may vary, depending on the display technology.

Ron Evans
May 28th, 2008, 01:56 PM
The sequence chain to the TV can have a big effect on how the image looks. I shot some skiing, hand held with my Sony HC96 ( stand DV) and made a DVD. Played back from the camera on my iART CRT the image had smooth motion, DVD played back from Pansonic DVD player attached to the same iART also was OK but not quite as smooth. Played back from the camera to my Panasonic 1080p Plasma image was not as smooth as CRT, played back from DVD player attached to Plasma was about the same, played back from my PS3 attached via HDMI was unwatchable. The PS3 was presumably upscaling to 1080P30 and not doing a very good job either. Now to AVCHD from my SR11, shot at 1920x1080i, playback from the camera HDMI is smooth, AVCHD DVD made from Sony Motion Browser software and played back from the PS3 is also smooth.... The latest Sony product book makes comments about receivers etc passing 1080P60 so I am wondering which products are actually outputing 1080P60. It is not part of the Bluray spec but is potentially true for MPEG4.
Aaron my only comments on your info is that I think the deinterlacers that go to 60 likely do not first deinterlace to 30 then to 60 but in fact use the interlace fields to create the higher frame rates directly using multiple fields for motion vectoring and interpolation information.

Ron Evans

Aaron Courtney
May 28th, 2008, 03:18 PM
Ron, it's clear that for progressive displays operating at 60 Hz, the last-in-chain video processor MUST output 60 progressive frames/sec. How the manufacturer implements its technology falls under the proprietary verbiage that I initially mentioned.

My point was simply that the interlace format has been bastardized (necessarily in order for it to be compatible with today's progressive displays) by these manufacturers and manipulated in a manner in which it was never originally intended to be manipulated. Therefore, it's not really accurate to say that motion rendition is inherently better due to interlaced shooting vis-a-vis progressive shooting, even if that is the end result due to technical limitations with progressive acquisition (e.g., no 1080/60P option) vs. advancements in de-interlacing technology (e.g., motion adaptive, pixel adaptive, interpolation, etc.).

I realize this is largely an academic discussion with very little real world practicality at this point due to issues raised in the other thread, so I'll refrain from further ranting, LOL.

Aaron Courtney
May 28th, 2008, 03:24 PM
As an aside it's critical that the footage be shoot with a fixed 1/60 shutter speed - anything at higher shutter speeds looks like cr@p after the software deinterlace.

Thanks again for that Graham. I promise, I'm going to give this a try. I have a feeling that for the foreseeable future, this process is going to yield the finest BD distributable product for playback on progressive displays (while avoiding the whole "how good is your de-interlacer" issue with 1080/60I video).