View Full Version : 1080i or 720p, which looks better?


Will Lau
October 27th, 2004, 05:10 PM
So what is the consensus on this board? Is 1080i (interlaced, 540 lines visible at one time) or 720p (720 lines visible at one time) better looking than the other? Besides the 3 ccd chip on the new sony, is having 1080i going to improve the image quality against the JVC?

Peter Moore
October 28th, 2004, 04:10 PM
This is a perpetual debate between HDTV fanatics. 1080i has more resolution but only half the picture changes every 1/60th of a second. However, the entire picture changes every 1/30th of a second. Compare this to HDV 720/30p, where the entire picutre also changes every 30th of a second, but the resolution is lower. Also keep in mind that with 1080i, you do see all 1080 lines at once due to residual light imprinting on your eye from looking at the phosohors.

My opinion is that 720/60p versus 1080/60i is a wash, but 1080i is clearly superior to 720/30p so the Sony should be much better than the JVC.

Will Lau
October 28th, 2004, 06:42 PM
So for greater detail, 1080i should look more sharper. But what about motion? Would the 1080i look more video like than the 720 progressvie image? And the 720p more film like, if you can call it that.

Peter Moore
October 29th, 2004, 09:46 AM
Again you have to qualify if you mean HDV 720/30p or HDTV 720/60p.

720/60p would have much smoother motion and be better for sports. 1080/60i would be much sharper when the image is still, but not quite as sharp for motion, though motion would still be pretty smooth. So take your pick there.

720/30p is basically the worst of those three standards. I tend to think 30p does not look film like but merely looks cheap. Personal opinion. So you have neither the smooth motion of 60p, nor the extra sharpness of 1080i.

Thus, sony's 1080i is objectively superior to JVC's 720/30p in my opinion.

Bill Ravens
October 29th, 2004, 09:57 AM
It appears that while 1080i may be visually superior to 720p, playback of 1080i is problemmatic. Any interlaced format doesn't play back well on a progressive display like a computer monitor. In my own opinion, interlace combing on a 1080i image is highly unacceptable and distracting.

Peter Moore
October 29th, 2004, 12:32 PM
Oh quite right. You always want your display to be native to the format you're watching.

Personally I've never been sure how 1080i is converted to 720/60p though this is my guess:

You take 1080 field A, combine it with 1080 field B, gaussian blur, and construct one frame. Then you take 1080 field B, combine it with 1080 field C, blur, frame 2. Etc.

Frame 1: A + B
Frame 2: B + C
Frame 3: C + D
Frame 4: D + E
Frame 5: E + F

etc.

That would give pretty smooth motion while maximizing resolution though you still have to blur a little bit. Another way to do it would be to construct each 720p frame from each 1080i field, upping 540 to 720. This would have VERY smooth motion but would not be very sharp.

Ken Hodson
October 29th, 2004, 02:15 PM
If we are comparing the HD10 and FX1 than that makes it easy'er. The Sony has smoother video. It is also interlaced so it looks like video. The HD10 looks very film like. A little faster than 24p film, but not enough that you can whip those pans.
Resolution they aren't much different. The FX1 has a 960x1080 CCD which it uses green pixel shift (ala XL1) to map out 1440x1080 which is 960(1440 pixel shift)x540 per field.
The HD10 is about 960x720 per frame.

Scott Anderson
October 29th, 2004, 05:02 PM
Ken, I have to disagree with the assertion that the resolutions of the HD10 and FX1 "aren't much different". By comparing one field of the FX1 to one frame of the HD10 is just not a fair comparison. As Peter said, quite rightly, 720p/30 is the worst of the three standards. With your comparison, you could de-interlace every frame of FX1 footage and end up with 720p/60. I don't think that's correct either. The FX1 is designed to shoot interlaced only (as far as we know so far), and displayed that way should look considerably better in all regards than the HD10.

Will, your question reminds me of a display I saw at NAB earlier this year. Windows Media HD compressed content was being shown in a very controlled room, with a high end projector, using footage both from NBC (1080i/60) and ABC (720p/60). Both formats looked awesome, much better and crisper than how film is projected in most theaters. What surprised me was that even the ABC 720p/60 from Monday Night Football looked like video. Even a progressive format at 60 full frames per second seemed "live". Our brains are just hard wired from years of watching TV and movies to percieve slower frame rates (24 or 30p) and longer shutter times as film-like, and higher frame rates as live TV, regardless of whether the frames are progressive or interlaced.

I personally believe the Sony is going to end up looking much better in all important ways than the JVC. Not only will it win on percieved quality with 60 "samples" per second (even if they are interlaced) and higher overall resolution, but the option will be there to use filmlook filters and frame blending software to simulate progressive "looks", even if Sony doesn't offer true 24 or 30p. The JVC is seriously limited by always being stuck in a semi-auto mode, and shooting in the least flexible and inferior of all the HD formats.

Ken Hodson
October 29th, 2004, 07:06 PM
We are talking resolution, not preferable frame rate, and in that regard they are both very similar as you can see by the numbers I wrote. The Sony comes no where near the full 1920x1080.
As far as frame rate I personally prefer progressive in every instance. Your example of 720p60 football looking like video is due to the high shutter speed they use. They are shooting a high motion sport so it makes sense. They want that ultra "live" look, but that doesn't mean it is a deficiency of progressive. You could use a much lower shutter and apply motion blur in post and have it look very film like.

Peter Moore
October 29th, 2004, 09:13 PM
No camera that I know of captures at 1920x1080. I think the Cinealta captures at 1440x1080, right?

Graeme Nattress
October 30th, 2004, 06:14 AM
HDCAM SR does the full 1920x1080, as does the Viper? Either way, HDCAM even at 1440x1080 has more resolution than can be broadcast anyway, and more resolution than most people's HDTVs, so why worry?? Even 1440x1080 looks blinkin good on a full rez 1920x1080 display.

the 960 CCD rez doesn't pixel shift to 1440, but to 1920 instead, which is then down sampled to 1440 I think. Pixel shift will give you real extra luma resolution at the expense of colour resolution. As HDV is a 4:2:0 format it's going to loose a lot of colour resolution along the way anyway, so it doesn't matter that it's loosing it as part of the pixel shift.

The FX1 1080i footage I have converts to 24p in post quite nicely, which is something that the 720p30 footage does not do.

Graeme

Tom Roper
October 30th, 2004, 09:59 AM
The JVC does NOT record 720p30 images in 960x720.

The imaging sensor CCD is 1280x960, and the full width is used to record 16x9 images, 840,000 pixels, 1280x656.

There are 750 scan lines, and 700 horizontal TV lines of resolution.

TV lines of resolution are specified using a circle with a diameter equal to the vertical resolution, thus the actual resolution for the full 16x9 image is 1156x656.

Ken Hodson
October 30th, 2004, 01:45 PM
I'm talking about the actual captured pixel resolution. Of course this is then put onto the "recorded tape format".

Tom -It is widely accepted that the HD10 captures about 960x700 actually it is probably closer to 960x625'ish, which is then recorded to 720p standard (1280x720).
The FX1 captures 960x540 per field which it then uses green pixel shift to increase luma capture 1440x540 per field. With interlace cameras, the effective vertical rez is reduced about 25% so it is actually 1440(PS)x 400 per field or 1440x800 frame.
Sony F950 1920x1080
Panasonic Varicam 960x720
These are the actual pixel captured resolutions. Not theroretical based on CCD or the final tape format resolution.

Tom Roper
October 30th, 2004, 07:23 PM
That's incorrect Ken, sorry. Please post your sources.

Ken Hodson
October 30th, 2004, 07:50 PM
Whats not correct?

Ken Hodson
October 30th, 2004, 08:51 PM
For the FX1's specs read here
http://videosystems.com/mag/video_sonys_hdv_debut/

"Sony developed a new 1/3in., Super HAD CCD for its HDV camcorders. Each CCD has 1,012 (horizontal) by 1,111 (vertical) elements (1,120,000 pixels) that provide an effective pixel count of 1,070,000 pixels (972 horizontal by 1,100 vertical. Vertical smear level is rated at a very low -107dB. Each element has a 2:1 aspect ratio."

JVC HD10. I thought this was all common knowledge by now. There have been many posts discussed in these forums.
The most technical info is probably this article.

"The JVC chip provides 632,640 luma pixels (960?659 pixels)."

http://www.findarticles.com/p/articles/mi_m0HFE/is_5_29/ai_102106333

Tom Roper
October 31st, 2004, 01:31 AM
Ken, I accept the Steve Mullen article. I apologize for misinterpreting your statement also.

The FX1 scales up from 960x1080(pixel shift), the HD10 resolves down to 960x659(luma/alias filters).

Tom Roper
October 31st, 2004, 02:50 AM
Interesting to note in the horizontal, the amount of down-resolving in the HD10 due to the 2 column sliding luma filter, the 2:1 ratio luma/chroma and anti-alias filters...

...is about roughly equal to the green pixel shift up-scaling in the FX1.

There's no contest on the luma/chroma ratio, but from Steve Mullen,

"...it is true that while a single CCD delivers two luminance samples for each cluster of four filtered CCD elements, the (conventional 1 CCD) chip is able to provide only a single red, green, and blue sample from the cluster. The JVC CCD is a new design that uses white, green, cyan, and yellow filters to deliver maximum effective vertical resolution.

The 2:1 ratio of luma samples to chroma samples is not that significant because both DV and MPEG-2 compression use color subsampling that reduces chroma resolution even further. NTSC DV compression uses 4:1:1 sampling, while HD MPEG-2 compression uses 4:2:0 sampling."

There, he seems to be saying that mpeg-2 compression (which pertains to the FX1 and HD10 both) plays a larger role in reducing chroma resolution than the 25% caused by the 2 column sliding filter on the JVC single CCD.

If that's so, there's a lot of non-native interpolation within both images caused by mpeg2 compression. Isn't that the whole point of GOPs? I-frames are the only actual native frames, outnumbered by B and P frames that are predictive in that they contain only the changes from the frame before and after?

So how do you put a number on THAT?

And if mpeg2 compression which applies to single and 3 CCD sensors alike, is more of a factor in reducing chroma resolution than single CCD filtering, should you be putting a number at all (like 25%) on single CCDs and not (3) CCDs?

I think there, one could make the case (speaking on resolution), that you can fairly square them off on native CCD sensor elements alone, that for GOP sequences, resolution is 1280x659(JVC HD10) versus 1012x1111(Sony FX1)

I'm talking about a GOP sequence (where mpeg2 governs), not a single I-frame capture where the 2 column sliding filter on the JVC single CCD takes a 25% horizontal resolution hit.

Either way you figure, the difference is only a percentage, but food for thought.

Tom

Graeme Nattress
October 31st, 2004, 07:50 AM
I think what Ken is trying to get at is that the resolution of the CCD is not necessarily what gets recorded to tape, and even if the pixel resolution of the format is played back in full, that might not be accurate to the true resolution of the system.

The only way tot really know is to measure test images, and even then, what do you do about moving shots which stress the MPEG2 compression?? In the end, it's going to be a subjective call.

Graeme

Tom Roper
October 31st, 2004, 10:27 AM
I agree Graeme, especially the second part about moving shots which stress mpeg2 compression.

The camcorder is foremost a motion picture making device, the point being that a fine detail that goes unresolved in one frame due to the two column sliding filter of a single CCD device, does get resolved in the next frame by motion that shifts the fine details into adjacent columns.

And that for either format(1CCD or 3CCD), what gets recorded to tape also might not be accurate to the true resolution of the system due to mpeg2 compression, as you stated.

So, when talking about resolution, why not stick to the objective versus the theoretical? The one thing that we can define unequivocably is the actual element array of CCD sensors behind the lens that gathers the light. That is what we did before pixel counting articles by Steve Mullin raised our awareness of what you stated in the end, will be a subjective call anyway. (On that count, there was never a question from me that the FX1 is a giant leap ahead of the HD10.)

Ken Hodson
October 31st, 2004, 07:40 PM
The mpeg2 compression will effect chroma sampling just as the DV codec did befor it. The FX1 will have an edge in colour resolution due to its 3 chip design, but not a big advantage in luma resolution.

"(On that count, there was never a question from me that the FX1 is a giant leap ahead of the HD10.)"

If interlaced image is a giant leap your welcome to it. I do envy the fully manual controlls though.

Graeme Nattress
October 31st, 2004, 07:58 PM
I would think that yes, 1080i is a big leap over 720p30, which is about the worst frame rate there is. And perhaps the biggest leap (other than manual controls) is that the FX1 footage is not overly sharp like the permanent sharpness lines around everything on the HD10 and especially the HD1 which makes it look like high resolution VHS. And those sharpness lines are something that you can't do anything about in post, whereas 1080i can be very sucessfully de-interlaced or 24p converted, or downconverted to progressive or interlaced PAL or NTSC quite well, which a 30p format will not without extensive and expensive processing.

However, 4:2:0 is ghastly on interlaced images, not that it's not ghastly anyway. Give me 4:1:1 over 4:2:0 any day. I'm going to have to code up a new 4:2:0 chroma reconstruction filter before I can do any kind of processing on HDV footage.

Graeme

Ken Hodson
November 1st, 2004, 01:50 PM
"FX1 footage is not overly sharp like the permanent sharpness lines around everything on the HD10 "

I do not find the HD10 footage to be like this at all. If anything for an HD image it is very soft. I don't know what you are thinking?

"or downconverted to progressive or interlaced PAL or NTSC quite well which a 30p format will not without extensive and expensive processing."

30p is just perfect for putting out anamorphic NTSC DVD's. Where 99.9% of everything I downconvert goes. As well JVC has been demo'ing HD10 footage converted to 25p in Europe using Edios software. Apparently it does a great conversion to PAL. But I live in NTSC land and regardless of what you say, it makes one heck of a NTSC DVD!

"However, 4:2:0 is ghastly on interlaced images"

But works just fine in 720p thanks very much.
I hope you enjoy massive render times befor you can colour correct that 4:2:0 as a deinterlaced image with any level of quality.

Graeme Nattress
November 1st, 2004, 02:00 PM
4:2:0 does not relate to colour correction issues, but to more image manipulation issues, and in that respect affects 720p nearly as much as it effects 1080i.

What I'm thinking about the sharpness is that it's permanently on, and it's visible as nasty edge enhancement, although much worse on the HD1 than the HD10. I don't see anything like that at all on the posted FX1 footage.

As regards to interlaced and progressive, it's a darn site easier and faster to make interlaced progressive than progressive interlaced, so shooting 1080i60 is a lot more flexible in post, in that respect. But as you say, be prepared for render times. It's obviously preferable to shoot in whatever format you want to end up in, rather than have to go in and convert between them.

I think the main issues with the HD10 are lack of manual controls, and it sounds like JVC will be adressing this before too long with their next camera. At that point, it will be "fair" to make some kind of comparison, but before then, you're comparing one obviously flawed (because it's the first of it's kind) camera with another that hardly anyone has been able to use yet, and it's only through kind people posting footage that we can have any idea at all what's going on.

Graeme

Tom Roper
November 1st, 2004, 02:52 PM
"or downconverted to progressive or interlaced PAL or NTSC quite well which a 30p format will not without extensive and expensive processing."

30p is just perfect for putting out anamorphic NTSC DVD's. Where 99.9% of everything I downconvert goes. As well JVC has been demo'ing HD10 footage converted to 25p in Europe using Edios software. Apparently it does a great conversion to PAL. But I live in NTSC land and regardless of what you say, it makes one heck of a NTSC DVD!

I don't think the JVC footage (from my HD1) downcoverts very well to NTSC DVD video because of the 30fps progressive 2:2 cadence results in excessive moire for many (U.S. mainly) DVD players.

The problem is not the 30fps progressive with 2:2 cadence format per se, but the fact that proper support for it in U.S. spec DVD players is so spotty. Players using the Faroudja deinterlacing chips in AUTO-2 mode, or Silicone Image chips will do fine, but among the others, it's hit and miss. Yes...the DVDs play, but when the player doesn't adapt to the 2:2 cadence or flags, the video displays excessive moire.

It's not a problem for PAL countries where progressive 2:2 cadence is common, and supported.

DVDs I make look fantastic on my DVD player (Faroudja), but borderline unsatisfactory on other players. It really does depend on the player itself. And since you don't know in advance which player your DVDs for distribution will be played on, you can't be certain that it will look its intended best on every showing.

I can't speak for the case of the FX1 because I haven't tried making an NTSC DVD from the video. But almost all U.S. spec. NTSC DVD players play any DVD movie with aplomb, because the vast majority of the DVDs are 24fps 480 interlaced movies. Almost every U.S. DVD player can be expected to properly play movie titles and support 24fps 480i, with conversion to progressive scan output and 3:2 pulldown.

That leads me to believe that the FX1 1080i interlaced video will convert to 480i NTSC with fewer artifacts and less moire, due to a conventional cadence, and the native output is interlaced.

Yiannis Kall
November 8th, 2004, 02:14 PM
the interlaced video is far more worse than the progressive. I have seen images from the my camera interlaced vs progressive and there is big difference on image quality. So i will wait for the next HDV camera from panasonic or form canon. The video quality of the new sony HDV is not so good at my 70' projector screen. The progressive video from my sony dv has more sharpeness than the interlaced video of sony HDV.
Dont buy it yet wait for the other companies HDV cams and the decide!