View Full Version : 50i vs 25p
Nik Skjoth August 5th, 2007, 03:04 PM After recording footage on my XL2 (pal) in both 50i and 25p, whereafter I captured it into premiere pro via firewire cable, i dont understand why there is no difference in image quality what so ever. Shouldent 50i be more smeared due to the interlaced processing?
Another observation I made is that the raw footage on the capture screen prior to capture, looks somewhat more clean and sharp than the same footage after its captured. Can someone elaborate?
Can this have anything to do with that the ressolution of the XL2 in 16/9 mode is actually 960x576, while premiere enterpolates it as 720x576?
Is there a difference in firewire cables, or firewire interface quality that would make a capture better or worse? Im guessing No since its all digital. 1 and 0'es, but want to make sure.
Ervin Farkas August 6th, 2007, 09:06 AM To your last question, your guess is correct, it's all digital data, it's either there or it's not - no quality difference whatsoever.
You don't see a difference on your computer monitor because all computer monitors work progressive and never interlaced, so your interlaced footage is probably deinterlaced by your player software before being sent to the monitor. You need to use and external video monitor (a simple television set or a higher quality broadcast monitor) in order to see the difference, provided that your video card has a video output.
Another alternative would be to use software that lets you watch the footage in it's original state - try free VirtualDub for example.
P.S. Your video (http://www.movielol.org/ip/video/1.php) is hard to watch, you need to slow down your camera movement; that's also the cause of blockyness - the encoder can't cope with ALL of the data changing frame to frame).
Nik Skjoth August 6th, 2007, 01:08 PM thanks for your reply... I was suspecting that a lcd screen would not show much difference, but some minor unsharpness should still be visible. Theoreticly if you take half of the information away from a picture, you only get half of the quality back. No matter where its shown...
PS: Ehm.. The video you are linking to is none of mine, why did you think it was?
Ervin Farkas August 6th, 2007, 01:12 PM When I posted my reply, that site was listed under your info as your homepage - looks like you deleted that.
Well, a good player software will not take away anything, it will blend the two fields together to form a whole frame - in other words puts back together what interlacing cuts in two fields.
Derek Prestegard August 25th, 2007, 06:08 PM The important thing to remember when comparing 50i/60i and 25p/30p is that the interlaced modes will always have more temporal resolution. They capture 50/60 half resolution images per second.
Now, in a static scene with little motion, they get full spatial resolution, because both fields spatially line up (mostly) and you get a nice sharp full height picture.
In motion, interlacing loses spatial resolution, but it maintains full temporal resolution for that lovely/dreadful (depending on your perspective) video look.
On a progressive display (all PC displays, and flat panel HDTVs), interlaced video must be deinterlaced. Usually this happens with a bob-deinterlacer, which turns 60i / 50i into 60p / 50p (to retain full temporal resolution), by using interpolation, and other more advanced methods depending on the quality of the deinterlacer.
Regular PAL / NTSC CRT TVs can display interlaced material natively due to persistence of vision associated with their type of CRT.
The way I see it, you have to be really aware of your target environment, and your post production techniques. If you're targeting joe average, it's still safe to assume he will be watching your project on an interlaced display. In this case it's usually best to shoot interlaced, because it's what his display natively handles. Unless you're shooting 24p for the filmic look..
Nowadays we've got HDTV, which has some very nice progressive formats. I consider 720p60 to be the best of both worlds, because it has full temporal resolution, 720 progressive lines, and manageable bitrate compared to 1080p60.
If you're shooting DV, but target people with HDTVs or their computer monitors (web distro), then it's probably best to either shoot in progressive, or de-interlace before authoring, as you can often achieve vastly superior results doing this offline in software. AviSynth is a great (free) tool for deinterlacing projects prior to final encoding (among many other useful tasks). Working it into an NLE workflow is tricky unless its at the very end.
Whatever you do, if you plan on deinterlacing a video, PLEASE don't use the deinterlace functions built into most NLEs, or QuickTime, etc... They are piss-poor. Compressor has some decent deinterlace routines, but the vast majority of commercial apps (except really high end payware stuff) have terrible deinterlacing. Good AviSynth deinterlacers like MCBob have full motion compensated deinterlacing, with masking and artifact protection. VERY impressive stuff that's albeit very slow (expect 1-2 fps on a very fast system with lossless codecs in and out for 4:2:2 YUY2 60i/50i -> 60p/50p)
Good luck. PM me if you need help with AviSynth
~Derek Prestegard
Nik Skjoth September 16th, 2007, 12:45 AM Well.. Ive been experimenting further.. and alltho your answers sound plausible, they dont represent what I see in practice.
Ive tryed watching some 50i footage on the same monitor shot with a different camera, and it clearly shows the interlacing on the lcd screen. The 50i footage from my XL2 does not. So either its due to a difference in ressolution making the fields so small that I cant even see them when pausing, or someone at the canon assembling line by mistake put in a 50p ccd or whatever makes 50p possible ;P
Also I noticed that when watching dvd's on my pc, they are also interlaced.. hm interesting. So even hollywood films shot with high end native progressive cameras, end up being interlaced in the final media. Then whats the point of shooting progressive, when there is no media for it, exept in the cinema, and for webstreams.
Another thing that ponders me is when you say that progressive shots, will look bad or worse on a standard Tv. How come? If the tv set interlaces everything that comes in, it would make no difference what the source is according to my logic.
Steve Brady September 23rd, 2007, 11:05 AM Concerning your XL2 footage, it strikes me that it's all being de-interlaced somewhere along the line. I don't use Premiere, so I don't know exactly how it works, but in After Effects, there's an "Interpret Footage" page, with a "Fields and Pulldown" box. The "Separate Fields" rollout should be set to "Off". If Premiere doesn't work the same way, then somebody who uses Premiere can tell you how to set it up.
Regarding the DVDs, it's certainly possible that some PAL DVDs are incorrectly encoded as interlaced, but it's hard to imagine a workflow whereby interlacing artifacts would be easily identifiable on the actual feature. Bonus materials are a different story; they may have gone through several format conversions before they end up on the DVD.
|
|