|
|||||||||
|
Thread Tools | Search this Thread |
November 6th, 2008, 02:08 PM | #1 |
Major Player
Join Date: Apr 2006
Location: Boston, Massachusetts
Posts: 616
|
Why is my FX-1 video not interlaced?
Sometimes I shoot w/ the shutter speed on 30 and other times 60. I always de-interlace my footage in after effects, simply by "interpreting" the footage.
Lately some footage that I bring into after effects doesn't look interlaced at all, and when I interpret the footage it looks really crunchy and bad. While other footage I had shot moments later, looks severely interlaced and interpreting the footage makes it look correct/progressive. I'm completely confused, I have a very good eye for interlacing, and I know where to be looking for horizontal lines. But it seems like some of my footage is randomly progressive. Could it be the 30 shutter speed footage just doesn't look interlaced? All this footage is fast action, and I can't understand why some is apparently not interlaced. I am importing this footage from Final Cut captured in apple intermediate codec. |
November 6th, 2008, 02:22 PM | #2 |
Wrangler
Join Date: Dec 2002
Location: Mays Landing, NJ
Posts: 11,802
|
When you shoot at 1/30 shutter speed (NTSC) the camera does something called "field doubling". In other words, the shutter stays open for 1/30 second, but each field of the video only lasts for 1/60 second. This means that the same data must be written to both interlaced fields, making them identical.
In doing so however, you are losing half of the vertical resolution - the same 540 lines are written to both fields of the 1080 line interlaced image. If you're downconverting to standard definition it won't matter much, since you only have a total of 480 lines there anyway. See the following article by Adam Wilt which discusses some related issues on the FX1 and Z1: AJW's HDV Info: Cineframe modes explained |
March 18th, 2009, 10:12 AM | #3 |
Major Player
Join Date: Apr 2006
Location: Boston, Massachusetts
Posts: 616
|
That article is about Cineframe 30 -and says that it does the same thing as s 30 shutter speed. If it is doing the same thing, then why does it exist?? Does it create a more 30p look to it? Rather than the motion blur introduced by a slow shutter?
|
March 18th, 2009, 08:27 PM | #4 |
Major Player
Join Date: May 2008
Location: Bangkok, Thailand
Posts: 400
|
First I have to admit I'm not an expert on any of the pseudo-progressive recording modes on interlaced cameras but remember reading a good article somewhere that explained the working or tweaking in-camera to imitate the progressive look. In the case of the Z1/FX1 cameras, Sony uses a technique similar to field blending of interlaced 60i recording to yield 30 progressive frames (or 24 frames with some sort of pull-down). The footage may or may not look like real 30/24p depending on who looking at it. One thing is certain, though, the footage loses about half the theoretical resolution compared to footage recorded in 1080/60i as the result of the field blending. Shutter speed is another matter altogether.
Most people I know including myself shoot 60i/50i and do whatever needs to and can be done in post. Wacharapong |
March 19th, 2009, 02:19 PM | #5 |
Major Player
Join Date: Apr 2006
Location: Boston, Massachusetts
Posts: 616
|
Now that I think about it: when an HDTV plays back 60i does it have to de-interlace it? If so isn't it only displaying half of the resolution through whatever means it de-interlaces?
In any case I can't use 60i because I do visual effects and always need to de-interlace by interpreting footage first in After Effects. But I don't see what's so great about double the resolution when it means an interlaced image, which just splices 2 different frames together with horizontal combing. I have to use a shutter speed of 30 a lot of the time for many different reasons, and I use 60 for straight forward shooting like talking heads, but my final video is always progressive anyway. So why not use Cineframe 30? Even if it throws away half the image won't that happen eventually with your interlaced footage? 60i is never an option as a final resolution for my work anyway, so I obviously don't understand it. But could someone tell me how 60i plays back properly on monitors, without having to de-interlace? |
March 19th, 2009, 08:06 PM | #6 |
Major Player
Join Date: May 2008
Location: Bangkok, Thailand
Posts: 400
|
You don't see combing artifact on your computer's LCD screen because one of the following two things is done behind the scene by either your player or NLE:
-The software blends two adjacent interlaced fields to create one complete progressive frame for display and playback on your screen. Your 1080/60i, thus, becomes 1080/30p. Don't confuse this with the real 1080/30p. The software just utilizes the double temporal (timing) resolution of the 60i recording to compensate for the lack of complete data required by a real progressive frame. -The software extrapolates a complete progressive frame for EACH of the interlaced field you have recorded. This process in effect "upscales" or "upconverts" your 1080/60i to 1080/60p for display using one of a variety of techniques to compensate for the obvious lack of complete framing data in each of the interlaced field. If you use a shutter speed of 1/30 or slower for your 60i shooting, the camera obviously has to find a way to capture the image during the 1/30 sec. while the shutter opens onto two consecutive interlaced fields. I'm not sure how this is normally done with interlaced CCD-sensor cameras like the Z1/FX1 but this capturing of image across two interlaced fields may have something to do with how your NLE identifies the footage captured from these cameras. Wacharapong |
March 25th, 2009, 10:37 AM | #7 | |
Inner Circle
Join Date: Aug 2005
Location: Atlanta/USA
Posts: 2,515
|
Quote:
Most TVs will display the actual video type - 480i, 720p, 1080i, etc. Mine for example displays this info for a few seconds when I switch chanels or inputs. |
|
March 25th, 2009, 05:04 PM | #8 | |
Inner Circle
Join Date: Mar 2003
Location: Ottawa, Ontario, Canada
Posts: 4,222
|
Quote:
All progressive displays have a fixed pixel count that they can display for each refresh of the display ( normally synced to the electricity supply in some way ( 60hz for North America)) Newer display are multiples of these ie 60, 120, 240 etc. IF a display is 1080p it means it has the pixel count and electronics to display 1920x1080 for each refresh cycle. Earlier displays that said they were 1080i were often actually 720P displays in common with the computer market 1366x768. All inputs thus have to be scaled to fit onto whatever the pixel array of the display, as well as being deinterlace for display at the refresh rate. The newer 120 hz+ displays solve some of these problems of deinterlacing by interpolating the extra full frames from the interlace input. They interpolate a full frame from each field and then with these full frames interpolate another frame between them too. This means that there is a new full frame every 120th of a second for full progressive display. However the inputs will still have to be scaled if they do not match the pixel count precisely. Ron Evans |
|
April 1st, 2009, 08:08 AM | #9 |
Major Player
Join Date: Apr 2006
Location: Boston, Massachusetts
Posts: 616
|
Thanks Ron, my TV is a 720p sharp 1366x768. And if we can assume that most people don't have the top of the line 1080p tvs capable of de-interlacing properly, it seems like de-interlacing on my end, or even shooting in cineframe 30 is sufficient for most viewers. I know this is a subjective opinion, but I'm stating it in hopes that someone will correct me if I'm totally missing something. At the very least I think cineframe is fine for my TV.
|
| ||||||
|
|