|
|||||||||
|
Thread Tools | Search this Thread |
July 5th, 2005, 02:56 PM | #1 |
New Boot
Join Date: Sep 2004
Location: TO, On
Posts: 18
|
Image flickering and crazy clients
So I'm using high quality 3d renderings and bringing them into final cut. Just about all of them are flickering in some way, I've put flicker filters on, deinterlace filters on, everything, they still flicker. I added some gaussian blur. This dimished most of the flicker, but the client doesn't want any blurry images whatsoever. Is there anyway to maintain the sharpness without the flicker, or is there any article I can show the client that it's not possible to get the amount of detail and 1 pixel lines on a video without the flickering?
__________________
Cathode Rays Daze |
July 7th, 2005, 04:18 AM | #2 |
RED Code Chef
Join Date: Oct 2001
Location: Holland
Posts: 12,514
|
I am not sure what the problem is here. Computer generated imagary should
not have that problem, especially if you render with things like anti-alias and motion blurring on. What 3D program are you using? I know those options both exist on NewTek's LightWave for example.
__________________
Rob Lohman, visuar@iname.com DV Info Wrangler & RED Code Chef Join the DV Challenge | Lady X Search DVinfo.net for quick answers | Buy from the best: DVinfo.net sponsors |
July 7th, 2005, 07:12 AM | #3 |
New Boot
Join Date: Sep 2004
Location: TO, On
Posts: 18
|
I received the 3d renderings from a seperate company so I don't know what they used to create them, but the problem comes when I bring them into Final Cut, there's just too much detail. I ended up putting maximum flicker filter on all of them as well as gaussian blur ranging from .5 - 1
I will be delivering it to the client today so hopefully they don't have major issues with the slight blur. It seems to me it can only be one or the other, sharpness with flicker or slight blur with no flicker...
__________________
Cathode Rays Daze |
July 7th, 2005, 09:36 AM | #4 |
Wrangler
|
John,
That's the problem with viewing on an interlaced monitor. The general concensus is that no horizontal elements of the graphic can be other than an even number of lines in width. This prevents the image from disappearing and reappearing (flicker) since the horizontal elements will always fall on a displayed field. The other option that gets used is one you have already tried, which is a slight blur. This helps 'spill' the horizontal elements onto adjacent fields of the interlace scan so that they don't flicker. Hope this helps. You might want to give this feedback to the folks that produced the graphics. -gb- |
July 7th, 2005, 03:39 PM | #5 |
Major Player
|
Depending on your setup there is one way to hopefully improve the quality.
Assuming you are using FCP5, if not skip to part 2. Part 1 First of all set the sequence rendering to best quality. While in the timeline hit Apple 0. Select the video processing tab, and set motion filtering quality to BEST. Part2 Control click on the problem graphics clip in the timeline and set the composite mode to screen or add maybe. This forces FCP to render the clips which should improve the quality. Of course this doesn't help if there is a clip under the graphic item in the timeline, but if there was it would probably need rendering anyway. Other ways to do this is to turn off effects handling, but this may cause you more confusion, in case you forget to turn them back on for your next project. Just to say I am finding FCP5 rendering quality very good, but simply setting a timeline to high quality doesn't guarantee that the new scaling algorithms kick in. It needs a kicj=k up the arse to get them to render. Apple should have a clip by clip render option. Control click on the item, and have an option to force and item to render. |
| ||||||
|
|