|
|||||||||
|
Thread Tools | Search this Thread |
January 27th, 2007, 10:48 AM | #1 |
Inner Circle
Join Date: Aug 2006
Location: Poland
Posts: 4,086
|
component input: is it deinterlaced?
I have the Fujitsu-Siemens ScenicView P24W, 24" LCD, 1920x1200 monitor. Using its component input, I can play 1080i from my Sony or Canon HDV camera flawlessly - there's the wow factor, etc. The picture has no interlace artefacts whatsoever; actually I was pretty sure the monitor had a very smart deinterlacer onboard...
Until I used the HDTV component output of my ATI graphics card, connected to the same component input of the monitor in question. The same 1080i material - captured to disk and played back with variety of software - does show typical interlace artefacts, like jagged edges and alike. Why is that? What is the difference in the component 1080i signal from the camera vs that from a captured raw m2t file? The first is uncompressed, but how does it affect the way interlacing is handled by the monitor? I'd appreciate your suggestions/explanations. |
January 27th, 2007, 11:05 AM | #2 |
Major Player
Join Date: Oct 2005
Location: Hillsborough, NC, USA
Posts: 968
|
It's because your camcorder is sending a true interlaced video signal to the monitor and the monitor is displaying it as truly interlaced.
From your graphics card, unless you have a specific setting you can adjust, it is most likely sending a progressive signal. Also, whatever software you are using to play the file will present the interlaced image as a single, progressive frame instead of two distinct fields. To convince yourself, send the captured file back to the camcorder and view it on the monitor connected to the camcorder. It should look "wow"! |
January 27th, 2007, 11:27 AM | #3 |
Inner Circle
Join Date: Aug 2006
Location: Poland
Posts: 4,086
|
I've always thought an LCD monitor can only display progressive (just like plasmas do), hence I believed it deinterlaces the component input from my camera.
Also, I believe the software I'm using send interlaced signal, as it has explicite options to deinterlace (blend, bob, compensate etc), which I usually do NOT use. So, I'm still unconvinced:( But perhaps I am missing something - so John, please try to explain it to me in more detail. Thanks! |
January 27th, 2007, 06:00 PM | #4 |
Trustee
|
I know little about the ATI's (I use Nvidia). However, is there a setting in your control panel to tell the graphics cards that the output monitor is an HDTV rather than a PC type monitor? Worth a shot.
|
January 27th, 2007, 06:05 PM | #5 |
Inner Circle
Join Date: Aug 2006
Location: Poland
Posts: 4,086
|
Of course, Peter - the system is configured so that the LCD is working as extended desktop in "Theater mode", and it's format is HDTV 1080i.
|
January 28th, 2007, 04:59 AM | #7 |
Inner Circle
Join Date: Aug 2006
Location: Poland
Posts: 4,086
|
No, it's not a field sequence issue. As to whether the monitor is deinteralcing, well - my question in the original post was about that! I mean, when I play back from my camera, I'm almost sure it is, because the material is of course interlaced, and there's no artefacts. But then, the same clip from disk, played back through the same component interface, looks jerky and has jagged edges.
There is one more factor that is only complicating the whole matter: I'm a PAL land, and my clips are 50i; unfortunately the Catalyst software for the ATI card does not offer 1080i/25HZ (PAL), only 1080i/30Hz (NTSC) format. Could this be the reason? I'd say the motion could get jerky, but edges? |
January 28th, 2007, 09:41 AM | #8 | |
Major Player
Join Date: Oct 2005
Location: Hillsborough, NC, USA
Posts: 968
|
Quote:
If so, you really should try my original suggestion - send the captured video back to the camcorder and connect the camcorder to the LCD display. If the jaggies disappear, it strongly suggests that your computer's video output isn't right. |
|
January 28th, 2007, 01:30 PM | #9 |
Inner Circle
Join Date: Aug 2006
Location: Poland
Posts: 4,086
|
Yes John, I mean the very same component input on the same monitor. I'll be doing tests as you prescibed, but more important to me would be an advise on some work-around...
|
January 28th, 2007, 03:23 PM | #10 |
Regular Crew
Join Date: Dec 2006
Location: Discovery Bay, CA
Posts: 138
|
Sounds like your graphics card is converting the interlaced fields to progressive frames which would leading to a combing effect and interlace artifacts. See if there is a frame blending setting in the control panel of the graphics card.
|
January 29th, 2007, 11:19 AM | #11 |
Inner Circle
Join Date: Aug 2006
Location: Poland
Posts: 4,086
|
The graphics card software (ATI catalyst) has all options set correctly (no deinterlacing). Howver, I'd like to stress again that - opposite to what the card's manual says - it doesn't offer the PAL HDTV format (only 1080/30Hz is available on component output). Perhaps this is the reason?
I've heard that nVidia card offer much better and flexible HDTV output, which model would you recommend? |
| ||||||
|
|