|
|||||||||
|
Thread Tools | Search this Thread |
July 20th, 2007, 05:46 AM | #1 |
Regular Crew
Join Date: Mar 2007
Location: Kaapstad South Africa
Posts: 64
|
video quality?
Hi, i have some digital video on my pc, which plays perfectly 100% in windows media player.
But when i use them in certain software, i get this result: look at the waving hand in this movie: http://www.discovideo.be/hand.wmv It occurs in the preview window of Sony Vegas en Resolume (VJ-software), no matter what i change in my preferences or system, i can't get it good. How to solve this, or what causes this? |
July 20th, 2007, 10:14 AM | #2 |
Major Player
Join Date: Oct 2005
Location: Hillsborough, NC, USA
Posts: 968
|
You are seeing the effects of interlacing - it's normal.
If it doesn't show in Media Player, then Media Player (or the video decoder it is using) is deinterlacing the image. Vegas is doing the correct thing and showing you the entire video image as it should be. Whether you do anything about it or not depends on the final destination for the video. If the video is going to be watched on a normal TV, just leave it. If you are only going to watch it on a computer then deinterlace the video. Vegas should have that ability. |
July 21st, 2007, 04:49 PM | #3 |
Major Player
Join Date: Aug 2004
Location: Durango, Colorado, USA
Posts: 711
|
John is right. All televisions everywhere blend two still images into one frame to create a smoother illusion of natural movement. In the US (NTSC video standard), the frame rate is 1/30 second. The first image (the lower scan lines for DV) is 1/60 sec. The second image (the upper scan lines for DV) is 1/60.
However, if you freeze a video frame for use as a still image, be sure to de-interlace before you export to video for television viewing.
__________________
Waldemar |
July 21st, 2007, 11:14 PM | #4 |
Major Player
Join Date: Feb 2007
Location: Orlando, FL
Posts: 404
|
What about deinterlacing anyway? I did that on a dvd project last year and got nice results. Seemed a little more like 30p. Playback seems fine on all the tv's I've viewed it on, but I wondered if this was incorrect to do in some way?
Eric |
July 23rd, 2007, 10:36 AM | #5 |
Regular Crew
Join Date: Aug 2006
Location: London United Kingdom
Posts: 77
|
actually i am quite sure that media player (the newer version at least) does display interlaced footage - rather than deinterlacing it and then displaying it. i think it updates the display 60 times a second with an interpolated version of each field.
it's quite cool. that way you can spot field order problems that these days you can only see on a CRT (and who still has one of those) this of course would explain why you don't see the wiggly lines in media player but in other players - that dont deinterlace or show 60 fields a second but just display both fields at the same time - which doesnt work |
July 23rd, 2007, 11:02 AM | #6 |
Major Player
Join Date: Oct 2005
Location: Hillsborough, NC, USA
Posts: 968
|
The Microsoft DV decoder has always generated interlaced frames. The video renderer determines whether it is displayed as such or not. However, interlaced images are always displayed as a single, composite frame.
Re who has CRTs? I do. And plenty of them! They work perfectly and for video are vastly superior to LCD displays as far as the dynamic range for color is concerned. e.g., you can display true black on a CRT. CRTs are still the de facto choice for critical monitoring. For computers, CRTs have the advantage of being able to accommodate different resolutions and frame rates without compromising the image. The only LCD displays I have are those for laptops. (Just my 2p!) |
July 24th, 2007, 12:30 PM | #7 |
Inner Circle
Join Date: Jun 2003
Location: Toronto, Canada
Posts: 4,750
|
IMO... you should absolutely monitor on a CRT TV or broadcast monitor for SD work. The image you see on a computer monitor is generally inaccurate/misleading... you can miss things like interlace flicker, field order problems, overscan/cropping, etc.
2- Some LCDs can exceed the contrast ratio of CRTs. If the ambient lighting is bright, a LCD can be much better since it has less glare on the screen. In broadcast monitors, CRTs generally have a better contrast ratio. But the high-end ones are starting to pull ahead of CRTs. The Sony BVM-A line (CRT) is very popular for critical monitoring. However, they don't show full resolution perfectly. |
October 14th, 2007, 06:34 PM | #8 | ||
Regular Crew
Join Date: Aug 2006
Location: London United Kingdom
Posts: 77
|
Quote:
Quote:
|
||
October 14th, 2007, 08:45 PM | #9 | |
Major Player
Join Date: Feb 2007
Location: Orlando, FL
Posts: 404
|
Quote:
I have one 20" CRT tv. That's all we have in our apartment. I'm sure there are plenty of other people who have had a tv for a few years and don't want to get rid of it just because there are other things on the shelves since then. I'll replace it when A) It dies or B) I can afford the luxury of buying a new tv just for the hell of it. :) |
|
| ||||||
|
|