View Full Version : Can a HDV image shot on the Z1 work on all HDTVs?
A. Stone April 15th, 2005, 09:31 AM I wanted to see what HDV looked like outside my little Premiere editing window (and since I don’t own a HDTV) and pleaded for help once again from the DVinfo community. The suggestion was to increase my monitor resolution and play back the avi file using media player. This is what I noticed when I did that: if my monitor was set under the resolution of the frame (1440x1080), the HDV image showed “tears” (de-interlaced lines) if there was any fast movement of the subject. At or above 1440x1080, the image looked fine. QUESTION (finally):
Does this mean that HDTVs - with resolution less than 1080 - will not display an HDV image shot on the Z1 properly?
When looking at HDTV’s online, it seemed that the resolution varied quite a bit. Some “HDTV’s” had resolutions of 800x600.
Thoughts?
Cheers!
Andrew Stone
Ignacio Rodriguez April 15th, 2005, 09:53 AM The simple answer... is not simple. There is no simple answer to this I think... what you are experiencing is that interlaced video does not downsample nicely to resolutions that are higher then half the original. For better results, make sure you are deinterlacing when compressing the video for Windows Media Player. Deinterlaced video will look much better on computer monitors and on the lower-resolution HDTVs, but by deinterlacing your camera's output you will lose both temporal and spatial resolution. A computer monitor will most likely not show you all the potential quality. The best way to see your camera's output is to use a monitor that has native 1080i resolution.
A. Stone April 15th, 2005, 10:18 AM Thanks!
any suggestions on what native 1080 monitor would be best?
Cheers!
Andrew Stone
George Griswold April 15th, 2005, 11:00 AM I am fortunate to have a HDTV monitor, but if you wanted a glimpse you can go to your friendly electronics shop and have them plug it in and play back a tape.
Trust me, the folks there would get a kick out of seeing a handheld HDV camcorder in the flesh--- everyone wins!
George
Peter Moore April 15th, 2005, 12:57 PM All HDTVs are either 1080i or 720p. If they are less than 720p they are not HDTVs by definition.
Now, on 1080i TVs, the footage will look perfect.
On 720p TVs, there are two ways the TV might handle the conversion.
1080/60i is essentially the same as 540/60p. One way to show the image is to simply take each 540 field, upconvert to 720, and display 720/60p using the 60 fields per second of the 1080i footage.
A better way is to take every sequential pair of 1080i fields and create 60 progressive fields per second. In 1080i you have:
A1 A2 B1 B2 C1 C2 ... etc.
The conversion would be:
A1+A2 A2+B1 B1+B2 B2+C1
For each combination, the TV would apply a slight 1-pixel blur to remove combing and then downconvert to 720p.
So in either case, your TV should display 1080i source material just perfectly if it is truly an HDTV.
The reason media player doesn't do it right is because it doesn't know how to handle interlacing properly. i.e. it should create progressive frames first, then blur, then resize, but instead it resizes without blurring which is why you get really wierd effects.
Steve Crisdale April 15th, 2005, 05:48 PM <<<-- Originally posted by Andrew Stone : I wanted to see what HDV looked like outside my little Premiere editing window (and since I don’t own a HDTV) and pleaded for help once again from the DVinfo community. The suggestion was to increase my monitor resolution and play back the avi file using media player. This is what I noticed when I did that: if my monitor was set under the resolution of the frame (1440x1080), the HDV image showed “tears” (de-interlaced lines) if there was any fast movement of the subject. At or above 1440x1080, the image looked fine. QUESTION (finally):
Does this mean that HDTVs - with resolution less than 1080 - will not display an HDV image shot on the Z1 properly?
When looking at HDTV’s online, it seemed that the resolution varied quite a bit. Some “HDTV’s” had resolutions of 800x600.
Thoughts?
Cheers!
Andrew Stone -->>>
Why would anyone suggest setting a monitor resolution of 1440 x 1080? That's the most bizarre thing I've ever heard suggested to any computer user wishing to view HD material from any source.
Not only would it be a resolution that's most unlikely to be supported by the hardware, the true resolution would be 1920x1080. 1440x1080 is the non-square pixel ratio used by the FX-1/Z1 and some broadcasters to help reduce bit-rates, without seriously eroding image quality for the given bit-rate, but the 1.3333:1 pixel aspect ratio has to be recognised by hardware/software to adjust accordingly.
A computer driven display which will always attempt to display square pixels (unless you tell it otherwise - in which case all your icons, other programs etc will look distorted) won't know how to handle such resolutions.
I'd be more certain that the advice was to set your monitor at a resolution that was at least 'equal' (if you have a WS XGA monitor) or better than 1280x720. My Viewsonic G790 CRT 4:3 monitor is set to 1280x1024 75Hz refresh for viewing HDTV via my VisionPlus HDTV DVB-t PCI card. I also view FX-1e clips I've shot on this same monitor (though mainly on a 17" WS LCD at true 1280x720) but I'd never use Media Player to do so! Media player is crap with any HD playback. I'd suggest you use one of the latest versions of either WinDVD or PowerDVD, as their mux/demuxers are now HD compliant and can handle the 25Mbit stream and the interlacing without any problem.
Quality HD/HDV playback on computer can benefit from the codecs you have installed as well.... and Intervideo and Cyberlink have the codecs that are most likely to provide smooth error free HD viewing.
The suggestion to go to an electronics store with your HDV camcorder and plug it in via component cable to a HDTV is a good one. And not some crappy plasma, as most plasmas don't have the resolution.... You'll be able to see the difference between HD 'ready' (which is marketing speak for Rip-off time for the gullible) and true HDTVs. On a true HDTV the FX-1/Z1 clips make even the best broadcast HD look inferior in quality....and that's saying something, because when you see a good HD show - like CSI Las Vegas on a good true HD set... you know why HD is superior to any previous broadcast technology.
Mike Tiffee April 15th, 2005, 09:12 PM <<<-- Originally posted by Andrew Stone :
Does this mean that HDTVs - with resolution less than 1080 - will not display an HDV image shot on the Z1 properly?
Andrew Stone -->>>
The "tears" you're seeing are because you are viewing interlaced footage on a non-interlaced display. On a HDTV, your footage will look great. Some media players (VLC) will deinterlace in real time as you playback and it will correct this for you.
When played back on a 720p set, the set will cross convert the 1080i signal to display properly.
A. Stone April 16th, 2005, 08:47 AM Wow...great feedback! When I showed an HDV clip in media player to a friend, the response was a bit lackluster. It's nice to know that MP is not truly the best way to view this stuff.
David Kennett April 20th, 2005, 08:35 AM Andrew
Conversion between 1080i and 720p can be done well, or it can be done poorly, so it's always difficult to evaluate an image that has been converted. Even an analog connection CAN have detrimental effects. I have a JVC HD10, and had seen it without conversion (all progressive) on my Viewsonic PF790 Professional Series monitor (VGA connection), as well as component display on my Samsung HLN5065W 50", 720 x 1280 DLP TV. At that time, I thought I had seen the camera at its best. I had always edited in native format, so there had never been any conversion.
Only when I built a home theater PC with DVI connection to the TV was I able to see the "real" picture. At any rate, I saw details I had never seen before. ANYTIME conversion is done there is always the possibility of losses. It is CERTAIN that the picture will not improve.
I recently found an InFocus 1024 x 768 LCD projector as an "open box" bargain. The 1280 x 720p signal required resampling (and a small loss of resolution) to display on the InFocus. Even with that relatively small loss, the picture had definitely lost its "crispness". I returned the projector.
It will be tough to look at a 1080i signal on a quality monitor that does not convert it. Commonly available "pixel oriented" monitors (LCD, DLP) are 1280 x 720. Consumer CRT monitors just aren't good enough to really evaluate pictures. So many times I see clips or pictures posted for evaluation that have been "fooled around with" just a little too much. Even then it's difficult to get a monitor that's good enough.
I used to think that 1080i was superior to 720p. Today, I find myself drifting to the ABC, FOX 720p camp.
Monitoring for quality evaluation is difficult. Audio has been that way for years!
A. Stone April 20th, 2005, 09:00 AM David...Thank you!
Do you shoot in 1080i and then convert to 720p?
David Kennett April 20th, 2005, 09:18 AM Andrew,
The JVC is 720p. I know, a JVC owner just shouldn't be hanging around here! I just thought my experiences might be helpful to anyone trying to evaluate picture quality.
Steve Crisdale April 20th, 2005, 03:46 PM Andrew,
The JVC is 720p. I know, a JVC owner just shouldn't be hanging around here! I just thought my experiences might be helpful to anyone trying to evaluate picture quality.
Strange that I'm a JVC HD-10u owner. I also own a Sony FX-1e.
You don't hear me telling people of the superior quality of 720p.
Maybe this time the explanation the 720p devotees will offer for my inability to see the superiority of the inferior format will be that my DVI connection isn't as capable as other HD-10 user's DVI connections are.... Get a Grip!!!
At least I own both cameras, and can make an assessment of 720p and 1080i that's based on dispassionate judgement, because both manufacturers screwed with me when I bought their HDV offerings.
David Kennett April 22nd, 2005, 08:15 AM Steve,
I'm not talking at all about these camcorders. Neither represents the best of 1080i or 720p. Interlaced scanning offers challenges in creating slo-mo, fast motion, or freeze frames. 720p60 makes all this easier. Resolution advantage obviously goes to 1080i. The Sony is limited to 1440h I believe. This is not particularly troublesome because (to my knowledge) Satellite and OTA providers limit it to that anyway. Apparently a compromise because the lower raw data rate could ultimately deliver better compressed pictures. My understanding is also that the apparent vertical resolution of interlaced scanning is only about 80% of the actual. Add this all up, and the strong apparent advantage of 1080i is certainly diminished.
After considerable reading, agonizing, and walking the isles at CC and BB, I bought a 50" DLP rear projector (1280 by 720 chip). I never saw a CRT RP that looked as sharp as either the DLPs or LCDs. The smaller screen direct view CRTs look good, but the smaller size makes comparison difficult without a resolution chart. TI is supposed to be coming out with a 1080 x 1920 chip, but I haven't heard about LCDs. There are 9" RP CRTs out there somewhere, but I have not seen one. My point is that a consumer actually seeing a 1080 x 1920 image is pretty unlikely.
There are certainly other factors that are VERY important to image quality, but they are entirely separate from pixel count and resolution.
|
|