View Full Version : Canon HV 10 does it record in 1920x1080 or in 1440x1080
Johann Schlossberg May 20th, 2007, 03:07 AM If I capture movie through firewire with any soft, the file that results is 1440x1080. The HV10 camera, I know that it records in 1080i HD format, that is 1920x1080.
Where do I do something wrong?
Mikko Lopponen May 20th, 2007, 03:31 AM They all record in 1440x1080.
John M. Graham May 20th, 2007, 07:43 AM Johann,
The reason why it's captured at 1440x1080 is because that is how **all** HDV cameras record high-def. That is how the format records. The HV20's sensor is 1920x1080, but when it's recorded to HDV tape it's recorded at 1440x1080. If you were to hook up your HV20 to a 1080p monitor via HDMI on a live feed, then you would be able to see a true 1920x1080 display. I honestly don't think you really could tell a difference.
Now it may seem confusing how you get a 16:9 wide screen aspect ratio when 1440x1080 is 4:3. That is because the pixels are not square but rectangular; 1.33:1 to be exact (they are that 1.33 times longer horizontally than they are tall).
Now, as I stated earlier, you really wont be able to see a difference between 1920 and 1440. As a test, find a nice pretty picture that is 1920x1080 and load it in Photoshop. Then, change the width of the image to 1440 (make sure that "constrain proportions" in unchecked so that the height stays at 1080). You'll see your picture squeezed and at a 4:3 ratio. Save the file as a 24 bit bmp with a different file name. Now, change the width of the image to back 1920 (leave the height at 1080 still, of course). We reduced the resolution to 1440 and stretched it back out to 1920. Finally, compare the original image to the new one. They look virtually identical. If you look very (and I mean very) closely you'll see a minuscule difference.
Mikko Lopponen May 20th, 2007, 01:43 PM Now, as I stated earlier, you really wont be able to see a difference between 1920 and 1440.
If you can't see a difference then the original picture doesn't have enough information even to 1440. There's about 20% more in the 1920x version.
Ryan P. Green May 20th, 2007, 02:30 PM Just a heads-up that while it DOES record to tape at 1440x1080, it does output pure, uncompressed 1920x1080 over HDMI while in live mode.
Alberto Blades May 20th, 2007, 05:39 PM you cannot see the difference cause the real resolution is less than 1440 dots , a 1920x1080 sensor means 1920x1080 dots including all red, green and blue sensors, a true 1920x1080 camera should have 3 1920x1080 sensors, with some procesing methods it can get more resolution than 1920/3=640 dots, but no more than 1000 probably.
if you get a image from a 7 mp camera, resize it to 1920x1080, then resize another copy to 1440x1080 and scale to 1920x1080, you'll see the diference!!
Pete Bauer May 20th, 2007, 06:16 PM That's not quite right. The HV10 uses a 3MPx CMOS sensor which delivers 2.07MPx (1920x1080 full raster) HD video, presumably 8-bit color at 4:2:2, from the chip. HDMI out will give the full raster 60i, and HDV to tape will be processed to the HDV2 standard of 1440x1080i60 at 4:2:0. The chroma resolution is less, but luma should still be either 1920x1080 or 1440x1080.
It'll depend on the particular image, format, amount of compression, display technology, and the attributes of a person's vision as to how apparent differences due to a re-sample will be.
Bruno Donnet May 20th, 2007, 07:06 PM I agree with the main idea of Alberto's post; only the calculation of the red, green and blue pixels is not fully correct.
The Canon HV10 and the HV20 have a colored Bayer filter on their CMOS sensor: that means that there are 2 green pixels for 1 blue and 1 red. This splitting is fine for a 4:2:2 color scheme but gives not a true and full 1920x1080 luma resolution.
The main color basis for the luma determination is the green color, and with a Bayer filter, half of the pixels of the matrix are not green... Whatever is the power of extrapolation/calculation based of 2 green pixels for 1 blue and 1 red, you will have never a true 1920x1080 HD resolution.
In term of PQ, the big difference is not between the '1920x1080' and the '1440x1080' mode, but between the non compressed mode and the compressed one: thru the HDMI port (delivered in 1920x1080 but 1440x1080 would give the same PQ with the sensor of the HV10/HV20) the info is not compressed with a color scheme of 4:2:2, compared to a Mpeg-2/HDV compressed mode in 1440x1080 with a color scheme of 4:2:0.
Terence Krueger May 20th, 2007, 08:01 PM the canon is a 1920x1440 bayer patern sensor. basically identical to a digital still camera where each pixel is only one colour. it crops to 1920x1080 for video capture and will output a coded single channel 10 or 12 bit signal to the on board processor. the processor then assembles it into one of a number of different formats depending on where its going to send it.
for tape, it scales the footage to either 1440 x 1080 (hdv standard) or 720x480 (ntsc dv standard). as mentioned dv colour space is 4:1:1, and HDV is 4:2:0. then both are compressed til they look like crap :)
for hdmi, the camera seems from what i can tell to leave the footage at 1920x1080, and convert the colour space to 4:2:2, either 8 or 10 bit, not sure yet. there is no compression here.
component out is the same as the hdmi, just analogue, with the option of an ntsc(ill assume pal in europe) (720x486) or HD 1080i singal.
now as far as the "true" hd nature of the sensor. well, its close enough. a 3 chip 1920 x 1080 setup would be best, but there isnt one that i know of under the price of a luxury car. all the 3 chip HDV cams use pixel shift to create what amounts to a 3 piece bayer sensor. most of these dont do anything any better than the hv20, many of them are substantially lower res, since its not as simple as saying "1920/3". at 1080p res, theres not much to worry about regarding single chip cameras. after all, the red one, dalsa origin, and evry professional still camera is single chip, and noones complaining there (except maybe foveon).
terence
|
|