|
|||||||||
|
Thread Tools | Search this Thread |
March 29th, 2005, 07:59 PM | #1 |
Trustee
Join Date: Mar 2004
Location: Milwaukee, WI
Posts: 1,719
|
720 vs. 1080 debate
A lot of people are busy comparing cameras that either are or are not a reality at this point. How about the two different HD formats?
I haven't seen very many debates over which one people like better for certain projects. Seeing as everybody is picking the heck out of the cameras themselves it would be nice to get some debate over which is better. I personally love progressive video but hate the fact that really 720p isn't that much larger than 480p. Considering that some hard core film people think 1080p HD still isn't good enough for film than 720p would be even worse. 1080i gives great resolution for TV but when deinterlaced has detail close to 720p. For film would having more horizontal resolution of an interpolated 1080i be better than a clean lower resolution 720p source? At first I was very excited about the new Panasonic HD camera but then I cooled down a bit. 720p is much better than 480p but I always thought of 720p 24p for film work as the poor mans HD format. I am in no way knocking 720p but the fact is that 720p's main advantage over 1080i HD video is the increased frame rate of 60p. This gives ultra realistic motion. If you were to shoot both 1080i/p and 720p both at 24 fps 720p would be much lower in terms of quality. When it comes to cameras they are just tools to me and it doesn't matter which one I have. I know at the end of the day no matter which one I use I could tell the same story. The thing that has me stuck between HD cameras however is the HD format itself. |
March 29th, 2005, 09:45 PM | #2 |
RED Problem Solver
Join Date: Sep 2003
Location: Ottawa, Canada
Posts: 1,365
|
Interesting, and there's no "right" answer. Personally, I see interlaced video as a rather outdated form of compression, that, as all modern display devices are inherently progressive, should be left to the past.
However, treating interlace as compression, and good de-interlacing as decompression, will, with 1080i60 give you a reasonably nice 1080p30 with retaining somewhere between 50% to 70% resolution of what a 1080p30 shot natively progressive camera woud. So, you would get a 50% greater horizontal resolution than 720p30, but that would be your only real advantage. But, 720p should compress better for a given data rate than 1080i60, so there could be an advantage there... And then it comes down to the optics and DSP etc. of the camera, it's professional controls and such. Sounds like a fun decision to make! Graeme
__________________
www.nattress.com - filters for FCP |
March 29th, 2005, 10:28 PM | #3 |
Major Player
Join Date: Nov 2003
Location: Los Angeles, California
Posts: 853
|
HORIZONTAL RESOLUTION.....
ya know, nobody ever talks about Horizontal Resultion... 1440 pixels from left to right.....is allot of freakin' pixels! Graeme, how important is that to the eyeball? Do we see vertical information more then horizontal information? and that's why we only talk about the ups and downs? I just bought a Dell 19" LCD HD monitor today to use in the field. It boasts 1280 x 768 (WXGA) Native Resolution. For instance.....I am contemplating on if I am going to get me one of those P2 cameras from Panasonic. However, time after time I read people saying "WELL, TODAY'S HD TV'S ARE ONLY 720, SO A PANASONIC 720p CAMERA IS PERFECT!". Yea, that's true BUT they always FAIL TO MENTION that that same HD TV is also 1280 pixels WIDE!!! and that Panasonic camera only does 960 pixels wide!!!! *dumb look* Now...JVC.....JVC is perfect for 1280x768 HD TV because JVC's cameras are 1280x720!!! and progressive or interlaced, Sony uses 1440 fat belly horizontal pixels. But for Panasonic, you never hear anything about them skinny horizontal pixels it uses maxing out at 960. And I wonder why? Is it because horizontal resolution is not as important? - Shannon W. Rawls |
March 30th, 2005, 01:23 AM | #4 | |
Barry Wan Kenobi
Join Date: Jul 2003
Location: North Carolina
Posts: 3,863
|
Quote:
Back on topic: There has been endless debate over 1080i vs. 720p. Someone just pointed out an exhaustive article on Walter Graff's site that talked about it. What it all comes down to is that both systems pump about 60 million pixels per second through the TV. Both look absolutely amazing on a high-def television, when displayed at their native resolution. What you will not hear debated is 1080/24p vs. 720/24p -- that would be a silly argument. 1080/24p is obviously much superior to 720/24p, as a format (although there's plenty of argument about which produces a nicer image to look at, the CineAlta or the VariCam, but that's beyond the scope here). And 1080/24p and 1080/30p are legitimate broadcast standards in the US, ensconced in the ATSC specifications (as are 720/24p and 720/30p). So the main argument comes down to 1080i vs. 720p. And yes, that has been discussed ad infinitum, a google search should bring up volumes of debate. |
|
March 30th, 2005, 05:12 AM | #5 |
RED Problem Solver
Join Date: Sep 2003
Location: Ottawa, Canada
Posts: 1,365
|
Well pointed out Barry. I was just comparing 1920x1080 (which hardly any camera actually records) with 1280x720, which some do record, but on the other had, hardly do justice.
What I was alluding to, with my comments on compression is that you have an issue of pixel quality and how accurate those pixels are. Obviously, you could take a VHS image and interpolate it up to 1080i, and say it's a high definition image. And indeed it is, but none of those pixels will be in any way accurate. Even when compression isn't producing the typical visible artifacts of mosquitos, quilting, noise pulsing, false grain and differential movement etc. it's still their adding noise, both temporal and spacial to the image, and the effect of that is, to me, a general reduction in the entire smoothness of the entire moving image. This type of compression artifacting is very hard to see without a comparison uncompressed image and the benefit of movement, but it is there and I do tend to see it's effects. Practically all video formats suffer from it - it's everywhere to one extent or another. I think the factor that outweighs 720p or 1080i is the ability of the camera operator and the controls of the camera they're working with. Indeed, these factors can also bring the quality of DV above any HD format, if you've put a good DV camera with great controls in the hands of an expert, and the HD camera is being operated by a lesser person. I'd prefer to watch a genius shoot VHS than some also-ran shoot HDCAM. Graeme
__________________
www.nattress.com - filters for FCP |
March 30th, 2005, 11:18 AM | #6 |
Trustee
Join Date: Mar 2004
Location: Milwaukee, WI
Posts: 1,719
|
Yes we are comparing 720p to 1080i but at the same time all the HD cameras that will be coming out soon only give us 24p or 30p for 720p.
Even if there were other debates at different locations I thought since so many people were wasting energy by debating the new cameras maybe we should try to shift the focus not on the hardware but the end format. I also thought about what Shannon said about current HD monitors only really giving us 720p resolution right now. How long is it going to be before we do have true 1080i/p TV's? Maybe the main reason why both 1080i and 720p from current HD cameras look equally good is because we aren't seeing 1080i in it's full glory. If we did have a 1080i TV how would 720p look blown up compared to the 1080. I think for broadcast 720p is great right now and will be for awhile. About the actual chip size for the SONY camera. 1080i on it's own as a format may have certain advantages over 720p but what about 1080i coming from the Z1? The chips are 960 by 1080. After de-interlacing you end up in theory with 960 x 1080 with the same detail as 960 x 810 (1080 times .75). If the Panasonic uses 960 x 720 pixels this equals almost the same amount of raw detail. The SONY does use pixel shift to get a little more detail but only for horizontal resolution. Now if the JVC camera is going to have true 1280 x 720 pixels on the chips then the raw image would have more detail than the de-interlaced pixel shifted SONY at 1440 x 1080. haveing 1920 x 1080 raw pixels would be very nice even interlaced but with the SONY we are nowhere near that. There is also the debate over 24 fps. Even though many hate 24p the fact is that if you are transfering to film we are sort of stuck with this right now. Considering that the JVC and Panasonic cameras will both have true 24p this means whatever pixel data we get out of the camera is what we will have. The image will not be interpolated more to get to 24p. If you were to shoot 25p/i with the SONY and shift the audio 4% you would still have to spend time processing the video and audio for that change. This actually makes any decision even harder because now all three HD cameras seem to be on par in terms of detail. Of course we will not know for sure until the other two cameras come out. One other thing I would like to point out is that 720p is a heck of a lot easier to edit right now. Even though in theory both have around the same data rates which is around the same as DV the fact is that for any processing of the footage is done on the pixel level and 1080i/p needs a lot of horse power. The system I use now can easily handle a couple of streams of 720p in real time. |
March 30th, 2005, 04:46 PM | #7 |
Major Player
Join Date: Jan 2004
Location: Katoomba NSW Australia
Posts: 635
|
<<<-- Originally posted by Thomas Smet : Now if the JVC camera is going to have true 1280 x 720 pixels on the chips then the raw image would have more detail than the de-interlaced pixel shifted SONY at 1440 x 1080. -->>>
Putting Mathematical theory aside for a moment here.... Without getting caught up in pixel numbers or any other pseudo logical rationale for one format being better than another, the 1080i FX-1e clips look significantly better on my Aquos HDTV LCD than the 720p clips from the JVC HD10u. I'm sure I'm going to get the "but that LCD isn't true 1080 resolution, so it's re-sampling the image" or "that HDTV can't support 1080i PAL and 720p NTSC" and "the HD10u isn't a 3 chip camcorder" and all the usual bollocks I've heard many times before. Theoretically certain insects shouldn't be in the air.... but they are, and in this case the result is the same; namely, the proof is in the pudding. |
March 30th, 2005, 04:54 PM | #8 |
RED Problem Solver
Join Date: Sep 2003
Location: Ottawa, Canada
Posts: 1,365
|
But the JVC was probably the worst (and at the time, cheapest) HD camera ever made, and certainly doesn't have the resolution in it's one chip to do justice to it's 720p designation, and although the FZ1 is 1080i, it certainly doesn't have the resolution that a high end 1080i camera would have. The argument of 720p v 1080i is not added to by comparing the worst 720p and 1080i cameras around. (Not that they're bad, but given that every other high def camera costs about 10 times the price and they all look significantly better.....)
Graeme
__________________
www.nattress.com - filters for FCP |
March 30th, 2005, 05:31 PM | #9 |
Contributor
Join Date: Jan 2003
Location: Kansas City, MO
Posts: 4,449
|
Steve, you're comparing a single chip camera to a 3-chip one. If they both recorded to the same format, the Sony would still look better. Regardless of the format, the recording can never make the picture look any better than what the chips produce.
|
March 30th, 2005, 06:56 PM | #10 |
Trustee
Join Date: May 2004
Location: Knoxville, Tennessee
Posts: 1,669
|
>>Theoretically certain insects shouldn't be in the air....
The myth persists that science says a bumblebee can't fly. Indeed, this myth has taken on a new life of its own as a piece of "urban folklore" on the Internet.......No one "proved" that a bumblebee can't fly. What was shown was that a certain simple mathematical model wasn't appropriate for describing the flight of a bumblebee. http://www.sciencenews.org/articles/20040911/mathtrek.asp |
March 30th, 2005, 11:25 PM | #11 |
Trustee
Join Date: Mar 2004
Location: Milwaukee, WI
Posts: 1,719
|
lol
|
March 31st, 2005, 02:18 PM | #12 |
Major Player
Join Date: Feb 2005
Location: UK
Posts: 383
|
Very simple answer to this one as far as I am concerned, BOTH have their place in broadcasting.
|
March 31st, 2005, 02:31 PM | #13 |
Major Player
Join Date: Jul 2003
Location: SF, Ca
Posts: 421
|
Well with the new panasonic, maybe now the discussion is 720/60p vs 1080/24p.
|
March 31st, 2005, 03:22 PM | #14 |
Barry Wan Kenobi
Join Date: Jul 2003
Location: North Carolina
Posts: 3,863
|
Or is it a discussion at all? With the new Panasonic, you can have it all in one camera: 1080/24p, 1080/30p, 1080/60i, 720/24p, 720/30p, 720/60p...
|
March 31st, 2005, 10:57 PM | #15 |
Trustee
Join Date: Aug 2003
Location: Vancouver BC Canada
Posts: 1,315
|
All of the new HD DVD's that are out (T3, Matrix Reloaded, Attack of the Clones) are all 720p. As a distribution format it takes some decent hardware to watch 720p nevermind 1080i. 1080p is pushing all tech at the moment. I think it is safe to say high bandwidth 720p (HD DVD ect.) looks much better than low bandwidth (19Mbps) 1080i TV. As low cost aquisition format (HDV)720p gives far more data per pixel then 1080i. Pushing the boundaries of compression just to promote 1080i when you are capturing an image that wouldn't fill 720p is only to sell cams to the sheep who need big numbers isn't an advancement. If the industry settled on 720p we would be much better off. The future isn't all resolution. Resolution plus lack of compression plus frame rate = video advancement. Watching 1080i with over done compression in an interlaced format with yesterdays frame rate was such a sideways move. But as always bigger numbers sell.
My 2 compressed cents.
__________________
Damnit Jim, I'm a film maker not a sysytems tech. |
| ||||||
|
|