|
|||||||||
|
Thread Tools | Search this Thread |
February 9th, 2011, 04:02 AM | #1 |
Major Player
Join Date: Mar 2007
Location: Madrid, Spain
Posts: 238
|
HD recording for the web: 720p or 1080p
Hi:
I was thinking, what's best when recording for web distribution, 720p or 1080p? Most people watching video on the web do not have a Full HD capable display (yes you can use your TV, but I think most use a laptop), so the video will have to be downscaled anyway. Now, recording at constant bit rate, with 720p having half the pixels of 1080p this means less compression, so I figure that what's lost on resolution is returned in better color depth as a result of less compressed image frames? Anyone tried to compare the result of downscale in when recording against the downscale in post? Thanks, Erik |
February 9th, 2011, 05:10 AM | #2 |
Inner Circle
Join Date: Jan 2004
Location: Boca Raton, FL
Posts: 3,014
|
So to restate the question, I think you are asking is it "better" to
1) shoot 720p then compress to 720p for the web 2) shoot 1080p, downrez to 720p in post then compress to 720p for the web In both cases, there is a massive compression of video to web data rates. That is where significant and most noticable degrading of the image quality happens. I would think the factors that go into the decision are more to do with which shooting resolution gives you the image you want? For example, in most cameras, the light sensitivity of 720p is better than 1080p. So, you can get a "better" image if you are shooting low light. If not, then it's moot. Similarly, if you shoot 1080p, you have more resolution to do Burns effect and cropping in post. Lastly, converting from 1080 to 720 can, if you aren't careful, introduce aliasing. So there are other tradeoffs involved that I think have a more significant impact on image quality in the final product...larger than the differences in raw data capture. I think the impact of the compression to web data rates is orders of magnitude larger than the capture differences. IMHO |
February 9th, 2011, 05:25 PM | #3 |
Major Player
Join Date: Jan 2004
Location: Kelowna, BC [Canada, Eh!]
Posts: 257
|
Also depends on the camera and it's codec.
|
February 9th, 2011, 09:49 PM | #4 |
Trustee
Join Date: Jan 2008
Location: Mumbai, India
Posts: 1,385
|
The most general answer is 720p. The difference after compression on the web is hardly noticeable (if both compressions were done accurately).
Why not check youtube for videos with both 720 and 1080 and see for yourself? You can download those videos and analyze them on your NLE too.
__________________
Get the Free Comprehensive Guide to Rigging ANY Camera - one guide to rig them all - DSLRs to the Arri Alexa. |
February 10th, 2011, 02:31 AM | #5 |
Major Player
Join Date: Mar 2007
Location: Madrid, Spain
Posts: 238
|
Maybe I can state my question more clearly, assume you have a camera with a full HD sensor 1080x1920 effective pixels. It can shoot 1080p or 720p at say 24fps/50Mbps MPEG2 in 4:2:2, although codec etc is irrelevant as long as it's the same for both. Then the full process is:
720p: capture 1080p down res to 720p -> compress 4:2:2 50Mbps -> post production -> compress for web 1080p: capture 1080p -> compress 4:2:2 50Mbps -> post production -> compress for web -> down res 720p For comparing quality it suffice to compare: 720p: capture 1080p down res to 720p -> compress 4:2:2 50Mbps 1080p: capture 1080p -> compress 4:2:2 50Mbps -> down res to 720p In case of 720p I'm assuming that the in camera 720p is produced first scaling down, then compressing in camera, whereas for 1080p compression is done in camera while downscaling is done when viewing the video. The question is if the order of these operations is irrelevant, if the end result is the similar. My question is if, knowing that the video will be viewed as 720p shooting straight in 720p will give a better result. |
February 10th, 2011, 02:33 AM | #6 |
Major Player
Join Date: Mar 2007
Location: Madrid, Spain
Posts: 238
|
No, actually to test this you need to shoot two videos one in 720 and one in 1080, I'm not aware of somebody having done that test, but if so please point me to it with a link.
|
February 10th, 2011, 06:24 AM | #7 |
Trustee
Join Date: Jan 2008
Location: Mumbai, India
Posts: 1,385
|
There are many videos on youtube shot on 1080 and showing both 1080 and 720 (and lower of course).
Anyway, the answer is to shoot the highest possible resolution (in your case 1080), then complete the post production in the same resolution, and finally output 720p for internet video only during the final render, in the format that it will be shown in. In most cases, 720p will be more than enough and the jump in quality that 1080 affords is somewhat negated by the compression on the internet (which is not optimized for quality but for speed). Also, another thing I forgot to mention is that 1080 videos don't play back smoothly on most consumer machines due to various hardware issues (assuming they have the bandwidth to download it in the first place). Youtube and Vimeo have different specs, so depending on your final delivery platform, your settings will change. The only way to know for sure is to sample your videos and test them yourself. You may find your preference contrary to what I've suggested, but that's okay. There's no right or wrong here. Hope this helps.
__________________
Get the Free Comprehensive Guide to Rigging ANY Camera - one guide to rig them all - DSLRs to the Arri Alexa. |
February 10th, 2011, 10:16 AM | #8 | |
Major Player
Join Date: Mar 2007
Location: Madrid, Spain
Posts: 238
|
Quote:
As mentioned in previous post there is a difference between shooting in 720p or shooting in 1080p then downscaling: 720p: capture 1080p down res to 720p -> compress 50Mbps 1080p: capture 1080p -> compress 50Mbps -> down res to 720p In this case the order of the operations change. In the first case, shooting directly in 720p there is no doubt a resulting 50 Mbps video stream. In the second case a 50Mbps stream is downscaled, data is thrown away, this may produce a visually inferior result. So, restate my question again: Is the difference visually significant? BR, Erik |
|
February 10th, 2011, 03:41 PM | #9 |
Major Player
Join Date: Dec 2006
Location: Auckland, New Zealand
Posts: 444
|
Erik - the only way to know this with a given camera is to test that camera.
For example, at work with have an HPX172 - I tested 1080 output versus 720 output, and would go with 720 in all scenarios because the camera is actually uprezzing to get it's 1080 ouptut and the artifacts become obvious, and will become worse when put under further compression. Also, a lot of cameras still only do 1080i, not 1080p, so if you aren't getting true progressive you are throwing away that extra resolution when going to progressive web output later. If you shoot in 1080P and your camera is natively 1080P - then your camera will be handling the downres to 720 in realtime. This is good for workflow as it saves you time, but is less flexible because you don't have any control over the downres - your camera could do a good job in most situations, and not as great a job in some. If you downres at the end you have control to fine tune the downres - but this can take a lot of time for little gain unless you are trying to avoid something in particular that the hardware downconversion in the camera doesn't deal with well. My gut feeling is that if you are looking to deliver in 720, let the camera handle the downres for convenience unless in tests it exhibits artifacting you don't like. Overall quality shouldn't take a hit (most techniques for downressing if properly implemented aren't too dissimiliar once you get down to it) but artifacting can be introduced on certain subject matters - and that's something that web compression will generally make worse rather than better. Regarding the bitrate - the tendency would in most cameras for 720p to offer twice the frame rate in the same bitrate as the larger format, and in the case of 30/25p this is still normally just 720p60/50 with double frames, so you don't get less compression from going this route. 1080 normally doesn't offer a 60/50P frame rate at the same bitrate - if it does, then you MAY be better going off with a 720p60/50 version because it will be less compressed, but not necessarily because it still depends on how well the camera handles the downres. So it all comes back to testing - there is no inherently better solution, only the quality of the methodology with which those solutions are implemented.
__________________
www.afterglow.co.nz |
February 10th, 2011, 10:02 PM | #10 |
Trustee
Join Date: Jan 2008
Location: Mumbai, India
Posts: 1,385
|
As I stated before, In my opinion, it isn't - for the internet.
__________________
Get the Free Comprehensive Guide to Rigging ANY Camera - one guide to rig them all - DSLRs to the Arri Alexa. |
February 11th, 2011, 02:55 AM | #11 |
Major Player
Join Date: Mar 2007
Location: Madrid, Spain
Posts: 238
|
Thanks Greg:
The camera I'm looking at, the Canon XF100, is not yet available which is why I ask this rather hypothetical question. It offers full HD CMOS with 1080p native at 25fps, or 720p downscaled at 50fps or 25fps. I know I can choose to shoot 720p at 50fps and eg. get a smoother video in particular slow motion, or shoot 25fps to gain a stop. Obviously to make the comparison, both 1080p and 720p must be shot at the same frame rate. When I get my hands on this camera I'll make a test. BR, Erik |
February 11th, 2011, 11:13 AM | #12 |
Major Player
Join Date: Jul 2006
Location: New York NY
Posts: 322
|
That's a subjective question. What's significant to some may be insignificant to others. I think you'll need to run your own tests and make your own decision.
|
February 13th, 2011, 07:25 AM | #13 |
Trustee
Join Date: Apr 2008
Location: Cornsay Durham UK
Posts: 1,992
|
I have the HPX-301 and shoot everything in AVC Intra 100 1920x1080i 25np, I then ingest as native into FCP as pro res HQ and export a master in the same format.
I then do an apple TV file for web use which is 1280x720p at 5mbs or 1.5mbs. I think that 720p is better for web delivery as it gives a good balance of quality over stream speed, I don't think doing it at 1080p would be hugely different at 5 or 1.5mbs.
__________________
Over 15 minutes in Broadcast Film and TV production: http://www.imdb.com/name/nm1044352/ |
February 18th, 2011, 05:53 PM | #14 |
Regular Crew
Join Date: Aug 2004
Location: Stevens Point, Wi
Posts: 156
|
I have studied this entire thread. I am not new to video but video is a retirement venture and my understanding is not as deep as is necessary. I get along but don't always understand why.
I shoot with a Canon 7d and a Canon 5d MK2. The 7d is primary and it will shoot 1280 x 720 at 60 fps. The 5d only at 1920 x 1080 30/24 fps. I have been shooting both cameras at 1920 x 1080/30 at a shutter speed of 1/60. I shoot wildlife. I am 1 year into a project in which I am filming Kodiak Brown bears at close quarters. The immediate output is for the internet. I hope to combine a few years of filming into a high a quality Blu-Ray for sale. I have seen, on the internet, some spectacular slow motion results shot at 1280 x 720/60 converted to 30 fps and slower. Some scenes I have shot would look spectacular reduced to 50% or much less. Of course when I reduce my current 1920 x 1080 30fps to 50% or more I get considerable motion blur. I have been using the Optical flow generator of Final Cut's Motion. When shooting I never know, in advance, what I will want to use as slow motion. My questions. In my current scenario could I reduce motion blur by decreasing my shutter speed to say 1/125? Would that result in jerky motion in rushes played at 100%? Could I combine footage shot 1280 x 720/60 and slowed down considerably with rushes shot 1920 x 1080/30 without noticeable difference of quality? I want my cake and eat it to. Thanks for your patience. Kent |
February 19th, 2011, 04:09 AM | #15 |
Major Player
Join Date: Dec 2006
Location: Auckland, New Zealand
Posts: 444
|
From my experience 1280 x 720 should upres reasonably well into 1920 x 1080 - after all this is what most consumers watching 720 broadcasts in Europe on their 1080 screens are experiencing, and most won't complain.
Also, in the case of the 7d and 5d they both resolve far less resolution than they actual record in video mode, so up conversion of the 7d footage to 5d size shouldn't make a huge difference in the final analysis. Increasing your shutter speed will create issues in terms of motion on particular subject matters if you don't want to slow the footage down in post, and may also cause flickering issues on fluorescent light sources etc. You need to determine on a shot by shot basis if you are shooting specifically for post processing slow mo greater than play 60fps footage back at 30fps for a 50% slowmo.
__________________
www.afterglow.co.nz |
| ||||||
|
|