Bernard Rosenzweig
November 8th, 2009, 01:44 PM
Hi, I understand that AVCHD with Video Quality set to SH or H or L, will be recorded at 1280 x720, the only difference will be the Bit Rate:
- SH = 17 Mbps
- H = 13 Mbps
- L = 9 Mbps
Do you know what is the "main" difference among those Bit Rates ?, if I set it to L (9 Mbps) will be a huge difference with H or SH ?, any previous experience ?, I would like to better understand what I am loosing by moving down from SH --> H --> L.
Thanks
Barry Green
November 9th, 2009, 09:35 AM
Quality versus recording time. Lower bitrates = longer recording time and smaller file sizes, at the expense of video quality.
My stock advice is always use the highest bandwidth mode you possibly can, given your recording time limitations. If you've got enough cards, glue the button on SH and never take it off. There is no advantage to using the lesser modes except for smaller file sizes.
Steev Dinkins
November 10th, 2009, 12:16 PM
Yes, I would agree with Barry with using the highest quality bit rate you can, SH or FHD. With my research of other's experiences, and testing of my own, the encoding can fall apart at even the best bit rate settings. FHD is the most fragile, and SH is the most sturdy. However, people were reporting SH being foolproof, and I can definitely say that it's not. I've seen the codec blow apart even in SH, however that is in 1% in all of the footage I've shot. So that said, I won't ever use any of the lower bit rates for fear of image quality degradation.
Lastly, I have no issues with shooting at the highest quality, since the data rate is so low to begin with, the record times are outstanding on 16GB cards.
My first experience with recording to solid state memory was with the HVX200 on 8GB cards at 720p24N, and I was thrilled to get 48Min on 2 of those "big" cards (16GB total). With the GH1, I'm blown away to get 2 hours+ on a single tiny 16GB SD card.