View Full Version : 10-bit file conversion?


Paul Firth
August 15th, 2008, 04:22 PM
I have Prospect HD and have been converting some HDV and AVCHD files to cineform intermediate. I can't seem to find a setting to convert these to 10-bit files. How do I accomplish this? I realize that they are only 8-bit source material but I have quite a bit of color correcting to do and would like to take advantage of the 10-bit files. I am using CS3 for editing.

Any help?

David Newman
August 15th, 2008, 05:29 PM
Internal precision is bumped to 10-bit, and in Prospect HD you can manipulate the resulting images using 32-bit float.

Paul Firth
August 16th, 2008, 03:52 PM
Thanks. So I guess I use the 8-bit file, put it in a 16- or 32-bit precision timeline, then I can render it to 10-bit if I want.

David Newman
August 17th, 2008, 10:53 AM
It bumped to 10-bit upon conversion, so you don't really have an 8-bit file any more.

Paul Firth
August 17th, 2008, 03:22 PM
Thanks. It's nice to get answers straight from the source.

Paul Kepen
August 20th, 2008, 05:19 PM
It bumped to 10-bit upon conversion, so you don't really have an 8-bit file any more.

I Use Prospect with PPro 1.5.1 to capture HDV 1080x60i footage. When I render this to a .m2v file in Premier and import it into DVD movie factory, I get outstanding output in HDDVD format on a cheap dvd-r.
My encoding settings - 1080x60i upper field first - pixel aspect 1.33 and variable bit encoding -25m-20m-4m.
When I take this same Prospect captured file and render an .m2v file with the same encoding settings with Vegas 8 pro and import it to DVD movie factory, the resultant output in HDDVD format on a dvd-r disc is very dark with lost shadow detail and blown out highlights. Also there is a lot more mosquito noise and pixelation.
Why the difference? Is the encoder in Vegas 8 worse then the old PPro 1.5? (I would not think so). Is it because PPro will work at 10bit versus the 8bit in Vegas? I know I can up it to something like 32bit encoding (haven't found where that is yet) but since the original HDV is only 8 bit, what would be the advantage of that - other then much lengthier encoding times.
As always, thanks for any helpful insight - PK

David Newman
August 20th, 2008, 05:37 PM
Not related to this thread. Look up Vegas Issues with Studio RGB.