|
|||||||||
|
Thread Tools | Search this Thread |
October 6th, 2006, 02:30 AM | #16 |
Regular Crew
Join Date: Nov 2005
Location: Auckland, New Zealand
Posts: 117
|
I guess I could just use my 800w power inverter from my car that I normally use to power 650w Arri's. Long run from car to location wouldn't be too bad. PC on a crate trolley, moving it around after the camera with 15m of freedom via HDMI to the camera.
Sounds good, there just needs to be a laptop solution, are there any cardbus to PCIe or PCIx adaptors/enclosures? The firewire would still send a feed when capturing with HDMI? Just thinking about DV Rack monitoring, but maybe just go all HDMI with the splitter. Does sound quite good, I'm nearly forgetting about the entire 1/4" chip DOF thing (why did I bring it up again!).
__________________
Ainslie Davies - www.dualityproductions.com |
October 6th, 2006, 07:15 AM | #17 | |
Join Date: Jan 2004
Location: Stockton, UT
Posts: 5,648
|
Quote:
HDMI has the *ability* in the spec to do 10bit 1920 x 1080, but I'm unaware of any manufacturer meeting that ability. Kinda like HDSDI, the pipe is there, but not enough water to fill it.
__________________
Douglas Spotted Eagle/Spot Author, producer, composer Certified Sony Vegas Trainer http://www.vasst.com |
|
October 6th, 2006, 07:26 AM | #18 |
Major Player
Join Date: Dec 2005
Location: Melbourne, AUSTRALIA
Posts: 735
|
I don't understand pulldown.
I don't understand 4:2:2 or whatever. I don't understand 10-bit. And I don't know what HDMI is. Am I right in assuming that the reason to capture via HDMI is in order to preserve higher quality than capturing via firewire? Ie: HDMI = HD and Firewire = HDV - in a very simplistic sort of a way? So I need an HDMI port on my PC? Just like I have a firewire port? And does it need to be an HD one? I am so confused... I see them advertised from a few hundred dollars right up to a few thousand dollars but I don't know what the difference is. Can someone shed some light on this for me? If HDMI Output is such a bonus for the V1 I'd like to know why. Especially since I also don't understand how I can edit the footage if it is 1920x1080HD instead of 1440x1080HDV. Premiere won't understand that, will it? On the net I saw that the Blackmagic Intensity PCI Express HDMI Card will be able to work with Premiere Pro and supports 1080p24 but I don't see how. And you're saying that you need a super computer tp be able to capture it. How super a computer do you need? My AMD 4400 dual core with 2Gig of RAM and nVidia 7800GT can do just about anything. Is it in the same league as what you're talking about? |
October 6th, 2006, 08:12 AM | #19 | |
Trustee
Join Date: Mar 2004
Location: Milwaukee, WI
Posts: 1,719
|
Quote:
A lot of people have complained about banding in HDMI 1.2. HDMI 1.3 now includes 30bit, 36bit and 48bit color up from 24bit color in earlier specs. That is info taken from the HDMI website in my own words. I think the PS3 is going to be one of the first devices to use HDMI 1.3 according to the website. |
|
October 6th, 2006, 04:06 PM | #20 | |
Regular Crew
Join Date: Nov 2005
Location: Auckland, New Zealand
Posts: 117
|
Quote:
Premiere can edit 1920x1080 just fine. I'm using 1.51 and have been playing around with exporting my 1440x1080 HDV to 1920x1080 uncompressed out of Premiere, de-interlacing with HiCon32 and bringing them back into a timeline at "full HD". I don't know if you aware - excuse me if you are - that with the camera connected via HDMI, you can only achieve all the benefits talked about through 'live' streaming capture, not like bringing the footage off the tape and into your NLE via firewire - after the point. I don't think editing will be too big of a pain as Blackmagic - referring to Intensity - has their own codec that should lower overheads akin to cineform.
__________________
Ainslie Davies - www.dualityproductions.com |
|
October 6th, 2006, 04:28 PM | #21 |
Trustee
Join Date: Nov 2005
Location: Honolulu, HI
Posts: 1,961
|
"So I need an HDMI port on my PC?"
No. Put down the mug and switch to decaf! :) HDMI output on the V1 is an extra capability, not a replacement of HDV. Your computer will be fine. In fact, in some ways HDMI-captured video may even be easier to edit than HDV. Think of it in a plumbing analogy: HDV and HDMI are like molases and water. HDV is viscous and sticky so it needs a strong pump to move it around (the CPU) but it is compressed down into a small quantity so the pipes needed are small. HDMI is less compressed so it flows more easily, but it is therefore larger so big pipes (hard drive subsystem) are needed. Current CPUs are powerful enough to pump HDV and hard drives are large and fast enough to handle the greater flow of HDMI. When people talk about 4:2:2 and the like, they are talking about colorspace - the amount of color data included in the data stream relative to the black and white information. The bigger the second and third numbers, the greater the level of color detail. The reason more color data is better is that compositiong could be more accurate. Also, more color information may allow a bit more latitude when doing color correction. It could also give a perceptible increase in apparent resolution, although the luminance (black and white) information is more responsible for that quality of the image. 8-bit and 10-bit color is the level of precision in each pixel of the color information. More is better for image quality, but HDV is still good even though it is lesser than the signal from HDMI. HDMI output is only a good thing for future V1 owners. |
October 6th, 2006, 07:40 PM | #22 | |
Major Player
Join Date: Dec 2005
Location: Melbourne, AUSTRALIA
Posts: 735
|
Quote:
In many ways, that's my interest in all of this. The V1 + HDMI out + Blackmagic Intensity = great chance of high quality film out, yes? Certainly better than my Z1 + HiCon32, right? Or should I just look at another camera all together if that's my goal? I understand that I'll need to be streaming to capyure footage via HDMI; that's not a problem. I guess I was just worried because everyone was saying you need a super computer to do it. Thanks for that explanation Marcus. I feel a little better now. I (try to) do a lot of keying, and from what I understand, HDMI will solve lots of problems. How come? Is it because it's progressive? Or because of the 1920 resolution? Or am I misunderstanding? |
|
October 6th, 2006, 08:03 PM | #23 |
Trustee
Join Date: Mar 2004
Location: Milwaukee, WI
Posts: 1,719
|
8 bit color means each color channel has 256 shades of color. 10 bit means each color channel can have 1024 shades of color.
256 shades of color stretched out over 1920x1080 pixels can cause banding. With that in mind however HDV, DVD, DV, DVCPROHD, HDCAM, DVCPRO50, Digital-S, HD-DVD, BluRay are all 8 bit formats so 8 bits are not all that bad. The only HD shooting format that actually records 10bits right now is HDCAM SR. Almost everything you will ever watch in HD is going to be 8 bits for a very long time yet. So do not look at 8 bit color as a bad thing. 8 bit uncompressed video will still knock the socks off of HDV or DVCPROHD. HDMI 1.3 with 10bit color makes sense for the PS3 since the games could be rendered using 10 bit graphics or high end HD production where studios would want to view their 10 bit source material on a 10 bit monitor. |
October 6th, 2006, 08:51 PM | #24 | |
HDV Cinema
Join Date: Feb 2003
Location: Las Vegas
Posts: 4,007
|
Quote:
And, yes it works during shooting and tape playback. But, I've not been able to get audio.
__________________
Switcher's Quick Guide to the Avid Media Composer >>> http://home.mindspring.com/~d-v-c |
|
October 7th, 2006, 01:21 AM | #25 |
Major Player
Join Date: Jul 2003
Location: Warren, NJ
Posts: 398
|
In the HDMI Spec (V1.0, 2003) I've seen--http://www.hdmi.org/pdf/HDMISpecInformationalVersion.pdf--section 6.6 (pgs 23-24) seems to indicate support for 8, 10, 12 bit quantization. I thought Sony uses a 14-bit A/D converson, so there should be more than enough resolution for 10-bit.
The Sony diagram does show 1440x1080 going to HDV and somewhat ambiguously to component. But why would they down res to HDMI? It seems to defeat the resolution advantage of the HDMI. And since HDMI doesn't support 1440x1080i (sections 6.2 and 6.3, pgs 18-20), they would have to upres after down resing . The Black Magic Design info on their HDMI card talks about getting 1920x1080 from an HC3. Does the HC3 output 1920x1080 and the V1 only 1440x1080? I'm a bit confused. What document says the HDMI is down-res'd? |
October 7th, 2006, 01:47 AM | #26 | |
Major Player
Join Date: Jul 2003
Location: Warren, NJ
Posts: 398
|
Quote:
|
|
October 7th, 2006, 02:02 AM | #27 | |
Major Player
Join Date: Jul 2003
Location: Warren, NJ
Posts: 398
|
Quote:
|
|
October 7th, 2006, 06:16 AM | #28 | |
Trustee
Join Date: Mar 2004
Location: Milwaukee, WI
Posts: 1,719
|
Quote:
|
|
October 7th, 2006, 06:20 AM | #29 | |
Trustee
Join Date: Mar 2004
Location: Milwaukee, WI
Posts: 1,719
|
Quote:
quantization is different then color sampling so maybe that is where some of the confusion is. DVD encoding uses a quantization of 9 or 10 bits but the color is still 8 bits. As for that chart are you sure that isn't the chart for the 1.3 specs? |
|
October 7th, 2006, 08:26 AM | #30 | |
Regular Crew
Join Date: Sep 2006
Location: Portland OR
Posts: 32
|
Quote:
What about later bringing in a few short pieces of DV from a DVCAM deck -- what is likely to be the best scenario for getting the DV footage into the same HD format as what I've described? |
|
| ||||||
|
|