July 27th, 2006, 08:15 AM | #151 |
Major Player
Join Date: Nov 2001
Location: Illinois
Posts: 888
|
One of the big differnces is the HD-SDI output. If your only planning on using one camera how important is having that? Also for shooting events, docs or maybe an indie can you live without SMPTE?
|
July 27th, 2006, 09:30 AM | #152 | |
Inner Circle
Join Date: May 2003
Location: Australia
Posts: 2,762
|
Quote:
I agree, about the making money thing, it could also be argued for Red and Viper. But I would value 35mbps for a prosumer camera, and cheaper prices for any base consumer camera that isn't more than 25mbps. Cheap Mpeg4 compression was not as far out as people think, and used on 8Mp motion camera years before. H264 editing is here shortly, I have talked to an company insider. General computer power is catching up with older advanced design concepts which have had the power for a number of years in actual chips. What we see today is nothing compared to what we can see tomorrow. Some of this is just company politics (like not being the preferred company internal solution) some a technology speed problem. The problem was that companies were caught up in processing power doubling every 18 months (which is why we are not looking at 20Ghz processors shortly). When this processing power rate increase slowed, it caught them off guard, they are now trying to implement alternative parallel designs to catch up. It has been recently revealed that Intel is planning a 32 core chip for 2010. If such a chip ran at 1Ghz per core, the combined power consumption might be 3.2W for 32 Billion Instructions per second (very rough estimates). For 32W, or 100W, we might expect something like 96bip+. That is 2010, but you should ask, when will we get 8 cores? A lot of these cores are moderate in performance compared to dedicated programmable parallel systems planned for the GPU/Cell/etc. Even the Ambarella h264 camera chip contains enough power to tackle editing. We can conjecture that everything is impossible, but the figures and designs do stack up, and I think that companies are planing for this level of processing being available for NLE's to use fro their cameras, before they committed. This year it might cost $2000-$4000 in computer hardware cost, but next year maybe $1000-$2000. |
|
July 27th, 2006, 09:51 AM | #153 |
Obstreperous Rex
|
Discussion moved from Industry News to Canon XH forum.
Congrats Greg -- your initial post drew 150 replies and more than 4,000 views in the space of about 24 hours. That's nothing on some other sites but around here that's actually quite a bit. |
July 27th, 2006, 10:05 AM | #154 | |
Inner Circle
Join Date: Sep 2004
Location: Sacramento, CA
Posts: 2,488
|
Quote:
And no matter what happens with hardware it will be easier to edit MPEG2 than MPEG4, so we'll get better peformance for the former on any given computer setup. HDV may not be perfect but it's a useful compromise given currently available technology; if it just had a slightly higher bit rate we might not be talking much about other alternatives. So maybe AVC cameras will prove to be more useful in the long run, but we're going to need some kick-a$$ computers to process the footage. |
|
July 27th, 2006, 10:44 AM | #155 | ||
Barry Wan Kenobi
Join Date: Jul 2003
Location: North Carolina
Posts: 3,863
|
Quote:
Quote:
Remains to be seen, but with the rapid adoption of AVC in so many other aspects (digital broadcasting, satellite broadcasting, European HDTV broadcasting, IPTV, blu-ray, HD-DVD etc) I think guesses of how difficult AVC-HD will be to work with are probably not accurate. |
||
July 27th, 2006, 11:40 AM | #156 | |
Wrangler
|
Quote:
I'm still fond of reminding people that in early 1980's they claimed we could never break 1200 baud on a standard copper wire phone line. Hmmm...and here I sit typing on a 5mbs DSL line using copper wire. My only complaint about technology is that there is always someone in the human race that finds a way to abuse it or do great harm to others. I know we will get realtime H.264 in due time, just like we started encoding MPEG2 with software and then graphics boards and sound cards started popping up with dedicated MPEG2 hardware encoding/decoding. -gb- |
|
July 27th, 2006, 12:04 PM | #157 |
MPS Digital Studios
Join Date: Apr 2003
Location: Palm Beach County, Florida
Posts: 8,531
|
I do know to edit native m2t, you need a lot of computing power, so who knows if it'll be easier or harder or the same with mpeg4. For now, using DIs on a PC really works, or the Apple Intermediate Codec on Apple NLEs (or native mpeg2-ts in Final Cut Pro 5).
heath
__________________
My Final Cut Pro X blog |
July 27th, 2006, 12:31 PM | #158 | |
Inner Circle
Join Date: Sep 2004
Location: Sacramento, CA
Posts: 2,488
|
Quote:
The important question is whether there will be a sub-$5K AVC camera which would make something like the new Canons seems overpriced. So far Panasonic is the only company proposing something plausible in that regard, and that likely won't ship until late next year at some unspecified price. So the Canon prices are perfectly reasonable today given the other available alternatives, and will presumably drop by next year when other cameras become available. |
|
July 27th, 2006, 11:46 PM | #159 |
Tourist
Join Date: Jul 2006
Location: Bellevue, MI
Posts: 2
|
Canon's English Press Release
Wow, things are really a buzz here in Canon land...
Here is a link to the Canon USA website press release for these two new exciting entries to the HD 3chip world... Go Canon! http://opd.usa.canon.com/templatedat...xha1_xhg1.html |
July 28th, 2006, 01:04 AM | #160 | ||
Inner Circle
Join Date: May 2003
Location: Australia
Posts: 2,762
|
I didn't really want to start a controversy, I was trying to post a final post before we got onto the subject, but then a few myths cropped up I though I'd address, and maybe I contributed a few more.
Quote:
Of course, it depends on what you are doing with it and how advanced, and what level of computer configuration you are using (assume I am talking of minimums). And using an intermediary like cineform, is a way of editing, with 2Ghz I understand. So my pricing takes into consideration those things (and H264 is far more intensive then Mpeg2). I was talking about 32 cores per chip, or when we will get 8 cores per chip. And it will help a lot, sometimes linearly, mostly nearly linearly, with software written for it that way. I understand multi core software is already in use in some NLE's, as well as GPU acceleration, you just have to use those products. The other thing is so much PC software is written inefficiently it is hard to judge what is possible from an application, unless it is written well. Quote:
The interesting thing is, from reading, that various wavelet schemes can produce results not quiet at the level of h264, with, by the looks of it, a lot less processing power. We make the mistake of judging the performance off of hardware assisted decoding, which, in many cases, wavelets do not enjoy. But h264 takes advantage of long development through Jpeg to Mpeg4, where as wavelets are probably yet to make their stride. I am interested in how techniques like 3D wavelets might help, I recently read they perform a transform over 3D space, with the third dimension being time/frames, to take in motion. I have spotted a H264, encoder, on a diagram of one of the graphic chips months ago. So yeah, possible. Using H264 is probably not going to be as far fetched as people think. Another thing that has come up in discussion, is the H264 pro codec Panasonic is to release next year (and I think camera release might be at NAB, rather than latter). It is frame based Intra, without the intermediate frames it is going to be a lot less processor intensive then Inter, or what people think. |
||
July 28th, 2006, 01:06 AM | #161 | |
Inner Circle
Join Date: May 2003
Location: Australia
Posts: 2,762
|
Quote:
|
|
July 28th, 2006, 11:13 AM | #162 |
Major Player
Join Date: Jan 2005
Location: Toronto
Posts: 532
|
cross-posted -- see http://www.dvinfo.net/conf/showthread.php?t=72491
|
July 28th, 2006, 01:26 PM | #163 | |||
Inner Circle
Join Date: Sep 2004
Location: Sacramento, CA
Posts: 2,488
|
Quote:
Quote:
Quote:
|
|||
July 29th, 2006, 12:40 AM | #164 |
Inner Circle
Join Date: May 2003
Location: Australia
Posts: 2,762
|
That other thread shows little understanding of costing, pricing, and one chip cameras (that even movie camera companies use in preference to 3 chip). So JVC HD10 and Sony AI are both credible cameras, as well as HC1 as it differs little (really serious sound recording uses a separate sound recorder). So I don't particularly want to contribute to it.
Kevin Yes, you gets heaps of processing power for $4k, leaving out all unnecessary bits and pieces that do not relate to the NLE job (and I forgot to factor in software costs I thin, so there is another $1K or so). SO that was my illustration there. Software might be around for single core and dual core, but it is progression, of writing a software structure that scales to more scores. The companies that have written for dual core before, are in a good position/experience to rewrite for more cores as they become available. The process should not be too difficult once you have mastered it. GPU etc, are another entirely different kettle of fish though, but once it is mastered for DirectX 10 or 100, it should become much more scalable experience for them. I don't say once written for direct x 9, because that is substantially different from Direct X 10/11, and substantially less performance, so the 10/11 is still a new learning experience, so to speak. The rumors, over the months, surrounding the now announced AMD take over of ATI, are interesting. I have not been posting here because they were rumours, but it goes like this, AMD (or was that Intel) was planning into taping into GPU like processing as a future processing avenue. Intel has been rumoured to be readying to counter the take over bid for ATI (who just had their Intel bus licence not renewed in another post). the interesting thing is that ATI and Nvidia are going separate ways on GPU processing at the moment. ATI has the most advanced integrated system, and Nvidia is saying they are too fr ahead of the market, and is going a more Conservative DX10 route for the moment (sort of the opposite of what happened a few years back). The more advanced structure of ATI requires more power and space, than NVIDIA's. This could result in AMD having less but bigger processing units on chip, and if Intel counters with smaller but more processing units. I wonder big but less will be able to match small but more. Intel has sold off most (if not all I am unsure) it's Xscale capacity, to fill the gap with a embedded form of the PC processor, that I imagine will be smaller and might even be related to the technology to be used in the 32 processor chip. Intel have tried this so many times in the past, and got their buts kicked ten years or so, ago, by ARM chips when they tried this with 386/486. Arm could fit many times more processing power in the same space by using an array of Arms (but then again processing array elements , like used in CELL and GPU, might matter more in todays applications), and even more again using alternative technologies, like the ones I were involved with. I am not completely negative of the H264 editing thing (especially as I read Sony plans NLE for PS3, that also is built to play AVCHD native, and they plan derivative products of the PS3 platform ;) ). Most likely Jobs was wooed by the 32 processor chip platform versus the Cell, with a but of ingenuity they should be able to add data processing units to match what CELL will do then. I still would have preferred a Apple based on CELL. If IBM gets a 100GB SD card out, P2 pricing might become a bit irrelevant, and who knows, maybe Panasonic was planning on taking advantage of this all along. So 100Mb/s might become very little, especially when you consider 1TB drives, and 200GB Disks and 3.2TB Holodisks etc, for storage coming in future years (but starting soon). Last edited by Wayne Morellini; July 29th, 2006 at 11:49 PM. |
July 29th, 2006, 06:40 AM | #165 | |
Major Player
Join Date: Jan 2005
Location: Toronto
Posts: 532
|
Quote:
Like I said the A1U is a good camera but you're on some sort of super drug if you think sony or any of the other companies believe it to in the same product category as the other cameras listed. You tell sony that both the A1 and the Z1 were created to serve the same purpose and they'd laugh. No one is bashing the A1 but come on man it makes no sense to compare it to the others and even once that comparison is made the XH A1 STILL stomps all over it. The A1U was created to ride in the sidecar of the Z1U, not to be the hero camera. And enough about the HD10 and the HC1, those cameras are no longer in production, they no longer exist. Mentioning something that is no longer for sale except for overstock has no logical foundation to it. If your gonna do that bring up the original sony pd100 and every other camera to ever go to camcorder heaven. There is nothing to debate or consider in american pricing the XH A1 offers incredible value compared to the competition for what you get. That said, if you want something else go ahead that's your right. If you already own something else that's cool too, your old camera doesn't stop working because there is a new one. But let's compare carrorts to carrots instead of carrots to baby carrots. |
|
| ||||||
|
|