Bob Zimmerman
July 27th, 2006, 08:15 AM
One of the big differnces is the HD-SDI output. If your only planning on using one camera how important is having that? Also for shooting events, docs or maybe an indie can you live without SMPTE?
View Full Version : Canon unveils the XHG1 and XHA1 Bob Zimmerman July 27th, 2006, 08:15 AM One of the big differnces is the HD-SDI output. If your only planning on using one camera how important is having that? Also for shooting events, docs or maybe an indie can you live without SMPTE? Wayne Morellini July 27th, 2006, 09:30 AM To this date the XH-A1 is the cheapest prosumer HDV camera with the Z1 and the JVC HD-100 next in line around $5,000.00. Sony A1, and the HC1 was not too far behind. I agree, about the making money thing, it could also be argued for Red and Viper. But I would value 35mbps for a prosumer camera, and cheaper prices for any base consumer camera that isn't more than 25mbps. Cheap Mpeg4 compression was not as far out as people think, and used on 8Mp motion camera years before. H264 editing is here shortly, I have talked to an company insider. General computer power is catching up with older advanced design concepts which have had the power for a number of years in actual chips. What we see today is nothing compared to what we can see tomorrow. Some of this is just company politics (like not being the preferred company internal solution) some a technology speed problem. The problem was that companies were caught up in processing power doubling every 18 months (which is why we are not looking at 20Ghz processors shortly). When this processing power rate increase slowed, it caught them off guard, they are now trying to implement alternative parallel designs to catch up. It has been recently revealed that Intel is planning a 32 core chip for 2010. If such a chip ran at 1Ghz per core, the combined power consumption might be 3.2W for 32 Billion Instructions per second (very rough estimates). For 32W, or 100W, we might expect something like 96bip+. That is 2010, but you should ask, when will we get 8 cores? A lot of these cores are moderate in performance compared to dedicated programmable parallel systems planned for the GPU/Cell/etc. Even the Ambarella h264 camera chip contains enough power to tackle editing. We can conjecture that everything is impossible, but the figures and designs do stack up, and I think that companies are planing for this level of processing being available for NLE's to use fro their cameras, before they committed. This year it might cost $2000-$4000 in computer hardware cost, but next year maybe $1000-$2000. Chris Hurd July 27th, 2006, 09:51 AM Discussion moved from Industry News to Canon XH forum. Congrats Greg -- your initial post drew 150 replies and more than 4,000 views in the space of about 24 hours. That's nothing on some other sites but around here that's actually quite a bit. Kevin Shaw July 27th, 2006, 10:05 AM We can conjecture that everything is impossible, but the figures and designs do stack up, and I think that companies are planing for this level of processing being available for NLE's to use fro their cameras, before they committed. This year it might cost $2000-$4000 in computer hardware cost, but next year maybe $1000-$2000. Today's $4K computers can barely handle 2-3 layers of 1080i MPEG2 without making performance compromises, so it's going to be a while before we can handle several layers of MPEG4 effectively. If/when professional MPEG4-based cameras start shipping, the footage will either have to be converted to an intermediary format for editing or someone will have to provide dedicated hardware acceleration to make native editing work. We should have eight-core personal computers by next year but that may not help much, because performance doesn't scale linearly with more cores and it takes time to rewrite software to take advantage of such systems. And no matter what happens with hardware it will be easier to edit MPEG2 than MPEG4, so we'll get better peformance for the former on any given computer setup. HDV may not be perfect but it's a useful compromise given currently available technology; if it just had a slightly higher bit rate we might not be talking much about other alternatives. So maybe AVC cameras will prove to be more useful in the long run, but we're going to need some kick-a$$ computers to process the footage. Barry Green July 27th, 2006, 10:44 AM so it's going to be a while before we can handle several layers of MPEG4 effectively. Maybe, maybe not. ATI and nVidia are supposed to be incorporating dedicated AVC hardware decoders in their graphics cards, which might make AVC editing practical much sooner than later. And no matter what happens with hardware it will be easier to edit MPEG2 than MPEG4 Why would you say that? MPEG-2 is a processor pig, we don't know if MPEG-4 necessarily will require more horsepower to edit. The compression phase may be more complex but that doesn't mean the decompression phase will be slower (and in editing, it's the decompression phase that's most interesting). And a dedicated MPEG-4 hardware chip on the graphics card could make AVC editing as responsive as DV editing. Remains to be seen, but with the rapid adoption of AVC in so many other aspects (digital broadcasting, satellite broadcasting, European HDTV broadcasting, IPTV, blu-ray, HD-DVD etc) I think guesses of how difficult AVC-HD will be to work with are probably not accurate. Greg Boston July 27th, 2006, 11:40 AM We can conjecture that everything is impossible, but the figures and designs do stack up... You'l never hear me say a given technology is impossible. Maybe not currently available, but not impossible. I spent 25 years in the semiconductor industry and I have seen pipe dreams become reality far too many times in that quarter century. I'm still fond of reminding people that in early 1980's they claimed we could never break 1200 baud on a standard copper wire phone line. Hmmm...and here I sit typing on a 5mbs DSL line using copper wire. My only complaint about technology is that there is always someone in the human race that finds a way to abuse it or do great harm to others. I know we will get realtime H.264 in due time, just like we started encoding MPEG2 with software and then graphics boards and sound cards started popping up with dedicated MPEG2 hardware encoding/decoding. -gb- Heath McKnight July 27th, 2006, 12:04 PM I do know to edit native m2t, you need a lot of computing power, so who knows if it'll be easier or harder or the same with mpeg4. For now, using DIs on a PC really works, or the Apple Intermediate Codec on Apple NLEs (or native mpeg2-ts in Final Cut Pro 5). heath Kevin Shaw July 27th, 2006, 12:31 PM MPEG-2 is a processor pig, we don't know if MPEG-4 necessarily will require more horsepower to edit. The compression phase may be more complex but that doesn't mean the decompression phase will be slower (and in editing, it's the decompression phase that's most interesting). And a dedicated MPEG-4 hardware chip on the graphics card could make AVC editing as responsive as DV editing. I'd be very surprised if MPEG4 is less processor intensive for editing, so as you speculated it's more likely we'll see hardware-assisted solutions. Maybe that will happen sooner than expected, but at the very least it's going to take a carefully configured system to edit AVC footage effectively. It will probably be easier to just convert to a digital intermediary, which we can already see signs will be supported. The important question is whether there will be a sub-$5K AVC camera which would make something like the new Canons seems overpriced. So far Panasonic is the only company proposing something plausible in that regard, and that likely won't ship until late next year at some unspecified price. So the Canon prices are perfectly reasonable today given the other available alternatives, and will presumably drop by next year when other cameras become available. Mike Boyce July 27th, 2006, 11:46 PM Wow, things are really a buzz here in Canon land... Here is a link to the Canon USA website press release for these two new exciting entries to the HD 3chip world... Go Canon! http://opd.usa.canon.com/templatedata/pressrelease/20060726_xha1_xhg1.html Wayne Morellini July 28th, 2006, 01:04 AM I didn't really want to start a controversy, I was trying to post a final post before we got onto the subject, but then a few myths cropped up I though I'd address, and maybe I contributed a few more. Today's $4K computers can barely handle 2-3 layers of 1080i MPEG2 without making performance compromises, so it's going to be a while before we can handle several layers of MPEG4 effectively. If/when professional MPEG4-based cameras start.... Ill, answer a few more posts here to. Of course, it depends on what you are doing with it and how advanced, and what level of computer configuration you are using (assume I am talking of minimums). And using an intermediary like cineform, is a way of editing, with 2Ghz I understand. So my pricing takes into consideration those things (and H264 is far more intensive then Mpeg2). I was talking about 32 cores per chip, or when we will get 8 cores per chip. And it will help a lot, sometimes linearly, mostly nearly linearly, with software written for it that way. I understand multi core software is already in use in some NLE's, as well as GPU acceleration, you just have to use those products. The other thing is so much PC software is written inefficiently it is hard to judge what is possible from an application, unless it is written well. And no matter what happens with hardware it will be easier to edit MPEG2 than MPEG4, so we'll get better performance for the former on any given computer setup. HDV may not be perfect but it's a useful compromise given currently available technology; if it just had a slightly higher bit rate we might not be talking much about other alternatives. Exactly, if they are going to sell prosumer cameras, and sell them as low end professional cameras, they should have a data rate to match. We don't need 100MB/s Mpeg2, but 35-50Mb/s goes a long way, and we would not be as interested in the alternatives. The interesting thing is, from reading, that various wavelet schemes can produce results not quiet at the level of h264, with, by the looks of it, a lot less processing power. We make the mistake of judging the performance off of hardware assisted decoding, which, in many cases, wavelets do not enjoy. But h264 takes advantage of long development through Jpeg to Mpeg4, where as wavelets are probably yet to make their stride. I am interested in how techniques like 3D wavelets might help, I recently read they perform a transform over 3D space, with the third dimension being time/frames, to take in motion. I have spotted a H264, encoder, on a diagram of one of the graphic chips months ago. So yeah, possible. Using H264 is probably not going to be as far fetched as people think. Another thing that has come up in discussion, is the H264 pro codec Panasonic is to release next year (and I think camera release might be at NAB, rather than latter). It is frame based Intra, without the intermediate frames it is going to be a lot less processor intensive then Inter, or what people think. Wayne Morellini July 28th, 2006, 01:06 AM I'm still fond of reminding people that in early 1980's they claimed we could never break 1200 baud on a standard copper wire phone line. Hmmm...and here I sit typing on a 5mbs DSL line using copper wire. He, He, I just signed up for 20Mb/s ADSL 2+ by the end of the year, and that is one of the slower ones, over copper. In the reduced bandwidth I will get in my location it is still like 160 times faster. or more, than my current modem, and my service provider is letting the service slip as they try to move us to a slower more expensive ADSL plan (sure..). I got sick of waiting for people's sample clips to download over 8-18 hors a time, and getting charged heaps when I went over my hours. Evan C. King July 28th, 2006, 11:13 AM cross-posted -- see http://www.dvinfo.net/conf/showthread.php?t=72491 Kevin Shaw July 28th, 2006, 01:26 PM And using an intermediary like cineform, is a way of editing, with 2Ghz I understand. So my pricing takes into consideration those things (and H264 is far more intensive then Mpeg2). I think we're saying the same thing here, which is that editing MPEG4 directly will be difficult on today's computers, so converting to an intermediary format would be the most practical approach. But today's computers are starting to be powerful enough to edit MPEG2 directly, which may be more desirable in some situations. For example, editing HDV or XDCAM HD footage in its native form will require less hard drive space than AVC footage converted to an intermediary format. I understand multi core software is already in use in some NLE's, as well as GPU acceleration, you just have to use those products. My understanding is that most software is still written to run well on just one processing core, a few applications have been written to run well on two processing cores, and hardly any software has been written to run well on four or more processing cores. It's not a simple matter for software to take advantage of the recent trend toward having so many cores availalbe. Another thing that has come up in discussion, is the H264 pro codec Panasonic is to release next year (and I think camera release might be at NAB, rather than latter). It is frame based Intra, without the intermediate frames it is going to be a lot less processor intensive then Inter, or what people think. Agreed that the Panasonic proposal is probably the most interesting aspect of current AVCHD announcements. Intra-frame AVC at a bit rate of up to 50 Mbps could be more useful to many of us than either HDV at 19-25 Mbps or DVCProHD at 100 Mbps. If Panasonic can ship a camera like the HVX200 but using standard flash memory and AVC encoding, that would be a nice compromise compared to other current alternatives. Imagine being able to record 20+ minutes of AVCHD on standard 8GB CompactFlash cards costing maybe $100 each by this time next year, compared to around $1400 for the same amount of P2 memory today. That's ~$5/minute for AVCHD compared to $150+/minute for DVCProHD. Wayne Morellini July 29th, 2006, 12:40 AM That other thread shows little understanding of costing, pricing, and one chip cameras (that even movie camera companies use in preference to 3 chip). So JVC HD10 and Sony AI are both credible cameras, as well as HC1 as it differs little (really serious sound recording uses a separate sound recorder). So I don't particularly want to contribute to it. Kevin Yes, you gets heaps of processing power for $4k, leaving out all unnecessary bits and pieces that do not relate to the NLE job (and I forgot to factor in software costs I thin, so there is another $1K or so). SO that was my illustration there. Software might be around for single core and dual core, but it is progression, of writing a software structure that scales to more scores. The companies that have written for dual core before, are in a good position/experience to rewrite for more cores as they become available. The process should not be too difficult once you have mastered it. GPU etc, are another entirely different kettle of fish though, but once it is mastered for DirectX 10 or 100, it should become much more scalable experience for them. I don't say once written for direct x 9, because that is substantially different from Direct X 10/11, and substantially less performance, so the 10/11 is still a new learning experience, so to speak. The rumors, over the months, surrounding the now announced AMD take over of ATI, are interesting. I have not been posting here because they were rumours, but it goes like this, AMD (or was that Intel) was planning into taping into GPU like processing as a future processing avenue. Intel has been rumoured to be readying to counter the take over bid for ATI (who just had their Intel bus licence not renewed in another post). the interesting thing is that ATI and Nvidia are going separate ways on GPU processing at the moment. ATI has the most advanced integrated system, and Nvidia is saying they are too fr ahead of the market, and is going a more Conservative DX10 route for the moment (sort of the opposite of what happened a few years back). The more advanced structure of ATI requires more power and space, than NVIDIA's. This could result in AMD having less but bigger processing units on chip, and if Intel counters with smaller but more processing units. I wonder big but less will be able to match small but more. Intel has sold off most (if not all I am unsure) it's Xscale capacity, to fill the gap with a embedded form of the PC processor, that I imagine will be smaller and might even be related to the technology to be used in the 32 processor chip. Intel have tried this so many times in the past, and got their buts kicked ten years or so, ago, by ARM chips when they tried this with 386/486. Arm could fit many times more processing power in the same space by using an array of Arms (but then again processing array elements , like used in CELL and GPU, might matter more in todays applications), and even more again using alternative technologies, like the ones I were involved with. I am not completely negative of the H264 editing thing (especially as I read Sony plans NLE for PS3, that also is built to play AVCHD native, and they plan derivative products of the PS3 platform ;) ). Most likely Jobs was wooed by the 32 processor chip platform versus the Cell, with a but of ingenuity they should be able to add data processing units to match what CELL will do then. I still would have preferred a Apple based on CELL. If IBM gets a 100GB SD card out, P2 pricing might become a bit irrelevant, and who knows, maybe Panasonic was planning on taking advantage of this all along. So 100Mb/s might become very little, especially when you consider 1TB drives, and 200GB Disks and 3.2TB Holodisks etc, for storage coming in future years (but starting soon). Evan C. King July 29th, 2006, 06:40 AM That other thread shows little understanding of costing, pricing, and one chip cameras (that even movie camera companies use in preference to 3 chip). So JVC HD10 and Sony AI are both credible cameras, as well as HC1 as it differs little (really serious sound recording uses a separate sound recorder). SO I don not particularly want to contribute to it. Buddy, "that other thread" lists the prices these things sell for! There is no debating that, there is NOTHING to understand about cost, pricing or whatever because those are THE prices as sold by one of this site's sponsors, a store that moves many, many cameras. Like I said the A1U is a good camera but you're on some sort of super drug if you think sony or any of the other companies believe it to in the same product category as the other cameras listed. You tell sony that both the A1 and the Z1 were created to serve the same purpose and they'd laugh. No one is bashing the A1 but come on man it makes no sense to compare it to the others and even once that comparison is made the XH A1 STILL stomps all over it. The A1U was created to ride in the sidecar of the Z1U, not to be the hero camera. And enough about the HD10 and the HC1, those cameras are no longer in production, they no longer exist. Mentioning something that is no longer for sale except for overstock has no logical foundation to it. If your gonna do that bring up the original sony pd100 and every other camera to ever go to camcorder heaven. There is nothing to debate or consider in american pricing the XH A1 offers incredible value compared to the competition for what you get. That said, if you want something else go ahead that's your right. If you already own something else that's cool too, your old camera doesn't stop working because there is a new one. But let's compare carrorts to carrots instead of carrots to baby carrots. Wayne Morellini July 30th, 2006, 12:30 AM Buddy, "that other thread" lists the prices these things sell for! There is no debating that, there is NOTHING to understand about cost, pricing or whatever because those are THE prices as sold by one of this site's sponsors, a store that moves many, many cameras. The reason I don't want to contribute to it, where ever something is expensive has everything to do with costing, pricing, and including all the credible options, and where ever it is just plain low cost or not. It is not just what the manufacturers and retailers decide. There is no super drug, A1 gets better latitude (one of the fundamentals of image quality) less over saturated colour (another thing that detracts from pro image quality) or better resolution, then some of the cameras listed, and the only thing that it really lacks, apart from one manual control, and level of usability of manual control, is it's sensitivity (left out progressive image, because it depends on the job style). A professional can use it quiet well. It is true that normal one chip cameras are really inferior, because they use an inferior complimentary sensor filter instead. But, the A1 and HC1 use a RGB/Bayer filter (forget exactly which one) which are good enough for Cinema cameras, and professional still cameras, delivering color accuracy close to 3 chip, and not suffering too much in resolution, normally, then how can we say it suddenly that much less than 3 chip. Still all prosumer HD, still all recent, still all the realm of comparison. We are talking about companies upping the price of HD here, how can we, unless we compare what has been released before. The HC1 is so close behind, and the JVC is just another prosumer camera. If Canon could offer true full resolution 24/25 progressive then that would be the biggest thing for the Canon A1 (and one of the reasons I did not buy a Sony A1). But the pricing, please, for the baby cameras no more than $3K guys, actually, preferably, starting at under $2K street guys. And upping the price of top prosumer cameras, please no more than $5-6K, under 4K street preferably, unless you are going to give us a better codec with full manual lens/shutter/exposure controls. Where ever prosumer, or baby prosumer, still prosumer. Pete Bauer July 30th, 2006, 05:55 AM Wayne, sorry I can't agree with you. Just a couple years ago, miniDV cameras of that form factor had MSRP's approaching $3000 US. You can make all the speculative technical apples-to-oranges analyses you want -- and seem to be very persistent in doing so -- but it won't change the fact that in today's marketplace, these cameras (including the XH's) are priced competitively. I think it is amazing that by year's end we will be able to buy a camera with 1.67MP chips, 20x L-glass, 24fps recorded to tape as 24fps, deep image control features, and a lot more for under $4000 US. Sure, I'd like to get one cheaper -- better yet, free! -- but that ain't reality. If you're one of the few who feel that none of these new relatively low cost HD cameras are worth their price, maybe you should shop for a Viper or something. But there is no point to trying every which way to argue that the manufacturers should halve their prices just because you wish it. Let's return to discussing the XH cameras themselves. Chris Hurd July 30th, 2006, 08:57 AM If Canon could offer true full resolution 24/25 progressive then that would be the biggest thing for the Canon A1No, it would not. You're trying to imply that somehow there is some big difference between progressive scan and Canon Frame mode. If you've actually seen it, then you'd know there's practically no discernible difference. Most folks that perpetuate the myth of "true" vs. "fake" progressive scan, have never looked at Frame mode. It pretty much *is* progressive scan, so, no this would not be the biggest thing for the Canon A1, because it's already in there, in the form of Canon Frame mode. But the pricing, please, for the baby cameras no more than $3K guys, actually, preferably, starting at under $2K street guys.The XH camcorders are not baby cameras. They're not that small. An Optura is a baby camera, and we havewn't seen one of these in HD yet, nor do we know at the moment what one would cost. Where ever prosumer, or baby prosumer, still prosumer.Argh, that ugly non-word again. I'm going to create a word filter in this forum's software, which will seek out and destroy all instances of that word. Professional is fine, consumer is fine, but "prosumer" is a term which is about to get banned around here. Dave Perry July 30th, 2006, 09:09 AM Argh, that ugly non-word again. I'm going to create a word filter in this forum's software, which will seek out and destroy all instances of that word. Professional is fine, consumer is fine, but "prosumer" is a term which is about to get banned around here. You go Chris! Fact is, the sub $10K cameras are not even close to being consumer cameras any way. Michael Liebergot July 30th, 2006, 09:20 AM Has anyone seen any mention as to the camcoders build? Is it plastic or magnesium body like the Sony's cameras? Canon makes great optics, but seems so skimp on the housing, making it more plastic, as opposed to Sony's strong magnesium body's on their cameras. Boyd Ostroff July 30th, 2006, 09:43 AM Argh, that ugly non-word again. I'm going to create a word filter in this forum's software, which will seek out and destroy all instances of that word. OK, I won't say it, but that word was actually coined in 1980 by Alvin Toffler to describe someone who is both a producer and consumer so it wasn't even originally intended to be used in the context we usually see it around here... Heath McKnight July 30th, 2006, 10:15 AM 6 years ago, some guy around my age that worked with me at a TV station that did BetaSP (we were both editors though on weekends, I'd shoot some stringer stuff exclusively for them with the XL1) and he was always determined to undermine me everywhere I went with my XL1. I'd talk about my movie, shooting stuff, etc., all with the XL1, and he'd walk buy, interrupt and say, "Prosumer." It made him look foolish and irritated me. One day, I shot some stuff for the TV station with the XL1, dubbed it to BetaSP and handed it to him to edit. He didn't say anything about the footage, so I asked if I shot it okay and he said yeah, looked good. I told him it was the XL1 and he flipped. Never said "prosumer" again. hwm Jeff Sayre July 30th, 2006, 11:05 AM Argh, that ugly non-word again. I'm going to create a word filter in this forum's software, which will seek out and destroy all instances of that word. Professional is fine, consumer is fine, but "prosumer" is a term which is about to get banned around here. Thank you, Chris! I started a thread last summer, one week after I joined your great forum. The thread was entitled "The End of the Prosumer". We had some good discussion but the issue fizzled out after awhile. You can read it here: http://www.dvinfo.net/conf/showthread.php?t=45885 Is it time to resurrect this year-old thread? I suggest those of you who are fed up with the word "prosumer" read that thread and post to it if you're interested. Wayne Morellini July 30th, 2006, 11:21 AM Wayne, sorry I can't agree with you. Just a couple years ago, miniDV cameras of that form factor had MSRP's approaching $3000 US. You can make all the speculative technical apples-to-oranges analyses you want Yes I agree, discussing the cameras rather than me, I was out of here a while back. On the inaccuracies, I am comparing the rising price of fruit, not apples and oranges. You can pay the grocery what ever you like, but the only reality is that he charges a profit between what it is made and supplied for and what people are willing to pay. I really need to make a doco on that. And that is all it was about, all the other stuff was a side track from the issue. Sorry for my statements, their is always somebody trying to peddle unreal stuff to me. Effective logical argument is the only thing to do, but our species seem far to fallible to do it, but not to argue ;). Now I'd better quit and get out of here, as originally planned (I want my freedom....). Simon Wyndham July 30th, 2006, 11:38 AM Most folks that perpetuate the myth of "true" vs. "fake" progressive scan, have never looked at Frame mode. It pretty much *is* progressive scan Agreed fully. I find it to be the same even with the old Canons. I used Frame mode all the time. And in fact whenever I asked clients which look they wanted to go with they always chose frame mode! Simply, the human eye does not discern detail in that way. So any loss of resolution that frame mode may introduce really is not noticable. Wayne Morellini July 30th, 2006, 11:48 AM No, it would not. You're trying to imply that somehow there is some big difference between progressive scan and Canon Frame mode. If you've I was not trying to imply that, as I didn't know (and others didn't seem to) what they had, and progressive was desirable feature for me to buy, I was just inferring that they could/can if it wasn't already (after having rewritten it a number of times trying to find the right word). Sorry about the baby stuff Chris, not meant that way, but bottom line of semi-professional camera ranges, in reference to the baby carrots still being carrots. The other word is the de-facto standard terminology for crossover cameras often targeted at both high end consumers and low end professionals, often with features from both worlds. I fully support cameras with increased pro quality instead. And, I don't care if anybody tells me pro......, it is what you can do with it on a good day. In that way Canon has always been on top. Good luck ;). Disconnecting/unsub, it is nearly 4 am. Michael Maier July 31st, 2006, 03:12 AM I love the way Canon works. They never make a fuss about non-working prototypes. These cameras look very interesting if you want a tricked out fixed lens camera on a budget or if you need (for some odd reason) a compact camera with HD-SDI and Genlock. It seems it could be the end of Z1 sales too. Sony better come up with something new fast. Michael Maier July 31st, 2006, 03:28 AM I'm not a fan of carrying exra stuff, but by using the HD-SDI and a device like the Bonsai drive you'd have a very portable way of recording 4:2:2 high def in difficult conditions. I thought the Bonsai drive only recorded PAL/NTSC 4:2:2, not HD. I don't even see a HD-SDI option for it. Michael Maier July 31st, 2006, 03:50 AM This is another big blow to the 720p format. We now have 8 1080i HDV cameras compared to the 3 720p HDV cameras. Those 3 720p cameras will be pretty much the same with just a few features added such as the HD200 and HD250. I have always been a huge fan of 720p but it is getting hard to stick with it since everything else seems to be going 1080i. Unless JVC comes out with some killer options for handheld 720p cameras I think I will have to kiss 720p goodbye. The calling card for 720p is that it's the only progessive HDV format as of yet. If you want real progessive, 720p is the only way for now. With all the bells and whistles of the H1 and these two great new little cams, the most important factor, the video, is compromised when using 24f, which is the mode all indie filmmakers will use. Now for those that want only interlaced for TV work, then definitely go for a 1080i camera. But for film style production 24f is just not there yet. Besides there's only one of the 1080i that is a exchageable lens camera, which is a totally different ball game from fixed lens palm-cams, and as far as I know it(the XL-H1) doesn't record HD 60p or 50p to tape as the 720p HD200/HD250 does. So I think we can all hold back on that goodbye kiss for now. I hope I haven’t opened a can of worms, but you brought it up :) Michael Maier July 31st, 2006, 03:55 AM Is "jet black" the official color of every HDV camera out there? I'm not saying it doesn't look cool, I just wonder why Sony, JVC and Canon simultaneously dropped their grays and silvers when a new format came along. What if I'm on a night shoot and I stumble and break my $7000 camera just because it's sooo stylish I can't even see it? That's what worries me. My HD100 is not jet black, more like a charcoal. Is your jet black? Josh Dahlberg July 31st, 2006, 04:04 AM I love the way Canon works. They never make a fuss about non-working prototypes. I agree with you, Canon keep things up their sleeves until they're fully formed and practically ready to ship. The H1 was even more startling, because at the time it was announced a lot of people thought Canon were way off their game, and then bang, they had this beautiful camera on the shelves. It would be interesting to know what pays off better in terms of sales, the Canon way or the marketing strategy of that other manufacturer - you know, racking up the hyperbole about a work in progress endless months out. On the other hand, the Canon way can also be a little unsettling. I'm half considering buying an H1 and picking up one of the new puppies later on as a b-camera, but you never know when they're going to spring an H1s on us, and it could be sooner rather than later. Michael Maier July 31st, 2006, 05:04 AM No, it would not. You're trying to imply that somehow there is some big difference between progressive scan and Canon Frame mode. If you've actually seen it, then you'd know there's practically no discernible difference. Most folks that perpetuate the myth of "true" vs. "fake" progressive scan, have never looked at Frame mode. It pretty much *is* progressive scan, so, no this would not be the biggest thing for the Canon A1, because it's already in there, in the form of Canon Frame mode. I’m sorry to disagree Chris, but there is a big difference. For starters, 24f doesn’t look nearly as filmic as progressive. All progressive scan footage I have seen looked more filmic than anything I have seen from a XL-H1. Now you may say this is a matter of opinion and subjective, even though I truly think footage from the XL-H1 looks like great video and not filmic at all and the difference is easily discernible to my eyes, but on the top of that and something that is not subjective, there’s the resolution drop when using 24f, while when using progressive, resolution is actually increased in comparison to interlaced. So considering that plus the fact that it could never physically or technically be considered progressive if it doesn’t originate from progressive CCDs, saying it’s pretty much progressive is like saying because the Genesis has a 35mm sized sensor and DOF it *is* pretty much film. Folks perpetuate the myth of "true" vs. "fake" progressive scan because that’s exactly what 24f is, an imitation of progressive scan. Michael Maier July 31st, 2006, 05:08 AM On the other hand, the Canon way can also be a little unsettling. I'm half considering buying an H1 and picking up one of the new puppies later on as a b-camera, but you never know when they're going to spring an H1s on us, and it could be sooner rather than later. Well, that can really happen with any manufacturer and it's just the nature of the bizz I guess. See what JVC did releasing the HD110 with just over 8 months of releasing the HD100? Josh Dahlberg July 31st, 2006, 05:39 AM I truly think footage from the XL-H1 looks like great video and not filmic at all and the difference is easily discernible to my eyes Are you serious? So you go into a theatre, sit down to watch a low-budget feature, digitally projected, with great lighting, great acting, great directing, great script... but wait, it's immediately apparent that it's been shot on an XL-H1 because it's not filmic at all, not like say, a Panny. Sorry, I don't mean to sound facetious, but I'm just trying to work out what you're saying because it seems incredible to me, not filmic at all? Perhaps I'm just being defensive because I can't see what you see... I've looked at a lot of H1 footage and it looks wonderful to me, and handled correctly, as filmic as any other 1/3" cam on the market. Admittedly, I've only been in the game five years, but I'm a keen observer. If I can't see it, I wonder if any of my clients, or the viewing public can. ..but on the top of that and something that is not subjective, there’s the resolution drop when using 24f, while when using progressive, resolution is actually increased in comparison to interlaced. Michael, while this is true, in this price bracket it's beside the point. The Canon drops from having WAY more resolution than competing models to having a very respectable resolution, on a par with the JVC, and better than the Sony and the Panny. So is progressive resolution of 540x540 (HVX) more resolution than frame obtained progressive of 800x540 (Canon)? I'm just quoting Adam's test numbers and trying to figure out your argument. Because surely you have to factor in the base numbers if you're saying there's a drop in resolution. No matter how they achieve it, the Canon's 24f mode is sharper than Panny's 24p. The DVX and XL2 are "true" progressive, but their res is way way lower (obviously). I don't think you can use a resolution argument to discredit frame mode. It would have to be soley on the basis of cadence. Michael Maier July 31st, 2006, 05:48 AM Michael, while this is true, in this price bracket it's beside the point. The Canon drops from having WAY more resolution than competing models to having a very respectable resolution, on a par with the JVC, and better than the Sony and the Panny. So is progressive resolution of 540x540 (HVX) more resolution than frame obtained progressive of 800x540 (Canon)? No, I was talking about 1280x720, which the XL-H1 despite being marketed a 1080 lines camera can’t match when in frame mode and 1280x720 is the least resolution to be considered HD. I'm just quoting Adam's test numbers and trying to figure out your argument. Because surely you have to factor in the base numbers if you're saying there's a drop in resolution. No matter how they achieve it, the Canon's 24f mode is sharper than Panny's 24p. Who elected the HVX200 as benchmark? The benchmark is the HD standard. Are you serious? So you go into a theatre, sit down to watch a low-budget feature, digitally projected, with great lighting, great acting, great directing, great script... but wait, it's immediately apparent that it's been shot on an XL-H1 because it's not filmic at all, not like say, a Panny. Sorry, I don't mean to sound facetious, but I'm just trying to work out what you're saying because it seems incredible to me, not filmic at all? Now I left this one for last because you totally lost me here. What does this have to do with discussing technical aspects of progressive and frame mode and with making a technical point that frame mode does or does not equals to progressive? Chris Hurd July 31st, 2006, 05:50 AM I’m sorry to disagree Chris, but there is a big difference. For starters, 24f doesn’t look nearly as filmic as progressive. All progressive scan footage I have seen looked more filmic than anything I have seen from a XL-H1.No need to say you're sorry to disagree with me, Michael. But you're referring to a side-by-side shoot, where all parameters involved including the lighting, place and time, and most importantly, skill sets of the operator were perfectly identical? Where did you see that? If you're not referring to a side by side comparison shoot, then I suggest that the "filmic" differences you saw were related to factors other than the frame rate, the human equation being the one making the most impact in any visual results. on the top of that and something that is not subjective, there’s the resolution drop when using 24f, while when using progressive, resolution is actually increased in comparison to interlaced.Reso, shmeso. Forget numbers, they're an impediment to the creative process. What *visual* difference does it make? What is the audience going to notice? Do you think your average movie goer could pick out the difference between 24P and 24F in a 35mm film out? So considering that plus the fact that it could never physically or technically be considered progressive if it doesn’t originate from progressive CCDs,Technically? That's why they call it Frame mode and not progressive scan. Physically? Are we talking about what goes to tape? The answer might surprise you. The distinction is no longer as clear as it used to be. Folks perpetuate the myth of "true" vs. "fake" progressive scan because that’s exactly what 24f is, an imitation of progressive scan.Wholeheartedly disagreed. I think the reason why some people perpetuate that myth is because they buy into the marketing hype that tries to make more out of the difference than it's worth. For example, nobody seems to be complaining about the "fake" 24P that comes out of the Sony XDCAM HD camcorders, which record a 24fps image from interlaced CCDs. Why is that? Because there's no progressive scan HD camera in that price range, and therefore no marketing battle to fight, that's why. Chris Hurd July 31st, 2006, 05:56 AM 1280x720 is the least resolution to be considered HD.Woah! Absolutely false. By definition, HD is *any* resolution higher than standard definition. One pixel taller and wider than PAL is high definition. 721x577 is the least resolution to be considered HD. The benchmark is the HD standard.And that's a very loosely defined benchmark! Josh Dahlberg July 31st, 2006, 06:03 AM What does this have to do with discussing technical aspects of progressive and frame mode and with making a technical point that frame mode does or does not equals to progressive? Well technically they are not the same, I don't think there was ever any argument about that. Chris has certainly never said that, and I didn't mean to imply it. The issue is whether they have the same effect, whether visually they achieve the same outcomes. You stated that it's very clear to your eye that the H1 is not filmic. That in itself is not a techincal appraisal, but a subjective one, albeit with trained eyes. So - and perhaps it was a folly - I put forward a real world situation, a completed digital film showing in a theatre, because when all is said and done, nobody cares if frame mode IS the same as progressive (by definition it's not) but whether it achieves the same results. Simon Wyndham July 31st, 2006, 06:13 AM If you find the size of the XD discs less friendly than Beta tapes I can't help you. I think the idea is pretty ridiculous, but hey... Just like the Canons the difference is not noticeable to the eye. Does anyone know how the F mode on the Canon H1 is done? because if it is done the same way as the old GL1 then it is a loss of 30%. Roughly the difference between NTSC and PAL. In other words, most people simply would not notice. Wayne, progressive scan is progressive scan no matter how it is acheived. If it is running at 24fps with full frames it will have EXACTLY THE SAME cadence as film. There is no disputing this at all. Any notion that there is somehow a difference between an HVX200 running at 24fps, a DVX100 at 24fps and a Canon at 24fps is quite unrealistic IMHO. In fact I bet if I made a sequence using my usual techniques for filmlook using any of the new HD cameras you would not be able to tell me which was F mode and which was a true P mode. I have slow motion footage from a PDW-F330. In slow motion the resolution halves to 540 lines. Yet I still find it hard to tell. People have also told me how they thought it was shot using overcranked film in some cases. There is simply no substitute for actually using a camera, or seeing a well shot programme made with them rather than debating figures. Chris Hurd July 31st, 2006, 06:37 AM if it is done the same way as the old GL1 then it is a loss of 30%.Nope, this isn't your grandmother's Frame mode. This is the "new and improved" version. Canon has had seven years to completely update Frame mode. It's much better than the old 1997 implementation from the XL1 (and GL1 in 1998). Kevin Shaw July 31st, 2006, 07:33 AM By definition, HD is *any* resolution higher than standard definition. One pixel taller and wider than PAL is high definition. 721x577 is the least resolution to be considered HD. I've heard this before, but is it an officially accepted definition anywhere? HDTV in particular has specific approved formats of which 720p is the lowest resolution. Peter Moore July 31st, 2006, 07:36 AM Woah! Absolutely false. By definition, HD is *any* resolution higher than standard definition. One pixel taller and wider than PAL is high definition. 721x577 is the least resolution to be considered HD. And that's a very loosely defined benchmark! Chris, I'm disappointed! That's not true at all. I suppose we could play semantic games, but, at least today, what is "considered" by the industry and by the consumer to be HD is 720p or higher. That is established in the ATSC HD standards definitively, and undoubtedly if you surveyed home theater owners they would tell you 720p. I would be loathe to consider a camera HD if it couldn't create pictures with true 720p resolution. Simon Wyndham July 31st, 2006, 07:43 AM Good. Well if it is even better than the old frame mode (which I thought was rather good at the time) then there is no excuse not to use it. Chris Hurd July 31st, 2006, 08:29 AM I suppose we could play semantic games, but, at least today, what is "considered" by the industry and by the consumer to be HD is 720p or higher.All I'm doing with that statement is proving a point... if some folks want to get "technical" about what is or isn't progressive scan, that is, make an argument one way or the other based solely on facts and figures and not by what is actually acceptable by industry standards or consumer experience, then the same folks that want to prove something using numbers or definitions alone need to apply that line of reasoning to everything else, not just those concepts that prove their favorite point of view. And I'm right. High Definition is defined by the ATSC as any television format with a higher resolution than SDTV. That's a fact. Now I agree with you that the common practice these days, what the market actually accepts today, starts at 720. Like you, I too would be loathe to consider a camera HD if it couldn't create pictures with true 720p resolution. But if somebody is going to suggest to me that Frame mode isn't progressive from a strictly technical standpoint, I would have to counter that from this very same strictly technical standpoint, HD starts at 721x577. From there, things can go downhill fast. If that sounds ridiculous, it's meant to be. Strictly technical definitions sometimes are not the best ways to get points across, nor are they the final arbiters of conflicting points of view, nor are they substitutes for real-world experience. The same audience, myself included, that considers 720 as the place where HD starts is the same audience that can't tell the difference between Frame mode and progressive, and wouldn't care if you showed them. Zack Birlew July 31st, 2006, 10:41 AM I'm sorry, I can't see any reason for people to complain. Quite frankly, exactly where else can you get a 1440x1080 resolution 1080i HD camera with a 24p-esque option for less than $8,999 MSRP, or now, $3,999 MSRP? True, 720p has its advantages but as was stated a long time ago, regardless of whether it is 720p or 1080i, if handled the same, the image will look identical, the only real difference comes with 1080p, but we don't have that yet in this price range. Unfortunately, there is a lacking of progressive cameras compared to the plethora of interlaced cameras and those that we do have are 720p and not the ideal 1080p. But who knows what's around the corner these days? Granted, manufacturers are using various compression schemes at this point in time for the majority of <$10,000 video cameras, but the image quality is still much better than we could hope for with MiniDV. There's also options for uncompressed HD output if you really want it, look at the HD-SDI option, from what I understand that's an amazing achievment for someone to put that on a camera below $25,000. It also opens many doors for the filmmaker in post. Looking at the XLH1, I dislike the low resolution LCD viewfinder and default image. I'm a simple guy, I'd prefer it if the camera had a good default image like the DVX or HVX, but that doesn't mean I couldn't get a similar or better image by tweaking the XLH1 to my liking. With the A1 and G1, I've got a better selection of 24fps cameras, regardless of how they get that frame rate, and I have the same options to tweak the image as I would with the XLH1. What is more impressive is that these new cameras are good enough for a film-out right off the bat (even though they might look like something you'd see on the "Regal 20" commercials at some movie theatres :) ). Also, they're cheap, you can get really professional looking 24fps HD for far, far, less than you'd pay for a Panasonic Varicam or Sony F900. If none of those options are good enough, then either wait for something better or spend extra for a higher up camera that is good enough. Chris Barcellos July 31st, 2006, 03:43 PM Since the discussion has strayed directly from the Canon cameras in particular, does anyone have any guess where a 3 CMOS chip HD camera might fit in this whole scheme of things. Would CMOS chips actually speed up processing in some way to make 1080p more possible ? Chris Hurd July 31st, 2006, 03:49 PM If the discussion has strayed directly from the Canon cameras in particular, that means I haven't been doing my job. Let's bring it back on topic to the CCD-equipped G1 and A1 specifically, and take the suppositions about CMOS camcorders elsewhere. Thanks in advance, Chris Barcellos July 31st, 2006, 03:53 PM Where ? Area 51 ? Chris Hurd July 31st, 2006, 03:57 PM Yes, but tread lightly there. Things have been known to disappear without a trace from Area 51. Chris Barcellos July 31st, 2006, 04:01 PM Oh, the mysteries of Area 51... |