January 21st, 2008, 05:42 AM | #121 |
Major Player
Join Date: Mar 2005
Location: canterbury
Posts: 411
|
Take,
thanks, have you considered one of the many GigE versions out there, what was the reason for going with Pike vs the other camera heads? Or in fact why not the PIke with the 2/3 sony ICX285 sensor (which claimes better range but lower res) It seems that you've had quite a bit of calibration problems, is this down to the pike itself or all the work involved in getting your recorder to work? I've seen images from the pike via streampix/cineform RAW that *seem* to look pretty good? That particular sensor is slightly wider than the usual 1" hence the question. Which lens have you been using? many thanks paul |
January 21st, 2008, 05:55 AM | #122 |
Major Player
Join Date: Mar 2007
Location: Amsterdam The Netherlands
Posts: 200
|
Paul,
Yes, I first considered a GigE, but I could not find the specifications of that protocol. Which is why I now use a IIDC camera, which has an open protocol and drivers for OS X. I like the HD resolution, I like to at some point get the 2K sensor. Which is why I use the 1" version. I am not sure why I have these non uniformity problems, maybe it is a bad sensor or it is caused by the micro lenses (which according to literature causes all sorts of problems). The non uniformity only shows in dark scenes with gamma correction, the manufacture says that I should use gamma correction and that is normal operations. At one time I was able to fix all the non uniformity problems, but I am redesigning some stuff to work better, it seems that my new algorithm has a bug somewhere so that is why I have so many problems now. I use the Fujinon CF16HA-1: http://www.fujinon.com/Security/Prod...ry.aspx?cat=47 |
January 21st, 2008, 06:32 AM | #123 |
Major Player
Join Date: Mar 2005
Location: canterbury
Posts: 411
|
There's an open standard GigE vision, i wonder if it's worth your while taking a look at that because firewire i would think is going to seriously constrict your options. Most of the cameras i've looked at also have thier own drivers and software to a greater or lesser extent. I have to say though that i've not tried developing with any of them yet so most of what im writing is guesswork based on specs and any information i can gather. Please take with a nice pinch of salt as the real world is often quite different!
http://www.prosilica.com/gigecameras/index.html that's one of the many manufacturers that use the KAI 2093 (others inlcude basler and JAI for example) At the moment i've not found any CCD based sensor at 2k with decent frame rates. That's one of limitations of CCD is quite slow rates (compared with CMOS). But then i thought the CCDs were much better with regard to FPN? I understand that because each cell on cmos is addressable individually then it's much more likely to show variations of amplification. Hence the question about the problems you've been experiencing. I thought that Cesar Rubio on his site had just plugged a Pike into streampix and cineform and output some pretty decent looking images. cheers! |
January 21st, 2008, 06:40 AM | #124 |
Major Player
Join Date: Mar 2007
Location: Amsterdam The Netherlands
Posts: 200
|
Hallo Paul,
All the drivers for camera are non OS X. You can no where download this "open standard". I may at some point switch to GigE, but for now I have got IIDC working. I may need to design my own camera at some point I am afraid. That is also why I think my camera is just bad, because Cesar gets nice pictures out of them. But then his nice pictures are not very black, I also get nice pictures from the outside without any processing. Cheers, Take |
January 21st, 2008, 08:12 AM | #125 |
Major Player
Join Date: Nov 2007
Location: Athens Greece
Posts: 336
|
The quality of cameras will vary within a series, even for serial numbers that change only a little, so it's always a serious bet that can cost 5k or 9k euro for the large sensors. You cannot trust the manufacturers to replace the cameras you don't like. In a production environment, you need to test a number of cameras in a scientific way, one by one, and only use the ones that cover your image quality requirements. Digital cinema is an application that is well above anything in machine vision in terms of required quality. You need to be sure the sensor will perform because it will possibly be used in natural light and with lots of gain and post processed in extreme ways.
You can't expect a produced image out of a camera. It takes work. Rubio's samples use quite a lot of commercial software, a recording app and cineform, cannot maintain precise frame rates and have audio sync problems. The lenses are very soft which hides the image quality problems of the simple debayer. But you can still see there is lots of chroma alias if you zoom. Generally, the package is expensive for an unfinished solution. There is no user interface, no focus aids, no real usable control. If you do not want a solution that uses the camcorder form factor, Take's colution will be cheaper and better than anything than can be put together with off the shelf software components because Take actually writes software:) It took Rubio quite a while to realise that ISO speeds of the bus (iso400, iso800 etc) have nothing to do with sensitivity, even though the frame rate halved when ISO was changed. |
January 21st, 2008, 08:32 AM | #126 |
Major Player
Join Date: Nov 2007
Location: Athens Greece
Posts: 336
|
A JAI Kodak HD gigE is about 9,000 euro with tax and shipping btw and it's practically a good small webcam that outputs uncompressed. There are many technicalities in designing a camera and lots of software and hardware engineering issues to solve. 1000s of manhours in user interface design, testing, processing algorithms, troubleshooting etc. It's not as simple as buying a head and using a computer. It would cost as much as a complete properly engineered solution and it would still be completely useless in a video production situation due to user interface, image quality, and basic implementation problems such as frame rate and audio sync.
|
January 21st, 2008, 08:40 AM | #127 |
Major Player
Join Date: Mar 2005
Location: canterbury
Posts: 411
|
John,
Whilst i agree with the artifacts in most examples, the Elmo Raw 12 looked much nicer. I wonder if StreamPix have fixed some aspects of the cineform integration by that stage. Also the divx of him with his kids in nice in terms of frame to frame consistency. The examples, lighting and environments are not ideal testing though! Im not sure what lens he was using, some photos show a f1.2 50mm nikon which would have a angle of view similar to 125mm (i think!) on this sensor and some of these examples look wider than that. Some of the companies i've spoken to about various cameras imply that some machine vision applications are beyond digital cinematography, it depends on the camera and supporting hardware behind the sensor. Can't beat creating your own though (which you're doing). I've found cineform to work very well for us (Prospect though) but i have no experience of the RAW version. SI footage looks very good though. Take, i understand the mac os x issue now, i hadn't taken that into account. It's been wonderfully enlightening reading your reports. I have no problem with development but i'll always try to avoid reinventing the wheel. If i can take someone elses and smooth it off a bit, that'd make me happier! cheers paul |
January 21st, 2008, 08:46 AM | #128 |
Major Player
Join Date: Mar 2007
Location: Amsterdam The Netherlands
Posts: 200
|
Hi John,
The difference in sensors is why I thought would be the problem as well. Anyway I am designing my system that even bad sensors would be good enough for digital systems. 6000 Euro was a pretty expensive bet, so I have no choice then continue what I have started. In a sort of weird way I am lucky I got such a bad sensor, the work I am doing will be a benefit later on. Cheers, Take |
January 21st, 2008, 08:54 AM | #129 |
Major Player
Join Date: Nov 2007
Location: Athens Greece
Posts: 336
|
I think SI are using their own debayer algorithms.
The recording app is just moving data, I believe it has nothing to do with image quality in this case. The machine vision applications are designed for processing images in scientific or industrial applications. Streampix can do practically nothing in that area. There are better packages which work great, but not in video applications. It's not the intended market. The software is not designed for streaming video so you have to build everything yourself. Anything will look ok if highly compressed to an mpeg4 variant. The format cannot preserve texture detail, shadow detail is eliminated, noise is reduced because the format cannot code it etc. The elmo sample with a sharper lens would look like this: (200% zoom) http://img182.imageshack.us/img182/3...12uncomct2.png A lot worse probably because some alias is already filtered with the ultra soft lens. |
January 21st, 2008, 09:04 AM | #130 |
Major Player
Join Date: Nov 2007
Location: Athens Greece
Posts: 336
|
|
January 21st, 2008, 09:09 AM | #131 |
Major Player
Join Date: Nov 2007
Location: Athens Greece
Posts: 336
|
Take a look at these cameras. These cost about 3600 euro a piece and are 2/3". Using the manufacturer provided video recording apps.
http://img168.imageshack.us/img168/5...arison2gx8.jpg Notice the debayer quality problems of the app and the uniformity problems (vertical lines) of the sensors. Also the difference in sensitivity. Both are using the same expensive sensor. I don't believe someone used to even $200 consumer quality camcorders would find this quality acceptable, but it is acceptable in most machine vision applications and these cameras are quite popular. |
January 21st, 2008, 09:16 AM | #132 |
Major Player
Join Date: Mar 2007
Location: Amsterdam The Netherlands
Posts: 200
|
Hi John,
I don't have any fancy lights or something, so I only can use daylight. That particular ColorChart was made indoor with natural light from outside. On a overcast day with lots of rain. Exposure was 0.02 seconds, and I think the lens was set on its third stop (f/2.4?) Cheers, Take |
January 21st, 2008, 09:20 AM | #133 |
Major Player
Join Date: Nov 2007
Location: Athens Greece
Posts: 336
|
The usual incadescent indoor lighting and candles are good for testing. You might have noticed we use them a lot:)
|
January 21st, 2008, 09:20 AM | #134 |
Major Player
Join Date: Mar 2007
Location: Amsterdam The Netherlands
Posts: 200
|
Whow John, those images are pretty bad, I do have that striping as well, but horizontally. This makes me a little bit more comfortable.
I wonder what they do in consumer cameras. Do they just make sure the sensors produce an acceptable picture, or do they solve it in software. |
January 21st, 2008, 09:23 AM | #135 |
Major Player
Join Date: Mar 2007
Location: Amsterdam The Netherlands
Posts: 200
|
I noticed your candle pictures. I am planning to be able to create multiple calibration data sets. So that you can use daylight or tungsten balanced. You will have to select which calibration data you like to use in the recording application before hand though (or modify the movie file with a hex editor :-).
|
| ||||||
|
|