View Full Version : 4:4:4 10bit single CMOS HD project
Steve Nordhauser June 15th, 2004, 02:53 PM OK.
Rob L: average vs peak clock rates. The average is the visible image size x frame rate (1280x720x24fps = 22.1Mpix/sec). Due to blanking time, for the SI-1300 to do that rate, the pixel clock is 27MHz. This is the rate that, during image readout, the bus must meet. With an HD with a big buffer, the average # works fine.
Wayne: Rolling shutter. I think you have it. The rolling shutter 'artifact' is tied to the single frame readout time. The time from the first line read out to the last is always one frame time. You can run the SI-1300H up to 60fps. This will drop the time for one frame from 1/30 to 1/60 sec, reducing the artifact. As you say, this will reduce the integration time (light sensitivity). UNLESS (this was the hmmmm part), you read out the frame at 1/60 sec and then have a long vertical blanking time - all the pixels will continue to integrate during the trash frame time. Now you are integrating at 1/30th sec, reading out at 1/30th sec average, but the frame you read is done top to bottom at 1/60th.
Wayne: On making money. Yes, we will be interested in building systems and creating bundles. I have to wait for my R&D group (you guys) to come up with a product. Seriously though, I will always be clear what is public domain to future customers. I don't have the resources to solve all these problems, but I may help out. If that BitJazz codec does what it says - 2:1 lossless faster than the speed of light, I will consider negotiating a volume license and give a license away with the cameras. I do have the corporate blessing to be helping here. I think great things will happen.
Wayne: USB 2.0. To get high data rates, you eat a fair amount of CPU time. Also, transfer rates are highly dependent on the host controller. Intel ICH4 and 5 south bridges are 2-3x faster than external controllers. The typical USB interface chip has small FIFOs making real-time tough at high rates when you don't want to drop frames. We don't compress in the camera because either you want lossless, which only gets 2:1 or lossy and then you need to be able to select a lot of variables. CPU compression and raw recording are more cost effective.
Jason Rodriguez June 15th, 2004, 02:58 PM Obin,
Right now Bitjazz is nice, but it's only 8-bit. I also believe that's real-time ecoding for 8-bit SD video, not HD video.
On the rolling shutter topic again:
How slow or fast does the shutter have to be in order to get away from the artifacts. From the footage that I saw on the wmv file, there's apparent stretching artifacts at all. How bad would it be at 24fps, with a 1/24 shutter speed? Also couldn't we just blank for 1/48 of a second and then take the next frame on the next 1/24th of a second, so it's imitating a mechanical shutter at 180 degrees (like a film camera). I think this was mentioned before, but I'm not sure what happened in the response. I heard something about a firmware upgrade needed.
The problem is that if we clock the chip at a high Mhz to satisfy the problem, and drop frames, then we're loosing the ability to shoot slow-motion. Not good.
Rob Scott June 15th, 2004, 03:08 PM Also couldn't we just blank for 1/48 of a second and then take the next frame on the next 1/24th of a second ... I heard something about a firmware upgrade needed.I believe the firmware upgrade was just to allow a long blanking time. By setting it to 48 fps and then grabbing every other frame, we get the same effect. (Unless I'm missing something.)
The problem is that if we clock the chip at a high Mhz to satisfy the problem, and drop frames, then we're loosing the ability to shoot slow-motion. Not good. The way I envision it, the software would adapt to what you're trying to do -- if you want 48-60 fps, it would not try to skip frames in between. The skipping would only be valid in the 24-30 fps range.
Jason Rodriguez June 15th, 2004, 03:20 PM I see what you're saying, but then you have the problem of too much motion blur when shooing from 48-60fps. Typical film shutters work at half the frame rate. So for 48-60fps, we need a shutter that's working effectively at 1/96th to 1/120th of a second.
Steve Nordhauser June 15th, 2004, 03:23 PM Obin:
I think there are two sets of test images the gang would like to see. First is a horizontal pan with lots of vertical edges. This should show up the rolling shutter problems. Or, a stationary camera view and someone driving by.
Second is Les has asked to see two consecutive frames - no movement between them. He has a lot of experience in noise analysis and will help understand the requirements for image enhancement.
Group:
To sweeten the pot, anyone who *substantially* moves this project along - programming, research, testing can have a 20% discount on a camera or bundle purchase from SI. Again, Rob coordinates the software, not me. Obin, you are clearly carrying your load on test image generation and ponying up for a camera purchase. You don't have to be a programmer, just take the lead in some valuable aspect of getting to usable tools.
Steve
Rob Scott June 15th, 2004, 03:33 PM Originally posted by Jason Rodriguez : I see what you're saying, but then you have the problem of too much motion blur when shooing from 48-60fps. Typical film shutters work at half the frame rate. So for 48-60fps, we need a shutter that's working effectively at 1/96th to 1/120th of a second.With this particular chip, I don't think we have any alternative, unless we use a mechanical shutter.
Jason Rodriguez June 15th, 2004, 07:03 PM The Kinetta's not using a mechanical shutter, only a mechanical one. I saw it myself. And it's going to have a lot of nice frame-rate features, so that must be feasable with this chip in an electronic shutter mode.
On a second though, the kinetta's going to use an Altasens chip, maybe there's something different with that chip than with the Micron.
Another thing we need to realize is that this Micron chip was made for digital still cameras (at least that's the way it's listed on their website). You can get very long exposures with those cameras, how come again there's no problems with rolling shutter artifacts. Also be aware of the fact that consumer digicams don't have any mechanical shutters, just electronic.
Obin Olson June 15th, 2004, 07:29 PM I am doing my best to lead this along... I should have been working today instead of this stuff but I was not ;) anyway here is an AMAZING codec
http://www.bitjazz.com/sheervideo/
that thing compressed the 1300 tiff files from 580megs down to 109megs with NO visible loss in quality whatsoever...amazing ...I was playing full on HD off of my firewire drive today with NO problem at all...15-16megs a sec as it ran the clip!!!! then I took the image into after effects and whacked it out to see if it would fall apart from the compression...not at all, it looked great! Much better compression then the Panasonic varicam has.
And they are in BETA with a 10bit version of the codec!!! yeeeee
<<<-- Originally posted by Jason Rodriguez : Obin,
Right now Bitjazz is nice, but it's only 8-bit. I also believe that's real-time ecoding for 8-bit SD video, not HD video.
On the rolling shutter topic again:
How slow or fast does the shutter have to be in order to get away from the artifacts. From the footage that I saw on the wmv file, there's apparent stretching artifacts at all. How bad would it be at 24fps, with a 1/24 shutter speed? Also couldn't we just blank for 1/48 of a second and then take the next frame on the next 1/24th of a second, so it's imitating a mechanical shutter at 180 degrees (like a film camera). I think this was mentioned before, but I'm not sure what happened in the response. I heard something about a firmware upgrade needed.
we can capture at 30fps and run the camera at 60fps and then when we wnat to capture 60fps for slo-mo we just keep all frames..no biggie also I can still shoot at 24fps with a higher mhz like 54mhz on the camera, this will allow for a nice image like in the wmv file...if you put the mhz down to say 27 the rolling shutter is VERY VERY bad as you pan the top of the iage lags way behind the bottom...I think you guys need to calm down about this..it's not that big of a deal and I know of atleast 2 ways to deal with it..no problem
Steve Nordhauser June 15th, 2004, 09:32 PM Obin,
The truth is in the pudding, whatever that means. Since we know what kinds of shots will be the worst, shoot a few and post them. I'll try to get to it but I'm not impartial (Wanna buy a camera?) and I'm not a cinematographer, I'm an electrical engineer (anybody get the DeForrest Kelly flashback from Star Trek? boy am I tired).
I think rolling shutter is only an issue for medium frame rate. If I were to expose for 1/8 sec and read out in a 1/60th of a sec, the top to bottom blur would be minor compared to any motion blur.
For testing with the current software and hardware, Obin, if you can't get the frame rate you want, use a smaller window. The frame rate is the issue - must be 24fps to be a valid test.
Bitjazz: No miracle, it is not lossless. 5:1 is not lossless. They might have a patent worth searching to see what they do. (don't infringe, that is intellectual property). We do need to understand the losses. They could make their own artifacts under certain conditions we don't know now. Remember what I said about fully understanding any step that has a potential effect on the final image quality. Compress before recording is the holy grail. This could be good. Even if we do the R,G,B separately into a non-standard file and provide a tool for crunching them back together. And we want 12 bits. But 8 or 10 is a fine start.
Joel Corkin June 15th, 2004, 11:28 PM Sorry Steve, I'm not really following this thread too closely, but the Bitjazz Sheer video codec certainly is lossless according to them and according to my own tests.
Rob Lohman June 16th, 2004, 03:10 AM Obin: we will calm down when we see some footage. Besides
24 fps I would also like to see the same footage at 48 fps to
see how much it goes away. The most easy way is to have a car
going from left to right and right to left and have the camera
in a 90 degree angle. The car should be driving a real speed
(ie not 10 mph or something).
It is not a problem you can fix. At 48 or 60 fps the problem will
still be there, only lesser. So you might not see it, but it will be
there because it is inherent to a rolling shutter.
Can you please respond to my question list that I put up above?
Steve: it looks like (according to their claims) that the codec is
lossless. Or so they claim. We could easily verify this. It looks like
it is only out for the Mac platform at the moment though. Also
they claim to do a 2 - 2.3 compression themselves.
I think this adds up. Think about it that Obin's TIFF files are 16
bit!! with only 8 bit information in them. So if you divide 580 by
two you get 290. Divide that by 109 and you get a 2.6:1
compression ratio which is basically what they claim.
I'm wondering if we can do something more efficient since we
know we will be working with Bayer data and they don't. I mean
in storing the data to disk. We still need to do Bayer conversion
after this so it might then be a very interesting codec to use
for editing (solves a lot of our problems). The Dalsa L3
compression is pretty similar in rates, but probably far far far
more expensive.
There is a text on compression for such chips which can be
found at this location (http://ieeexplore.ieee.org/xpl/abs_free.jsp?arNumber=78401). I'm not an IEEE member though...
Everyone: they are looking for beta testers so this might be
interesting. I'm wondering if they are supporting all sorts of
resolutions as well.
Steve: I have been looking a bit into pre-recoding compression
before laying it out to disk. I'd much rather have a single harddisk
then two or more. Ofcourse it should be lossless and fast.
Steve Nordhauser June 16th, 2004, 06:31 AM Rob on BitJazz:
On their web site, they claim 2.4:1 or so for YUV. They also say they can do RGB but don't give numbers. This is realistic for lossless. I was commenting on Obin's <<from 580megs down to 109megs with NO visible loss in quality>> . This is lossy. I'm sure that they are doing more than packing bytes. Using the basic tools (Huffman, RLE and some varients that are common) you can do 2:1 lossless. Very image dependent though. And *Very* noise dependent. Remember that they are talking images after the Bayer de-mosaic step which puts that into the real-time path. I'm thinking the best method is to create a red, blue and green separate buffer and RLE each one. That should be doable at any resolution in real-time and save the Bayer for some real crunching. We need to hear from someone deep into Bayer. I'll try to get to the library and get a copy of that article.
Obin Olson June 16th, 2004, 06:31 AM Rob, I don't think it's the framerate but what mhz you shoot at...I have shot at 24fps with huge rolling shutter and then put the camera mhz up and it goes away at 24fps
Steve, what I got was VERY acceptable for production and I could edit from a firewire disk drive! I would be happy with that codec..for really important stuff you can use 16bit tiff files
it's such a pain in the ass to capture with Xcap I will wait and see if streampix fixes the dll file and then try and shoot more...I have seen enough that I don't have a need for more. I do understand that you guys want more ;) and will do my best to help
Obin Olson June 16th, 2004, 06:38 AM 1) do you have a RAW 10 bit pre-bayer file for us? This CAN be a lower resolution STILL file. No movie, no high res.
I will look on the hard drive today..problem is i never have them i have 10bit packed into 16bit tiff files
2) which lens(es) are you using with your camera now?
el cheapo CCTV lenses, 25mm and a 75mm - yes I am sure the image would be better with a higher res lens
3) could it be the stuttering in that movie is due to your harddisk not keeping up?
I have issues with keeping the camera shooting at 24fps with Xcap, it could be my hard disk sure
4) which camera settings did you use to record that movie? (gain / exposure)
that is 3 shots...they are all 8 bit. the truck was exposed for the hot background and then I pulled truck up from the shadows
the shot of me is exposed for my face and blows the crap out of the backgorund
the flowers? dunno
5) did you do any color correction or other conversion to the material?
flowers are raw colors, everything else is "fixed"
6) Which Bayer algorithm did you use?
dunno whatever Xcap uses
This are important questions to the people looking at also getting
a chip or not...
Steve Nordhauser June 16th, 2004, 06:50 AM Obin, you are doing great. On the XCAP, try recording to memory since your test clips are small, then you should not stutter. Norpix should have your new .dll files today.
I'm a little not clear - you are able to record at a fixed frame rate independent of the pixel clock? That is news to me - I will have to check it out. I would suspect you occasionally get duplicated frames - I will talk to the Epix technical people about this.
XCAP uses a very basic Bayer as default. I'm pretty sure that you can change that under the 'colors' tab in the left hand camera control window. I think they support a number of algorithms (just saw this yesterday) in the latest version. One of the nice things about Epix and Norpix is that they never charge for upgrades - just download the latest version.
Obin Olson June 16th, 2004, 07:01 AM not sure Steve, I can just type in 54mhz and then shoot at 24fps..don't ask I have no idea! :) but it seem to work..Maybe it's not working at all, it's hard to tell when you can't just type in 24fps on this crappy software..What I do is slide the frame period up and down till the "video" and "Display" read about 24fps seems like they vary from 23.97 -24.2 ...this is one big reason we need software that LOCKS in 24fps 30fps 48fps etc...it's got to be a constant framerate, no fluctuation
Wayne Morellini June 16th, 2004, 09:29 AM The Sumix info is up at the Viper thread, and it is good, Altsens sensor to, but no 3 chip yet.
This is something I forgot to post last night which I just mentioned in the Viper thread, that maybe of some interest to Rob, and Rob, and Steve:
I have some cheap tech that could do the compression without FPGA, and be programmed in weeks. I have spotted it some time ago, it has an array of parralell risc processors with memory and fpu's, at extremely high perfomance and low power, but only just remembered it. That would move developement time of this part of the circuit up 10 times. Steve if you would like to pass it along? This is not the other tech I mentioned before that was mainly for PC's processing. I am such a dope not to remember this earlier, this would let even a low speed PC do all the capture, compression, and editing you like (as long as the software can take advantage of it).
www.clearspeed.com/news.php?page=pr&pr=2
www.clearspeed.com/technology.php?PHPSESSID=4c2189afe776e9261cc7694795e722fb
www.clearspeed.com/products.php?page=si
www.clearspeed.com/products.php?page=faq
I have included all these links as it is hard to get an idea of the extent of it's capabilities from one page.
Summary:
- from 10's to thousands of Risc 32-bit processors with Floating Piont untis
- programmable in C (Rob could move code to it).
- can act as main processor, or coproccessor
- very low power at very high performance, or even more performance at higher power levels.
- 64 bit version coming.
With something like this, a cameralink comrpession board could have been within reach of the Robs.
Jason Rodriguez June 16th, 2004, 09:59 AM We better watch out with recording at 24fps. For the guys in the audio department, that could end up being a REALLY bad idea. Right now, everything shot for television at 24p is really shot at 23.976 to keep compatability with 29.97 NTSC. If you shoot at 24p and then try to go back to NTSC, and your audio is at 24p, that could spell problems. These are just a couple things I've learned from shooting on the F900 where you have the choice between both 23.976 and 24p. The 24p option and NTSC always meant trouble.
BTW, normally with film, the audio is slowed down and synced with the video in the telecine, which is also actually slowing the film down also from 24fps to 23.976.
Rob Scott June 16th, 2004, 10:08 AM Jason Rodriguez wrote:
Right now, everything shot for television at 24p is really shot at 23.976 to keep compatability with 29.97 NTSC.I may not have mentioned it, but from the beginning I was planning to have a standard vs. NTSC timebase option.
So in the UI you would have the option of any frame rate -- 24, 25, 30, 48 -- even 27.5 I suppose -- and then separately you would select NTSC timebase. I think that would keep the UI a bit cleaner -- what do you think?
Obin Olson June 16th, 2004, 12:27 PM hmm...streampix is working now, just captured 48fps 8bit! using a ide raid 0 2 drive setup now...gets 51megs/sec single SATA drive gets 40 or so....looks like we may have trouble with rolling shutter at higher framerates..not sure why but I can see it playing 48fps at 24fps..I will do a better test outside soon
Wayne Morellini June 16th, 2004, 01:33 PM <<<-- Originally posted by Steve Nordhauser :Wayne: USB 2.0. To get high data rates, you eat a fair amount of CPU time. Also, transfer rates are highly dependent on the host controller. Intel ICH4 and 5 south bridges are 2-3x faster than external controllers. The typical USB interface chip has small FIFOs making real-time tough at high rates when you don't want to drop frames. We don't compress in the camera because either you want lossless, which only gets 2:1 or lossy and then you need to be able to select a lot of variables. CPU compression and raw recording are more cost effective. -->>>
I thought that might be the case, I thought I heard about them upgrading their specs so that it would off load the cpu ussage to a dedicated controll circuit (much like firewire), but I'm probably mistaken.
I like to say all this stuff with the read out peak speeds (with 1 GB you can average them out to disk) maxing out the interfaces is dissapionting, why don't they use buffer memory on the chip/camera (then you can run the next Micro camera at 216fps (24fps*9) and blank 8 out of 9 frames, that would solve much of the problem.
<<<-- Originally posted by Steve Nordhauser : Rob on BitJazz:2:1 lossless. Very image dependent though. And *Very* noise dependent. Remember that they are talking images after the Bayer de-mosaic step which puts that into the real-time path. I'm thinking the best method is to create a red, blue and green separate buffer and RLE each one. That should be doable at any resolution in real-time and save the Bayer for some real crunching. We need to hear from someone deep into Bayer. I'll try to get to the library and get a copy of that article. -->>>
This is what I was thinking too, the bayer has a pattern, but if you take all the pixels of the same colour you get three seperate gradient monochrome images, that the fax compression routine you mention could probably do a good job on (that was a looseless routine wanst it?).
About that compression performance of 5:1 (was it really 5:1). I think that any routine that reliably gets rid of niose before compression to get higher comrpession is acceptable, and it is prefferable to get rid of it at some stage anyway. RAW yes, but we are going to nuke this niose anyway, and a couple of other things. Am I right Steve, if we preproces the niose, Bayer (split to three mono files), and artifacts, we can archive higher than 2.5:1 average lossless compression?
Thanks
Wayne.
Rob Scott June 16th, 2004, 01:45 PM Steve Nordhauser wrote
I'm thinking the best method is to create a red, blue and green separate buffer and RLE each one. That should be doable at any resolution in real-time.I had that very idea! Once I get a camera in-hand I'm going to try separating the channels out into separate chunks and do zlib compression on them. I also found a real-time very fast compression library that I'm going to try out. Depending on the compression, we might be able to get far higher frame rates with a single drive.
then you can run the next Micro camera at 216fps (24fps*9) and blank 8 out of 9 framesI think that would make the shutter speed too fast and would tend to give unblurred frames. Good for sports, perhaps, or a "Saving Private Ryan" effect, but not for everything.
Les Dit June 16th, 2004, 01:50 PM Rob,
I've found that noise really limits the compression of digitized images. Using LZ compression on 10 bit images only gave me maybe 30% .
A lot of the codecs and compression schemes are a lot trickier when you have 10 or 12 bits of which the last few bits have a significant noise component. It's very important to keep those bits for color correcting later.
-Les
Steve Nordhauser June 16th, 2004, 01:51 PM Wayne on compression:
I think it was Obin that was talking about 500MB files in about 100MB of space. True lossless will max out around 2.5:1, no matter what you do. Certainly the less the black level offsets and gain non-linearities, the better the compression because then a smooth color will give smooth values.
Wayne on speeds: Yes, buffering in the camera or grabber would reduce peak speeds to average. Not sure if it is worth the cost.
Wayne on high speed: very fast is epensive - the Micron parts are rated at something like a 60MHz clock, Altasens at 75MHz. With internal A/Ds, you can't run much faster. Really fast parts like the Micron 1.3Mpix with global shutter have more taps out of the array and run them in parallel - that is a 10 tap.
Jason on rates: We have a programmable clock generator that works off of a base frequency. If we can't get to the exact number, we can change the base frequency. Using 23.976 as the target frequency, we can hit 23.977 with a 45.5ppm error. For Obin and Rob, that is a value of 0x35BD8F to the clock generator.
Rob Scott June 16th, 2004, 02:00 PM Les Dit wrote:
Using LZ compression on 10 bit images only gave me maybe 30%Well, that might still give me enough breathing room to get 30-40 fps on a single hard drive. Or prevent hiccups when Windows has to "do something" during capture.
Steve Nordhauser wrote:
Using 23.976 as the target frequency, we can hit 23.977 with a 45.5ppm errorInteresting. Does anyone know how much error we can tolerate? Is there some tolerance specified in the NTSC and/or ATSC standards?
If we can't get close enough, I guess we'd just have to adjust the speed audio to match the video, like the "old" days of shooting silent film and wild sound. That's OK with me, doing indie filmmaking, but I'm not sure it's going to be OK with everyone...
Les Dit June 16th, 2004, 02:02 PM A good test for the rolling shutter artifact is panning across some vertical objects, like some buildings or a picket fence.
The image will look 'trapezoidal', as if for each scan line the image is offset by a pixel or so.
Most consumer still digi cams *do* have a shutter.
I have an el-cheapo CMOS one that has a 2MP sensor, and it has horrible rolling shutter problems, because it has no shutter. I took it apart, and I think it has either a Zoran or Micron sensor in there. Too bad it Jpegs it all, I can't see the image quality.
You may not see problems on a passing car, as cars these days don't have many straight edges, they look more like a used bar of soap :).
I think it will be essential to allow a longer integration time, and a short readout. Otherwise the image will look like it was made of Jello as you pan around, for lack of better words!
Any luck posting two Bayer images that we can look at for noise content?
-Les
Steve Nordhauser June 16th, 2004, 02:06 PM Les,
You are sure you were seeing rolling shutter problems and not interlacing? Interlaced images have their own problems with panning - the odd and even fields are one frame off so you get sawtooth edges - much worse than a RS effect.
Les Dit June 16th, 2004, 02:08 PM Indie film makers get away with shooting 25 fps in PAL and transfering it to film at 24 with NO CONVERSION at all !
Close enough for them !
Steve,
Most defiantly rolling shutter.
It's a SiPix still camera, a sub $100 still camera.
-Les
<<<-- Originally posted by Steve Nordhauser : Les,
You are sure you were seeing rolling shutter problems and not interlacing? Interlaced images have their own problems with panning - the odd and even fields are one frame off so you get sawtooth edges - much worse than a RS effect. -->>>
Rob Scott June 16th, 2004, 02:22 PM Les Dit :
Indie film makers get away with shooting 25 fps in PAL and transfering it to film at 24 with NO CONVERSION at all !
Close enough for them !True enough! There is plenty of software available now (I think Audacity will handle it) for adjusting audio without changing the pitch. (And a minor change in the pitch is usually no big deal -- for dialogue anyway.)
Steve Nordhauser June 16th, 2004, 02:40 PM Les: the two are not related. Rolling shutter is the sequential reading lines in a frame while the other lines integrate. Interlacing is reading every other line each frame so you traverse the image top to bottom twice as fast. The opposite of interlaced is progressive - no skipped lines for each frame read. You can have interlaced rolling shutter or progressive. All of ours are progressive.
Rob Lohman June 16th, 2004, 03:39 PM I'm pretty sure that 23.976 or 23.977 will be good enough. But
then again, I ain't in NTSC land and we have rock solid frame
rates here at exact 25 fps for example.
Steve & Rob: we are all on the same line in regards to first
seperating the planes and doing compression on them seperate.
I've just written a TIFF to RAW Bayer conversion program so we
can now test with any 16 bit TIFF basically. This gives us some
freedom. The program is in Rob S's posession as well.
Wayne: I did my first programming in low-level assembler before
moving to C(++) and then on to the Windows platform. I'm pretty
sure I know exactly how a PC works internally and how Windows
works as well. I've written assembler boot-loaders and some
low-level Windows stuff. The only thing I ain't really good at is
anything with Unix/Linux on the PC. Oh well...
Steve: do you happen to know if the 16-bit bytes are coming
in BIG or LITTLE endian order? Or do we get the high byte first
or the low?
A question regarding future camera's. Will they be controllable
and connectable in a similar fashion? In other words minor
changes to the software?
Obin Olson June 16th, 2004, 04:30 PM ok streampix works well..I shot 3500 frames in one set today at 12bit 24fps..this very good...workflow from stream pix is hell and goes like this:
1 Open xcap and set the camera exposure frame period gain ROI and mhz for each and every shot we shoot, save fmt camera file from xcap
2 Open streampix and load fmt file so that the camera is ready for the shot
3 Make new file for sequence
4 Shoot
5 Open shot black and white sequence with streampix
6 Load Bayer converter filter and make sure files will output in the right place with the right Bayer RGGB
7 Save sequence to disk for a long long time as the black and white sequence is being converted to color sequence
8 Open HUGE color sequence in streampix and output tiff files or avi or jpeg and wait for a very long time
9 Open what you have just saved and compress with some codec like SheerVideo so you can edit and finish project
almost all steps above should be this:
1 Open stream pix and in streampix set camera for shot incl. shutter fps etc
2 Shoot to sequence file
3 After shooting open sequence file and de-Bayer the image to color and save with a codec like SHeerVideo
4 Edit final footage
what's up with this board??
oh ya, almost forgot it takes over 45min to deal with the above files to get them into a SheerVIdeo codec quicktime compressed file! this is silly and will never work for professional work unless we can make fewer steps to get a sheervideo file
and don't forget the 22min it takes to transfer all the tiff files to the videoserver over gigabit lan
Steve Nordhauser June 16th, 2004, 05:01 PM Obin,
In terms of LAN speed, we have found that streaming over a gigE lan is limited to about 250Mbps from one machine to another using the Windows drivers. Even the raw data for 3500 frames is 1280x720x16x3500 = 51Gbps
At 250Mbps, that should take 206 seconds or 3.5 minutes. That is 16 bits per pixel.
For bigger shoots, if you don't need a raid or maybe only a two drive, use removables (slide into a chassis slot right on the IDE bus, not firewire) to move the data around. Just watch the heat - I don't think the removables cool as well as the fixed.
Jim Lafferty June 16th, 2004, 05:30 PM Is anyone looking into building a simple control system that would send the footage as is to some stacked drives?
I'm following these threads the best I can and keep reading about Obin's plans to build a small miniATX machine, but why not just have SATA drives in an enclosure that's within reasonable size/weight/power consumption specs ala the Vance Cam?
What sort of intercept system would have to be built to catch the stream and compress it on the fly?
What I'm getting at is -- I'd rather capture footage to disk without lugging a computer around with the cam. I want to just get the footage on site and then bring it back to a machine for post.
Is this possible? If so, can it be done with our resources?
Also, I mentioned elsewhere that it would be neat if custom "looks" could be generated as small files -- like BIOS flash files -- that we could all trade online as filmmakers. Based on what I know of the project as is, this isn't really feasible given that the camera is technically already built -- we're just figuring out innovative ways to harness its footage.
To my mind it would be amazing to be able to assemble an interface that would apply different effects to the footage with a way to prerserve and select among settings. These prefs would be output to a small file, and could be downloaded and installed to the cam. In this way you could change the looks of your footage as easily as you do ringtones on your cell phone. A community of open-source mod'ers could build visual "filter" toolkits this way.
Steve -- what kind of control is possible for the ambitiout enduser with regards to the image's lattitude and other characteristics? Is it possible that they could be changed on the fly and plugins could be built for them?
Thanks for all contributing,
- jim
Obin Olson June 16th, 2004, 05:42 PM ouch..it says it will take 58min to convert 16bit tif file's into the beta sheer codec....guess this could be because it's on the network drive and or it's dealing with a high bit depth.
Jim your thinking big, and thats good, but we have very basic problems at the moment like what to do with all the data this camera spits out...how to deal with high bit depth images and the list goes on and on and on...arggghh I am getting tired of it! I wish we had a magic button for this!
what the heck am I doing with 16bit tiff files anyway? the 1300 camera outputs 10bit that gets captured at 12bit and converted to 16bit tiff...what the HECK is that about ?? anyone? why does After Effects edit in 8 or 16bit ONLY what about 10 and 12bit?
maybe we should stay with simple 8bit files...everything works with them....argghh
Steve Nordhauser June 16th, 2004, 05:57 PM Obin on transfer time:
Don't do any processing on video on a network drive - move it locally and then back.
Jim on a control system:
A couple of things to watch. First is cost - PC motherboards are pretty cheap. If you are grabbing camera link, you could embed your own interface - not too complicated. How will you do the viewfinder? Pretty easy with a PC chassis. What about a smaller footprint like PC/104+? I know there are frame grabbers out there for it. Not extendable to 64 bit, as far as I know. There are some other standards for SBCs (single board computers) like this EBX format:
http://www.orbitmicro.com/products/embedded%20boards/socket%20370/EM370B.htm
It has PC/104+ so you could attach a frame grabber. There are many - don't get hung up on my first google search. You might find one with a RAID controller. Motherboard and PCI grabber will always be cheaper.
Jim on software filters:
It should be easy to do some kind of plugins in version 1.1 or so of the software. Color balance is just a 3x3 transformation table. I think photoshop lets you build your own filters by editing the table. Hey, why not use the standard photoshop plugins or whatever video editing tool uses plugins.?
David Newman June 16th, 2004, 06:02 PM Obin,
Well you already know the problems of 8bit, that is what you are trying to avoid. :)
Regard the 10 or 12bit issue. There is little point for applications to support a native 10 or 12 bit workflows when computers work better on 16bit. Our codec is 10bit yet our workflow is 16bit, this is completely normal. 10 or 12 bit data is simply left shifted to be processed as 16bit. This is then reversed when exporting back to the codec. The extra interim precision is a good thing.
Obin Olson June 16th, 2004, 06:07 PM sorta like filling your gas tank 3/4 the way full? so the image file is not bigger then a 10bit EVEN though it's 16bit?
David I sent you 2 cd-roms with some raw data on them, HTH
So we need to have a 16bit compression codec to edit with and a way to edit it in a normal edit system like premiere pro. We could do our color work in 16bit and save the files for editing in 8bit and then master in 8bit onto dvd HDDVD mpeg-2 HD DVCPROHD etc?
here is the card to get for this system
http://www.highpoint-tech.com/USA/rr1820a.htm
upto 1200MB/sec with 8 SATA disk drives and the card is under $200!!
Jim Lafferty June 16th, 2004, 07:48 PM Obin and Steve -- thanks for the responses.
I know I might be getting ahead of the game a bit, Obin, but I'm trying to think ahead conceptually so we can best accomodate certain design ideas from the outset. I find this helps when trying to avoid potential future pitfalls.
That said, I've a friend who's one of the PHP codifiers (not just a coder -- he helps standardize new PHP commands). He's got a lot of coding knowledge and is doing render/compression stuff for 3d environments at the moment. I'm hoping to persuade him that this project might be worth his time.
I've also taken the liberty of setting up a quick-n-dirty site of still images from the footage you've provided, Obin. Here it is (http://ideaspora.net/HD/).
I hope you don't mind -- I'm just trying to get the word out and I find the images are worth many words :D
Also, it helps for people who don't have WM9HD installed.
The site will be updated with condensed info and links to footage, images, etc. as I have time and as the results here merit.
Thanks again for the work you're all doing!
- jim
Les Dit June 16th, 2004, 08:49 PM I've been dealing with the bits problem for about 10 years.
For now, just to save disk space, I recommend using the Cineon ( fido ) format, it puts three 10 bit rgb code values into 32 bits ( 4 bytes ), so you only waste 2 bits per pixel.
That's for rgb demosaiked frames.
You could pack the bits in a similar fashion for the raw bayer image, and decode it later. It's all in the name of saving disk space.
Like I said before, LZ or RLE won't help much with > 8 bits.
But if you have to, the LZ Tiffs would crush out the wasted empty bits to give similar files sizes, but with more cpu to get there.
-Les
Rob Lohman June 17th, 2004, 02:32 AM Okay, let's all slowdown a bit. Obin: I understand your frustration,
but this is what a lot of people have predicted and we know
would happen.
That's why we are trying to design a system that works like a
camera and not something with 20 steps to go through. As I
explained earlier, this is NOT going to happen overnight. Although
Rob S. and myself are working on details and testing things, we
both have jobs and Rob S. is still waiting on his camera.
Development of software, testing algorithms, optimizing, testing
systems and generating a workable platform takes time. Lots of
time. Where you expecting to be up and running in a couple of
weeks? That's just unrealistic.
David: is 10 or 12 bits always left justified? Is there a reason for
it being left instead of right?
Jim: the main problems is bandwidth / datarate and the system
to put inbetween the camera and the harddisk (or a set of
harddisks). Two programmers (Rob Scott and myself) are already
on the project so, yes, a middle system is being worked out but
don't expect the "magic button" (as Obin calls it) next week or so.
Les: Rob and myself pretty much agreed on doing no Bayer
conversion in the "camera" section. That takes way too much
time and resources to do good and we don't have that (we
must process a very large data stream).
So we are going to write the RAW stream coming from the camera
to disk in at least a packed form. This means that 10 bits per pixel
turn into 40 bits/5 bytes per 4 pixels instead of 8 bytes.
Storing it in Bayer form (10 bit) gives us a data reduction (not a
true reduction since this is what is coming from the camera, but
a reduction in comparison to full demosaiked RGB data) of 3:1.
This reduces bandwidth 3 times or 77%. Then we at least store
it packed which saves us 37.5%.
So for every 24 bytes (not bits) of RGB data we only store 5
which is a 4.8:1 compression or a reduction of 79.2%. This is
without loosing any quality (except for the Bayer filter, but that
you can't get without on single chip system either except for
a Foveon chip)
Since the camera is sending us raw Bayer at 16 bits this gives
us a 37.5% reduction or a 1.6:1 compression.
Rob S. and myself (and everyone is invited, ofcourse) are looking
into ways to futher LOSSLESS compress this. Rob has already
seen some codecs. I'm a bit worried about speed and we should
keep this as fast as possible (speed over compression) to deal
with the bandwidth (see below).
At the moment we are not looking at any compliant file because
I doubt there would be one for Bayer format (except RAW camera
files. Canon has CRW I think).
After it is recorded the signal must be dealt with on a computer:
1) convert from Bayer to full RGB with the best (selectable) algortihm
2) do things like white balancing / color correction (can only be done with the full RGB data)
3) write it out to a standard format / codec (perhaps with smaller resolution proxies to ease the editing)
Now I hear a lot of people think I thought the camera would do
white balance and color presets and whatnot. Don't count on it.
We probably don't have the processing power. Plugins for the
second phase of the capture process should be no problem.
The problem with that is, is that the data must be converted into
full RGB first before we can do anything with it. This also increases
our storage three-fold. What we might do is do some lower
resolution stuff for the viewfinder / monitor.
Instead of full bayer you simply make 1 pixel out of every macro
block that contains 4 pixels (2 horizontal, 2 vertical). This gives
you a 50% reduction in either resolution and a very fast de-bayer
algorithm. So a 1280x720 image becomes 640x360 which is more
easily displayed on a standard monitor as well. We might have
time to do a bit of color-shifting (white balancing) on that and
it might be possible to save such presets with the stream so the
second stage on a full blown computer can take that as the
default setting.
So for now it will definitely be a two stage process. Capture and
store and then the second phase transform and transcode to
a different format.
The list of the most problematic things in this project are (in my
opinion):
1. working out a system to go inbetween the camera and storage (full PC, FPGA, embedded chips etc.)
2. working out a storage system that can keep up (or compression)
3. getting a viewfinder / monitor out
Now to give everyone a low down on some data rates (again):
1280 x 720 x 24 fps x 8 bit, full RGB = 63.28 MB/s (126,56 MB/s)
1280 x 720 x 24 fps x 10 bit, 16-bit full RGB = 126.56 MB/s (253.12 MB/s)
1280 x 720 x 24 fps x 8 bit, Bayer = 21.09 MB/s (42.19 MB/s)
1280 x 720 x 24 fps x 10 bit, 16-bit Bayer = 42.19 MB/s (84.38 MB/s)
[b]1280 x 720 x 24 fps x 10 bit, packed Bayer = 26.37 MB/s (52.73 MB/s)
1280 x 720 x 24 fps x 12 bit, packed Bayer = 31.64 MB /s (63.28 MB/s)
The numbers between the () are for 48 fps. I've added 12 bit on
the last row to see where it might be headed. So the most optimal
size and very easy to implement (10 bit, packed Bayer) is the one
that will definitely be there. Whether or not we can support any
futher compression depends on a lot of things (licenses, speed,
resources etc.).
Steve: can you explain to me in a little sentence what PC/104 is?
Is it like a form of mini-PCI or PCMCIA?
Steve Nordhauser June 17th, 2004, 06:07 AM Rob L: PC/104 plus. One sentence- easy:
http://www.pc104.org/technology/pc104_tech.html
I'll elaborate. PC/104+ is an small stackable card format designed for industrial uses that mimics PCI-32 in signals. Instead of card edge connections (we've all had to reseat boards in computers) it uses connectors like hard drives use and the boards screw together. Since it mimics the standard PCI, designing a new board is usually just a relayout. That means there are a lot of boards out there. Price is definitely higher that PCI (volume of sales are lower), but possible for the integrated camera idea (CPU + LCD controller + disk controller) but the CPUs tend to lean towards lower speed, lower power. EBX is worth watching (same site). It is a somewhat larger embedded format that uses PC/104 plus as an expansion board. If I were designing an embedded PC product, this is where I would be looking. It could also be a migration path for what is being developed without any software changes.
By the way, Rob L, nice laying out the project for everyone. I think you spelled out all the major first pass design decisions. A 640x360 viewfinder should be able to be done with a cheap, small LCD screen.
A non-programmer who would like to assist can look for a mini-pc box, like the shuttle with a low noise power supply (or the PS from another source). Someone posted this:
http://www.logisysus.com/
I was told that it is better if you have a separate OS drive from the video. So, ideally, the box should be able to handle a small drive (laptop?) for the OS and two 3.5" for a RAID. Maybe removable. Definitely gigE. For people doing larger work (as Obin dicovered), they may need to move it faster. Or we be patient - I think 10gigE will be dropping quickly in price.
Valeriu Campan June 17th, 2004, 06:26 AM Steve,
I am coming back to the CMOS chip saga. I found something that looks almost incredible. I would like to know your opinion or anybody else of course.
The Lupa 1300 from FillFactory, probably the same manufacturer that supplies the Sumix camera:
http://www.fillfactory.com/htm/products/htm/Lupa1300/lupa1300.htm.
It's size is only 30% smaller than full 35mm frame, has 62db S/N ratio, is capable of 30fps. I think that for a start (1.3Mpixels Bayer), is not a bad choice.
Any thoughts?
Obin Olson June 17th, 2004, 06:37 AM looks like a good chip..Steve?
I have a good idea, lets make the mcroatx computer run in RAM so when you turn it on you don't have to wait for windows to load an load and load...yes??
Steve Nordhauser June 17th, 2004, 07:04 AM Fill Factory, the company that brought us the much maligned IBIS-5. This is the direct competition to the Micron MT9M413 http://www.micron.com/products/imaging/products/MT9M413.html
Noise spec is somewhat better, but nothing like the Micron 1.3Mpix rolling shutter or the Altasens. Simultaneous expose and readout - that is very good. 16 analog taps - very bad. This would be a very expensive camera (not a cheap sensor either).
I guess I haven't heard a consensus that you are willing to put up with noise and high cost (especially for 720p) to gain a global shutter and large pixels? Would you pay $4-5K for this camera? The Altasens will be in the same ballpark with 1920x1080 60fps true 12 bit.
Valeriu Campan June 17th, 2004, 07:15 AM 4-5k is tooooo much for a prototype. Altasens with those specs sounds more promising. What is the size of the sensor? Still 2/3 or larger? Don't forget that the big boys are looking at full frame 35mm (Dalsa, Panavision, Arri). Also many from this list are aiming for a similar outcome.
The look given by the combination of FOV and DOF are still an important issue for narative long form projects.
Rob Scott June 17th, 2004, 07:25 AM Rob L - Thanks for excellent summary! Do you mind if I adapt it for my wiki on the project?
Obin wrote:
I have a good idea, lets make the mcroatx computer run in RAMIt is possible to put a CompactFlash drive in a PC that looks like a standard IDE drive ... but that would only help if Windows can be installed in 1 GB :-)
Valeriu wrote:
What is the size of the sensor?The AltaSens is 2/3" I believe.
Obin Olson June 17th, 2004, 07:31 AM I was talking a RAM drive..can't you install windows into ram?
http://www.cenatek.com/product_rocketdrive.cfm
EDIT:
here we go:
http://www.bitmicro.com/products_edisk_25_ide.php
Eliot Mack June 17th, 2004, 07:42 AM <<<-- Originally posted by Rob Scott :
It is possible to put a CompactFlash drive in a PC that looks like a standard IDE drive ... but that would only help if Windows can be installed in 1 GB :-)
The AltaSens is 2/3" I believe. -->>>
Is the actual target platform for the camera Linux? If this is the case it may be much easier to get it onto a CompactFlash drive. Not sure what would have to be stripped from Linux, though. I may check this out a bit as I'd like to avoid a camera boot cycle.
Eliot
Rob Scott June 17th, 2004, 07:42 AM I was talking a RAM drive..can't you install windows into ram?I don't think so. The biggest problem is that RAM disappears when the power is off, so when you power on, the PC has to create the RAM drive from scratch. I suppose it's possible that there is a boot loader that will pull Windows off the hard drive and copy it onto a RAM drive first, but that doesn't seem much quicker than just booting.
Edit ...
I looked at the Cenatek device, and (unless I'm mistaken) it requires power all the time. When the power goes away, so does the contents of the RAM.
Is the actual target platform for the camera Linux?That's a long-range goal. I'm not sure if/when that will happen, because I don't have much experience with development under Linux.
There are a number of "embedded" distributions of Linux, some small enough to fit on a single floppy. Linux on a CompactFlash card is definitely doable.
|
|