![]() |
streak ( duplication of part of image )
Obin, Looks like a memory pointer was somehow reset to a point higher up in the picture during readout. That's why it is duplicating the image.
-Les |
Ok who knows of editing software that can edit 10bit?
I am back to square one....we don't have a way to edit this stuff! LOL..funny that could be... Rob are you fully in love with the SDK from Epix HA HA HA ;) we are haveing troubles with it left and right..crappy stuff...did you find a way to capture the RAW black and white image and write that to disk AND display a live color preview? |
Speedrazor is supposed to support 10 bit, it supports Cineon images too...
|
Quote:
Quote:
|
I tried Speed Razor once on Cineons. It didn't like it, it had some kind of "VGA memory" error. I think that SR is very video oriented and they tried to use the graphics card to do a bunch of stuff. I guess that was the way to do things back in those days.
I think SR is a dead product these days. Let me know if someone does know if it works on XP with the latest directX installed, and with an Nvidia card. Maybe they only worked on some odd ball video card, I don't know. -Les <<<-- Originally posted by Juan M. M. Fiebelkorn : Speedrazor is supposed to support 10 bit, it supports Cineon images too... -->>> |
I've got a severely older copy of Speed Razor (from about 6 years back), and I gotta tell ya . . . man, nothing but problems. Of course, things have come a long way in that time, but I got one word of advice . . . MAC
|
Obin,
Here's a great 10-bit 4:4:4 editing system that you can use: Dual 2.0Ghz G5 (or go for the higher-end 2.5Ghz, but dual 2's will do you fine if you're trying to save money). Also pack it with at least 2GB of RAM. Atto UL4D - SCSI U320 dual-channel card Huge MediaVault U320-RX, their dual-channel MediaVault hard-drive, which can easily keep up with the data rate of 10-bit 4:4:4 RGB Then download the Decklink HD Pro driver, and just install the 10-bit RGB codec. You'll have to convert all your file sequences over to the 10-bit codec, but once you do that, you'll have real-time 10-bit RGB 4:4:4 editing within FCP no problem. BTW, you'll also probably going to need the Decklink HD Pro to output your stuff back to tape, so you might want to include that in the price too, but it's a fairly cheap card, at $2495. For monitoring your signal, get the HD-Link from Blackmagic. You can hook it up to a second 23" Cinema Display for 4:4:4 HD-SDI ouput on that LCD as a second monitor. Saw them at NAB, and they look absolutely fabulous. And they cost a whole lot less than a Sony 24P HD Monitor! For color-correction I suggest nothing less than Synthetic Aperture's Color Finesse. Go check it out at www.synthetic-ap.com. This app is amazing, and it works in a 32-bit-per-channel color space for impeckable quality. It's also very fast for what it does. Not RT, but you want this app, especially version 2.0. So: Mac G5 - $3000 + RAM = $3300 FCP - $1000 Decklink HD Pro - $2495 Huge MediaVault U320-RX 1.2TB - $6789 Atto UL4D - $450 2x 23" Cinema Displays - $4000 HDLink - $1295 Color Finesse - $575 Total Cost: $19,904 Expensive, but not bad considering you're going to be editing in Real Time (not RT effects though) 10-bit 4:4:4 RGB which runs around 215MB/s. If you take off the monitoring stuff, you'll save $3295, but you'll proababy want a monitor, and a NTSC display is not going to cut it! The only other system I know that does this reliably is Discreet Smoke and Fire, but I don't think you can afford a quad-proc SGI Tezro and Smoke 6.0. |
that is all fine and dandy Jason BUT I want the stuff compressed so that we can cut out the Media Vault..I can compress with a 10bit codec down around 15-20megs a sec...BUT what codec supports that and what editor can do it/? can FCP work wiht SHEERVIDEO 10bit yet? or?
I don't care at all about true 4:4:4 because it's not needed when you have a great codec that is 2:1 or 3:1 compression .. with NO artifacts @ 10bit |
Why not create our own codec? I haven't been able to get to it, but I'm looking at making a simple lossless QuickTime Component that would deal with bayer images... Should be able to get down at least to 18 or 20 MB/sec at 10bit 4:4:4, 100% lossless. Possibly as low as 15MB/sec...
720p 24fps bayer images are only 26MB/sec anyway... not sure why you'd need such serious gear for under 30MB/sec.... Even if we don't develop our own codec, you'd still be able to edit in black&white. Not the worst thing in the world... We could even give a lossy option for less fussy people (after all, visual effects firms routinely pass around frames as JPEG files, and they aren't always 100% quality). Like a JPEG-style thing but in 10-bit... I should add that converting bayer-originated images to 4:2:2 is pretty close to lossless, since red and blue are sampled every other pixel. However, keep in mind that green also affects chroma, so 4:2:2 will make your colored edges slightly blockier... |
For what you're trying to do the Media Vault isn't that expensive. I don't see why you'd want to cut it out.
Monitoring HD is going to cost you a whole bundle too. And if you edit with Sheer Video or your own compressed codecs, you're going to have no way to edit back to tape. For that you're going to need a card, and those cards use specific uncompressed codecs that your hard-drives aren't going to handle without dropping frames. How are you planning to get your HD-shot material out to the mass-market/broadcasters or deliverable to clients who want to edit it on their own/have backups on the shelf? Why edit in HD if you not going to an HD-tape of some format? And again, if you're going back to tape, you're either have to use the DVCProHD codecs and a AJ-1200A Tape Deck ($25,000) if you don't want the Media Vault, or you'll have to get a Decklink HD Pro card and rent yourself HDCAM, D-5, DVCProHD, or HDCAM-SR decks, and with that you will need the MediaVault. |
Ben your idea is awesome: Jpeg type codec with 10bit! I don't care about jpeg as long as we keep it at 10bit for color work! the bit depth is the most important thing IMHO...and the most lacking from the NLE people
Well for us Jason all we need is the highquality codec...everything we master is SD so if we can work in HD and do effects in HD and color work in HD then the last step is SD..I don't want or need all that gear..not yet anyway |
If you're doing everything to SD, then you might want to try out the PhotoJPEG codec from Apple or the DVCProHD codecs from Apple/Panasonic.
FCP should be able to edit with Sheer Video, unless of course Sheer doesn't really support HD frame-sizes in FCP (which I think was our problem). I don't know of any 10-bit JPEG codecs our there except for something based of JPEG2000. At 720P sizes you could also probably make your own HD array with 4 SATA drives-that might be enough. Quote:
|
Well, that is what I was saying the last time I talked about JPEG2000 (open source projects JASPER, Libj2K, etc) and most of the people here condemned me.
Jpeg2000 supports all the way up to 16 bit depth. Ben, wasn't you the guy that always talks against compression?? It looks to me things are changing very fast here, hehehe :) BTW, if anybody is interested about this, the nearest colorspace for a Bayer pattern is YUV 4:2:0 |
Juan, I just think there should be a 100% lossless pathway for people who want to maintain everything. However, realistically, this isn't necessary for most people. So something like JPEG at a high quality in 10bit is enough.
I agree with Obin -- I don't want a bunch of high-end gear that depreciates hundreds of dollars every minute. I have absolutely no idea how I'll be producing an HD master in the future, but I'm not concerned. For now, no one I know (including me) has an HD display (besides a computer monitor), so what's the point of producing an HD final product? I'm interested in finishing on SD DVDs. When HD DVDs are available, that will be my output option of choice. The point is flexibility. With uncompressed 10bit 720p, you have tons of options. But I won't be outputting to HD tape until there's a good, inexpensive option (better than HDV anyway), or someone with lots of money wants to see my movie on the big screen. :) - ben |
I think I've mentioned before I've used Jpeg 12 bit for years for feature work. It's close enough to lossless. Not gunna be real time. See the jpeg public domain code, it's free for you to compile however you want.
-Les |
How hard would it be to adapt DCRAW?
Juan and I were talking, and this seems like the best conversion out there that already has the source-code available. you can find the source code here: http://www.cybercom.net/~dcoffin/dcraw/dcraw.c I was testing this out on some images, and they look really good. I've also emailed David to see if he won't implement converting TIFF files that are greyscale bayer to color files. He might, and he might not, so we'll see. |
Ok, here I go again putting forward info I know little about.
Just found a multimedia library, unfortunately only for windows, which seems to include bayer conversion. It's here http://www.gromada.com/mcl.html I'm not sure if this will work in this case though. (ok, I just downloaded their app and it doesn't convert automatically. That's not to say that the library couldn't be made to do it) I was also mucking around at home and had a look at the raw_cap.bin file in a hex editor and the same file converted to an uncompressed tiff. It's just a matter of cutting the header and end section from the tiff and applying them to the bin file. I was going to put together a very basic windows assembly tool but I'm a bit rusty. (don't get excited, my 'programming' skills come from cracking in my younger years. Anything more than a hack job is beyond me) Obviously, this is just the bayer greyscale image. I'm still trying to get my head around bayer but I found a site that explains it a bit. Here, tis http://www.siliconimaging.com/RGB%20Bayer.htm with some formulas. This one is better with matlab scripts http://www-ise.stanford.edu/~tingchen/main.htm In windows you can put the following in a .bat file COPY /B header.bin /B + raw_cap.bin /B + tail.bin /B file.tif header.bin - being a binary file containing the start of the tiff raw_cap.bin - being the still capture tail.bin - being the tail bit from the tiff You will of course need to generate your header and tail bins. Do this by getting your image file, open it in photoshop, save it as an uncompressed tif. Open the raw image and your tif in a hex editor. Search for the first 5 or so bytes from the raw image in the tiff file. Delete from there on then save the result as header.bin, open the tiff again and search fro the last 5 or so bytes from the raw file and delete all ahead of it. Save as tail.bin In theory, this should make your raw bayer file into a bayer tiff. You can do a similar thing in a Unix script so OSX users can do it too. I can post the binary files somewhere if you like, but until then you can give it a bash yourself. Raavin :) |
Hey Guys,
I'm not sure if this is a problem or not, But as of right now, if you're trying to convert (especially from the Altasens) the RAW images to a color 16-bitt TIFF, etc., even if it takes 1 second per frame (which is pretty quick), it's going to take 24 hours to process 1 hour of shot footage! Another thing to consider is that you can't edit with TIFF sequences, so you'll have to render some offline format, and again, that's going to take at least 1 second per frame to shrink the HD down to SD resolutions for editing (or to encode to Sheer, etc.), meaning again, at least another full 24 hours for 1 hour of shot footage. 2 seconds per frame would take 48 hours, etc. I'm trying to think if that will be a problem, namely a minimum of 48 hours on some of today's fastest gear (I have a dual 2Ghz G5, so it's no slouch) to get your frames output, unless you invest in a render farm. Also that render farm is going to need some very fat networking pipes, because simply hooking up ethernet (if it's not Jumbo Frames on gigabit, and those switches cost $5,000) through a cheap hub, the amount of information you'll be passing around will eat up processor cycles and ruin the effectiveness of the render farm. That's why I mentioned a good switch with jumbo frames, which again can easily cost $5K or more. Anyways, I'm just wondering how this would apply to shooting a film. I guess you would develop as-you-go? Also be ware that the uncompressed footage converted to RGB will take up around 1.2TB at 12-bit-per-channel for the 1920 Altasens and 1 hour of shot footage. Compressed RAW greyscale should be about a third that amount, actually it'll be 432GB for an hour of RAW bayer footage from the Altasens at 24fps (considering it'll be 72MB/s). Not a no-show, but I guess bayer image processing and offline processing time-frames are going to be some things that need to be considered for a film production workflow. |
You could process footage way, way, way faster than 1fps on reasonably decent hardware. Even if you're doing pretty sophisticated de-Bayering. Especially if the code is optimized (possibly with vectorization/SIMD).
Totally unoptimized, it would probably run at about 2-3fps. Vectorized, your main bottleneck becomes the drive speed. I'm sure you could get between 10 and 15fps. |
With LinBayer I'm only getting around 1.2fps on my dual 2Ghz G5
|
Ben, what are you basing this frame rate estimate on? How long does the software package the still photo guys use to demosaik take? I forget the name of the app, but it's for doing work on still camera RAW images.
-Les <<<-- Originally posted by Ben Syverson : You could process footage way, way, way faster than 1fps on reasonably decent hardware. Even if you're doing pretty sophisticated de-Bayering. Especially if the code is optimized (possibly with vectorization/SIMD). Totally unoptimized, it would probably run at about 2-3fps. Vectorized, your main bottleneck becomes the drive speed. I'm sure you could get between 10 and 15fps. -->>> |
Well, a long time ago I made a resizing filter for Avisynth that had a speed of around 18 fps on an Athlon 2000 for a resulting image of 1600x1200.Obviously it worked on 8 bit depth images.
It was really un-optimized and very alpha with lots of comparisons, multiplications and divisions.It uses some techniques that are well suited for demosaicking (in fact I was upsampling color vectors based on Luma info, or Red and Blue based on Green for RGB).Maybe if I can get the time to code again I could release something in a month or two...... |
linBayer is totally unoptimized, and 100% single-thread -- ie, it's only using one of your processors, and using it extremely inefficiently. I won't optimize it until the code is a little more finalized.
Multithreading alone will give you an almost 100% speed increase, because it'll start using that second processor -- that would put you anywhere from 2fps to 2.4fps. Simple code optimization could probably give at least a 25% boost, to around 2.75 - 3 But linBayer is a prime candidate for vectorization, which could boost the speed even more, and for each processor. On a 2x2ghz G5, I wouldn't expect anything less than 8 - 10fps once it's fully optimized, and that's conservative. The real bottleneck is disk speed, not processing speed. AE is not designed for fast disk reads -- rather than reading a bunch of frames into memory at a time, it determines file dependencies on a frame-by-frame basis. So. Every. Frame. Is. Read. In. Separately. That's necessary to keep AE flexible, but if you have a specialized need like we do, it's unecessary. A well-coded standalone app could read in a bunch of frames at a time, churn through them with multithreaded and possibly vectorized code, and spit them to disk in sequence. That way, the drive gets to do a nice big sequential read (which is where you get your burst data transfer rates), your processor gets to work directly from RAM, and then it goes back to disk in a big fast sequential write. Additionally, processing will probably be faster in general than in AE, because you won't have quite as much overhead associated with being a plug-in within a large app. These are the reasons why 10 - 15fps is totally possible, given some nice disk throughput. Add to this fact that MPEG-4 compresses in better than realtime on a G5, and you could probably write out a low-res editing proxy at the same time as your Bayer->RGB conversion with a small speed hit. But the bigger question is, why convert Bayer->RGB this way? It seems to me that the data should be stored as Bayer information, and de-Bayered by a codec as it's played back. That way you get the storage benefits of compressed Bayer (lossless or lossy, either way), but you can see the image in RGB in all your apps. (Including AE, FCP, etc) It makes absolutely no sense to store uncompressed 4:4:4 RGB versions of Bayer images on your hard drive, unless you just love eating up 3X the disk space and throughput for absolutely NO gains... |
Today i will order the Sumix SMX 150C, because we think the IBIS5 sensor is mutch better than others (in this price range = 2/3" and global shutter). Ben, how much did you pay for it, exactly? It seems, Sumix want to co-operate with us (software and hardware changes).
I know the bad software and the slow USB2.0, but this will be not a problem, we have other solutions so we will use the camera at 24fps with 10Bit and global shutter. Global shutter was the reason for the Sumix decision. Because the SI rolling shutter is absolutely unfit. We made test with a german camera head (with same sensor characteristics). At 24fps, but also at 48fps, the rolling shutter artefacts destroys each moving picture. The altasens will have also a rolling shutter, but mutch faster (not the whole fps-time). This produces mutch less artefacts. The altasens supported also a external shutter, so on this way the artefacts will disappear completely. |
Rai,
I agree that 2/3" is totally key. The rolling shutter is not such a show-stopper for me, but the IBIS-5 can indeed do global shutter. What remains to be seen is whether the SMX-150c can deliver 24fps of global shutter of USB2. I have my doubts, but I'll be interested to hear how your experiments go. If you email Sumix and mention my name, they should give you the same deal they gave me. They're very eager to work with filmmakers and figure out our needs. I should note that they're taking a bit of a summer vacation right now, so you may not get an immediate response. I think they get back in full force in a week or two. - ben |
I thought we were working together towards an eventual cheap solution, step by step? I think we are still going well, with more of the steps getting done. If anyone wants to jump the gun and experiment, or wants broadcast compliance, before then it is going to cost you more.
My 2 cents, we are fine, and if you read Rob's Developement log, he is making progress on speeding up the software (I did warn everybody that it wasn't easy to program for the best speed). Juan and Ben, it is obviouse you want to do bayer and compression codecs, why don't you volunteer to work on them with Rob. He is allready doing capture software you could each volunteer to do bayer and compression? It maybe crazy economics, but I think it is worth it. Jaun, you wrote about people shooting you down about jpeg2000, but I say go ahead great (but remember the BBC version). We are looking at universal codec support, but for now we could use a few codecs, the fastest codec (huffy??) and the best (wavelet based lossless, near lossless, and 20:1). Everybody has different needs, all we need to do is support the best needs. Ben you wrote a lot of wisdom there, unless we are passing to broadcast, display, some non-bayer NLE, or using 3 chips, there is no reason to debayer. Rai, been sick in bed for days because of stomach bug, I will try to email you by the end of the day. Somebody posted here (or one of the other threads, even photo camera to video mod threads) that there was a number of SD cameras you could catch "frames" from live through firewire. Which models I don't know but worth asking around about. Thanks Wayne. http://www.cameraaction.com.au/, should be able to do a better price on an XL1S. |
Unless your app captures directly to Quicktime Ben, you're going to have to convert from RAW to Quicktime, but maybe Rob's app could capture directly to QT, although dropped frames will be a new concern (ala FCP captures).
Keep in mind that QT on Windows (where the framegrabbers are) is horribly slow, much slower than the Mac. Also for HD 1920x1080 I'm definitely NOT getting better than real-time for MPEG4 transcodes. On SD footage yes, but not HD-it could be disk-write speeds, but again, I'm back at the good 'ole 2-3fps for pure transcodes/resizing. |
<<<-- Originally posted by Ben Syverson: What remains to be seen is whether the SMX-150c can deliver 24fps of global shutter of USB2. I have my doubts, but I'll be interested to hear how your experiments go-->>>
Fist, we can change the hardware inside the camera. For example in the past we changed canon video cameras (up-side-down) to work with our 35mm solution, but without a prism. So we can go arround the USB2.0. But... ...second, read this part of a email from Sumix: "...In the present SMX-150C pixels are sampled at 10 bit. Then a look up table (which can be arbitrarily programed) converts the 10bit to 8bit. In effect at present SMX-150 user has access to all 10 bit by choosing the look up table accordingly. However we have a new version (a software/firmware upgrade) that transfers directly 10 bit pixel data to PC. This upgrade will be ready in 2-3 weeks. In addition this version has the option of using the multi-gain (muti-slope) sampling features in the IBIS5 sensor used in the cameras. This muti-sampling at different gains provides defective 12 bit pixels (compressed in 10 bits for transfer to PC.) This upgrade will be free of charge for people who have purchased the present SMX-150C... ...Both global and rolling shutters can be used for video streaming..." |
I would like to see how the dual slope operation will look like..... :)
Defective or Efective?? I'm still thinking they should give us an SDK.It would be good both for us and for them.... |
Quote:
Quote:
Mostly, though, I expect the conversion stage to be used to Bayer-process the footage, apply gamma correction (etc.) and then write to QuickTime using a near-lossless codec for a reasonable compress ratio. |
Write in a very simple RAW format is the key.
I dont look for other ways, because all other ways will "change" the image-information (differnt decoder = different bayer-pixel-resolution). If you store the original RAW format, you can select the best bayer decoder for your images later. I think a real-time decoder is need only for the viewfinder or camera-display. But maybe you can live with the B&W RAW format. |
what's the problem of seeing a 1280x720 camera at 640x360 in color on the viewfinder??
Anyway you would need to view a MODIFIED version of the RAW image on the Viewfinder, unless you have some kind of gamma correction suitable for this kind of data inside your Brain... I'm getting lost again..... And , What's the problem of LOSSLESS transformations on the RAW Bayer? (LOSSLESS means you always have the EXACT original data) |
@Jason: Unless your app captures directly to Quicktime Ben
The Sumix app can capture directly to AVI, a frame sequence, or a "SMX" video stream, which is basically a stream of raw data. The easiest solution is to generate a grayscale Bayer AVI, and convert that to Quicktime. I guess I don't understand what the problem with that is. @Jason: Keep in mind that QT on Windows (where the framegrabbers are) is horribly slow, much slower than the Mac. Doesn't matter much to me -- my PC laptop is just a capture slave. Everything else in my pipeline happens on a Mac. @Jason: Also for HD 1920x1080 I'm definitely NOT getting better than real-time for MPEG4 transcodes. I was talking about SD (low res) proxy footage for editing purposes. Just a throwaway thought. It's just one possibility... @Wayne: Juan and Ben, it is obviouse you want to do bayer and compression codecs, why don't you volunteer to work on them with Rob. I already have -- I have an open invitation to anyone working on an free or open source integrated solution to use my de-Bayering code. But it's just as well -- I'd like to get it in better shape before I let anyone see it. But I'd rather not work directly on Rob's app, as I think he's focusing on a PC-based solution, whereas everything I do is on the Mac. Also, I think Rob's sw is basically capture software specifically for CameraLink (Rob, correct me if I'm wrong), which doesn't apply to my camera. It may make sense to share certain techniques and code snippets, but if I decide to go forward with a Quicktime codec, a lot of that code will be fairly specific. I don't believe Rob is working on a codec, but maybe he has plans for one... @Wanye: the fastest codec (huffy??) Wayne, remember that HuffYUV is... YUV. To get YUV from Bayer, you have to do Bayer->RGB->YUV. And remember that YUV @ 4:2:2 takes up TWICE as much space as Bayer. There's no reason to use HuffYUV. I'm not sure how well Huffman encoding would work on raw Bayer channels (ie, 1/2 size G, 1/4 size R+B). HuffYUV probably works so well because U and V are mostly gray.... A 10 or 12 bit JPEG/Wavelet would be fine for 99.9999999999% of the people on this board, but it would be slower than lossless, so what's the point? The single best option in my eyes is RLE/Huff on Bayer data, inside a codec that displays RGB for you. That way, you get the 15-20MB/sec file size, and you never have to worry about de-Bayering, because the codec does it for you. That would be the way to go, and it's what I'm currently investigating via a Quicktime Component. If anyone wants to help, they're more than welcome to. - ben |
Dual slope on IBIS-5
At the risk of supporting my competitors' sales, here is an application note on dual slope with the IBIS-5:
http://www.siliconimaging.com/Specifications/Dual-Slope%20Ap%20Note%20001.pdf Here is another test: I just did a quick experiment with the dual slope capabilities of the SI-1280F using XCAP. I placed a 20W halogen in a fairly dark scene, pointing directly at the camera. (all files are 1.3MB Tiff files) http://www.siliconimaging.com/Sample...osec%20exp.tif http://www.siliconimaging.com/Sample...msec%20exp.tif Then I turned on the dual slope: http://www.siliconimaging.com/Sample...ight%20exp.tif I didn't spend a lot of time tweaking this but you can see that the details of the lamp are there along with the background image in the dual slope capture. I'm sure it would be possible to improve the contrast of the dark image, but this will give you an idea of what can be done. Having provided this public service, I'll drop in a commercial. If you find that when you move to 10 bit and global shutter, USB 2.0 can't hack it anymore, think about our camera link version. It is true 12 bit and clocks to 60MHz (important in global shutter mode). It is still an IBIS-5 so it comes with all the other baggage (noise and washed out colors), but that is sensor level stuff. |
Woah -- Steve, what was your gain set to? I've never seen footage that noisy unless my gain was set through the roof...
|
Hopefully there's some way to control the slopes (Sumix mentioned a multislope setup, with control over each color), because as it is, the dual-slope looks extremely unnatural...
|
Ben:
This is one of my beefs with the IBIS-5. In global shutter mode, the integration time is sequential with the readout time. Let's say that they can move data over USB 2.0 at 40MB/sec and that the 10 bit data is packed. So, you are streaming at 320Mbps or 32Mpix/sec max rate. The real transfer time you need is 1280x720 x 24 = 22Mpix/sec. That leftover time is all you get for integration. The IBIS-5 isn't that sensitive to start, so crank up the gain or the lights in global shutter mode. With CCD interline transfer or CMOS Truesnap you can overlap the integation time and readout time. The IBIS-5 doesn't exactly control slope. You can set a knee point and any pixel that is saturated at the knee is reset. In my lamp example, the light is reset very shortly before closing the shutter. This lets the bright areas integrate for a short period while allowing the full integration for the rest of the image. Unless it is finely tuned, it does look very artificial. Very nice for machine vision though. |
Steve -- interesting. Thanks for the illumination. I did do a couple of global shutter tests recently (with still frames) and I found it to be much noisier (and darker) than rolling shutter with the same settings (even without gain). I'm not 100% sure what's going on there, but it was enough to dissuade me from global shutter for now.
I'm implementing a 16 pixel Spline interpolation for linBayer. The results are extremely promising. This is the normal (bilinear) interpolation This is the Spline16 interpolation I should note that the process is still 100% lossless, even with Spline16 -- the original pixel values are not changed, so you could extract the original sensor data from the RGB data and do a different de-Bayer routine in the future if you so desired... It's an interesting problem, because any interpolator has to fill in 75% of the image data in the R + B channels. Spline seems to do a really good job. I don't think Spline36 would be worth it, for all the extra processing. - ben |
Hi Ben,
That spline-16 looks very nice-will this also be implemented on the green channel as well as the red and blue? |
Jason, I'll give it a shot. Maybe Spline is all we need to get rid of our gridding and zippering problems, since it sees a neighborhood of 16 pixels for every pixel interpolated. However, it means that processing 720p really will be slow, 1080 even more so. Bilinear is an integer operation, whereas spline really needs to take place in floating point. That in itself is not the speed hit, it's the type conversion to/from integer from/to float -- 512 times for every pixel.
At this point, it might make sense to read the image into a floating point array and do everything in float. No type conversion means no slowdown... Jason et al, Spline doesn't help at all with zippering. In fact, it looks almost identical to bilinear, since it doesn't have to do very much interpolating (50% of the data is already there). Our de-zippering stuff still beats anything I've seen, but I'm open to suggestions as always! |
All times are GMT -6. The time now is 07:10 AM. |
DV Info Net -- Real Names, Real People, Real Info!
1998-2025 The Digital Video Information Network