July 10th, 2007, 12:53 AM | #106 | |
Regular Crew
Join Date: Aug 2004
Posts: 91
|
Quote:
|
|
July 10th, 2007, 01:53 AM | #107 | |
Inner Circle
Join Date: May 2003
Location: Australia
Posts: 2,762
|
Quote:
Jose, It most likely will not be as simple as that, you would have to program the ambarella interface to suit, and maybe external circuits to connect the two. To overclock USB, would be interesting, but in reality, why not use GigE, and an GigE caddy? The problem with USB is the peaks, and the processor drain, but with embedded you can have an section handling the USB, and buffering the data peaks before and after the USB. This equals, nearly, 50MB/s max through put (8bit 1080p24/25, or 720p50 without compression). Take, Interesting solution, I did find an HDMI to wavelet chip solution that included USB like interface, tried to talk them into making an USB capture solution, but nothing came through, but somebody else could do it. |
|
July 10th, 2007, 03:21 AM | #108 |
Regular Crew
Join Date: Jan 2006
Location: West Country, UK
Posts: 141
|
Steve -- thanks for reminding me about avisynth. I remember looking at that a while ago but didn't go into the upsizing quality because I thought Photoshop would be superior. There are also Photoshop plug-ins like Genuine Fractals and S-spline (called something else now, can't, remember) which can also be used as an action in Photoshop -- I just walk away and do something more interesting while that's going on. I'll check out the latest version of avisynth...
Thanks. |
July 10th, 2007, 03:47 AM | #109 | |
Major Player
Join Date: Mar 2007
Location: Amsterdam The Netherlands
Posts: 200
|
Quote:
I think it would be quite easy to change the Adaptive Homogeneity ... Algorithm, to do this. Cheers, Take |
|
July 10th, 2007, 06:39 AM | #110 |
Major Player
Join Date: May 2007
Location: Sevilla (Spain)
Posts: 439
|
Take, I don't want to double the speed of the usb interface. I just want to speed it up a bit. I already have like 18-20 fps at 1920x800. It would be just a matter of 40% or 50% more to get to 25-26fps and then fix it to 24fps.
The question is how can we do all that? I mean, we're talking about different solutions but almost all of them require hard/soft engineering. I.e. I've got my demo board. How can we modify that board so it has a GigE interface instead of USB? If we do it, we also need new software to capture from the new interface. That's why I was talking about overclocking the usb. Cause using parts that we already have is much easier than building new parts, putting them all together and developing new software. If we want to stay with camera-to-laptop solution, which IMHO is the best and easier way, I see two different options: I can capture 24fps bayer at 1600x666 using usb without any problem and it's direct to HDD capture so we have no time issues. I still prefer capturing to ram though, cause with this things faster is always better. We'd have to develop a piece of software that can read the bayer results and convert them to RGB 2048x858 in just one step while debayering. That would be Take's job if he wants to do it. Upscaling is very common even in top pro cameras, so if we know this upscaling process would give us much better quality than any pro camera, it's ok to me. We could use any of the great interpolation algorithms out there. The other option would be to overclock the usb interface to get 24fps at 2048x858 or at 1920x800. In the first case it would be like 75% faster, the other would be easier and more stable, as it's just like 50% faster. This way we would have 2k or near 2k bayer, but full HD anyway and we would just need to bebayer in post. The good part of this one is that we have more resolution to start with. There's in fact another option that's a little more difficult, but a good one nonetheless: If we connect the demo board to the Ambarella chip and then we stream to the computer using usb we could have lossless 2k directly to the laptop. We would need a way to connect the board to the chip, an usb interface from the chip to the computer and software to control the camera and capture. If we can do this one, we could easily turn it into a portable solution using a mini pc and a small lcd. With any option we have to work together. The best of this is that we can start with the most simple solution and then upgrade it till we have a standalone camera. So what do you think it's the most simple option? |
July 10th, 2007, 09:14 AM | #111 |
Major Player
Join Date: May 2007
Location: Sevilla (Spain)
Posts: 439
|
I did it.
I changed USB clock from 125Hz to 500Hz. The camera works and now it captures 2048x858 at about 25fps already debayered to uncompressed AVI. The question is... Is it bad for the camera? I mean, I already spent 900euro on the board. I used a patch used by pro gamers to increase mouse performance and the guy who programmed the tool says it's completely safe. In fact the program has an option to set the clock to 1000Hz and that's the only option the programmer says is not safe. I'll post a full res clip soon. |
July 10th, 2007, 11:47 AM | #112 |
Inner Circle
Join Date: May 2003
Location: Australia
Posts: 2,762
|
Is that related to the actual speed, or an switching speed fro polling etc?
|
July 10th, 2007, 12:12 PM | #113 |
Major Player
Join Date: May 2007
Location: Sevilla (Spain)
Posts: 439
|
What do you mean?
|
July 10th, 2007, 02:38 PM | #114 | |
Regular Crew
Join Date: Jun 2006
Location: Columbus Ohio
Posts: 36
|
Quote:
After talking to one of my friends I have given up on the gumstix idea. We are now talking about going purely with a CMOS-FPGA-hdd solution. Neither of us have ever attempted anything like this before, but that is kinda what makes it fun. Once I have some money to spare I think I am going to buy a Spartan-3E Xilinx dev board. But once again that could be a while. Oh and Jose, that website you gave me didn't have the full datasheet for the Micron MT9P031 only the product overview. It seems you have to sign a NDA with Micron to get the full datasheet :-( |
|
July 11th, 2007, 01:46 AM | #115 | |
Regular Crew
Join Date: Aug 2004
Posts: 91
|
Quote:
It's faster than AHD because you don't have to convert data to CIE color space (which is a very slow operation involving cube root) to calculate homogeneity map, but the result should be quite similar. (and i've written it in a way that allow me to add easily MMX code where it matters) Creating an avisynth filter using internal resizer shouldn't be too hard, but i wanted to see if i could wrote something from a scientific paper. |
|
July 11th, 2007, 02:36 AM | #116 |
Major Player
Join Date: Mar 2007
Location: Amsterdam The Netherlands
Posts: 200
|
Hello Steven,
Acording to the AHD paper they also tried to use YUV with good result, so I am using YUV. I am doing camera_RGB->rec703_RGB->rec703_YUV using 4x4 matrices. Also, the conversion of RGB to YUV or RGB to CIElab is not the limiting factor, it is the amount of convolutions that have to be performed, including running the image three times to a median filter. BTW, I am running the complete AHD on the graphics card. It runs at about 25% (slower) real time. On a MacBook Pro. Your directional filtering algorithm also looks very much like AHD, except that the decision seems to be made before interpolating red and blue. It took me quite a long time to figure out what the AHD paper tried to explain. While the actual algorithm is quite easy to explain in detail. I which these math people would write papers with engineers in mind. Cheers, Take |
July 11th, 2007, 03:08 AM | #117 | |||
Regular Crew
Join Date: Aug 2004
Posts: 91
|
Quote:
Quote:
Quote:
I lost a lot of hairs before figuring it out and then i was "whaat ? is that all ???" edit : btw, if anybody has some raw bayer footage, i would glady download it so i can test the algorithm on different video material. |
|||
July 11th, 2007, 07:16 AM | #118 |
Major Player
Join Date: Mar 2007
Location: Amsterdam The Netherlands
Posts: 200
|
Steve,
When I get the camera running correct, I will be able to put some footage up. However, the reseller of the Pike doubts that the artifacts are abnormal. He basically says: "you shouldn't do gamma correction". So to prove that it is wrong I will have to make a light sensitivity graph (x: exposure time, y: AD output value) for a couple of pixels. it looks to me like 50% of the pixels are not linear in mid-light. Cheers, Take |
July 11th, 2007, 09:20 AM | #119 |
Trustee
Join Date: Nov 2005
Location: Sauk Rapids, MN, USA
Posts: 1,675
|
Any overclocking is safe provided you can cool the parts enough...it goes up quickly though, you'll be liquid cooling the thing before too long ;)
Luddite question: is there a reason Firewire (IEEE1394) isn't being considered (or considered, then dismissed)? |
July 11th, 2007, 09:38 AM | #120 |
Major Player
Join Date: May 2007
Location: Sevilla (Spain)
Posts: 439
|
Steve, I can provide you with raw bayer material from my cam. Do you want anything in particular? Resolution? Colors? Lighting?
Just tell me where I can upload it. |
| ||||||
|
|