August 29th, 2010, 09:24 AM | #1 |
New Boot
Join Date: Jun 2010
Location: Odessa Ukraine
Posts: 20
|
Everybody please don't lost this project!
everybody please don't lost this project because seems its the last hope for independent cinema makers! just want to share few total global thoughts about all this...
today is a great time when technology is reach the hi level when even diy real digital cinema camera is possible. unfortunately most of all technology is closed and limited by marketing. all professional digital cinema cameras are overpriced according to professional cinema world but not according technology. from other side dslrs are still not not enough to make real film with their poor compression aliasing and low dynamic range. and according to marketing policy they still same forever... this project seems the only one free open and really diy solution for real digital cinema camera possible in any configuration. maybe my thoughts is not coincide with idea of apertus to build ready to use camera based on elphel, but i think it can be more diy camera project when anyone can develop its own configuration based on few main components purchased separately and use software and workflow developed by enthusiasts its only few little steps left for: - first of all we need to wait a new elphel 373 model with new more powerful hardware which allow transfer much more data rates and will have ssd drive inside and (maybe) new sensor with decent dynamic range, or with ability to connect your own custom sensor to it - second is a custom linux or application for monitoring and remote software from the small remote portable computer - Elphel Vision. (alpha version of for current 353 model already exists on http://cinema.elphel.com/software) - third is a cross platform app with gui to convert raw jp4 to dng. (today as i know there is only exists command lone tool for linux). of course some kind of al in one jp4 raw processor/grader/converter can made forkflow faster and can save many hard disk space, but watching at buggy alpha versions of redcine i think its not so simple to do... all other components like rods, lenses, followfocus, battery and computer for monitoring are separate things. and so everyone can choose its own diy configuration of camera according to its goal. also it will be possible in future make a compatibility list of components and configurations for camera. p.s. i am only afraid that jp4 is a format based on jpeg compression which can handle only 8 bit of color and in this way its a very big limit for picture quality according 12 or 14 bit raw data possible from sensors. Last edited by Dmytry Shijan; August 29th, 2010 at 05:01 PM. |
August 29th, 2010, 03:47 PM | #2 |
Major Player
Join Date: Apr 2006
Location: Magna, Utah
Posts: 215
|
JP4 format
Dmytry,
JP4 format is designed to handle 12-bit data from the sensor, not just the 8-bit one. It is made specifically to preserve camera sensor data for post-processing (regular JPEGs are better suited for presentation of the images). Many (billions of ) the high quality images you can see on the Internet are made from the JP4 originals. You can find more info about the sensor bits here: How many bits are really needed in the image pixels? | Elphel, Inc. or in other places, like in this nice article: KammaGamma Articles Solving the Leica M8 DNG riddle Andrey |
August 29th, 2010, 04:24 PM | #3 |
New Boot
Join Date: Jun 2010
Location: Odessa Ukraine
Posts: 20
|
Andrey thanks for making this clear. seems i just missed something about jp4. regarding KammaGamma Articles again i see that 8 bit is bad when you decide color correct image ...
Last edited by Dmytry Shijan; August 29th, 2010 at 05:12 PM. |
August 29th, 2010, 07:58 PM | #4 |
Major Player
Join Date: Apr 2006
Location: Magna, Utah
Posts: 215
|
Dmytry,
That KammaGamma article does not explain the shot noise of the sensor - you need to read ours. Unfortunately the LinuxDevices web site screwed up some content loosing images and Javascript code. Here is my draft - it may have some minor differences from the final version): How many bits are really needed in the image pixels? The sensors we use have about 8500e- of the pixel full well capacity (FWC), so applying gamma 0.5 you do not just fit the 12-bit data to the 8-bit range, you match the output code to the sensor noise (mostly defined by the FWC of the pixels). As for color correction - when decoded properly you get the same "raw" sensor data as available on the sensor output. The encoding errors (provided JPEG quality is close to 100%) are less than natural noise of the sensor. So JP4 does not limit you in color correction, you may also use more advanced debayer algorithms than practical inside the camera. Andrey Last edited by Andrey Filippov; August 29th, 2010 at 11:22 PM. |
August 30th, 2010, 05:31 AM | #5 | |
Regular Crew
Join Date: Jan 2007
Location: Vienna, Austria
Posts: 112
|
The project is far from dead, but it seems like people have settled for less chatter here on the forum and doing more work in the background I guess.
As a funny coincidence the introduction and hype of the iPad will help Apertus a lot. Soon the market will be flooded with cheap linux friendly touchscreen tablets (with latest technology screens like the PixelQi Pixel Qi - Products) and all of them could be potentially perfect viewfinder hardware to run ElphelVision. Quote:
There is still room for improvement but it works. We do not need an equivilant for RedCine as DNG is a universal format that can be imported in most image video post-processing software inlcuding the complete Adobe bundle and of course free software like UFRaw. |
|
August 30th, 2010, 09:11 AM | #6 |
New Boot
Join Date: Jun 2010
Location: Odessa Ukraine
Posts: 20
|
of course DNG is simple and can be flexible edited in any raw processor or composing application, even maybe if make fast raid, dng sequences can be edited in realtime in nle applications. but equivilant for RedCine is good in terms of disc space as so as unprocessed jp4 frame has 10 times smaller size than dng Frame. for example - jp4 graded and converted to prores or any editeble format, bypassing dng step.
another way maybe is connect jp4 format to some existing processing application as plugin. for example ask to support jp4 workflow some opensource raw processor which is able to export video. or virtual dub or adobe or apple or sony vegas to support jp4 in their editing software. of course maybe its all my fantasy, because i am not so technical to understand all real situation with raw processing :) |
August 30th, 2010, 09:13 AM | #7 |
Regular Crew
Join Date: Jul 2005
Posts: 96
|
on my own i'm doing my first steps in java, i've done a program that calculattes everything arround SMPTE codes, conversions form frames to smpte given a FPS number, and some checks arround correct SMPT codes with given FPS, but at the moment i'm still looking for a way to implement floating point FPS, this program is currently running in a CLUI interface... i'm this far yet :) as i've seen in the screencaps of the java program there is not a smpte system implemented (am I right?) are you guys interested in that little program?
|
| ||||||
|
|