April 23rd, 2004, 08:16 AM | #376 |
Regular Crew
Join Date: Sep 2002
Location: Loveland, CO
Posts: 101
|
Pixel Shift or Pixel Remapping Scenario
<<<-- Originally posted by Obin Olson : i would say yes BUT the pixels are moving around in each frame...a dead pixel stays dead in every shot.... -->>>
A pixel doesn't necessarily have to be "dead" to be bad. Most likely this is the root cause for these defective pixels and Juan will need to do a bit of research on pixel mapping. Anybody who has played with the 3x3 grid sliding puzzle will understand how you would need to account for a bad pixel in the 773x494 grid. Any pixel out of spec will need to be mapped out and either the x or y grid line will shift 1 pixel over and be remapped to make up for the bad pixel. There is obviously a "point of no return" where this cannot be done without excessive visual distortion, so during CCD QC testing any chips with more bad pixels on either an "X" or "Y" axis line would not be used. With the actual pixel count available on the CCDs, you have 53 "X" axis and 14 "Y" axis pixels available for remapping. Whether or not they would shuffle pixels over the full 53 amount I wouldn’t know, but the 14 for the "Y" axis should not be any problem I suppose. Food for thought. -Rodger |
April 23rd, 2004, 08:27 AM | #377 |
Regular Crew
Join Date: Mar 2004
Location: Germany
Posts: 64
|
Noise problem
Hello. I didn't write for a long time because I was away. As I see the problem still occurs and no stepping forward...
Althrough I don't know the exact solution, I can give some thughts in terms of methodology. First obvious thing is the problem touches only R or G component. The CCD that makes component B is just OK (just check RGB values of those bad pixels and their surroundings - it's always R or G component that makes the problem). So, I would check why the B CCD is read out correctly and others not. Also, that means that the CCD __COULD__ be read correctly so just work for solutuion and not substitute it with postproduction. Juan, suggest to prepare 20-30 test screens with very simple patterns, for egsample with two colors of basic R-G, (and maybe one with B to show there is no problem with it) that have straight border between them. Then place it in various positions in front of camera and shoot it to check how the noise react to it. Then show in different exposures to check how noise react to level on exposure. So on, so on. The more the better. And then put them as jpgs to download and maybe we could see any pattern that could help. |
April 23rd, 2004, 10:00 AM | #378 |
Regular Crew
Join Date: Sep 2003
Location: Worcester, MA
Posts: 64
|
Ok, after playing around with the photoshop file for a few minutes, I've come to a few conclusions:
1. After converting to 8-bit, the pixels off the raw image are close to 20 (00100000, 32) than what I said before, which discounts the "high bit error" problems since, well, it's not the high bits that aren't getting set. Of course, this may still mean the middle bits are off. In some sampled pixels, the value of the dead channel was 0 and sometimes it was 1. In the color-corrected version, they could get as high as C (00001101, 13). 2. All the CCDs have different points of "deadness" but the blue one seems to have FAR fewer than the others, like Milosz said, but *unlike* what he said, there are still dead pixels in the blue channel. They're just much less common. Is there something electronically in your capture sequence that would make a difference here? 3. The more I think about it, the more it makes sense that a simple filter would be able to run on these images that checks the median values of the surrounding 8 pixels and if they're far enough off, to flip the bits. Basically, a normal median filter, but specifically targetted towards individual pixel-hunting so the whole thing doesn't get bogged down (also, so it can be integrated into the capture sequence). Judging from these shots, it won't do any damage to the final frame. Unfortunately I don't have the time to write one, although I'll see what I can do this weekend (don't cross your fingers ;) ) |
April 23rd, 2004, 11:06 AM | #379 |
Built the VanceCam
Join Date: Apr 2004
Location: Prescott Valley, AZ
Posts: 109
|
I still think you guys are missing something here. I didn't see anything about altering the CCD timing, maybe I missed it. That's the only way to get around the NTSC limitation, but even if you do that, you immediately run into the sampling rate limit of the CCD itself.
Juan, Yes, I was saying you are extracting 4:4:4 COMPARED to the DV signal of 4:1:1, not that I thought you were trying to get 4:4:4 FROM 4:1:1. And I also said you WOULD get a LITTLE better resolution. It's not about NTSC "compliance," it's about the CCD timing that is designed to produce the NTSC signal. Regarding: "You can say that the DV compression only gets rid of uneeded data, but i think very few people here will beleive you," my point is that the 5:1 compression cannot be translated to mean a loss of 80% of the info. Yes, compression is bad and the artifacts look horrible. I was only clarifying that misstatement, which I realize was not yours, but was in someone else's comment. I really think this is a very worthwhile endeavor, and I'm interested in seeing the results. What would be even better, once you're done, is to get "someone" to go in and remove the DVX100 lens and put a nice Fujinon (or Canon) lens on one, just to see how much difference it makes. |
April 23rd, 2004, 12:12 PM | #380 |
Regular Crew
Join Date: Sep 2003
Location: Worcester, MA
Posts: 64
|
The thing is, Dan, that you're posting stuff we basically already knew. We knew the limitations of the CCD, we knew how much resolution we were getting. The whole reason this is exciting is because we would finally have real depth of color and all the information our CCDs can record instead of the (admittedly not terrible, but definitely not perfect) information coming off the DV tape. And the more information you have, the better your end result will look after you're done tinkering with it.
I don't think anyone's had unreasonable expectations about what this project means. |
April 23rd, 2004, 02:03 PM | #381 |
Major Player
Join Date: Jul 2003
Posts: 479
|
Dan,
>I didn't see anything about altering the CCD >timing, maybe I missed it. That's the only way to >get around the NTSC limitation that's incorrect. I assume you are talking about how much bandwidth NTSC allows for luminance, and the two chroma signals. This is irrelevant at the stage i am extracting the signal. The only difference between the PAL and NTSC CCD's in the DVX's is the pixel aspect ratio and the frame size. There are 3 CCD's, all MONOCHROME. There's a color filter in front of each one. The CCD's output the entire frame, that's it. It's S/H'ed and then sampled by an A/D converter for each CCD. THUS, you get full RGB for each pixel at 12-bit/channel precision. There is NO YUV signal at this point. It doesn't matter if you time the CCD's for 25fps, or 30fps, or 24fps, you STILL get FULL COLOR RGB FRAMES out of the A/D's. These frames correspond to 4:4:4 if they where converted to YUV, but they are not. About your compression statements this is what i was referring to: >Compression is not truncation, >it's an analytical processing method of getting rid >of as much UNNEEDED info as >possible. This is also incorrect. There are different types of compression but the type used in DV is destructive. It's just a matter of what is "good enough" for prosumers. It's a tradeoff, between degradation of the original signal and bandwidth. >my point is that the 5:1 compression cannot be >translated to mean a loss of 80% of the info. Actually it's even more than that. However, i don't think anyone here was thinking of the effects of ~just~ the compression, but rather the entire DV process which is what i mean. I can mathematically show that given the original image sampled by the camera, you loose at least 80% of the info just in the decimation, frame resizing and color precision, without even taking into consideration compression, panasonic's chosen dynamic range, etc. if you want to see it i can do it, but i think we might be going off topic here, and Stephen might use his Samurai sword on us. <g> Juan |
April 23rd, 2004, 04:56 PM | #382 |
Major Player
Join Date: Jul 2003
Posts: 479
|
I agree completely with Luis.
Also, it wasn't my intention to come off as rude at any point, if i did i apologize. I'm just trying to explain things as best as I can. :) Update: I've started ordering the materials for the prototype. Like I said right now I am stuck with work and finals but once i'm done I will also dedicate some time to either eliminating the noise problem, or writing a simple program that gets rid of it by doing a specific median(pretty easy). I am aware this is not a solution, however I do not want to get stuck because of a small problem that i am almost %100 is in my test setup. In the end the prototype is setup so differently that it won't make a difference if it works perfectly in the test or not. Before any of this, 10-bit 4:4:4 uncompressed clips will be posted.... :) Cheers, Juan |
April 23rd, 2004, 05:29 PM | #383 |
Built the VanceCam
Join Date: Apr 2004
Location: Prescott Valley, AZ
Posts: 109
|
Last Post
I was merely addressing some issues that I read in the threads. Luis expressed expectations of signifcantly better RESOLUTION, which won't be there. So how is that "already covered"?
Juan, you keep saying that "it's not NTSC," and in virtually the same sentence you say you're "taking the signal right off the chips AT THE A2D SAMPLING RATE." The A2D sampling rate of 13.5MHz *IS* THE NTSC-DEFINED SAMPLE RATE. It would have been fun to be in on this project--I could even have helped, believe it or not. But it's not worth my time defending stuff that, because you guys don't understand it, you think I'M wrong! Pretty funny really. I guess my 12 years as an electronics engineer and 35 years working with NTSC is just a fluke. I'm sure you'll all be happy to know this is my last post. Good luck, Juan. It's a commendable project, and I wish you all the best. |
April 23rd, 2004, 05:35 PM | #384 |
Major Player
Join Date: Jul 2003
Posts: 479
|
Dan,
I'm sorry to see you go, and I really don't have control over what other people say on this board so all i can do is reply for myself. The sampling rate of the A2D's is not 13.5Mhz. What else can I say? It's mathematical. I'm not doubting you have tons of experience, but i think you are making huge assumptions about a camera you haven't looked at the schematic for? I could sit here and flash degrees at you all day, but if the math doesn't add up. Sorry, i'm just being honest. All the best, Juan |
April 23rd, 2004, 08:36 PM | #385 |
Trustee
Join Date: Jan 2003
Location: Wilmington NC
Posts: 1,414
|
Juan, will the software you write take the R G B images and layer them into one 16bit file and write that to disk? or will we have to open all the RGB files and layer them on our timeline for editing?
|
April 23rd, 2004, 08:40 PM | #386 |
Major Player
Join Date: Jul 2003
Posts: 479
|
Obin,
My software right now takes a raw capture file, and outputs raw 16-bit RGB frame files in a new directory, labeled with frame time code. Before I upload the clip i will write the remaining code for the TIFF header in each frame file so they can be opened with any program. However, I am still unsure about my R,G,B frame shift values. That last frame i uploaded has my best guess so far as to how the RGB frames align. Juan |
April 23rd, 2004, 11:22 PM | #387 |
Major Player
Join Date: Apr 2004
Location: Austin, Texas
Posts: 704
|
RGB SHIFT VALUES
Not that this matters at this stage of the testing - but do you anticipate that the RGB shift values you are using would be the same for every DVX100?
If not, that would mean you'd have to calculate the RGB values for each camera that was modified, and rewrite your program on a one to one basis. |
April 23rd, 2004, 11:28 PM | #388 |
Major Player
Join Date: Jul 2003
Posts: 479
|
Very good question.
I expect the alignment might not to work for all DVX's. This is why I will be including user-adjustable settings for this. If i do install this on a case-to-case basis, then i will take care of entering the correct values in some documentation or the software i include. The reason why i think it will be different for all cameras, is because there is an aligment procedure on the DVX service manual, which basically calibrates on-board software. Since my mod bypasses all on-board software, this calibration has to be done at the time when the layers are put together. My capture program will have all calibration options user-accessible, including the option to decimate footage in order to save space. Juan |
April 24th, 2004, 04:34 AM | #389 |
RED Code Chef
Join Date: Oct 2001
Location: Holland
Posts: 12,514
|
Okay, as some of you may notice I've removed two posts and
editted some others. Please stay on-topic and civil to each other. Because we all come from a different background and may be thinking differently about certain things it does not give us the right to call people names, insult them etc. Such things will not be tolerated here. Just discuss the matter. Perhaps Dan knows some things we all don't and vice versa. Civil discussion can only lead to better things! Thank you.
__________________
Rob Lohman, visuar@iname.com DV Info Wrangler & RED Code Chef Join the DV Challenge | Lady X Search DVinfo.net for quick answers | Buy from the best: DVinfo.net sponsors |
April 24th, 2004, 09:57 AM | #390 |
Obstreperous Rex
|
Thanks, Rob. By the way, I'd like for all participants in this thread to know that I met Dan Vance personally at NAB. He is an exceptionally talented engineer and filmmaker. Thinking that we would all benefit from his input, I invited him into this particular forum in which he expressed significant interest. Because of the regretable actions of a few people, I am now in the postion of asking Dan to overlook the way he was rudely treated by some of us and please give us another chance. We need to remember that no matter how much we think we know about something, someone else may come along who knows even more. It is my hope for the DV Info community project that we attract open-minded individuals who are willing to consider alternative viewpoints that don't always agree with our own. And now, I'm off to apologize to Mr. Vance on behalf of a couple of members here who unfortunately could not act beyond their own zeal.
Be sure to review the VanceCam 25p camcorder. While many of us discuss alternative imaging methods, Dan Vance has actually built a working homemade camcorder. He has a lot to offer. |
| ||||||
|
|