|
|||||||||
|
Thread Tools | Search this Thread |
July 28th, 2006, 02:49 PM | #1 |
New Boot
Join Date: Jul 2006
Location: United States
Posts: 22
|
Inexpensive CMOS + 35mm adapter = 3CCD Cam + 35mm Adapter?
Question that kinda jumps a couple different areas:
Does using an inexpensive CMOS camera (i.e. Sony A1U) with 35mm Adapter produce a nearly identical image as a 3CCD camera (i.e. Sony Z1U) with the same added adapter and lens? When using a 35mm Adapter you seem to really take the onus of image acquisition off the camera itself and onto the 35mm lens and adapter, with the camcorder just recording what's on the ground glass. As such I would *presume* the necessity for having a better camera drops way down - wherein a $2k Sony A1U (even with the smaller lens and less red saturation) would produce nearly the same image as the $4,500 Sony Z1U. If anyone has some objective or subjective info on this assumption, I'd love to hear it... Just seems that if you're destined to go the adapter route that blending it with a cheaper camera would seem to nicely offset the price. |
July 28th, 2006, 03:28 PM | #2 |
Trustee
|
Geez, I hope not...I would think that downgrading to a cheaper camera from 3CCD to CMOS would inherit all the issues that come with it...poorer color reproduction, bleeding, soft edges, yuck...I don't really see how a 35mm adapter could "hide" these issues. Flat out, a cheaper camera produces worse images, with or without an adapter. Your thinking, as Spock would say, seems to be highly illogical.
|
July 28th, 2006, 05:13 PM | #3 |
Major Player
Join Date: Mar 2004
Location: Roanoke, VA
Posts: 796
|
I agree with Ben. Besides, the A1 shoots superb images anyway. The addition of ANY imaging devices to the stock lens potentially adds room for degradation of image quality.
__________________
Dave Perry Cinematographer LLC Director of Photography • Editor • Digital Film Production • 540.915.2752 • daveperry.net |
July 28th, 2006, 06:28 PM | #4 |
Major Player
Join Date: Nov 2003
Location: Ashford, AL
Posts: 937
|
In the videos and stills I have seen with the GS400 and a Brevis adapter, there is very little that I could argue with quality wise. So, that seems to debunk your issue of inexpensive camcorder. If you have color reproduction issues, it is not the inexpensive 3CCD camcorder that is causing it.
|
July 29th, 2006, 06:36 PM | #5 |
Major Player
Join Date: Mar 2005
Location: Atwater, CA
Posts: 246
|
the last 2 guys who posted above me.
you dont seem to understand what he meant. basically Dave, you repeated a statment Don Steele wrote in his post, and you say it like you are disagreeing...confusing. perhaps you should understand his logic. it makes perfect sense, other than color reproduction and other issues, you are ultimatly limited to the quality of your gg. and all he is saying is that say your quality of gg is only of type C quality, and the A1u is of type B. and the Z1u is of type A. by adding a type C quality imaging device, your camera now becomes a type C resolution camera. so the Z1u and A1u should have around the same resolution. assuming the resolving power is greater than the gg itself. Ben, i dont know where you picked up your knowledge, but CMOS picks up Color, picture, and video so much more beautifully than CCD. Esecially when it is a progressive chip, which sadly, the HC3 hC1 and A1u are not. It also requires less power, and does not have the light smear that CCD's produce. The only downfall is low light performance. But what is so yuck about better that CCD performace. look at Silicon Imagings pictures, oh yeah, totally yuck. Even the RED camera is CMOS. not CCD. not a downgrade at all, and upgrade if anything . Last edited by Forrest Schultz; July 29th, 2006 at 08:32 PM. |
July 29th, 2006, 08:27 PM | #6 | |
Major Player
Join Date: Mar 2004
Location: Roanoke, VA
Posts: 796
|
Quote:
I stand corrected. I neglected the statement that moving from a 3 chipper to a single chip CMOS is gratuitous. Just because a camera has one sensor does not, in and of itself, necessarily mean it gives lower quality images than any 3 chipper. At the time I bought my single chip Optura Xi, it was the only camera, except maybe the GS 400, that had a native 16:9 image. That's why I bought it. At that time, I was used to shooting with a Sony TRV 950 3 chipper and I think the Optura has as good or better image than the 950. It certainly, In my opinion, was a better choice than the Canon GL2 which was one of the other cameras I was looking at. And yes, any glass you put in front of the camera will be the quality bottle neck. By the way, who is Guy Bruner?
__________________
Dave Perry Cinematographer LLC Director of Photography • Editor • Digital Film Production • 540.915.2752 • daveperry.net |
|
July 29th, 2006, 08:34 PM | #7 |
Major Player
Join Date: Mar 2005
Location: Atwater, CA
Posts: 246
|
sorry Dave, i meant Don Steele. I just didnt know if you read the portion he said about the imaging device being the bottleneck as you said. sorry to come of strong like that. i was just amazed that no one was seeing what Don was saying in his original post. No correction needed, i was just ranting. thank you for being kind in reply
|
July 29th, 2006, 08:43 PM | #8 | |
Regular Crew
Join Date: Mar 2006
Location: Seattle, WA
Posts: 158
|
Quote:
|
|
July 29th, 2006, 08:51 PM | #9 | |
Major Player
Join Date: Mar 2004
Location: Roanoke, VA
Posts: 796
|
Quote:
I don't know Don Steele either, but it's nice to know we think the same :)
__________________
Dave Perry Cinematographer LLC Director of Photography • Editor • Digital Film Production • 540.915.2752 • daveperry.net |
|
July 29th, 2006, 11:59 PM | #10 | |
Trustee
|
Quote:
And not to be boisterous and rather to emphasize the validity of my point, I have tried many focusing screen and camera combinations and know from experience that this is true. And see Mark's post about the whole CCD/CMOS thing. I had an HD10u for a while before selling it for an FX1. Wonder which one the adapter footage looked better on?
__________________
BenWinter.com |
|
July 30th, 2006, 01:13 AM | #11 |
Major Player
Join Date: Mar 2005
Location: Atwater, CA
Posts: 246
|
I agree with everything you said Mack. I know the Z1 and FX1 are better than the HC1,3, and A1u, cuz ive seen comparisons. there is no doubt that the Z1/FX1 look better. but they are still remarkly close despite the price difference. Which i think is great. And if Sony wasn't run by Stevie Wonder they can desing a CMOS camera that would kick ass. but they arent doing it. so Silicon Imaging is.
But i hate Sony very much because all they had to do was keep the Cmos sensor progressive(which nativly, i think most Cmos sensors are) and make 24fps, 30 fps, and other combinations possible, and they would have had a rockin' camera. but yet they ruin the whole thing by making it into an interlaced camera, which i dont know what freakin idiot designed that. perhaps they should have thought to make it switchable. i mean, i know there are alot of reasons to shoot 60i, but jeez, why be going HD if you arent even a least bit interested in 24p. oh well. |
July 30th, 2006, 01:19 AM | #12 | |
Major Player
Join Date: Mar 2005
Location: Atwater, CA
Posts: 246
|
Quote:
or the prosumer one with 3 CCD's (Sony). Perhaps you were trying to make a point and were mistaken thinking that the HD10u was a CMOS sensor, which it sadly is not. it is a CCD, and goes to prove more, CCD is just not that great. I find CMOS to be extremly appealing to the eye, and very filmic, perhaps not any of the A1u/HC1/3. but even Science Vision cameras and such. they are extremely fimlic when they shoot progressive. |
|
July 30th, 2006, 01:21 AM | #13 |
Trustee
|
Sony refuses to go progressive because interlaced is basically their baby. If they go progressive, they risk admitting to a certain degree that they were wrong about interlaced being the way to go and down goes their precious ego. And you know how sensitive Japanese pride is.
and yes, the HD10u is CCD but I'm not trying to prove that. I'm referring to your usage of "A" versus "C" grade cameras affecting the quality of the image coming from an adapter. Which, what a surprise, it does.
__________________
BenWinter.com |
July 30th, 2006, 01:22 AM | #14 |
Major Player
Join Date: Mar 2005
Location: Atwater, CA
Posts: 246
|
"Sony refuses to go progressive because interlaced is basically their baby. If they go progressive, they risk admitting to a certain degree that they were wrong about interlaced being the way to go and down goes their precious ego. And you know how sensitive Japanese pride is."
cant agree with you more there Ben |
July 30th, 2006, 01:53 AM | #15 |
Inner Circle
Join Date: Dec 2003
Location: PERTH. W.A. AUSTRALIA.
Posts: 4,477
|
I wouldn't get in too much of a lather about 720 progressive versus 1080i. HDV capture and transcoding software continues to evolve and if the progressive look is needed acceptable options are there now.
Every action has an equal and opposite reaction. As I understand things, interlaced aquisition places less demand on a digital processing system and equates to less power being required, therefore the convenience of fewer or smaller batteries being required for a given working life. Others here please contradict my comment if it is wrong. The JVC HD100 apparently has to split the imaging task across two processing chains due to a destructive overheating state if a single chain is used. Heat ( = power used ), remains an issue to the point where there is a recommendation that the camera not be operated on its side because of excessive heat buildup possibly damaging the tape. There is no doubt however about the pleasing look of the JVC HD100 image. The nd result is all about compromise. Different design teams have come up with their own solutions to get the job done to our level of pleasure. The marketplace will be the final determinant. There might well be profitable room for all solutions to co-exist. As for 35mm groundglass based optical relay systems, 850 TV lines of resolution seems to be about the practical limit so the argument of 1080i versus 720P I guess may be largely academic. |
| ||||||
|
|