|
|||||||||
|
Thread Tools | Search this Thread |
December 29th, 2009, 10:04 PM | #1 |
Regular Crew
Join Date: Feb 2007
Location: Ontario Canada
Posts: 113
|
Is CRI really dead?
I found the following excerpt from a lighting manufactureres website. Now I consider the statements about CRI not being very important complete bunk but it does raise a question, are the new cameras so good that bad lighting (color wise) doesn't make as much of a difference as it used to? There is so much more to a high CRI rating than just color temperature. I feel that the better the light quality the better the image, and being able to match light sources with a given standard is important.
Interested to see what others think about the importance of CRI. -Start quote "The CRI rating of light sources, is an old out-of-date standard developed by the International Commission on Illumination (CIE). It is a relative index which compares the relative color reproduction ability of one light source to another for a standardized FILM medium. The maximum value of 100 is given to tungsten light sources, therefore all other light sources including sunlight will have values less than 100. This was an important reference for film photography because tungsten balanced film rendered perfectly in tungsten light and color shifted for almost all other light sources that have lower CRI values. This outdated light valuing method is mostly irrelevant to digital photography because all digital cameras include compensation for color temperature and are more accurate at 5500-6000K than the 3200K color temperature of tungsten light. In reality tungsten light does not render very well as it has an abundance of red and infrared (heat) and is relatively weak in the blue part of the spectrum. In reality, light sources that are closer to 5500K (like the XXXXX Lites) will render color better in a digital camera than tungsten light. Conclusion: We include CRI ratings in our product descriptions because many customers request it as it is the only reference commonly used. But in terms of using CRI to determine the rendering ability of a light source CRI is useless. In reality we are anxiously waiting for the CIE to develop a better index that address the many light generation technologies present today that are far better than tungsten sources." -End quote |
December 29th, 2009, 11:41 PM | #2 |
Inner Circle
|
Hi Chris.............
Gonna try not to dig myself a big hole on a subject I know little about, but this has me a bit stumped:
"There is so much more to a high CRI rating than just color temperature" Er, what, exactly? If a CRI rating is measured entirely against Tungsten, with it's preponderance of red and absence of blues, how can it have any relevance whatsoever to what renders best in digital imaging systems that don't use silver as the imager? From my reading of your quoted article, the whole basis of the CRI related to FILM, not digital, and they go on to say that the standard is basically out of date with current technology. I won't argue with that, 'cos I can't. It is (out of date for digital imaging). Good as the new digital cameras (still or video) are, I've yet to find one that will render colour correctly (no matter how analy it's WB has been set) with pure tungsten lighting. Whether the "XXXXX Lites" from manufacturer "X" are any better, is another question entirely. The CRI system was thrashed out between film manufacturers and lighting ditto over many years and served the industry well for a very long time indeed. It's now "all change" on the camera front and there needs to be the same in lighting as well. That's my take, for what little it's worth. CS |
December 29th, 2009, 11:55 PM | #3 |
Major Player
Join Date: Jun 2006
Location: Shenzhen, China
Posts: 781
|
CRI wasn't thought up for film or media production use at all. Its a rating for all lighting manufacturers to use. Its not really dead until another system comes into place to replace it and every one accepts it.
Real film people think its important because they have no way to filter out the extra green found in lower CRI lighting other than minus green. A lot of people brought up under film people therefore think its important even though they it may not be a concern today with so many using digital mediums. Video/digital media use has made it super convenient and many times the custom white balance on a camera can take care of any infractions in the green/magenta balance, getting it back into order. Whether you add more magenta inside a bulb in the form of extra phosphors, magenta outside in the form of a minus green filter or compensation in the white balance, they are all pretty much accomplishing the same thing for digital use. Some kind of filtering is coming into play one way or another. For film people, the first two are their only choices since film and film cameras have no nature filtering ability built in. |
December 30th, 2009, 10:57 AM | #4 |
Regular Crew
Join Date: Feb 2007
Location: Ontario Canada
Posts: 113
|
I to am not an expert, just interested in finding the best colour quality and knowing how to tell it when I find it.
CRI is more than just a colour temp rating, it’s a rating of how well a given light source renders colours to the eye, or camera in this case. It considers the emission of full spectrum light, free of nasty spikes, and the smoothness of the spectral curve covering all of the given wavelengths of light in a light source. The fuller and smoother the curve the “nicer” the light looks to the eye and the camera. I think when using digital cameras as soon as you put up a white card and carry out a white balance procedure on your camcorder, you are immediately altering the red, green and blue values of the cameras three CCD chips (not sure how this works with CMOS), this can result in inferior skin tones and other colour inaccuracies. Digital cameras internal white balances are biased towards corrections on the green/magenta axis but the chips see in RGB, to my little mind it seems the digital white balance is leaving out some of the spectrum from its correction that makes for less than perfect skin tones. True the tungsten standard has colour peaks but they are smooth and full spectrum. It may not be the colour your looking for but it is a standard, with modern cameras a click here and a quick gel there and great looking video can be recorded even with the limitations of digital white balance – film guys aren’t so lucky. With higher CRI light sources less gelling will be needed and thus light powers can be lower or fewer fixtures used to get the same light level as well. Oh well, just more material for the water cooler debates. |
December 30th, 2009, 04:40 PM | #5 |
Major Player
Join Date: Jun 2006
Location: Shenzhen, China
Posts: 781
|
By the way, CRI and color temperature are unrelated. You have two measuring axises in light: red/blue and green/magenta. Color temperature is rating of the particular shade of white light with a bias between red on one end (3200K for instance but of course goes much lower) and blue on the other end (5600K for instance and beyond). CRI can indirectly be related to the green/magenta axis. The more out of balance on the green/magenta axis (either toward green or magenta bias) the less the CRI. The more in the middle of the scale, the closer to 100 you come. You can easily have a 5600K light that's a bit green (lower CRI) and you can also have a 5600K light that is in balance (high CRI). They're both still 5600K though.
On the subject of full spectrum. There's only really 3 full spectrum sources I know of: tungsten, daylight and carbon arc. Other than that, anything else that claims to be full spectrum is more of a simulation of full spectrum but if you actually look at the readout taken from an integrating sphere of the particular light being measured, you'll find it to be mostly pretty spikey in red green and blue and the rest of the spectrum more shallow. Last edited by Richard Andrewski; December 30th, 2009 at 05:13 PM. |
December 30th, 2009, 06:05 PM | #6 |
Regular Crew
Join Date: Feb 2007
Location: Ontario Canada
Posts: 113
|
Ahh the ol' integrating sphere , not your standard meter in everyones bag of tools. Thanks for the input. Still anticipating the arrival of my Kenko 3100 colour meter you reccommended Richard, gonna see what trouble I can bring to the next shoot for the grips.
|
December 30th, 2009, 09:15 PM | #7 |
Trustee
Join Date: Nov 2005
Location: Sydney Australia
Posts: 1,570
|
I think there's a lot of truth in the statements but the conclusions are the opposite.
With the narrow emission spectra of some modern light sources we should be looking at the spectral emission curves rather than relying on averaged numbers. |
December 31st, 2009, 09:15 AM | #8 |
Major Player
Join Date: Jun 2006
Location: Shenzhen, China
Posts: 781
|
That's true but many wouldn't really understand what its telling them. In this day and age, people want a quick way to sum things up, make it easy to understand and unfortunately CRI is the quickest way to communicate this until someone else comes up with a way to sum up the spectral output.
|
December 31st, 2009, 10:25 AM | #9 |
Regular Crew
Join Date: Aug 2006
Location: Ithaca, NY
Posts: 122
|
Can you (or someone else) go into a little more detail about the "many times" part? Is it that some cameras custom white balance better than others? And/or are there circumstances whereby the custom white balance works better than other circumstances? Thanks! -JP
|
December 31st, 2009, 01:20 PM | #10 |
Trustee
Join Date: Jan 2004
Location: Scottsdale, AZ 85260
Posts: 1,538
|
Johnathan,
One excellent example is in shooting in mixed lighting situations. Imagine a set where you have reception area with daylight admitting windows. Then, beyond the reception counter you have a inner workspace lit by fluorescent fixtures. Your camera takes a "white balance" on the entire scene which - due to the mixed light, is less than satisfactory. No matter HOW sophisticated the camera's internal white balance circuitry - you have a problem - because no camera that I'm aware of can set some of the CCD pixels to one WB and others to another. So you're faced with telling your camera what YOU want. Do you want to skew to take the green spike out of the inner office - at the risk of making the reception area and it's people look too "ruddy?" Or would you prefer to balance on the reception area people and let the inner office people look sickly green? Your choice. If I can add one other note to the discussion - one reason that Tungsten -for all it's inefficiencies in converting electricity into light stays with us is that the technology is so well understood. Pop on line and you can find tungsten to daylight conversions anywhere. Tungsten is a standard. Daylight, on the other hand is pretty arbitrary. What kind of daylight? Daylight at 12 noon? Daylight at 5pm. 5pm where? Northern or southern hemisphere? Is it daylight through clouds? What kind of clouds? Silky stratus or threatening clumulus? Is it daylight reflecting off the white sands of New Mexico? Or bouncing off the blue walls of the restaurant next door? With sunlight, you're constrained by using it as it falls on your scene. Adapting or filtering it is typically a hassle often involving giant silks, sandbags and large crews. Tungsten is a snap by comparison. BTW, Richard. Maybe you should look into a line of carbon arc fixtures! I kinda miss the smoke and heat of my days as a Strong Super Trooper follow spot operator back in High School! |
December 31st, 2009, 01:35 PM | #11 |
Major Player
Join Date: Jun 2006
Location: Shenzhen, China
Posts: 781
|
Hehe, I ran one of those in College. Terrible is the best way to sum it up from an operator's viewpoint. HMI was someone's attempt to redo it better but not completely full spectrum though like a real carbon arc. So now most all of those spots are retired and in their place is either a spot with an HMI bulb or a Xenon which is incredibly powerful but not necessarily all that efficient.
One thing I was curious about recently, sort of O.T. the original question. What was the color temperature of the carbon arcs used in filmmaking? In other words, what type of color temp film did they use when filming with predominantly carbon arcs? My guess is daylight but I was thinking though that carbon arc was actually not quite in the 5600K range though. Jonathan, Bill said it. Mixed light situations confuse the camera white balance logic. Mixing color temps or simply mixing different kinds of light even within the same color temperature can be an issue or it may be no issue at all. Testing will tell you. |
December 31st, 2009, 02:54 PM | #12 |
Regular Crew
Join Date: Feb 2007
Location: Ontario Canada
Posts: 113
|
Another reason that using manual white balance is better is that many cameras only auto white balance within a certain range. For example the Canon auto white circuit has a lower limit of reading 3000K and many residential interiors lit by practicals are below that limit so AWB goes whacky. It's the Star Trek Ferenge (sp) yellow skin syndrome:)
|
December 31st, 2009, 06:02 PM | #13 |
Inner Circle
Join Date: Feb 2004
Location: switzerland
Posts: 2,133
|
but as said above , CRI and color balance are different things.
Good CRI is the warranty that if you expose a color chart, all colors will render more or less as they should relatively to each other. With bad CRI, you will get some colors to be ok (usually blue,green red) and some being totally messed up (usually yellow, brown, purple). So you can adjust white balance to any value you want (and god knows that NO commercial movie is released with original colors), if some colors are ok and some are wrong, it is an heavy work to get them back ok. |
December 31st, 2009, 07:03 PM | #14 |
Major Player
Join Date: Jun 2006
Location: Shenzhen, China
Posts: 781
|
As far as I can tell preset white balances like 3200K or 5600K don't necessarily have any green/magenta correction ability in them. You have to go custom to get that logic to kick in.
|
January 1st, 2010, 10:21 AM | #15 |
Regular Crew
Join Date: Aug 2006
Location: Ithaca, NY
Posts: 122
|
Ok, I get it about mixed light situations. So where the light is homogoneous it makes more sense to do a custom white balance. Like if you've got an LED source with a green spike, it seems a lot easier to just use the wb than to gel the source with a plus green, besides which you lose some light with the gel. I guess there are situations where you just can't do a custom white balance...anyway I just didn't realize that the custom white balance could do just as good a job. Really it should do a better job since you're often guessing when you use a gel...
That said, it seems safer to gel the source if you're in a mixed light situation... Does this make sense? |
| ||||||
|
|