|
|||||||||
|
Thread Tools | Search this Thread |
April 19th, 2009, 11:36 AM | #1 |
Regular Crew
Join Date: Jul 2007
Location: Cph Denmark
Posts: 136
|
Proper LCD calibration for video colorgrading
I need advice on the right way to set my color temperature on the edit suite monitors.
I bought myself a X-Rite i1Display2 calibration tool to finally set my monitors straight. After running the calibration, Gamma and contrast look fine, but the grayscale tones dont look neutral gray anymore. White is white, black is black but the grays have a red tint to them no matter how I set the whitepoint. I read somewhere that the standard for videoediting calibration is 6500K?. But that looks very red in my eyes. Ive done some testing to show that the grays are indeed consistent now and are holding at 6500K throughout the scale, even if they have the red tint to them. Without calibration the color temp ramped up linear to the grayscales brightness. The bright graytones went from 6500K and down to the dark graytones which ended at 9600K (maybe its just me but that looked kinda more "neutral" that way). Furthermore Im concerned with the fact that nearly all home displays, laptops in particular seem to follow the same pattern.. (bright grays have a lower Kelvin value than the dark grays). And since I mostly make videos for web viewing, my correctly calibrated material will end up bluish on the avarage screen. Any thoughts? I would like replys for both those who do videoediting for TV and for web. Are there any differences? I can see that professionally graded movies look great on any screen. Calibrated or not. So I guess the question is, whats the trick? How do they maintain the color to be nearly the same on all consumer screens? Im gonna post a link to a music video Ive done. http://www.youtube.com/watch?v=T16umXG5lOU The piece was graded before I got the calibration tool. In my edit bay it looked ok, but now after calibrating my monitors it looks to orange, dark and a bit over saturated... At the same time, when ever I see it on some of my friends laptops, it looks very pale, dull, and to bright. Now look at this little piece of geniuality at work. http://www.youtube.com/watch?v=auF9ONmSUlQ I'd eat my leg to figure out how they graded it to be so consistent. It looks fantastic on every monitor Ive seen without to much noticable difference. |
April 19th, 2009, 12:54 PM | #2 |
This seems to be a topic that keeps coming up. The belief that color timing a monitor can be done with a hardware spectrometer is urban legend, plain and simple. The fact is that spectrophotometers, e.g. Spyder, Gretag macbeth, etc., are designed to balance screen displays for PRINTING photographs. The color maps these devices generate are designed to represent, on screen, injet printer colors.
The best procedure for color timing a monitor is to go thru the process here: HD CINEMA: HD MONITOR CALIBRATION (aka how to calibrate w/ ARIB bars) If you don't do inkjet printing of photographs, throw that Spyder away, or at least put it in a drawer somewhere where you're not tempted to use it. |
|
April 19th, 2009, 08:36 PM | #3 |
Regular Crew
Join Date: Jul 2007
Location: Cph Denmark
Posts: 136
|
so you are saying that Im better of with simply using a sRGB or AdobeRGB profile, which variates the color temperature of the grayscale on my screens?
That actually makes me happy if confirmed, however I still don't come closer to the answer how professional graders keep the colors in place. I know the main reason to that must be EXPERIENCE, but there has to be something in there system which helps them on the way, if its not a well calibrated monitor then what? |
April 25th, 2009, 04:45 AM | #4 |
Inner Circle
Join Date: Oct 2001
Location: Honolulu, HI
Posts: 2,054
|
Take a look at the FSI monitors. Pricey. But supposed to be very accurate.
Comes in two grades but I suspect that Grade 2 would still be miles ahead of what most of us currently use. Calibration equipment available for these monitors. Flanders Scientific, Inc. - Top Quality Broadcast & Post Production Equipment.
__________________
Dean Sensui Exec Producer, Hawaii Goes Fishing |
April 27th, 2009, 11:59 AM | #5 |
Regular Crew
Join Date: May 2007
Location: Penang, Malaysia
Posts: 123
|
Nik,
Bill's pretty much right. Many people, including myself, have gone down this same path of trying to "cal" our computer monitors for best video only to be left wonder why it doesn't look like it should. There are many problems but the most basic one is that computer LCDs don't use the same color primaries (sRGB, etc) that video LCDs do (REC709). And for the most part the primaries can not be adjusted enough to get lined up correctly. The new HP dreamcolor might be an exception here. Bill is almost correct that sypders, et. al. are designed for print. Actually, it is the software that drives them that is designed for print. A sypder, et. al. are just color photometers and with the right software can be made to provide fairly decent results for cal'ing a TV. Google "HCFR". The "right way" is to get a broadcast monitor. The poor mans way is to use a HDTV and cal with a spyder. Good luck, Mark |
April 27th, 2009, 02:05 PM | #6 |
Mark...
wow, this HDCR software is awesome. Problem is, I can't figure out how to use it properly. The help file is translated from French, so, it's a bit obtuse. Can you recommend any more comprehensive hekp file? |
|
April 27th, 2009, 02:40 PM | #7 | |
Inner Circle
Join Date: Dec 2004
Location: Arlington, TX
Posts: 2,231
|
Quote:
This myth that the television makers have portrayed that HDTVs are interchangeable with computer monitors is just wrong. One thing I have noticed is that I am getting a difference in gamma with HD material. When I produce something for DVD or Blu-ray I have to bump-up the gamma for internet viewing as the image will look a bit underexposed when I put it on the web. My Editing screen is calibrated (with bars) and it looks great on my plasma. I have not found the trick yet of one file for all sources. |
|
April 27th, 2009, 03:03 PM | #8 |
Regular Crew
Join Date: Jul 2007
Location: Cph Denmark
Posts: 136
|
I see.. So I should rather use my HDMI cable instead of the DVI? I have one of those hybrid samsung screens, that both has tv/video (HDMI/composite) connectors and computer (DVI/D-Sub)
Infact I have been puzzeled about the REC 709 thing during the last project I did, when I realized that Premiere Pro shows the image in REC 601 no matter if its SD or HD. But the single most annoying thing is the final render which looks very different on every other screen I watch it on. On some its way over saturated and crushed, on others its almost colorless and milky. I don't see that huge difference when watching other peoples work (such as hollywood movies). One aditional question I have is, if its possible to tweak the gamut to stay fixed at 16-235. When I use the MP4 H264 decoder from k-Lite. You can set it to playback with either 0-255 or 16-235 (standard is 0-255) but other H264 decoders have 16-235 as a standard (youtube uses one of them). I would assume that if I set the output levels to 16-235 values in my NLE. The decoder should render the same black and while levels, in both standards. However thats not the case. 16-235 is always more contrasty. Can that be fixed? |
April 27th, 2009, 06:22 PM | #9 |
Regular Crew
Join Date: May 2007
Location: Penang, Malaysia
Posts: 123
|
Nik, Using a computer video card to drive a LCD (TV or dislpay) is bound to give you grief when it comes to color.
What you really need is a Blackmagic, Aja, or Matrox type of interface to drive a broadcast monitor or HDTV. There's the whole RGB -> YUV and 0-255 vs 16-235 thing that these boxes should be able to handle. There is plenty of good threads here to search thru on this subject. I'd start with the calibration thread: http://www.dvinfo.net/conf/sdtv-hdtv...-monitors.html Bill, I've recently moved and can't seem to find the how-to I use... and I don't have the time to write up my own at the moment. But I know that avsforum is where I got it. It's a pain to search thru but lots of good info. Mark |
April 27th, 2009, 07:07 PM | #10 |
Inner Circle
Join Date: Dec 2004
Location: Arlington, TX
Posts: 2,231
|
Nik, Mark touched upon another important factor.
You need a pro video - video card. With proper output signals for a broadcast monitor. Without this type of card, your video output will not be true for precise color correction. It is not cheap to get it right! |
April 27th, 2009, 07:53 PM | #11 | |
Regular Crew
Join Date: Jul 2007
Location: Cph Denmark
Posts: 136
|
Quote:
But my question regarding the 0-255 vs 16-235 issue, was not meant as to how I can display one or the other during my edit, but rather, how do I make sure that all decoders (client players) show the same values. To make it more clear... Would it be possible to set the levels -in the editbay- in such way so that a decoder (client player) which is default to 0-255, shows the same black and white levels as if it was 16-235? My understanding in this area is limited, but my logic dictates that if there is no information below 16 and above 235, then there should be no visible difference. If 16 is black, then why does it become gray when 0-255 gamut is set (in the decoder). I just want to streamline my project to look more equal, no matter who and how people are viewing it. On the other hand I can see that every single videofile (even feature films) that decodes with FFshow, does show a visible difference in the values when shifting between. 0-255 and 16-235. So I guess it's due to some kind of mechanic in that decoder which is "lifting" the black value above 0... thats annoying. |
|
April 27th, 2009, 11:10 PM | #12 |
Regular Crew
Join Date: May 2007
Location: Penang, Malaysia
Posts: 123
|
Hmmm... I'm not smart enough to answer your question. Let's hope someone chimes in that is.
I know what you mean though... you've got four possible out comes if you mix and match 0-255 and 16-236 encode and decode. I guess you should pick that which matches the majority of you're viewers. In other words: 0-255 for web and 16-325 for DVD and or broadcast.... possible BR, but I'm not there yet. Mark |
April 28th, 2009, 08:28 AM | #13 | |
Regular Crew
Join Date: Jul 2007
Location: Cph Denmark
Posts: 136
|
Quote:
|
|
April 28th, 2009, 04:07 PM | #14 |
Regular Crew
Join Date: May 2007
Location: Penang, Malaysia
Posts: 123
|
The difference between these cards and a Nvidia or ATI card is that they are designed to output video to match video standards; ie: 1080i/p, 720p, 480i/p, ntsc/pal, using video signaling formats (usually YUV); and video color spaces (REC604, REC709, etc). While Nvidia and ATI cards are designed to output video to match computer standards, ie: 1900x1200, 1680x1050 frame sizes, etc; using computer signaling (usually RGB); and computer color spaces (sRGB, AdobeRGB, etc).
They all have the means to produce the proper signals but vary widely on features and price. I have no idea which if any of the work better (or worst) with PP. I would suggest that you research all the cards yourself... it give you a reasonably good education. For what it's worth, I'm in the "poor man's" strata so I have a BM Intensity (the pro wasn't out yet when I got this card). I use it to drive a Samsung HDTV that I calibrate using a i1. The results are fairly good, but I know it's not perfect. And so far I haven't had color issues that you see with this setup. When I have the coinage I'd like to get a broadcast monitor, and maybe a HDLink. |
April 28th, 2009, 05:37 PM | #15 |
Regular Crew
Join Date: Jul 2007
Location: Cph Denmark
Posts: 136
|
thanks a bunch.. Ive been reading about BM all day, and got somewhat smarter along the ride.
As I understand it the intensity card only functions as a video feed from your NLE, so it puzzles me how you could be able to calibrate it using a i1, since it's a computer monitor calibration tool, and requires it to be stored as a ICC profile at the end. |
| ||||||
|
|