|
|||||||||
|
Thread Tools | Search this Thread |
October 26th, 2007, 03:49 PM | #1 |
Regular Crew
Join Date: Apr 2007
Location: Hertfordshire
Posts: 118
|
Computer colour, Studio RGB and .m2t
There is definitely something very strange going on with the colours in the project I am working with.
The project was transferred from Z1E -> Vegas 6 -> .m2t. I would naturally expect the resulting video rendered as an uncompressed AVI to look poor on a computer screen without conversion to Studio RGB equivalents. However, I am looking at some stuff i have rendered directly from the timelien in the last few days, and it has struck me, this stuff looks much better in Computer RGB. For example, one of the problems I am finding when viewing the footage when converted to Studio RGB values is that the faces are either very bright or very yellow in content. There is too much contrast in the image. View in Computer RGB gives a much more balanced image. Could this be down to some odd workflow glitch that may have unintentionally converted the footage to Computer RGB, or is it just down to shooting in a high contrast environment? As it stand, colour correcting the footage has been such a nightmare, I am very tempted to convert the whole project to Computer RGB just to try and taper off the top and bottom end a bit. Or is this madness? Cheers, Rob, |
October 26th, 2007, 05:09 PM | #2 |
I think you need to bone up on video information. Computer RGB looks better on the computer because that's what it's designed for...hence the name "computer" in RGB. Video displays are designed to show RGB levels from 0 to 255, and that's what you get with computer RGB. Studio RGB is really a misnomer. It should be called TV RGB or NTSC RGB because the color range for studio RGB is 16-235, designed to be viewed on a TV monitor, which uses that color range by design. StudioRGB, when viewed on a computer display, will look dark. Computer RGB, when viewed on a TV will look washed out. You need to be sure you're outputting to the proper color range when you generate your final video stream. Vegas allows converting one to the other by use of the Levels FX. Choose the proper preset for your display media. I ALWAYS throw a color bar pattern in my video stream thru the rendering process and monitor the pluge bars. After I get thru the rendering process, and the pluge bars still look right, I'll remove the colorbar test pattern. That's the only way I've found to get consistent results, since, every codec handles it differently.
|
|
October 26th, 2007, 06:06 PM | #3 | |
Regular Crew
Join Date: Sep 2007
Location: Toronto Canada
Posts: 31
|
Quote:
Whenever I import HDV video from my camcorder, without any level adjustment, it looks washed out on my (reasonably well calibrated) computer monitor. I understood this to be normal because my imported video is 16-235 Studio RGB range, whereas my computer monitor displays in the 0-255 Computer RGB range. This would explain (in my understanding) the fact that my histogram monitor in Vegas shows (unfiltered) HDV as not falling below about 16 at the low end, and the fact that blacks appear greyish. My normal operation is to apply a levels filter to the whole imported track when I import, and apply the Studio RGB --> Computer RGB preset, which appears to have the effect of "spreading out" the dark-to-bright range to 0 through 255 (or wherever it happens to fall depending on the content of my shot). (I do this because I don't have/use a regular NTSC monitor, so this is going to give me - in my understanding - the best approximation of levels for when I ultimately export. If I want to adjust the brightness further, I add a second levels filter on the event.) If I am exporting to a computer-destined format, such as WMV or QT, I leave the track-level Computer RGB filter on. It looks washed out, as it did when I first imported, if I don't have this filter. However, if I export to DVD, for viewing on my TV, before I render I disable the Computer RGB levels filter (temporarily). If I don't, my experience is that on TV it looks way too dark, and details are lost in the shadows. If I do disable it, it looks great - pretty much as I intended. This workflow/set of observations comes from plodding along blind as an amateur. Have I got my reasoning right/wrong? Thanks, Ian. |
|
October 26th, 2007, 06:22 PM | #4 |
Regular Crew
Join Date: Apr 2007
Location: Hertfordshire
Posts: 118
|
Hi Bill, thanks for your reply, but I think you have slightly missed the gist of my post. What I was saying was that I was expecting that the .m2t stuff would appear washed out and inaccurate on the computer monitor, as it was was showing video designed to play on a screen that uses registers between 16-235 for the visible range. However, in actual fact the video looks more appealing when it is shown within the the 0-255 range. With the blacks set at 16 and the whites at 235, the lower end looks too dark and the upper end looks too bright. Which is why I suspect one of two things:
1) Something has gone wrong as some stage in the codec and the native Studio RGB format of the footage has been converted in Computer RGB. So what is being read as headroom above 235 and below 16 is actually detail in the normal range of vision. 2) The image is just really constrasty and it is a coincidence that the relative compression on the bandwidth of the colour in the image actually gives the impression of an improvement because it takes the edge off of the peaks of the range of the image and brings them into a more easily appreciated range, and reduces contrast of the image overall. I.E, it is vaguely like the "S" curves I am applying to a lot of the footage. This is probably more likely reason. But, if the conversion to computer colour is beneficial for whatever reason, I'm happy to do it if it makes an improvement. |
October 26th, 2007, 06:34 PM | #5 | |
Regular Crew
Join Date: Apr 2007
Location: Hertfordshire
Posts: 118
|
Quote:
Computer graphics cards almost uniformly use the 0-255 range to generate colour. However a DVD player will use the 16-235 range. MPEG, being a format designed for playback in domestic machines operates in the 16-235 Studio RGB range. So any encodes you do for a DVD should provide footage that has been calibrated to provide it's entire range of colours between 16-235. So if you are previewing on your computer screen and you want to check your colours, a temporary Studio RGB to Computer RGB would give you a more accurate indication of what is going on. There is still a big difference between the colours you get from a CRT PAL/NTSC monitor and a computer screen. However, there is so much variation in what people actually watch TV on these days, I think it is not really an attainable achievement to get something that looks good on everything. Most video prepared on CRT's for reference looks terrible on plasmas and TFTs. It is not until you see video especially mastered to be shown on newer displays they actually seem to look any good. |
|
October 26th, 2007, 07:21 PM | #6 |
Robert...
I think you're pretty right on. Actually, it's been hard for me to tell exactly where things go wrong because each codec reacts differently, depending on whether it's being fed Computer RGB or StudioRGB. Some codecs expect to see Studio RGB and treats all input as if it was StudioRGB. Glenn Chan has a good start at defining the most common codecs and what they expect. It's on his website www.glennchan.info. Look for a white paper called "Color Spaces in Vegas8". |
|
| ||||||
|
|