View Full Version : RGB to Studio RGB - confusion reigns....
Federico Perale July 29th, 2011, 04:58 PM can someone please give me an explanation of this?
I've read countless topics but am still confused.
I burnt a bluray of my project and the colours on my LCD tv turn out to be too dark and saturated, and the blacks too "strong" - the night scenes are particularly bad as the blacks seem to "jump" (for lack of a better expression)
do I need to apply the computer RGB to Studio RGB to make colors right, before rendering?
is this the solution? or do I apply Studio to Computer RGB and apply the necessary color changes and thenremove the filter before rendering?
thanks for your help
Federico
Adam Stanislav July 29th, 2011, 06:02 PM do I need to apply the computer RGB to Studio RGB to make colors right, before rendering?
is this the solution?
Possibly, though as suggested in yesterday’s webinar, you should just apply that filter to the Video Output FX (i.e., to the preview screen), so it applies as the last filter after all other filters, and so you can easily remove it if you need to render for some other purpose than a TV.
or do I apply Studio to Computer RGB and apply the necessary color changes and thenremove the filter before rendering?
No. If your media is in studio format, you do need to apply the studio to computer RGB because all color grading math (and math is what it is) works in linear 0.0-1.0 space. But do not remove that filter before rendering. Instead, apply the computer RGB to studio filter before rendering as mentioned above (if needed).
Additionally, if your media was shot with a gamma, you should apply a filter that removes the gamma at the beginning (so you work in the linear space). And for video (such as BD) you probably need to add a correct gamma filter at the end of the chain, right before the computer RGB to studio conversion.
The gamma conversion may be more important than the studio conversion with a BD.
Federico Perale July 30th, 2011, 03:48 AM Possibly, though as suggested in yesterday’s webinar, you should just apply that filter to the Video Output FX (i.e., to the preview screen), so it applies as the last filter after all other filters, and so you can easily remove it if you need to render for some other purpose than a TV.
No. If your media is in studio format, you do need to apply the studio to computer RGB because all color grading math (and math is what it is) works in linear 0.0-1.0 space. But do not remove that filter before rendering. Instead, apply the computer RGB to studio filter before rendering as mentioned above (if needed).
Additionally, if your media was shot with a gamma, you should apply a filter that removes the gamma at the beginning (so you work in the linear space). And for video (such as BD) you probably need to add a correct gamma filter at the end of the chain, right before the computer RGB to studio conversion.
The gamma conversion may be more important than the studio conversion with a BD.
when you say "if your media is in studio format"... what does this mean? I shoot with a canon 5D, edit in Vegas to output in BD.
does this mean my media is in Studio format? and if so I instead need to use the other Studio to computer filter? this will make my footage look even more dark and saturated
I am not sure I understand the second part of your reply...
the reason for the question in the second part of my original post is that I see some people apply these filters to match the preview to the output, but then remove them
see this post below from a recent thread called "shooting from mark II 5D"
"....Note for you CIneform/Vegas users. In order to get a near approximation of what you rendered output will look like, it is important to select the color correction filter in the Preview window while you edit, and set the color correct fitler to Vegas RGtocomputer preset. Then at time you render, you need to turn it off, or your render will be much contrastier and darker than your preview..."
still a bit confused....
thanks again
Rob Wood July 30th, 2011, 08:24 AM when you say "if your media is in studio format"... what does this mean?
---
this is Sony-speak.Whenever a Vegas user says "studio" they mean television colorspace. Y'UV, YUV, YCbCr, YPbP. etc, etc, etc. As in "not RGB colorspace".
Filter-wise in Vegas, it means pixel values are ranged from 016-235 rather than 000-255.
regarding Canon 5D, it's working in YCbCr (in Vegas-speak, Studio RGB). If you're monitoring with a television and all the footage you're using is from the Canon, the only thing to watch for is content made on a computer (title cards, lower thirds, etc). These will be in RGB colorspace (Vegas-speak, "Computer RGB")... for this footage, i'd suggest putting it on a separate track with a "Computer RGB" to "Studio RGB".
this'll maintain consistent colorspace throughout your project.
Adam Stanislav July 30th, 2011, 08:55 AM I was going to answer I did not know whether Canon 5D shoots in studio or computer RGB, then I remembered I took some sample video clips with my new Canon 5D Mk II. I have just opened those clips in Vegas and judging by their histograms, waveforms and RGB parade, they seem to use the full 0-255 scale of the computer RGB.
Based on that, I would not convert them from studio to RGB at the beginning of the filtering process.
But for the rendering for BD, I would certainly at least try adding the computer-to-studio filter at the end of the chain and would then check if the final output on a TV looks better that way.
The thing is, it was an unbreakable rule that to show something on TV, you had to convert computer RGB to studio because that was what every TV set (at least in America) expected. But with the recent transition to HDTV, people may be watching your BD using an old TV, or a new TV, or a computer monitor, and may have their systems configured correctly or incorrectly.
So, personally, I do not know what to say other than test it and see. But when in doubt, I would apply the computer-to-studio filter at the end of the filter chain because studio RGB looks better on a computer monitor than computer RGB on a TV monitor.
Adam Stanislav July 30th, 2011, 09:02 AM regarding Canon 5D, it's working in YCbCr (in Vegas-speak, Studio RGB).
As mentioned above, I looked at some footage in Vegas and it used the full 0-255 range judging by the histogram and such. Does that mean the codec does a silent conversion from studio to computer RGB (in Vegas speak)?
Rob Wood July 30th, 2011, 09:19 AM ^^ yes. thx Adam for catching that.
I did a double-check after posting; Canon's 5D uses "Canon full range 8 bit YCbCr values" (0-255); i was assuming it'd return 016-235 values; not correct.
So in this case, if you're monitoring with a television place a "Computer RGB to Studio RGB" filter on preview/output... this'll scale the footage on your timeline from 000-255 luma values to 016-235, which means it'll display correctly on your tv.
And when you render to BluRay, leave this filter on as BD expects values of 016-235 also.
Federico Perale July 30th, 2011, 09:24 AM thanks
just did exactly that and it worked great.
I also used the MainConcept MPEG2 as it seems to give better results than Sony AVC
Rob Wood July 30th, 2011, 09:34 AM "I also used the MainConcept MPEG2 as it seems to give better results than Sony AVC "
yeh, i had similar results making BD's at 1280x720@24p... so far mpeg-2 works better for me.
this might change when i eventually shift to 1080p but for now all is good.
Phil Lee July 30th, 2011, 11:04 AM Hi
I've got footage from a Panasonic HD camcorder (SD900).
I don't use this with any Studio/Computer filters, should I be, it is very confusing.
When the footage is being previewed on a second computer monitor it's a bit washed out (blacks are lighter, whites are not as bright and contrast slightly reduced), to correct this I set the preview to use 16-235 and it then looks identical to watching the original file in Windows Media Player. This setting as I understand it will to be making no difference to the final output, it's just a monitor adjustment.
So if I render direct to MPEG2 or AVC with no Studio/Computer filters, the output file when played is identical to the original footage colour/brightness wise.
So reading here I check my scopes and the video has RGB values ranging from 0 up to 255, so Computer RGB? Following the advice here then I should be adding a Computer to Studio filter, when I do that the output goes back to being washed out like it does without having the 16-235 setting checked on the preview.
If I add a Studio to Computer filter (which appears the opposite to what I should be doing), the preview is now too dark. If I uncheck the 16-235 on the preview, it becomes the same as when no filter is applied and 16-235 is checked.
Testing renders, the same applies, it is either darker or lighter than the original.
So to get matching output to MPEG2/AVC as the original file looks (which does look good on the black and white levels) I must not use any filters. Why are filters not required in my case?
My other work flow is to output to Lagarith then use x264 to encode. When I do this without using any filters, the RGB Lagarith output is too light (blacks washed out, whites less white), so as it goes through AVISynth I tell it to do a colour conversion from RGB to YV12 using the PC.709 setting which is "Use Rec.709 coefficients, keep full range", this returns it to the same as the original files and it looks right. I can see now that I can add a Studio RGB to Computer RGB as I output to Lagarith, which gives me the correct output on Lagarith when playing it on the computer, which matches the original files and so when I run it through AVSynth to x264 I can use the Rec.709 coefficent setting. Probably just giving me the exact same thing.
So I suppose my question is, why is it my footage seems to want the opposite filters, but when rendering to internal encoders in Sony Vegas for AVC/MPEG2 it is correct without any filters? Is the resulting output right for Blu-ray, it looks okay on my player/TV but it might not play back correctly for someone else is what I'm reading here.
When I've played a raw .mts file and one that has gone through Sony Vegas via Lagarith->AVISynth PC.709 to correct the levels to x264 on Blu-ray, they have identical levels.
It's all very confusing.
Regards
Phil
Adam Stanislav July 30th, 2011, 12:28 PM It's all very confusing.
Well, sadly, yes. But it can un-confused if you think of editing as if done in three separate steps.
For whatever reasons, when digital video was first created, the creators of the Y-Cr-Cb standard decided to limit it to the 16-235 range, as opposed to the full 0-255 range possible when dealing with 8-bit RGB channels (i.e., red, green, blue). As far as I know, this was done to allow storing non-video information in the extra values for compatibility with analog television. My background in computer programming finds it strange, but it makes (or made) sense for TV engineers. Whatever the reasoning, we are still stuck with it.
This brings us to the first of the three steps I mentioned, your source media. If it comes from a video camera, chances are it is in the 16-235 range, even after it is converted from Y-Cr-Cb to RGB. That is, usually, because some codecs will transparently convert it to the 0-255 RGB range. Such codecs are best avoided when editing because you want to control everything yourself for best results. The 16-235 range is what Vegas calls studio RGB. Additionally, the data will usually have a gamma applied, so it is non-linear (just as our human vision is non-linear).
On the other hand, if the image, or image sequence, was generated by a computer, you can be almost certain it is in the 0-255 RGB range (that includes the text and the various color backgrounds created by Sony Vegas). This is what Vegas calls computer RGB. Also, if it was scanned from film, it is probably computer RGB. If it comes from a photo camera (such as the 5D MkII), chances are it is in computer RGB.
Now, step two is editing, applying color grading filters, transitions, special effects, etc. This is mostly pure math (internally within the software, that is). This can be done in the 8-bit mode (0-255) or the 32-bit mode (0.0-1.0). Even when done in the 8-bit mode, many filters will convert everything internally to the 32-bit mode because the 0.0-1.0 scale is ideal for many math functions. For example, squaring a value between 0.0 and 1.0 will give a result in the same range of 0.0 - 1.0. Either way, it expects the data to be in the full range and it expects it to be linear.
The third step is your output media, which is typically a video file and in most cases requires a gamma to make it non-linear and is normally output in the 16-235 range. That is what DVD expects, that is what BD expects, that is even what YouTube expects (well, not so much YouTube as the video player inside a web browser does).
So, when you split it in your mind into these three steps, the first thing you need to know is what format your source media is. And if it comes from different sources, it may be in different formats. You need to consult your camera manual or ask in a forum specific to your camera (since someone there is likely to know the answer).
Then, still in the first step, if your media is in computer RGB with no gamma applied, just use it as is. If it is in studio RGB, add a filter to convert it to computer RGB. And if it has a gamma applied, add a filter to reverse it to linear RGB (after you convert it from studio to computer). One of the nice things about Vegas is that you can apply filters at different places. For this step one, you should apply these filters by right clicking on the media in the Project Media window and selecting Media FX... You may have to do it separately for different media files, depending on what format they are. But at the end of step one, all of your media should be normalized to linear computer RGB.
In step two, you just add the media to your tracks (possibly via the trimmer, depending on personal preferences). Everything is linear there (and preferably set for 32 bits so individual filters do not have to convert between 8 and 32 bits, and to avoid round-off errors from filter to filter). You can apply whatever FX you want whether on the track level or event level. But do not convert between studio and computer RGB at this step.
In step three, you typically need to apply a gamma appropriate for your output file and probably convert from computer to studio (and if you are not sure, it is safer to do the conversion here since most video players expect it that way). This applies to the project as a whole, so you should click the Video Output FX... button near the left end of the buttons located above the preview window, and add those filters there. And you might only want to do that after you have finished editing your project so it does not affect what you see on your computer monitor (but if you are using a professional grade broadcast monitor, then you may want to add it there early in the process since your monitor will expect that).
Doing it at these three separate places will apply all effects correctly to the correct type of images and will give you the correct video output. It also greatly simplifies the whole process since you only have to worry about the confusing stuff in small and separate compartments, and you can spend most of your effort in step two, editing, without worrying about the technical issues of video formats and human vision.
By the way, thanks for asking the question. Writing this has helped me as well, since now I have put into words what I have been doing intuitively up till now. :)
Phil Lee July 30th, 2011, 02:44 PM Hi
That kind of makes sense but in practice it seems to give no advantages.
So from what you have said:
1) Clips on the timeline from typical HD cameras capturing 16-235 need a Studio to Computer RGB level change to bring them to 0-255.
2) Everything on the timeline is treated as 0-255.
3) On output for Blu-ray add a filter to video output for Computer RGB to Studio RGB to convert the levels back to 16-235.
So I tried the above. With the filter on the clip for changing to Computer RGB I don't need the 16-235 option checked anymore for the preview monitor, that makes sense as the levels are computer levels.
I added a title with white at 255.
I then added a Computer to Studio RGB on the output and rendered as MPEG2 HD for Blu-ray.
So what I got was as I expected, levels appeared correct. When I sampled the 255 white title, it was around 253 to 254.
I did the same again with no level filters and the output looks as expected. This time when I colour picked the title it was 255 spot on.
I would have thought on the first one with the level changes the white titles at 255 should have become 235, yet they shifted slightly to 253-254.
So still confused as to what is going on. Adding the filters seem to have made the levels less accurate while not bringing it into the 16-235 range.
Regards
Phil
Adam Stanislav July 30th, 2011, 09:49 PM When you play the BD on a computer, the player will stretch the 16-235 into the 0-255 expected by the computer.
When you play it on a TV, the player will not stretch it, as the TV expects it in the 16-235 range. The TV monitor is built to display 16 as black and 235 as white. So, effectively, the TV hardware stretches it.
If you did not filter it from computer RGB to the studio range, the TV would show everything between 0 and 16 as black, everything between 235 and 255 as white and stretch everything in-between. So you would lose contrast in the middle and completely flatten the highlights and the shadows.
If you do filter computer to studio but your player does not stretch it, you lose a little contrast but do not lose any details. It does not look perfect but it looks a lot better than if you do not convert and the player stretches it as discussed in the previous paragraph. At any rate, the player is supposed to stretch it.
Computer video cards are often configured (I know my nVidia is) to automatically stretch any video from studio to computer RGB before sending it to the monitor, and not stretch anything else (i.e., the output of standard Windows software). Some computer monitors also can be configured to expect video levels, so they can be hooked up to the HDMI output of a DVD or BD player.
Last but not least, Sony Vegas vectorscope can be configured for the studio range, so it shows 235 as fully white and 16 as fully black.
Any of that may give the impression that the computer to studio filter is not working properly, when it actually is.
Phil Lee July 31st, 2011, 01:35 AM Hi
Thanks, that does make sense, I did wonder if conversion was happening on the computer.
But still to add more questions, I found an article that explains that HD video decodes to Studio RGB and nothing is needed to be done to it, and if you add an image that decodes to Computer RGB you need to add a Computer to Studio level change.
When encoding, some codecs want to see Studio RGB and some want to see Computer RGB. MPEG2 and AVC expect Studio RGB which is why my outputs direct from Vegas to these have correct levels without ever using any Studio/Computer filters. Any images I've used on the timeline I've never used a filter on them so these must have wrong levels, but I've not actually noticed anything looking odd.
The Lagarith encoder seems to require Computer RGB levels but without a filter in Vegas it gets Studio RGB levels, this explains the wrong levels when I export this way requiring me to tell AVISynth to do a conversion of levels as it moves from RGB to YV12. I'm thinking the final output is exactly the same regardless where that conversion takes place, so it makes little difference if I add a Studio to RGB filter in Sony Vegas or change it later using AVISynth when exporting to Lagarith.
The above doesn't apply if you change to a 32bit full level project according to the article, then Sony Vegas and some codecs start working in Computer RGB mode. This explains to me why levels are correct if I export to Lagarith with the project set to 32bit full video range, as it now gets Computer RGB.
So if I have this correct from reading the article and what is discussed here then if I predominantly use clips that are Studio RGB, I don't need to apply a level change to the clips, but if I add something to the time line that decodes to Computer RGB, such as an image, I should ideally apply a Computer to Studio RGB filter, this keeps everything on the timeline at the same level. If on the other hand I predominately have images on the timeline that decode to Computer RGB, I can do the opposite, and keep the timeline as Computer RGB, and should I add a video clip that is Studio RGB, convert that to Computer RGB.
On outputting, if my timeline is Studio RGB and using the MPEG2 or AVC codecs in Vegas, I don't need to add a filter as they expect Studio RGB. If my timeline is Computer RGB, I would need to add a Computer RGB to Studio RGB filter if going to those same codecs.
When outputting to Lagarith with a Studio RGB timeline, I need to add a Studio RGB to Computer RGB filter for correct levels, or deal with that later.
So from what I understand the important bits are to keep all footage on the timeline at the same levels, and the best level is the one that most footage is in by default?
When exporting it depends on where it's going and what it is on the timeline if a shift of levels is required.
I think it is making more sense now.
The article is here Color spaces and levels in Sony Vegas 9 and 10 (http://www.glennchan.info/articles/vegas/v8color/vegas-9-levels.htm)
Regards
Phil
Jerry Amende July 31st, 2011, 04:44 AM This might help: HD Video for the Web - Guide for Vegas Users (http://www.jazzythedog.com/testing/DNxHD/hd-guide.aspx?booladmin=true#Summary)
...Jerry
Phil Lee July 31st, 2011, 05:41 AM Hi
Thanks. I found this link gives very good info Color spaces and levels in Sony Vegas 9 and 10 (http://www.glennchan.info/articles/vegas/v8color/vegas-9-levels.htm)
Re-visiting some projects it all makes sense now. In most cases we don't need to apply any level filters, which explains how most people are getting good outputs level wise.
The majority of video most people will use (DV/MPEG2/HDV/AVC) will sit on the timeline as Studio levels, exporting out to the same formats in Vegas requires Studio levels, so all is good. The only time you will need a levels filter is if you add images to the timeline, these are added with Computer levels, so you need to add a Computer to Studio level filter to bring them in-line, and they do look a lot better.
I think Adam was advocating a different approach with a timeline using the Computer RGB level, so all video clips get a Studio to Computer RGB level filter, and any images are fine as is. This has the benefit of making the preview in Sony Vegas correct as it wants Computer RGB. (Not an issue if you have a second monitor as you can tell Vegas to use Studio or RGB levels on a different monitor).
On outputting Computer RGB, you need to then revert back to Studio levels with another filter applied global to the timeline. This works fine, output will look identical, however from reading some more about it most camcorders will record levels and so detail (considered illegal) above 235 and below 16. When you apply a studio to computer level filter it chops everything below 16 and above 235 then stretches it to fill 0-255. Adding this to a clip on the timeline means you have lost some detail that resided below 16 and above 235. Now that detail will get lost somewhere when you convert for Blu-ray/DVD etc anyway as it isn't legal, however having the extra information on the timeline can give you more latitude for exposure changes, or you can add a filter to move the detail into the legal range so you don't lose it completely. So my thoughts are it is best to stick work with Studio RGB levels on the timeline, which I had been doing when I didn't really have a clue what it all meant, it just looked okay when outputted so I didn't question the workflow.
Regards
Phil
Adam Stanislav July 31st, 2011, 12:12 PM So from what I understand the important bits are to keep all footage on the timeline at the same levels, and the best level is the one that most footage is in by default?
Keep the footage on the timeline, whether 8-bit or 32-bit (but preferably 32-bit), in the computer RGB level. Otherwise some effects will not work to their full potential.
Example: An effect may calculate the square root of the value of a pixel, the basic algorithm of brightening the image while preserving both the shadows and the highlights. When working in the computer level, working in 32-bit mode, white will remain white (square root of 1.0 = 1.0), black will remain black (square root of 0.0 = 0.0), and everything else will be brighter (e.g., square root of 0.5 = 0.71).
If you apply the same filter to studio levels, black (16 / 255 = 0.0627...) will become a lot brighter (0.25..., which converts to 0.25... * 255 = 63.87 = either 63 or 64, depending on how the filter converts floats to integers), which is totally wrong. Your shadows are no longer preserved.
Even worse, in that situation, white (235 / 255 = 0.92...) will become 0.96 which when multiplied back by 255 will become 244 or 245, way outside the 16-235 range. Your highlights will be crushed.
Not all filters are affected this way, or at least not as drastically. So it may seem that just keeping your timeline on the same level is OK even if it is all at the studio level. But sooner or later, a plug-in will appear not to be doing what its author says it is doing, simply because its math is thrown out of whack by an incorrect range on the timeline.
So, it is not enough to keep the timeline at the same level, it is essential to keep it at the computer RGB level. Every single book on computer graphics that I have ever read—too many to count—assumes the data being manipulated is in what Vegas calls computer RGB. And every computer graphics programmer familiar with the studio range thinks those TV engineers were out of their minds when they established that standard (I am sure they had their reasons at the time, but I wish we all just switched back to the full computer scale, both because it messes up too many computer graphics algorithms and because it reduces the number of available colors considerably so you need 10 bits in YCrCb to preserve all those beautiful 8-bit RGB colors, but all modern computers use an 8-bit byte).
So, in case my ravings are overwhelming, just a summary here: Always use computer RGB for your timeline.
Phil Lee July 31st, 2011, 02:00 PM Hi
I think I understand but...
If you have Studio RGB on the timeline and convert to computer RGB, the assumption is there are no values less than 16 or greater than 235 to preserve, so 0-15 is discarded and 236-255 is discarded and then all the values are remapped, so 16 becomes zero and 235 becomes 255. This causes an inaccuracy though surely, because where do you map value 17 for example in this new scale, it doesn't precisely fit in anywhere.
If you have Studio RGB on the time line it still sits in the range 0-255 with the advantage is you have some latitude now. For example you might have a filter that moves shadow details lower than 16, but another adjustment moves some shadow details back above 16. Now if you've converted to computer RGB of 0-255 so black is now 0 and white 235 becomes white, the first filter drops some detail as it can't move that shadow detail any lower than 0, it's clip it off, so the second filter has no shadow detail to move back into range, i.e. you've lost detail. That is how I understand it which makes me want to keep Studio RGB on the time-line. While it's not ideal not using the full 0-255 range, we might as well get some advantage for it by having more latitude to shift things around without bits falling of the ends?
I see what you mean by your maths examples, but this brings me to another issue in that most HD camcorders will record detail less than 16 and higher than 235. As I understand it the 16-235 is not designed to be a hard boundary anyway, and the higher and lower ends being spaced away from the maximum and minimum are purposely designed to allow a bit of over or undershoot in recording values and they can be made "legal" later. Adding an immediate Studio to Computer RGB conversion drops all that information which might have been of use? For example reducing exposure with 16-235 on the time line might bring in highlight detail that was at "superwhite" level abover 235, I certainly have one example of this where details appear in the sky. If you have an immediate conversion to Computer RGB though haven't you removed the superwhites and anything blacker than black, and reducing exposure wouldn't bring in any extra highlight detail at all where you might have had some? Same applies for shadow detail.
Yes some filters may not work absolutely correctly at the extreme of black and white, but that is only the case perhaps where there is absolutely no picture information below 16 or above 235? Or if a filter isn't working correctly than just on that clip apply a Studio to Computer RGB Conversion immediately before the filter, than after convert it back to Studio RGB, wouldn't that be technically better?
I'm not saying your method isn't correct, there is not necessarily a wrong or right way with this stuff, just trying to understand the rational by having two level conversions which is your suggestion, with the Studio to Computer RGB one being potentially lossy on modern HD footage, that's really where I'm coming from now I've played about and understand what is going on some more.
I found one clip that reducing exposure magically makes some telegraph lines and more cloud detail appear in the sky, when I convert that to Computer RGB then try and reduce exposure, these details do not appear as clearly they are recorded above 235 by my camera which is cut off by the Studio to Computer RGB conversion, which is why I'm not 100% convinced converting everything to Computer RGB on the time line is absolutely the right thing to do, certainly where modern HD footage is concerned, as that extra detail might just be useful and brought back into legal range by some filters or exposure adjustments.
The other pointer is that Vegas by default uses Studio RGB levels on the majority of it's codecs, certainly the modern HD ones, if it was better to convert and always work with Studio RGB converted to Computer RGB, wouldn't that be the default for anything added to the time line?
Regards
Phil
Adam Stanislav July 31st, 2011, 02:57 PM This causes an inaccuracy though surely, because where do you map value 17 for example in this new scale, it doesn't precisely fit in anywhere.
If you use 32 bits, yes it does (fit, that is). Those are floating point numbers, i.e., real numbers, not integers.
As for some camcorders not sticking to the exact 16 - 235 range, they are supposed to. But if they do not, convert whatever range they have to the proper linear range.
Note (in the enclosed image) how Sony Vegas Color Corrector converts Studio to Computer: It increases the saturation by the factor of 1.164, multiplies everything by 1.164 (gain of 1.164), then subtracts 18.6 (offset of -18.6), since 16 * 1.164 = 18.62. So, just change the gain and the offset (and apparently saturation by the same factor as your gain, I’m not quite sure about that, but that seems to be the algorithm Vegas uses) to whatever your footage requires to stretch your values. And keep it in 32 bits so you do not lose anything to round off errors. And make sure to save it as a preset so you can use it on other footage from the same camcorder.
Bill Koehler August 3rd, 2011, 09:12 PM Thank you so much for your explanations Mr. Stanislav, especially Post #11 & #17.
I have been going nuts trying to get a handle on what's going on.
Adam Stanislav August 3rd, 2011, 09:27 PM I’m glad you found it useful. :)
Gerald Webb August 4th, 2011, 01:45 AM I'm reading this over and over, and must admit I'm still a bit lost.
So, if I have multiple tracks/cams and some shoot in 0-255 and others 16-235,
( I know this because when dropping the raw files onto a 32bit timeline with video gamma set to 2.2,
some cams have blacks and whites already clipped at both ends of the RGB parade,
and others have no blacks below the 16 line, some whites are higher than 235, but generally the same scene is much more compressed on the scope.)
Up until now, Ive always brought all the ranges down to sit between the 16-235 with a levels and/or color correction filter with the view that doing this as I convert to Cineform will make grading after cutting more uniform and blacks and whites will be less crushed and blown out. Is this right or wrong?
From what I'm reading, I should be making my srgb cams into crgb ( so using input levels to crush blacks and brighten whites) on the time line or when converting to Cineform, and then applying a global crgb to srgb filter on export, is this right?
I am confused because if editing at crgb levels 0-255, when you use Magic bullet or the Cineform First light Looks, it just destroys your footage, anything grey becomes black, and virtually any lighter colors are blown.
Sorry for being the slow kid in the class :) but if someone (looking at you Adam) could explain a real world workflow from cam to DVD or Bluray it may help my simple brain understand.
Adam Stanislav August 4th, 2011, 08:17 AM Interesting. I don’t use Magic Bullets, I do all color grading with tools built into Vegas, so I cannot really comment on MB. It is possible that the programmers of Magic Bullets have figured that too many people do not convert to linear computer RGB and are doing the conversion internally within the plug-in. In that case you need to do whatever any particular plug-in expects.
Ultimately, do whatever looks right.
Phil Lee August 4th, 2011, 12:12 PM Hi
I'm reading this over and over, and must admit I'm still a bit lost.
So, if I have multiple tracks/cams and some shoot in 0-255 and others 16-235,
( I know this because when dropping the raw files onto a 32bit timeline with video gamma set to 2.2,
some cams have blacks and whites already clipped at both ends of the RGB parade,
and others have no blacks below the 16 line, some whites are higher than 235, but generally the same scene is much more compressed on the scope.)
Up until now, Ive always brought all the ranges down to sit between the 16-235 with a levels and/or color correction filter with the view that doing this as I convert to Cineform will make grading after cutting more uniform and blacks and whites will be less crushed and blown out. Is this right or wrong?
From what I'm reading, I should be making my srgb cams into crgb ( so using input levels to crush blacks and brighten whites) on the time line or when converting to Cineform, and then applying a global crgb to srgb filter on export, is this right?
I am confused because if editing at crgb levels 0-255, when you use Magic bullet or the Cineform First light Looks, it just destroys your footage, anything grey becomes black, and virtually any lighter colors are blown.
Sorry for being the slow kid in the class :) but if someone (looking at you Adam) could explain a real world workflow from cam to DVD or Bluray it may help my simple brain understand.
I think do what works for you, if it looks right then that is all that matters. If something isn't right with levels you will see it, if it all looks good then it most likely is. Just compare the raw file playing in your chosen media player, then put it through your entire workflow (obviously with no colour grading) then compare that with the original. If there is no shift in levels or colour it's all good.
By luck I've had Studio levels on the timeline and when encoding out using the built in codecs they want Study levels so there isn't anything to really do, I should think this is by design. I've followed Adam's advice and tried converting studio to computer, then computer back to studio on output and no matter how much I pixel creep (and a little alt-tab trick you can quickly switch between screen grabs) there is absolutely no difference on the final output to colours and levels. This makes me go for the less is more approach, and rather than two shifts of levels I'd rather do none, it's safer as the tendency might be to forget the shift back to Studio, and for performance I'd rather not stick the level shifts on there while editing anyway.
The arguments Adam puts forward may well indeed be sound, but nothing I've seen with having computer RGB on the timeline then shifting back is any different to keeping it at studio levels on the time line.
If you are mixing studio and computer levels, you need to decide what you want on the timeline, and I would go for keep if the majority of clips are studio levels, or computer if they are computer, this is as per the advice in the article I linked to. As for Cineform according to the http://www.glennchan.info/articles/vegas/v8color/vegas-9-levels.htm article and codec list, it wants to see Studio levels.
Regards
Phil
Phil Hamilton March 13th, 2018, 10:19 AM I am working on an animation project with individually rendered frames at 1280 x 720 from a 3D Modeling software package.
I'm using Vegas Pro 2012 and the timeline I am using is HD 1080-24p (1920x1080, 23.976 fps) with the intended delivery being the internet and also to Bluray using DVDA 6.0.
When I render to WMV the colors, contrast, and brightness look fine. And the way I want it. When I render to MPEG2 - Blu-ray 1920x1080-24p, 25 Mbps video stream for DVDA that will NOT require recompression - the images come out too dark and contrasty. So, based upon the recommendations here, I am using the SONY LEVELS "Computer RGB to Studio RGB" and on the preview it washes out but when I burn to Bluray and play it from my Sony PS4, the levels seem to be almost exactly like what I am seeing on WMV.
My question is: Is there another setting or adjustment I should make in order to make the video on the Bluray look as close to the WMV as possible? What I am noticing is the Bluray looks like the video has some DNR applied - softness and the colors seem a bit less saturated.
Thoughts and suggestions? The image on the RIGHT is an M2V capture and you can see it's a little less sharp to me and seems to be missing a bit of RED in the face.
Rainer Listing March 14th, 2018, 04:50 PM Wow. Old thread. Very basically, if it looks right on you and your client's final display, it is right. But it has to be viewed on the (properly calibrated) final display. Normally, original 3D is in full range RGB (computer RGB) and you can leave it that way. View in Vegas, upload .mp4 (or wmv) to YouTube and most browsers will display it that way. If at the end you're seeing crushed blacks and blown out highlights not present on the original footage, any of your display, broadcast, codec, browser, whatever, is clipping computer RGB to studio RGB levels. Add the computer RGB to studio RGB effect to the output - and it then should look right. You can check with your scopes (Wave form and RGB parade) or a simple utility like Instant Eyedropper. To make things look right in the Vegas preview, a really useful utility is the free SeMW Extensions Preview level, which simulates the signal levels of the output device, available from SeMW Extensions (http://www.semw-software.com/en/extensions/)
(edit: Have a look at your scopes, the m2v is slightly more saturated, could be compression effect, don't know about the DNR, seems unlikely)
Phil Hamilton March 17th, 2018, 11:20 AM I don't have scopes. Doing all of this by "eye".
I'll check out the extensions link you gave me I appreciate the response. So I did. I installed it but I am not seeing anything in the Application Extensions folder which is where it is supposed to show up. I can't find out the name of the extension so I can't search for it. It says it installs but.....
Thank you.
Rainer Listing March 17th, 2018, 04:26 PM Hi Phil
I may be a bit confused by which version of Vegas Pro you are using, I assumed version 12, since there is no version 2012. The scopes I was referring to are the inbuilt Vegas Video scopes which you should have if you are using version 12, Menu: View>Window>Video scopes, or Cntrl Alt 2, Google for how to use if you don't know. Don't know about SeMW extensions, they worked for me in version 12 and still are working in version 14.
Phil Hamilton March 18th, 2018, 01:00 PM yes thanks I have version 12.0 build 563. I see the SCOPES. I just need to see how they work.
Regarding the SeMW extension - if I knew what the extension name was I could search for it to see WHERE it is installing it. But it is NOT going into the Application Extensions folder as I understand it should be. I could drop it there if I can find out where it's going.
|
|