View Full Version : SR11/SR12 in x.v. color mode -- should this mode always be on?
Robert Smith June 20th, 2008, 06:42 PM Hey guys,
I've noticed the SR11 and SR12 come with xv color off by default. Sony lauds it, but doesn't force you to use it.
I was reading some threads, and noticed that on older cameras, it was said you lose nothing with xv color on, and that everything looks better, even if you video editing software doesn't explicitly support xv.
So what's the story with these new cams? Is it best to always have xv colour on?
Also, how can you possibly not lose anything, when you add an additional 2 bits for more colour? You only have 16mbps per second to work with, there's a set amount of space for all the data combined. I assume you must lose a little of some sort of other image data or something when you add extra color info.
Maybe the loss is just so small, and what you gain with better color info is so great, that it's tough to notice anything?
Dave Blackhurst June 20th, 2008, 07:04 PM I turn it on, leave it on. There were early warnings from the mfrs that this mode "could" cause some displays to look odd or strange (high tech terms here). I've not seen a problem, and I do think you get a bit better image with more data to work with in edit, but that could simply be a marketing induced hallucination <wink>.
From my use, no harm in setting it to on, forgetting about it... wish they'd leave the little indicator off though!
John Bosco Jr. June 20th, 2008, 09:57 PM Because of the larger color spectrum of x.v. if your display is not x.v., then it's possible that some colors will not be displayed correctly. So I have to disagree with those of you who vote on leaving it on as it serves no useful purpose unless your display is capable of x.v. colors in my opinion. There is a reason why it defaults to off. Why now? Well, displays are better now, and you can see more detail.
Dave Blackhurst June 21st, 2008, 02:44 AM "it's possible that some colors will not be displayed correctly"
I believe that is how the manual states it... has ANYONE had reports substantiating this "possible" result?? It's "possible" you'll get hit by a bus crossing the street...
I don't know exactly how Vegas handles the extra colorspace or if it survives in the final render, but I always operate under the theory that the more info "IN", the more accurate and detailed final result you will achieve. Somehow the x.v.color looks just a tiny bit better to me, and leaving it on doesn't seem to do any harm.
Lorenzo Asso June 25th, 2008, 01:49 PM I have some doubts about XV colour. Let's see why:
- sony says its dynamic range becomes larger (if you have a TV capable to reproduce xv)
- video bitrate rests 16mbit.
Now, if the dynamic range is really larger, there are more informations and so they need more bits...
Even if is not exactly the same thing, let's think about the difference between cameras capable of 4.2.0 and 4.2.2 color space...4.2.2 cameras (i think for example pana) use dvcprohd 4.2.2 only spatial compression and very high bitrate 100mbit...
So in my opinion it is better to leave it off (it sounds me strange that sony leaves it off by default) because, imho, you would have a boost only if the bitrate would go higher....
ciao
Steve Mullen June 25th, 2008, 07:16 PM "it's possible that some colors will not be displayed correctly"
I doubt Sony would put in a warning for no reason.
If I understand the whole process, these extra bits only travel to an HDTV via HDMI and are used only if the HDTV has xycolor. I don't know anyone with such an HDTV. So my feeling is why take a risk?
And, really, of all the things wrong with my videos -- a few missing colors are not one of them! :)
In a few years when the world is living with BD players with xycolor and everyone has replaced their first gen HDTVs -- it will make sense.
Although I'm sure the extra bits are in the AVCHD file -- somehow I doubt they survive transcoding to any Intermediate codec. As far as seeing better color -- unless your monitor supports xycolor, I doubt you can see them. In fact, I would think that recording those specif colors that use the extra color space and then truncating them away -- could produce exactly the effect Sony is warning about.
Dave Blackhurst June 26th, 2008, 04:54 AM Ever read those "warning lables" like "don't operate in shower" on a hair dryer? OK, maybe that's just a bit much (I've a hair dryer with a warning not much shy of that on the cable though!), but companies have to go so overboard protecting the consumer from themselves, I have to ask if ANYONE has reported this effect (not theorized, ACTUALLY SEEN the problem) "in the wild"??
I don't transcode, files are drag and drop into Vegas, which may or may not deal with the additional colorspace - haven't really thought to investigate, as I turn the option on, and the colors look better to me in the cam and when I tested it with the old HC7. That was good enough for me to just turn it on and leave it on. Haven't suffered or noticed a single adverse display issue... that doesn't mean "something" couldn't happen in some specific hardware configuration I suppose.
In the end, it's probably not really that much of a concern either way, IMO. I'd rather record the extra bits for futureproofing.
Will Cowling June 27th, 2008, 11:24 AM Also, how can you possibly not lose anything, when you add an additional 2 bits for more colour? You only have 16mbps per second to work with, there's a set amount of space for all the data combined. I assume you must lose a little of some sort of other image data or something when you add extra color info.
Wide gamut (xvColor) and deep color (10, 12 or 16bit color components) are two different things. You could combine the wider color gamut with deep color, but no consumer camcorders do this to my knowledge; they're all 8 bit. Equally you could have deep color without the xvColor wide gamut. The gamut defines the range of colors available, while the color depth defines the number of colors (i.e. how fine the graduation is) across that color range.
The increased color information in xvColor comes from the use of previously out of range color components, not from (necessarily) increasing the bit depth. The old BT.709-5 (sRGB) standard limits luma values from 16 to 235 and chroma values from 16 to 240. The xvYCC (xvColor) standard allows chroma values from 1 to 254 and so allows a broader range of colors to be represented with the same color bit depth.
It's certainly possible that this increase in information (more colors recorded) may adversely affect recording quality on SR11/SR12, but it may be that the AVCHD algorithm is good at handling this, so it makes little noticeable difference. It's certainly not as bad as the effect of increasing color depth would be.
The potential 'problems' come with the unknown behavior of how your editing or display software will handle these out of range values? It seems likely to me that they may just be truncated, which is where problems may arise. To give an example....a valid color value for a pixel in xvColor format might be (Y=235, Cb=254, Cr=1). If this color is dumbly truncated to the sRGB valid ranges it becomes (Y=235, Cb=240, Cr=16) which may be a noticeably different color.
None of the editors (Vegas included) seem to explicitly support xvColor, so who knows what they're doing to out of gamut pixels.
Steve Mullen June 27th, 2008, 06:18 PM To give an example....a valid color value for a pixel in xvColor format might be (Y=235, Cb=254, Cr=1). If this color is dumbly truncated to the sRGB valid ranges it becomes (Y=235, Cb=240, Cr=16) which may be a noticeably different color.
None of the editors (Vegas included) seem to explicitly support xvColor, so who knows what they're doing to out of gamut pixels.
That was my worry. If you record a color in certain values for Cb and Cr, what happens when this information meets ANYTHING that cannot deal with these values?
And, yes, I can believe color on the camera's LCD may indeed look better -- but unless you connect via HDMI to an XY HDTV -- you are not seeing better color where it counts. And, since it will be years before most HDTVs will support xy -- why take a chance?
Moreover, it's entirely possible the xy extra information never gets into or past your NLE. It certainly is not going to pass through any Intermediate codec! And, it isn't likely to be able to be exported to Toast or any BD burning software. So it's got no place to go!
It's a marketing deal like JVC's 60p output -- something to convince you these products offer something you need, but work only when you playback directly into your HDTV.
|
|