View Full Version : Video Signal Measurement and Standardization
Bill Ravens December 21st, 2008, 11:52 AM More and more and more, the nature of the visual entertainment industry is changing and evolving. New venues are opening for the digital content producer, including cell phones, iPods, video players, both portable and settop, as well as digital content projection at theatres and other entertainment venues. Like it or not, the digital content landscape is exploding outwards, with standards and permutations of formats multiplying like rabbits in springtime.
Thru this massive explosion in presentation and delivery formats, the digital content producers(videographers, editors, directors, producers) still have to get his/her footage from the camera to the presentation media. With the explosion in delivery format and standards, the path to a viewable and professional presentation is becoming more and more torturous. There has always been an implicit understanding(or was it a wink and a nod) that all the transcoding would, somehow, maintain broadcast signal standards. For what I've seen, this is anything but true, for example, Quicktime's serious flaw with luma values in H.264.
Speaking for myself, the only way I can confidently navigate thru the morass of formats, transcoders and display media is to carefully monitor the quality of my production from acquisition to delivery. In the past, I've relied on digital, onboard metrics like SeriousMagic's HDRack, which has now evolved to Adobe's OnLocation. The drawback to this kind of signal monitoring is fairly obvious. The limitations of monitoring the image stream from acquisition to presentation is limited to the 1394/firewire bus that OnLocation supports. What I have been lacking, all along, is an instrument capable of monitoring the image stream at all phases of post production: acquisition to capture, capture to edit, edit to color correction and FX, render to transcode, transcode to distribution media(DVD, digital file, etc.) and, finally, media to display device. (Although, from what I've experienced, there seems to be more of a concerted effort by industry to assassinate any software that gives the DCP better control of the image stream)
The USA is less than 60 days away from a radical departure from the NTSC format and standard that has been with the industry since television was invented. I find it disconcerting that no one, especially on this forum, has raised the issue of image stream adequacy and monitoring, other than the very subjective approach of actually looking at the delivered image with their eyes. I don't mean to belittle this approach, because, ultimately, the viewer doesn't sit in the theatre with a spectrophotometer measuring the technical "correctness" of the image stream. Yet, as digital content producers, I believe it's incumbent on all of us to do a technically credible job on what we deliver, chroma and luma correctness, always with the artistic eye for color correction and color grading. There's a million ways to corrupt the content, as delivered thru the presentation devices. Unless the content is correct when delivered, a real nightmare scenario is developing.
However, the tools available to DCP's are extremely expensive for all but hi-dollar venues like TV stations. So, where am I going with this monologue? It would seem to my thinking(OK, arguable, but, lets not go there) that DCP's need a portable and relatively inexpensive device that can display video stream HISTOGRAM, waveform, vectors, luma and chroma values and voltages.
Yes, Martha, I'm talking about a versatile software histogram/WFM/vectorscope. Such a device should be able to intercept and pass thru the image stream measuring the delivered content values, whether the signal is digital or analog. Ideally, such a device would be incorporated in all display devices, analog or digital, LCD, plasma, or CRT. But, this isn't realistic nor practical. So, how does an indie DP and director monitor the visual stream at all the intermediate delivery points, whether they're digital or analog, RGB or YPrPb, or any other format TBD? The standard interfaces are all meaningful: y/c, composite, component, SD-SDI, HD-SDI, HDMI or DVI. Who can build such a device? Is there a market outside of my own needs for such a device? We're all familiar with those teenie weenie histograms that come with modern DVcams. Wouldn't it be nice to have a three channel RGB 720x480 histogram monitor for our cams? There are a ton of hardware devices on the market, but, many or most are in the $10k-$50k range. Hardly suitable for event videographers or small indie producers, nor are they portable.
Most modern NLE software comes bundled with scopes of some form, and, it's a wonderful thing. Unfortunately, these scopes show only what's on the timeline. When image streams are imported, a great deal of color remapping goes on. The editor has absolutely no way of measuring the adequacy of his input video stream. Likewise, when the timeline is rendered or exported, the amount of color data remapping that happens is jaw dropping. The editor has absolutely no way of knowing the adequacy of the rendered/exported output without a visual examination, after rendering is completed. The inexperienced editor waiting on a 12 hour render is in for a costly wait. "That looks washed out" or "that has all the shadow detail crushed" or "the colors are all wrong" are very subjective judgments, but, exactly what's wrong with the image, and why? What stage of the process corrupted the data? Are you importing Broadcast RGB or Computer RGB? Are you exporting Broadcast or computer RGB? Are you getting what you want and expect to get? We, as editors, assume the software must be right. Unfortunately, I've seen too many random examples of when it is not.
There are a few digital technologists who will loudly proclaim that "digital is digital and there is no such thing as BLACK or white in a digital domain". Well, while this is technically true, our eyes are analog devices. The final presentation will ALWAYS be analog(until we get HD-SDI BNC plugs installed in our brains at birth). This is an infuriatingly incorrect perspective, and one that digital audio technologists have worked thru, by the way. 0dBFS digital limits have a relation to RMS, PPM in the analog domain. Like our eyes, our ears are analog sensors.
So, is it unreasonable to hope for an LCD display device, the size of small book with a 6 or 7 inch low rez,LCD display, that displays histogram, wfm and vector info. You wouldn't need more than 720x720 pixels, probably less. Such a device should have component and HDMI interfaces, as a minimum, 12vDC power. In fact, rasterizers that display the scopes on the video device whose input is being measured can replace onboard display. Display resolution can be fairly small to display the waveforms and vectorscopes. Price range, how about $4000-6000, including SDI inputs, 2000-4000, without HD-SDI.
I'd be more than happy to invest $500-$1000 as an early adopter, to the first CREDIBLE company that would embark on building this device. Mike Schell at Convergent-Design, are you listening?
Bill Ravens December 22nd, 2008, 10:34 AM ALready 75 views and not a single comment. Likewise with my parallel post here...
http://www.dvinfo.net/conf/sdtv-hdtv-video-monitors/139913-waveform-monitoring-calibration-standard-display.html
Perhaps this idea is just more BS amongst all the other BS floating around. I would sure like to hear back how many people would be interested; and, at what price point would your interest wane? If it's BS, please feel free to say so. I think I know who's feedback I can put confidence in, and who's I can't.
Ervin Farkas December 22nd, 2008, 11:35 AM Bill, the need is legit, but honestly I don't think it will ever happen because of the price tag. Calibration is the reason these things cost so much, electronic devices will drift away with time due to chemical reactions inside components; to use the best parts costs a fortune, plus you have to have in place real-time correnctions of these drifts - it becomes very, very complicated. Even so, you would have to calibrate the baby every so often. And there is not a huge market either.
Have you thought about one of these relatively new flash-based small laptops running software instead?
Bill Ravens December 22nd, 2008, 12:26 PM Ervin...
what software would I run? And what would be the I/O? Experience tells me that 1394 is insufficient and passe. as for calibration, if you ever used HDRack, you'd see that they had a pretty slick on-board calibration to self generated SMPTE color bars and pluge. Nice thing about digital processing, there is no such thing as "black"(never thought I'd say that ;o) ) it doesn't need a lot of calibration and monitoring for accy. all calibration needs would be strictly for the analog component, just like SeriousMagic did it. HDRack even was smart enough to tell the user whether the signal was Rec601 or 709, and adjust the reference appropriately.
Ervin Farkas December 22nd, 2008, 12:42 PM How'bout we ask our own John Miller (http://www.dvinfo.net//conf/member.php?u=20251) to further develop the capturing program Enosoft - High Performance Tools For Music And Video (http://www.enosoft.net/) he already has?
John, you copy? Over...
Bill Ravens December 22nd, 2008, 01:09 PM Great reference, Ervin! I'll look into this further. Maybe you're right. It could be as simple as something like this ported to a PDA.
Several production monitors(Panasonic) come with these features built in. Very handy when setting up the monitor, then viewing the color corrected footage. Problem is, these are built into the monitor and can't be used elsewhere, but, it demonstrates what I think needs to be done on a more versatile and economic platform. Given that a Panny production monitor is about $2700, perhaps the price point is more like $1500?
Shaun Roemich December 23rd, 2008, 09:41 AM Bill: I'd happily invest in your "proposed" product if it were a hardware solution that accepted HDMI, Analog Component and HD-SDI, ran on 12v, either on 4 pin XLR or AB Mount AND I found myself with the disposable income to route in that direction. Coming from a broadcast background, I know and appreciate what a WF/VS brings to the table and adding a histogram to that would be awesome. It's just that I'm having such a hard time parting with $5k for something that isn't DIRECTLY related to image acquisition or editing (or lighting or audio or...) as I DO have a workflow that involves keeping the Scopes window open in FCP while editing and MOST of the time, that's "good enough" for me. Your proposal would be much better, but I'd need to weigh the cost against the "value". I own an aging WF/VS for SD work and I test my DVDs every so often to make sure everything is tickety-boo but when it gives up the ghost, maybe I'll be more inclined to "upgrade".
Thanks for sharing!
Bill Ravens December 23rd, 2008, 02:11 PM Thank YOU Shaun for responding.
The cost figure is really something I pulled out of the air without any research. Chances are, it might be a lot cheaper, but, I am not in the position of getting realitic costs at this point in time. I do know that there is a licensing fee for HD-SDI that doesn't come cheaply, so that feature alone will add about $1000 to each unit.
Bob Grant December 24th, 2008, 05:27 AM The issue of component drift is largely irrelevant in the digital domain. It's just ones and naughts.
However there's still a very fundamental problem, standards.
Even with the old 1V composite video there's multiple standards. Unless you know what's meant to be on the cable and how it's being used and interpreted downstream any instrument could be worse than useless. We were fairly safe, mostly anything non standard was hopefully flagged as such.
It gets more complicated from what I read in the digital domain. Firewire can carry several different kinds of video. The digital numbers can have a range of values and ways they're meant to be interpreted. It gets even complex with HD-SDI. About the only thing one can confidently say about HD-SDI is it uses a BNC connector, unless it's fibre or if it's dual-link.
If you only wanted a meter that's going to work with a camera such as say the EX1/3 then things should be fairly straight forward however that'd kind of limit its market. Other cameras seem to use a different range of values for 100% black and white. That's before we even consider 10bit Log.
Bill Ravens December 24th, 2008, 06:34 AM Bob...
Always appreciative of your upbeat, can-do attitude. I think the issue you're referring to is called "calibration". There is a way to do anything, just gotta figure out how and how much.
Billy Steinberg December 24th, 2008, 07:38 AM It doesn't have a graphic histogram function, but it has everything else you ask for, and then some.
It's small, light, 12v battery powered, accepts any SDI video (HD or SD) in any format, and shows you everything you could possibly want to know about the signal. It not only has a wonderful picture display, and every test scope function, but it also will show you gamut errors going between RGB and YCbCr and composite. It has FULL numeric data displays with history and logging for the data. You can even send the info out of the built in ethernet port rather than to a USB stick which plugs into to front of the unit. (Maybe this is close enough to your histogram request, though it's numeric data, rather than a graphic display of the data).
It will even display an (NTSC or PAL) composite analog signal, though only as a (full screen) picture with no scope or data functions available. This is to aid the DP or cameraperson who wants to mount the 5330 right on the camera rig, to use as a viewfinder, but doesn't have access to an SDI signal.
The Leader 5330 retails for $6995, but it's typically available at a discount.
I have one that I picked up two weeks ago, and I love it. I've already used it in Europe for HBO (at 1080i/50) and here in the states for the Metropolitan Opera (1080i/59.94), both with great success.
Here's a URL into the USA Leader site for the 5330; I hope it's within forum guidelines to post it. Check out the CineZone and CineLight functions while you're there, if you're a DP (rather than just a video engineer), and note that the 5330 can display two or four different modes at the same time, not just one (e.g., Waveform AND Vector AND Picture AND Audio).
5330 Waveform Monitor - Leader Instruments (http://www.leaderusa.com/web/products/video_monitor/lv5330.htm)
Billy
ps I have NO affiliation with Leader, other than being a very happy customer.
Bill Ravens December 24th, 2008, 07:46 AM Billy...
Many thanx. The LV5330 looks like a very nice piece of equipment. It's price is a bit steep, but, that may just be reality. Another visitor to this forum has referred me to the HAMLET series of products, which also look very nice. HAMLET's website is kind of funky, no pricing or availability data. Looks like my proposed instrument may already be in production. Will report back.
Bob Grant December 24th, 2008, 08:23 AM Billy...
Many thanx. The LV5330 looks like a very nice piece of equipment. It's price is a bit steep, but, that may just be reality. Another visitor to this forum has referred me to the HAMLET series of products, which also look very nice. HAMLET's website is kind of funky, no pricing or availability data. Looks like my proposed instrument may already be in production. Will report back.
We've had a SD Hamlet for years, not a bad piece of kit that can overlay the scopes over the vision.
However reading the original post several times I'll say again such instruments are not going to address the very valid issues raised. Most of the things that go wrong are not for lack of such instruments or their calibration to set standards. The problem is the variety of standards. This is a huge issue industry wide with HD and the further you go up the food chain the bigger it becomes. There's committees working to try to solve these issues. It in part comes about as we move from traditional "video" into what is more digital film. If everyone and everything from the camera though the post chain stuck to say SMPTE 292M these issue would be trivial and most of the problems wouldn't happen.
It becomes even more of an issue as we go tapeless, there's less opportunity to monitor a signal on a wire that's going into a VTR. Even in the world of HD tape there's a number of standards used on exactly the same tape. Bits on a disk can be anything.
Bill Ravens December 24th, 2008, 08:58 AM It becomes even more of an issue as we go tapeless, there's less opportunity to monitor a signal on a wire that's going into a VTR. Even in the world of HD tape there's a number of standards used on exactly the same tape. Bits on a disk can be anything.
AFAIK, legacy products are just that, left overs. It is what it is. My primary interest is in HD and HDV, hereafter referred to as HD(v). If an HD(v) disk or tape has multiple standards on the same tape, who's fault is that? Seems to me the DP has some responsibility to manage these things, or am I being naive? I always put a colorbar leader on a record. If I change the basis of the colorbar leader mid record, it's up to me to lay down a new colorbar leader. I recognize that in the "heat of battle" this sometimes doesn't happen. Then it's up to the poor editor to salvage what he/she can.
My principal function is as an editor. I take what the DP gives me, and I make the best of what I've got. I work in a structured way, being an engineer by training. So many times I get a tape with no idea what camera it came from, or even what format. If it is a film transfer, the format becomes an even bigger mystery, sometimes. What happened at the transfer house? Ya just never know. Just laying the input on a scope is a huge step forward, for me. Then there are other transfers, the colorist, the host display system, etc., etc. The nebulous standards NECESSITATE some means to define whether the source is blown out or crushed or the display is mis-adjusted. It seems rather senseless to constantly be readjusting the display to suit the source. And I've been in enough preview showings to know that this is what's done, trying to get the playback system to show a good image.
Shaun Roemich December 24th, 2008, 09:04 AM Ok, time for me to try my hand at Devil's Advocate...
Bob, you raise some interesting and valid points. Wondering though if there really is a NEED to KNOW EXACTLY what is going to tape or disc. An analog component or HD-SDI out of camera from the imager will allow us to "rough in" (and I use the term LOOSELY - using vector/waveform/histogram to "tweak" an image is FAR from "rough") the image, isn't monitoring the EXACT bitstream going to storage a BIT (pardon the pun...) academic?
Don't get me wrong; I'm not suggesting 8 bit HDV at 25mbits is going to look the same as 270mbit 10 bit log (forgive me if that EXACT flavour doesn't exist... mere speculation). But by field prepping the image PRIOR to it becoming data, aren't we MAXIMIZING what we can do with a given file/video stream in post?
Disclaimer: the above is ONLY designed to stimulate further discussion that I am actually excited to read. I LOVE this stuff!
Bill Ravens December 24th, 2008, 09:24 AM I agree, Shaun. My own turn as devil's advocate, though would follow up on your comment re: 8-bit, 10-bit, whatever. Clearly the convention for what is digi-black is shifting around when the encoded bitrate shifts. How to deal with this? It's still up to the DP, human imput needed? yet, it's gotta be detectable in the header, or something. Scopes that I've seen don't know the difference between REC 601 and REC 709.....that bothers me.
In the end of things, WHATEVER digi-black is defined as, it has to be played back at IRE 7.5 on an NTSC analog display screen. That's a gain setting. And digi-white falls out from the gamma curve. So, in the conversion to analog, a gamma curve is needed(slope or non-linear, take your pick). That's just another gain setting. If it's played back on an HDTV display, the gain settings change...I THINK it's that simple, but, I could be wrong...in fact probably am.
Shaun Roemich December 24th, 2008, 12:47 PM In the end of things, WHATEVER digi-black is defined as, it has to be played back at IRE 7.5 on an NTSC analog display screen.
So here's hoping that the LOGICAL response to the "banishment" of analog over the air broadcasting is a complete phasing out of analog gear, along the same lines as colour supplanting black and white...
And a follow up: correct me if I'm wrong but 7.5 IRE is only a requirement for NTSC TRANSMISSION, correct? You can safely pass 0 IRE over local analog connections, if I'm not mistaken. Does our "gift" from the regulators in February make 7.5 IRE essentially obsolete?
Bill Ravens December 24th, 2008, 01:45 PM So here's hoping that the LOGICAL response to the "banishment" of analog over the air broadcasting is a complete phasing out of analog gear, along the same lines as colour supplanting black and white...
And a follow up: correct me if I'm wrong but 7.5 IRE is only a requirement for NTSC TRANSMISSION, correct? You can safely pass 0 IRE over local analog connections, if I'm not mistaken. Does our "gift" from the regulators in February make 7.5 IRE essentially obsolete?
I will bow to the East when NTSC is dead and buried. (maybe I should be bowing to the west, toward Japan)Unfortunately, I believe NTSC is more than a broadcast standard. I think American NTSC TV sets expect analog black to be at 7.5 IRE. If it's not, the TV will crush the shadows. Also, DVD players destined for sale in the USA ADD setup to playback. There are codecs that incorrectly add setup, resulting in a blown out image on a viewer that does not comply with NTSC standards, like most computer monitor screens. Just part of the reason I think a WFM is so important.
Robert Wiejak December 24th, 2008, 04:26 PM ...Does our "gift" from the regulators in February make 7.5 IRE essentially obsolete? ...
No. The date changes nothing and your output really depends on your target audience.
If you produce for TV then you still need to confirm to broadcast standards. If your target audience is computer display, projectors, etc. you can produce anything you like – anything your target can display. The ‘February 2009’ date you guys speak of, has nothing to do with NTSC or levels within. It only has to do with over the air method of transmitting the TV signal. The old analog (AM) transmitters OFF, new digital (ATSC) transmitters ON, that’s all. The output of the new DTV converter boxes will still be the same old analog NTSC signal and/or some modern variation of it. The same old method of transmitting analog TV (AM) will still be in use for years in cable. So nothing is changing.
If I may ask: Just out of curiosity, what is it that troubles you so much about ‘7.5 IRE’?
Rob
Bill Ravens December 24th, 2008, 05:00 PM For myself, 7.5 IRE is a small annoyance by itself. Yet in my mind, it represents all the "problems" associated with rasterized image display, all the problems with the "funkiness" that is NTSC, including but not restricted to interlaced images, 60 fps, color standard confusion, and all those kinds of legacy issues. NTSC TV has defined the nature of motion images, whether it's over the air or not, since it was invented. It's archaic, archane and totally needless.....except that it sticks like a bad headcold. It means that production has to account for NTSC (USA), NTSC (ROW), PAL, 601/709, and now HDTV. Can't help but think how much easier things would be if there was only one or two standards, instead of half a dozen.
Bob Grant December 24th, 2008, 05:30 PM I will bow to the East when NTSC is dead and buried. (maybe I should be bowing to the west, toward Japan)Unfortunately, I believe NTSC is more than a broadcast standard. I think American NTSC TV sets expect analog black to be at 7.5 IRE. If it's not, the TV will crush the shadows. Also, DVD players destined for sale in the USA ADD setup to playback. There are codecs that incorrectly add setup, resulting in a blown out image on a viewer that does not comply with NTSC standards, like most computer monitor screens. Just part of the reason I think a WFM is so important.
Not that I live in NTSC land however I think I can safely say, yes USA TVs do expect 7.5 setup on their composite inputs. Digital transmission would most likely not have setup, the same as DV doesn't. It's the responsibility of any D<>A converters to add/remove the setup. My ADVC-300 has a switch for this purpose, On if I was to deal with analog tapes from the USA, Off if they came from Japan. That said it would appear that some of the early A<>D converters did not do this. Levels are not the only area of confusion either. Some of the early systems used a different number of pixels and field order.
The issue here really is that at acquistion anything that will fit into the available number of bits can and is used. Broadcast standards exist for broadcasting, so everything conforms to the one standard and for obvious reasons. If you want to hand a tape to a broadcaster to put to air it needs to conform to their standards, if it doesn't and they're serious about their sound and images they'll likely reject it. Internally broadcasters tend to shoot vision and record sound to their standards, especially for ENG. This caused me some degree of confusion at first as their tapes seemed to have low audio levels etc. However their thinking is valid. They might need to take a tape straight from a camera and play it to air. We're not working under those constraints and the full range of digital values is at the disposal of our cameras.
Certainly a cheap portable HD/SD SDI waveform monitor should find a market. If it was reasonably priced I'd buy one for use with my EX1 as the built in histogram is less than useless although I can generally get by with the zebras they tend to get in the way of the other functions the camera's monitor gets used for. So yes, I'm as interested as Bill in such an instrument but I know it will not be a solution to all the issues he raised initially.
When I'm handed a DV or HDV tape I always check it with my NLE's scopes so I know what I'm dealing with. I check anything of dubious parentage on a CRT as well. More than once I've had media with the wrong field order or files with mixed field order. Based on what my scopes read I deal with it as need be to preserve all of the dynamic range that was recorded. Depending on what the deliverable format is then I may need to adjust vision levels to suit. I notice there's a lot of interest in using the latest digital still cameras to shoot moving images. I'll bet they're not conforming to any SMPTE standard, at a guess they'll likely record the same way as they record till images, using all the available bits.
Bill Ravens December 24th, 2008, 05:48 PM Hey...
MERRY CHRISTMAS, everyone.
Robert Wiejak December 25th, 2008, 02:07 PM ...For myself, 7.5 IRE is a small annoyance by itself...
...It means that production has to account for NTSC (USA), NTSC (ROW), PAL, 601/709, and now HDTV. Can't help but think how much easier things would be if there was only one or two standards, instead of half a dozen...
I don’t agree with you there. I think you are a bit overwhelmed with all these different standards for no reason at all.
If for luma (value Y) you only stick with these two numbers: 16 and 235 (for 8 bit coding), you will be 99% broadcast compliant in NTSC, PAL, SECAM lands and HDTV. There is no need to differentiate for what standard you are producing. The big confusion and I mean the BIG CONFUSION, is in a way different standards interpret these two values. Let me explain:
ITU-R BT.601 - ‘Studio encoding parameters of digital television for standard 4:3 and wide screen 16:9 aspect ratio’.
ITU-R BT.709 - ‘Parameter values for HDTV standards for production and international programme exchange’.
These two documents as you know, define all parameters for encoding of all SD and HDTV television standards. Aside from the numbers of lines and method for encoding color, these two standards also define minimum and maximum values for parameters like Y, U and V, the bread and butter of video signal. Guess what!? Across all these different standards, the limits are all the same for 8 bit coding.
Quote from ‘601 (just the relevant parts):
http://www.zabcia.ca/pics/601-1.gif
These 625/50 and 525/60 systems, effectively are describing NTSC, PAL and SECAM.
Quote from ‘709 (just the relevant parts):
http://www.zabcia.ca/pics/709-1.gif
Same as above, the table describes parameters for all forms of HDTV.
So if according to ‘601 and ‘709 the limits for Y, U, and V are identical across NTSC, PAL, SECAM and HDTV, then why the confusion?
For different reasons, each system (NTSC, PAL, SECAM and HDTV) looks at the same numbers and reads them differently.
From the tables above we know that black is defined as Y at value 16 and white as Y at value 235 no matter what system.
In USA and Canada we have NTSC and we use IRE scale here. Due to the way some test signals were designed (SMPTE bars) the IRE scale start at Y = 0 (0IRE) (this is to accommodate PLUGE). So if Y at 235 is 100IRE then Y at 16 is 7.5IRE.
In PAL and SECAM land they don’t use IRE units and they don’t have black setup, they use percent of Full Scale. The scale there starts at Y = 16. Therefore black (Y=16) is 0%FS and white (Y=235) is 100%FS.
As you can see there is no way of getting away from these two number no matter what system you work with. But, I can also see how one would confuse the two different methods of reading the same values:
| Y | IRE | %FS |
---------------------------------
Black | 16 | 7.5 | 0 |
White | 235 | 100 | 100 |
IRE = NTSC
%FS = PAL, SECAM, NTSC(JP)
The same values are true for HDTV signals. The only exception is when using 10 bit coding. Then the black is at 64 (16*4=64) and the white is at 940 (235*4=940).
So no matter what system you produce for, as long as you stick to 16 and 235 limits, you have nothing to worry about.
For colors (chroma) there are different considerations, so I won’t even go there.
But, there is one more thing worth mentioning about HDTV. If you look at section 5.7 ('709), you will notice that the video data in HDTV can span from 1 to 254 and there are two ‘reserved’ values 0 and 255 – they are reserved for timing reference. I don’t know how or where they are being used, but I would avoid these values at all cost.
Merry Christmas
Rob
Bill Ravens December 25th, 2008, 02:49 PM Thanx, Rob. You and I understand, it would seem. Unfortunately, there's a LOT of people out in the production world who don't. What I get are files that don't comply because of confusion about these issues. To add to the complications, some codec software writers don't get it, either. And therein lies the crux of my dissatisfaction. Some codecs will remap to 16-235, while others map to 0-255. ALL codecs should allow the user to pick, if I were king. Problem is, most codecs have no operating guides to describe how and when they remap or pass thru. Some cameras generate superwhites. I suppose it's because the maker wants to squeeze as much data as he can into 256 bits, I don't know. And then, there's the issue of video for the web, which isn't NTSC, at all, with RGB 0-255. And, as you point out, 10 bit will add a whole new dimension to the confusion for them.
With all these permutations, NLE scopes show the timeline color mapping. NLE's aren't smart enough enough to show waveforms AFTER rendering, altho' they could be made smart, they aren't. What has the editor gotten after a render? It depends on what codec was rendered to, and how the codec is designed to remap values.
Shaun Roemich December 25th, 2008, 03:37 PM With all these permutations, NLE scopes show the timeline color mapping. NLE's aren't smart enough enough to show waveforms AFTER rendering, altho' they could be made smart, they aren't. What has the editor gotten after a render? It depends on what codec was rendered to, and how the codec is designed to remap values.
I suppose you are talking about an EXPORT render here? The WF/VS in my NLE SHOULD be showing me the values that will be created as part of my render file created to display my sequence in realtime (ie. ProRes material captured, edited on a ProRes timeline, colour corrected and rendered to ProResfor output via an I/O card should maintain the integrity of the values the scopes indicate AT LEAST until the video hits the output stage of the card (which may have gamma lookup tables or 8 to 10 bit converters or other"magic" going on)).
BTW, this is a GREAT discussion!
PS. I am one of those editors who works with my NLE's WF/VS open during colour correction and fine tunes before final output. Glad to see I'm not the only one...
Bill Ravens December 25th, 2008, 03:39 PM sorry, I use "render" and "export" somewhat interchangeably. I am talking about exporting. as a practical example of what I'm saying, bring a smpte colorbar pattern into your NLE. Be sure to include pluge. Export the colorbars and pluge to both Windows Media and something like an MPEG2 file. Import both the WMV and the MPEG2 back into your NLE. Look at the waveform and vectorscope for all three. Are they identical? Have the luma values been shifted? what about the color values? if you have an eyedropper tool in your NLE, measure the color bars for each. are they all the same?
Ron Cooper December 27th, 2008, 07:37 PM Just for my cent's worth. I don't go into the detail most of you guys do but I think you hit it on the head Bob with Standards.
I have an expensive and very useful Sony DVD recorder with firewire input, which is just a few years old now, but if I feed it directly from say, the timeline in premiere 6.5, (I wish I could do this in Vegas), it often produced overloaded sound, even when I did an audio calibration line-up in Prem. In anolgue parlance I would call it input overload. In digital it is disgusting !
Obviously there are some different standards here as I had not encountered it with other devices such as cameras etc.
As usual, Sony were quite useless as they never admitted to anything wrong with their product other than to blame Adobe !
As I am now aware of the problem, I make sure my levels are lower than normal to overcome this problem.
Standards, what are THEY !
RonC.
Bill Ravens December 29th, 2008, 08:27 AM I suppose you are talking about an EXPORT render here? The WF/VS in my NLE SHOULD be showing me the values that will be created as part of my render file created to display my sequence in realtime (ie. ProRes material captured, edited on a ProRes timeline, colour corrected and rendered to ProResfor output via an I/O card should maintain the integrity of the values the scopes indicate AT LEAST until the video hits the output stage of the card (which may have gamma lookup tables or 8 to 10 bit converters or other"magic" going on)).
BTW, this is a GREAT discussion!
PS. I am one of those editors who works with my NLE's WF/VS open during colour correction and fine tunes before final output. Glad to see I'm not the only one...
Unfortunately, I have yet to encounter an NLE with a built in set of scopes that "scope" the post-export image. If they did, half of my consternation would be solved.
And, a resounding YES! How can anyone do cc without scopes? Without them, it's russian roulette. I am quite fond of adjusting the gamma curve slope. Doing this, it is very very easy to exceed broadcast standard limitations, even if the gamma curve endpoints are fixed at legal values. Without a scope, you just can't see that.
|
|