|
|||||||||
|
Thread Tools | Search this Thread |
August 2nd, 2006, 07:25 PM | #1 |
New Boot
Join Date: Jul 2006
Location: Chihuahua, MEXICO
Posts: 8
|
IRE settings
OK... new to JVC HD100 ....
I'm testing Paolo's TC3 calibration settings... best recomendations for the IRE settings ? I've heard that 0 is best for producing DVDs...is that true ? whats the difference between 7.5 and 0 ? Regards |
August 2nd, 2006, 09:27 PM | #2 |
Wrangler
Join Date: Aug 2005
Location: Toronto, ON, Canada
Posts: 3,637
|
Hi Enrique, and welcome to dvinfo.
It sounds like you have had experience with the DVX100. On the DVX100 the setup level affected the digital signal...even though it shouldn't have. The HD100 doesn't suffer from this problem. The digital signal is always 0IRE setup level. The option in the menu ONLY affects NTSC analog output. So the answer is that your results will be the same no matter what setup level you set the camera to. It doesn't affect the digital signal.
__________________
Tim Dashwood |
August 2nd, 2006, 09:39 PM | #3 |
New Boot
Join Date: Jul 2006
Location: Chihuahua, MEXICO
Posts: 8
|
IRE settings
Thank you so much....
|
December 18th, 2006, 09:28 PM | #4 |
Major Player
Join Date: Jun 2005
Location: Louisville, CO
Posts: 204
|
Hey Tim... are you sure?
I just had a shoot today where the client requested 0 IRE... now the manual says that the setting in the menu only affects the analog output, which it did (he was happy with the output on the field monitor). However, the LCD on the HD100 was unaffected regardless of this change. I told him that I didn't think the 0 IRE was going to tape. When changing the setup from 7.5 to 0, I would have expected it to update the LCD too.. right? |
December 18th, 2006, 09:59 PM | #5 |
Trustee
Join Date: Jun 2006
Location: Burbank
Posts: 1,811
|
I will give a short explanation. As Tim says above, the IRE level only affects the output of the Analog signal.
In this regard, it affects the appearance of the video on an analog device being fed by the analog signal. Analog devices in NTSC North America are calibrated to receive an analog signal of 7.5 IRE to display correctly. On the other hand PAL and NTSC outside North America devices are calibrated to accept an analog signal of 0 IRE. Some professional monitors are switchable to receive either a 7.5 or 0 IRE signal. Otherwise, the correct IRE signal needs to be sent to the device. However, in the digital realm all is 0 IRE. For example, if I shoot NTSC DV or HDV, the signal is 0 IRE, and it is written to tape at 0 IRE. When I edit it is all 0 IRE. However, if I am viewing while editing on an NTSC monitor that is calibrated for 7.5 IRE, I would want the analog signal sent to that monitor to be 7.5 IRE, and I could probably do this in the software or the device that is sending the analog signal out to the monitor. In any case, the actual video I am editin is still 0 IRE. Now, all DVDs, since they are digital, are also 0 IRE. So I would take my editited footage that was shot (by defalult) at 0 IRE and burn it to a DVD without change at 0 IRE. When I play the DVD, if I am playing it to an analog TV, the DVD player will change the analog output to 7.5 IRE. If I play the DVD to a digital TV, the IRE stays at 0. Only when the video goes from the digital realm to an analog device does the IRE change, and only in NTSC North America. The issue most often is a problem when taking old VHS into an NLE. The analog VHS is 7.5 and needs to be changed to 0 IRE. The opposite is true if making VHS dubs of a digital timeline. An NLE like Liquid PRO (and I'm sure others with breakout boxes) allow you to set the analog output (as does the HD100) to 7.5 for output to a VHS deck, for example. Analog to digital converters such as made by Canopus also have the setting. However, some digital cameras, as Tim notes above, record a 7.5 IRE to tape, and this is not correct. If a clien is asking for a 0 IRE, chances are he doesn't understand what he wants and someone has confused him. There would be two responses to this client that I can think of: (1) Oh, sure, no problem, I'll do that for you (since that's what he's getting anyway; or (2) Oh really? How is it you're going to use the video that you need to be specific about that?... (and then maybe you will get more information to understand what he's talking about and what he needs and for what reason. The video will still be in 0 IRE, but you may learn that he got some plans that it would be helpful for you to know about ahead of time.... like he's going to make a VHS dub and send it to Europe, having been told it should be 0 IRE, but not realizing it also has to be PAL.) |
December 18th, 2006, 10:18 PM | #6 |
Major Player
Join Date: Jun 2005
Location: Louisville, CO
Posts: 204
|
Thanks Jack! Very well explained.
FYI: the client is using the video directly on the internet, so it would stay in the digital world, being reproduced by (largely) digital monitors. So I assume he wanted to make sure he was recording the correct black levels as seen on PC monitors, not NTSC. So, that mostly makes sense. What I don't understand is why when I changed the output to show 0 IRE, my LCD looked different than the external monitor. If I was recording 0 IRE and the external monitor was showing 0 IRE then why did my internal LCD look more like 7.5? (Unless my LCD is out of adjustment ??) Curious minds want to know. Thanks! |
December 18th, 2006, 10:46 PM | #7 | |
Trustee
Join Date: Jun 2006
Location: Burbank
Posts: 1,811
|
Quote:
In your case, exactly what external monitor (brand and model number) were you using? With the camera, the most accurate view you get is with the zebra bars, telling you what the highlights are doing. Otherwise, the camera itself is not very good to judge what the picture looks like. |
|
December 18th, 2006, 10:56 PM | #8 |
Major Player
Join Date: Apr 2005
Posts: 512
|
Since LCDs rely on subtractive color mixing, and there's only so much light they can cut out, they tend to have horrible black levels. This can create the illusion that setup level is being added to your video. It's not. It's just because LCDs suck.
|
December 25th, 2006, 10:24 PM | #9 | |
Regular Crew
Join Date: Aug 2006
Location: San Francisco, USA
Posts: 52
|
Quote:
Note: this question isn't just for Stephen-- I was just quoting him. Thanks. |
|
December 26th, 2006, 12:11 AM | #10 |
Major Player
Join Date: Apr 2005
Posts: 512
|
Of course a pro-grade LCD will be better than a consumer grade TV, but a pro-grade CRT will be better than a pro-grade LCD. I was also referring in the first place to the rather low-grade LCDs used in the viewfinders on consumer and prosumer cameras.
|
December 26th, 2006, 01:03 AM | #11 | |
Trustee
Join Date: Apr 2006
Location: United States
Posts: 1,158
|
Quote:
all that said, AFA 0 vs 7.5 blacks - most video capture devices when set to 7.5 ADD setup. if you VTR is already at 7.5 on analog output, you have double setup added !. so normally you capture with the 0 IRE settings because that represents unity gain, 0=0 and 7.5=7.5. if your clients' content is going to NTSC broadcast, 7.5 is a factor during editing/color correction. if not, it doesn't matter. for computer use, 0 is fine, it will look better, and even for DVD, its ok because all modern TV's have no problem with 0 blacks. its a throw back to 50+ year old technology and is of very limited relivance if you don't do broadcast TV, and even then, it might not matter - it depends on the network. plenty of 0 IRE is on TV, the question is if the station just sends it out, or if they have a auto corrector procamp on the line that messes with it and makes it "auto-legalgalizes" it to 7.5. I've just recently had a broadcast client request 0 blacks... and its a major national network. they liked the way it looked. Steve Oakley |
|
December 26th, 2006, 03:08 PM | #12 | |
Major Player
Join Date: Apr 2005
Posts: 512
|
Quote:
A national network may request 0 IRE material because digital TV does not have setup anymore. However, setup would be added for analog NTSC broadcast. |
|
| ||||||
|
|