View Full Version : A technologists view of HDV
Douglas Spotted Eagle February 9th, 2005, 04:07 PM It's not just the bandwidth, it's a cost factor at a number of levels, plus it's not an "official" standard at this point. It'll get there, but will it be in the next 10 years? Likely not.
Were it that anyone had the wisdom 20 years ago to realize that the convergence of computer and television was going to happen, *maybe* it could have been planned for differently. Would have been nice, eh?
Kevin Shaw February 9th, 2005, 04:11 PM <<<But what about 1080p30 provisional, for 1080p60 in the future? >>>
Two problems with this: one is that video shot at 30 progressive frames per second has an odd "strobing" look which many people find distracting, and the other is that 1920x1080 resolution is simply too demanding for much of today's technology. Sure, someday it will be no problem to process and display 1080/60p video, but it's going to be slow and painful getting to that point. We would have been better off doing 720/60i as a precursor to 720/60p, because we could get there a lot quicker. Instead we're using 1080/60i and struggling to deal with it, when a lot of expensive big-screen TVs can't even properly display that resolution. The end result is that 1080/60i is going to become the de facto standard in the U.S. for at least the next several years, and that's a shame.
Davi Dortas February 9th, 2005, 04:43 PM <<<-- Originally posted by Kevin Shaw : <<<But what about 1080p30 provisional, for 1080p60 in the future? >>>
Two problems with this: one is that video shot at 30 progressive frames per second has an odd "strobing" look which many people find distracting -->>>
Huh? Most episodic programs on TV are shot in either 24P or 30P. Star Trek: TNG was shot in 30P because it made editing easier, then trying to remove and add pulldown with 24P. Unless you mean many people to include just yourself, then you could be correct, but such a broad statement is simply untrue.
Kevin Shaw February 9th, 2005, 04:52 PM <<<Huh? Most episodic programs on TV are shot in either 24P or 30P. Star Trek: TNG was shot in 30P because it made editing easier, then trying to remove and add pulldown with 24P. Unless you mean many people to include just yourself, then you could be correct, but such a broad statement is simply untrue. >>>
Okay, maybe this is only a problem with prosumer-priced video equipment, but it's a widely discussed phenomenon nonetheless. It is arguably the primary reason why we use interlaced video in the first place!
Gabriele Sartori February 9th, 2005, 05:38 PM <<<-- Originally posted by Davi Dortas :
Huh? Most episodic programs on TV are shot in either 24P or 30P. Star Trek: TNG was shot in 30P because it made editing easier, then trying to remove and add pulldown with 24P. Unless you mean many people to include just yourself, then you could be correct, but such a broad statement is simply untrue. -->>>
Don't be confused by frame rate vs shutter rate. The film industry shoot at 24 fps but they have a blade giving the effect of 48P. I do have a JVC HD1 and I have a very similar effect when I keep my shutter at 1/60 even if I shoot at 30fps. It is not the perfect cure but it does miracle, just like in the movie industry the double shutter does miracles.
Brad Bodily February 9th, 2005, 05:47 PM <<<-- Originally posted by Kevin Shaw : Okay, maybe this is only a problem with prosumer-priced video equipment, but it's a widely discussed phenomenon nonetheless. It is arguably the primary reason why we use interlaced video in the first place! -->>>
It’s my understanding that we went interlaced in the first place to save bandwidth without ("noticeably") sacrificing vertical resolution.
I don't think 30fps has any inherent problem with ‘strobing’. I think you'll find a lot of strobing problems relate more to how the shutter rate relates to the frame rate (and as frame rate goes up, this should be less of a problem). Heck, I’ve watched 24p material on HDTV and it looked fine. 30p is 24% faster than that and much better suited for smooth motion on a 60i/60p display.
Toke Lahti February 10th, 2005, 08:04 AM <<<-- Originally posted by Gabriele Sartori: Don't be confused by frame rate vs shutter rate. The film industry shoot at 24 fps but they have a blade giving the effect of 48P. I do have a JVC HD1 and I have a very similar effect when I keep my shutter at 1/60 even if I shoot at 30fps. It is not the perfect cure but it does miracle, just like in the movie industry the double shutter does miracles. -->>>
Now who's confused?
Double exposing single film frame?
Double blades are in the projectors...
24p with fast shutter is not the same than 48p.
Gabriele Sartori February 10th, 2005, 09:06 AM <<<--
Now who's confused?
Double exposing single film frame?
Double blades are in the projectors...
24p with fast shutter is not the same than 48p. -->>>
I know, my post was typed quickly and I didn't write well but I didn't exactely wrote that the blade is in the film camera. I was talking about the result and yes you are right it is in the projector but tell me who watch a movie without a projector. Also the result wouldn't change if they put the blade in the camera.
Also about the shutter, in the video camera I just said that "I've a very similar effect. It is not the perfct cure" you understand what I wrote? We discussed on this forum many times, don't put in my mouth words that I didn't say. I didn't say that double shutter is exactely like 48p. I just sais that is giving a similar effect. Oh well. I need a lawyer when I write something from now on. :-)
Barry Green February 10th, 2005, 09:59 AM The double shutter doesn't give anything like the effect of 48p.
Movie projectors use a two-bladed or three-bladed shutter to even out the flicker that's caused when the film gets advanced to the next frame.
Each frame is held on the screen for 1/24 of a second, regardless of how many blades the shutter has.
If the projector had only one blade, you'd see a bright picture for most of the time, followed by pitch-black while the shutter closed and the film advanced. This causes flicker on the movie screen. To "even it out", they came up with the idea of adding more blades to the shutter, so you see a more even pattern of light/dark.
But the effect is *nothing* like having 48fps or 72fps or anything like that. It's 24fps. The shutter is used for flicker reduction, not for anything related to motion enhancement.
Balazs Rozsa February 11th, 2005, 08:18 AM <<<-- Originally posted by Brad Bodily: It’s my understanding that we went interlaced in the first place to save bandwidth without ("noticeably") sacrificing vertical resolution. -->>>
I think the TV set itself played a vital role in the decision. When HDTV formats were introduced in the US CRTs were the main target. An interlaced CRT needs to display only half of the lines than a progressive one. So it is easier to make a 1080i TV than a 720 60p one, not to mention a 1080 60p. Even 1080i TVs has higher manufacturing costs than SD TVs. For CRTs 1080i is a good compromise between price and quality. And for a 1080i display a 1080i signal is the best. Now the EBU says that the ratio of progressive / interlaced displays is changing fast in the direction of progressive displays. For progressive displays interlaced is not as well suited than for interlaced displays. Why stick with interlace when in the very near future progressive displays will dominate the market (at least in Europe).
Some say that interlaced can be converted to other formats well, but in my experience this is not the case. Here in Hungary I very seldom see HD video, but when I go into a shop which has 100 Hz and normal TVs put next to each other, the 100Hz pictures look much softer. This is why I think converting to other formats from an interlaced format will not give too good results. And I think this is why they rarely put 100 Hz and normal TVs next to each other.
Gabriele Sartori February 11th, 2005, 09:08 AM <<<-- Originally posted by Balazs Rozsa : when I go into a shop which has 100 Hz and normal TVs put next to each other, the 100Hz pictures look much softer. This is why I think converting to other formats from an interlaced format will not give too good results. And I think this is why they rarely put 100 Hz and normal TVs next to each other. -->>>
What you are saying may be true but the example is not the best. 100 Hz TV have cheap electronic in order to do the conversion. It is not a good conversion. In theory if you apply the right math any format could be converted in another format without visible degradation if the new format has enough BW and resolution . Still too often though we see scaling done with decimation and interlaced to progressive done taking away every other half field. Both tricks are followed by integration with some amount of low pass filtering. results are pretty bad.
|
|