View Full Version : 60i vs 30p vs 24p
Stephen Byrd April 8th, 2008, 09:57 AM I hope I don't start a firestorm with this question, but since I'm still learning about this stuff, please be kind :)
I've searched and read about all of these shooting modes, but is there a "superior" choice among these 3? I know the true answer is "whatever you like", although I want to make sure I'm not confused about a couple of things...
Let's say I want to shoot something (like a trip) where I only have ONE chance to get the material. Is it better to shoot in 30p/60i and convert to 24p later, or does shooting 24p on the camera natively produce better results rather than conversion from 60i/30p? If it's simply less post production rendering, then I would likely want to shoot at a higher rate. That way I can do whatever I want with it later. The one I'm most confused about is 60i vs 30p. I have a Canon HF100 and realize that 30p is a new feature, but what does that really do for me when comparing to 60i? Again, I've tried to search articles for this on this site (and others) but haven't found a definitive answer. Please feel free to send links if you don't feel like explaining all of the details. Thanks in advance for any help!
Ron Evans April 8th, 2008, 11:40 AM In my opinion the only reason to shoot 24p is if you want to transfer to film for distribution in a theatre or to a film festival. In this case it makes transfer to film at 24fps easier. Since 24p is the slowest frame rate it has the most motion artifacts or judder which is poor for any fast movement or especially movement across the screen. Video shot at 24p needs to use film shooting techniques NOT video approaches( pans, fast zoom and large depth of field are not good for 24p). An inexperienced user shooting at 24p will produce video that is likely to have juddering images that will border on unwatchable video. The fastest frame rate and thus the smoothest video will be from 60i. For viewing on a CRT 60i will be the best for all instances. 30P has advantages for PC viewing and reducing frame to 15P for internet use. IT still has half the frame rate of 60i so will not be as smooth but a little better than 24p. The smoothest video with high resolution will be 720P60 used by the sports networks to give both high definition and smooth motion. For a consumer shooting their holidys or family video stay with 60i or 720P60. IF you want to have that film look use film techniques for shot composition and exposure of the scene but in my mind there is no need to have the judder frame motion of film of 100 years ago.
You can tell I am not a fan of 24p!!!!
Ron Evans
Sean James April 9th, 2008, 01:17 AM Aren't the colors and the whole look different when shooting progressive?
I am trying to evaluate how much 30p would be worth to me.
Brian W. Smith April 9th, 2008, 05:45 AM Aren't the colors and the whole look different when shooting progressive?
I am trying to evaluate how much 30p would be worth to me.
according to canon:
http://www.usa.canon.com/consumer/controller?act=ModelInfoAct&fcategoryid=177&modelid=16186#ModelFeaturesAct
24p Cinema Mode
spacer
24p Cinema Mode enables all aspiring moviemakers to achieve a professional "film-look." You can change the camcorder's frame capture rate to 24p (recorded at 60i), which provides the appearance of the same frame rate as movie film. In addition, you can use the VIXIA HF10's CINEMA setting, which changes the color and tonal characteristics, evoking the look and feel of a movie shown in a theater. For added flexibility, these settings can be used together or independently.
30p Progressive Mode
spacer
This is a unique feature that only Canon offers in the consumer camcorder market. You would otherwise have to look to professional video cameras in order to have this sophisticated tool at your command. In addition to the standard interlaced video frame rate of 60i, you may choose to set the VIXIA HV30 to capture video in 30p which is (30 progressive frames) particularly useful for footage to be used on the Internet. In addition, this setting gives enhanced quality to still images captured after recording. Excellent for action shots and sports.
Ron Evans April 9th, 2008, 06:55 AM Frame rate is just that. How many frames are exposed in a period of time ( normally related to frames per second). The more frames the smoother the motion. Colour and exposure will be the same for the same relative exposure times controled by the gamma curves and colour matrix of the sensor. IF you don't change these elements they will be the same for all frame rates. There are of course other implications with depth of field due to iris changes. However with the same iris opening then these will be the same too. Remember what you are doing is taking many still photos one after the other and playing them back at a rate that fools our eyes/brain into thinking its fluid motion. Too few photos and our eyes/brain don't get fooled and view them as a fast slide show. More frames than about 50 per second and we start to think of the images as in motion. To get the so called "film" look consumer cams change the frame rate AND the colour characteristics of the sensor. Without these changes then a frame rate change to 24p or even 30p would just introduce frame judder, in my mine a defect!!! If you want the colour and exposure effects of film you can do this at 60i with the right manually controlled cameras.
The problem with 24p in video is that NTSC TV's refresh at 60hz. 24 doesn't divide into 60 evenly!!! So it is necessary to play with playback to get all the frames to line up in sync with 60hz. With newer LCD's that refresh at 120hz and some Plasmas that refresh at 72hz this is possible. In the cinema with film this doesn't happen. The film projector moves the film through the gate at 24fps and a shutter with normally 3 or more blades exposes the frame while it is stationary in the gate leading to a 72 fps flicker rate. A faster flicker rate than either TV's in North America or Europe. This doesn't stop frame judder of 24fps but it does solve the flicker rate issue with our eyes/brain. Shooting video at 24p for transfer to film to be shown on a film project makes perfect sense. Cheaper to shoot and edit etc. But to shoot 24p to show directly on TV to me just doesn't make any sense since it is a poor emulation of what would be seen from a film projector.
Mario I am not sure what you mean by tape being smoother. Do you mean smooth motion or smooth image edges? This has nothing to do with whether the recording medium is tape or HDD it has do do with frame rate and how the camera is setup for edge enhancement etc. Your approach of downloading to PC everyday is a good one and would allow you to even look at a pure flash based cam as well.
Ron Evans
Stephen Byrd April 9th, 2008, 09:48 AM ok, most of this is making sense. i guess i'm still interested in the pros/cons of 60i vs 30p. is 30p just a "number thing" that makes the HD cam sound good, but not used much in normal shoots? for instance, i'm planning to shoot an upcoming safari and want to make sure i'm using the best mode available. should i be taking advantage of 30p, or is 60i "to the eyes" a better mode?
Ron Evans April 9th, 2008, 10:39 AM Stephen I assume animals will be moving? Fast in uncontrollable directions? So there will be subject and camera movement if you are going to pan to follow( really bad idea for 24p,lots of judder)? 60i will have twice the effective frame rate. Watched on a normal TV 60i will be smoother motion than 30p( it has twice the number of images to capture movement of subject and/or camera even if those are effectively half the vertical resolution since they are fields captures rather than full frames). IF you want really smooth motion in HD then you should look at 720P60 cameras.
30p makes the software task of creating 15p video for the WEB easier though this is hardly a great challenge from 60i!!!!.
Ron
Aaron Courtney April 9th, 2008, 11:15 AM Hey Stephen,
Regarding 60i vs. 30P, you're still only getting 30 frames per second with either mode. The difference is with 60i, you're capturing two fields (odd scan lines then even scan lines) separated by either 1/60 or 1/50 (PAL) which must then be recombined to form a complete frame (de-interlaced) in order for a progressive display to correctly present the video. With 30P, you're capturing a complete frame (just like with a still film camera) from the same instance in time - notwithstanding any rolling shutter vs. ccd technicalities.
When we were all watching interlaced television sets, 60i was the obvious choice because that's how the displays inherently functioned. Obviously, the consumer electronics world has shifted gears and everyone is buying progressive sets. I think it was a couple of years ago that progressive sets outsold interlaced displays for the first time. And there's no going back either since the CRT has effectively been transplanted by PDP's, LCD's, and whatever else will be coming down the pike. So it is safe to say that we are now living in a progressive world, not interlaced. Unfortunately, we are still dealing with this interlace mess - in broadcast television and video acquisition.
The reason I say it's a mess is because it is impossible for a progressive set to perfectly de-interlace an interlaced-acquired video source. This is straight from the video processing chipset manufacturers. Any way you cut it, you're losing resolution somewhere, regardless of whether you employ sophisticated motion-adaptive, pixel-adaptive, or whatever adaptive algorithm. It simply doesn't work, and that's exactly what Yves Faroudja said upon selling his company (one built entirely on the prospect of high end de-interlacing technology) to Genesis.
The ONLY way for a progressive display to perfectly de-interlace interlaced footage is if the footage was acquired while shooting progressively. In this instance, either the display or the video playback device merely needs to employ a simple weave de-interlace algorithm to recombine the two fields into their original progressive frame because the entire frame was captured at the same instance in time - meaning there is no interframe movement (between fields). The problem, as many professional testers are discovering, is that with these advanced motion-adaptive de-interlacing video processing chipsets used today, most of them have proven to be incapable of detecting lack of interframe movement in interlace flagged video, so they screw up the de-interlacing of 30P video.
Apparently, this would not have been an issue with HD-DVD because the spec fully supported 1080/30P whereas BD does not - only supports 1080/24P and 1080/60I. So even if you are able to encode your BD at 1080/30P, you still have to flag it as 1080/60I and you're then at the mercy of the playback chain to ignore the flags and instead determine no interframe movement and correctly weave your interlaced stream and then frame double to get to 60 Hz.
So in a nutshell, native 24P video will be handled correctly with some displays (3:3, 4:4, 5:5) while others will have to use 3:2 pulldown (not ideal from a perfectionist POV), 30P - total crapshoot as it stands today, 60i handled universally but will never be perfectly displayed on progressive sets.
Stephen Byrd April 9th, 2008, 11:16 AM thanks for the info everyone. i think most definitely i will be using 60i to capture on my trip. since i just bought the hf100, i'll have to wait a while before i venture into the 60p HD market :)
Ron Evans April 9th, 2008, 04:05 PM Aaron you raise an interesting fact about progressive displays. Unless they are given a compatible input they have a likelihood of not being as smooth a display as CRT's to our eyes. As you say inputs that are driven to 30p, repeated twice, to get the 60hz refresh have a chance of displaying judder. CRT's have the advantage of displaying movement between fields emulating a 60fps image to our eyes for a 60i input. Although I have mainly Sony equipment I really feel that 720p60 would have been a better system to move to for everyone until we have the capability to get to 1080p60. I am getting more disappointed with cable TV as the judder/stutter is getting worse on most programs, don't know where the issue is but even occurs on my CRT's so isn't just my 1080p Plasma.
Ron Evans
Sean James April 9th, 2008, 04:37 PM according to canon:
http://www.usa.canon.com/consumer/controller?act=ModelInfoAct&fcategoryid=177&modelid=16186#ModelFeaturesAct
24p Cinema Mode
spacer
24p Cinema Mode enables all aspiring moviemakers to achieve a professional "film-look." You can change the camcorder's frame capture rate to 24p (recorded at 60i), which provides the appearance of the same frame rate as movie film. In addition, you can use the VIXIA HF10's CINEMA setting, which changes the color and tonal characteristics, evoking the look and feel of a movie shown in a theater. For added flexibility, these settings can be used together or independently.
30p Progressive Mode
spacer
This is a unique feature that only Canon offers in the consumer camcorder market. You would otherwise have to look to professional video cameras in order to have this sophisticated tool at your command. In addition to the standard interlaced video frame rate of 60i, you may choose to set the VIXIA HV30 to capture video in 30p which is (30 progressive frames) particularly useful for footage to be used on the Internet. In addition, this setting gives enhanced quality to still images captured after recording. Excellent for action shots and sports.
It sounds like the 24p mode would look the same as 60i, and the 24p cine mode would be with image adjustments I could also do on Final Cut pro.
The reason I am asking this: I owned a first generation DVX100, and 24p footage looked so much better than 60i, light/mood was represented better, and the colors were much nicer, so my impressions.
I wonder if these differences still apply, or if technology has advanced and made the image characteristics of interlaced, 24p pulldown, and 30p progressive identical?
Dave Rosky April 9th, 2008, 06:41 PM It sounds like the 24p mode would look the same as 60i
I think this would depend on the television display. On TV's that can shift their refresh rate to 72Hz or 96Hz, do inverse pulldown, and display the 24P material as 3:3 or 4:4, the 24P material might look as good as or maybe even better than deinterlaced 60i depending on the material. But TV's that cannot do that will have to deinterlace the 60i which contains the pulled down 24P, and it won't look as good.
Sean James April 9th, 2008, 07:23 PM I think this would depend on the television display. On TV's that can shift their refresh rate to 72Hz or 96Hz, do inverse pulldown, and display the 24P material as 3:3 or 4:4, the 24P material might look as good as or maybe even better than deinterlaced 60i depending on the material. But TV's that cannot do that will have to deinterlace the 60i which contains the pulled down 24P, and it won't look as good.
I'm a bit confused here.
Don't get TV's just one kind of signal? Or, in case of a video clip coming from a Blue Ray disc, a signal in one particular compression format?
Why would they have to shift their refresh rate?
And regarding the pulldown, I didn't even know TVs do that. I thought this was just something that's done in the camera to create 24p while recording in 60i to tape.
OK, I'm a bit more than just a bit confused here.
Ron Evans April 9th, 2008, 08:01 PM Unfortunately TV's don't just get one type of signal. Depending on the TV they can deal with 480i,( really 480 60i, standard definition), 480p, 720P, 1080i, 1080p and the p can mean 30p or 60p. These over various interfaces from composite to HDMI. How the TV responds to these inputs may be different depending on the interface and certainly on the brand of TV. A lot of variants. This is made more complicated by what feeds the TV. DVD players can also decode what is on the disc depending on how they are set up for the TV they are connected to and its capabilities and represents a wonderful opportunity to have the DVD and TV set up in a non optimum fashion!!!! Sometimes the DVD has better decoding and sometimes the TV is better using particular interfaces will allow selection or not. For instance the new 120hz LCD would be better decoding the signal of 24p or 30p from a DVD rather than letting the DVD player do it and will create a smooth interpolated video image( create extra frames that are not in the original) that is likely smoother than the film judder of an actual film projector since it will interpolate images rather than just increase the flicker rate!!! . With HDMI the player and TV communicate and supposed to transmit data appropriate to the display capabilities.
Ron Evans
Aaron Courtney April 9th, 2008, 10:11 PM Why would they have to shift their refresh rate?
It's not that they have to; it's a feature that is more commonly being implemented by the CE manufacturers. As was said earlier in this thread, you're not watching strict 24fps in a movie theater anymore - I don't know the exact history, but I'm sure someone does, LOL. I think most facilities employ frame doubling IIRC (so 48 fps) for a better visual experience.
Because 24fps is so embedded in feature films, CE manufacturers and video processing chipset manufacturers have taken it upon themselves to embed native support of 24fps in some equipment to elminate pulldown problems. So, with the right playback chain, you can view your BD movie in native 24fps glory where the player outputs 1080/24P and the display (preferably) frame triples, quadruples, or quintuples that stream to hit 72Hz, 96Hz, or 120Hz, thereby eliminating the 3:2 pulldown issue altogether. But the implementation is currently all over the map. Very few displays do this correctly at this point in time regardless of what the marketing material says - you can read the reviews, it's comical how some CE companies are handling this - total joke.
Sean James April 9th, 2008, 11:15 PM I wonder if there is a good source to tank up on these technical basics.
Dave Rosky April 10th, 2008, 12:15 PM I wonder if there is a good source to tank up on these technical basics.
I didn't find a single place with a clear explanation of everything, but after looking around a lot, here is a really terse summary of the basics:
- Most newer TV's are progressive with a 60Hz referesh rate.
- A few TV's (e.g., Pioneer Kuro, but I can't guarantee that) have a few additional refresh rates that help with smoother (i.e., non-pulldown) display of 24P. These are usually either 72Hz or 96Hz, which are multiples of 24.
- A very few (maybe none at this point) can refresh at 120Hz, which is sort of a magic number because it is a multiple of both 24 and 30.
- A TV that refreshes at 60 Hz will deinterlace 60i material, and then display each deinterlaced frame twice in a row to get the 60Hz refresh rate. This is why on most progressive TV's, unlike older CRT's, 60i material doesn't necessarily have smoother motion than 30P
- A TV that can only refresh at 60Hz can only play 24P that has been pulled down to 60i, which normally happens in the DVD player. 2 out of every 5 frames will be interlaced in a way that contains a field from one 24P frame and a field from another 24P frame, so they don't necessarily deinterlace well, which is why such material won't look as good as 24P material on a TV that can change its refresh rate and display 24P with a "pure" pulldown like 3:3 or 4:4.
I'm a newbie to this too, so anyone please correct me if I'm wrong about any of this - I'd love to know so I can understand it more fully.
Aaron Courtney April 10th, 2008, 12:54 PM Looks pretty good Dave. I'll only add that, as some people have found out with the Sony Bravia's in particular, that you have to turn OFF the 120Hz smooth motion in order for the display to natively process 24P material, which obviously is quite illogical, LOL. Also, there are issues with "forcing" the entire chain to properly handle 24P, which only creates more confusion regarding this technologically complicated topic.
What would be absolutely ideal (outside of the stupid BDA revisiting its spec to reflect current trends in videography), is if every progressive display had the intelligence in its video processing chipset to natively process every conceivable format. Then you wouldn't need to spend $1K-2K on a killer BD player with all the bells and whistles. All you would need, essentially, would be a $50 player capable of reliably transferring what is on the physical media over the wire to the display which would then be processed accordingly - in essence, nothing more than a dumb transport mechanism ala a $15 PC CD-ROM drive. This would also prevent any AVR from screwing up the video stream over HDMI on its way to the display. <--another serious issue people are beginning to discover.
Maybe we'll get there someday, but I'm not holding my breath.
One other thing that I'll add that I found particularly interesting is that 1080/30P is actually found in the ATSC standard for broadcast television. So perhaps there is hope after all.
Ron Evans April 10th, 2008, 02:53 PM Another unknown is what the cable companies do to the signal before they send it down the cable!!! I thought at first I was disappointed with my new Panasonic Plasma because of the juddering image only to look carefully at the CRT in the other room to find the same issue but hadn't noticed it as this set is only 24" viewed from a distance rather than 42" viewed closer.
Ron Evans
Dave Rosky April 10th, 2008, 03:15 PM Looks pretty good Dave. I'll only add that, as some people have found out with the Sony Bravia's in particular, that you have to turn OFF the 120Hz smooth motion in order for the display to natively process 24P material, which obviously is quite illogical, LOL.
That is kind of weird. I thought the main advantage of 120Hz is that since it is a multiple of both 24 and 30, it can display both 24P, 30P, and deinterlaced 60i (and of course even true 60P if there is ever source material) all using simple frame duplicating pulldowns like 4:4, 5:5, etc. That way, TVs can be built with only one referesh rate (120Hz), reducing cost and complexity. It sounds like Sony is using 120Hz to do something different like get smoother motion by interpolating frames. I haven't seen that set yet personally.
What would be absolutely ideal (outside of the stupid BDA revisiting its spec to reflect current trends in videography), is if every progressive display had the intelligence in its video processing chipset to natively process every conceivable format.
That would be a really good idea. Like you said, I think the TV is the correct place for this, since it is the end of the display chain and has control over its referesh rate. I'm not sure, but I think the Pioneer Kuro comes close. I believe (that is, I have heard) that it can recognize 2:3 pulldown and do the inverse telecine and display it as 3:3 at 72Hz. I don't know if it can take direct 24P and 30P, but if it can, it would be a pretty universal display device (it might well support those given your comment about 1080/30P in the broadcast standard). Unfortunately, though, as long as there are also dumber (and cheaper) sets out there, DVD and BD players will still have to have the capability to do some of the processing.
Aaron Courtney April 10th, 2008, 03:53 PM Ron, I think we have to trust the broadcasters when they claim 720/60P and hope they maintain that progressive image through their entire chain. Perhaps you're seeing compression artifacts on the larger set. I have both an interlaced HDTV (Sony 34XBR970) and a PDP (Sammy 50"). Of course, everything looks better on the 34" Sony compared to the plasma, partly because CRT's are just plain better at everything beyond logistics, LOL, but also because the compression artifacts inherent in broadcast television are much less noticeable because of the size difference, even if I'm sitting <6ft from the TV.
Dave, I have never heard the 120Hz feature referencing 30P video. It's always been in relation to 24P and 60i material. The little trick about turning OFF the motion feature I discovered while reading a professional review of the LCD. Also, other reviewers and owners have confirmed it. And of course, it came up while discussing native support for 24P film material, not 30P video.
Of all the 120Hz capable displays, Samsung, if memory serves me, had the most horrifically convoluted way of handling 24P. I think the display first telecined and interlaced 24P to get to 60i, then de-interlaced, and then frame quadrupled to get to 120Hz, or something along those lines. What a joke! Whatever it was, in the end you essentially lost support for the whole 24P (sans pulldown) native thing anyway. You just shake your head and wonder what these companies are thinking!?
It truly is caveat emptor today in CE.
Aaron Courtney April 10th, 2008, 04:03 PM Unfortunately, though, as long as there are also dumber (and cheaper) sets out there, DVD and BD players will still have to have the capability to do some of the processing.
On the surface, this is no big deal, right? But the more you think about this and how you plan on connecting your AVR to your BD player in order to take advantage of the new lossless codecs (DTS-MA, Dolby Digital True HD) - using HDMI only option right? - then both the AVR and the display (via EDID) must be capable of accepting the BD player's output, otherwise, you'll have to revert to the most common denominator between the AVR and the display, which will likely not be the preferred selection/format.
See, the deeper you dig into this whole format mess, the more things start to unravel and the more chaos the end user is left to deal with because of the shortsightedness of CE manufacturers.
Sean James April 10th, 2008, 09:27 PM I didn't find a single place with a clear explanation of everything, but after looking around a lot, here is a really terse summary of the basics:
- Most newer TV's are progressive with a 60Hz referesh rate.
- A few TV's (e.g., Pioneer Kuro, but I can't guarantee that) have a few additional refresh rates that help with smoother (i.e., non-pulldown) display of 24P. These are usually either 72Hz or 96Hz, which are multiples of 24.
- A very few (maybe none at this point) can refresh at 120Hz, which is sort of a magic number because it is a multiple of both 24 and 30.
- A TV that refreshes at 60 Hz will deinterlace 60i material, and then display each deinterlaced frame twice in a row to get the 60Hz refresh rate. This is why on most progressive TV's, unlike older CRT's, 60i material doesn't necessarily have smoother motion than 30P
- A TV that can only refresh at 60Hz can only play 24P that has been pulled down to 60i, which normally happens in the DVD player. 2 out of every 5 frames will be interlaced in a way that contains a field from one 24P frame and a field from another 24P frame, so they don't necessarily deinterlace well, which is why such material won't look as good as 24P material on a TV that can change its refresh rate and display 24P with a "pure" pulldown like 3:3 or 4:4.
I'm a newbie to this too, so anyone please correct me if I'm wrong about any of this - I'd love to know so I can understand it more fully.
Thanks for this.
That means that if you want to play 24p on a modern, progressive TV, it would be reformatted twice, one 24p to 60i, and then to 60Hz(is that like 60p?).
It also looks like 60i is pretty much over.
So what about 30p footage on a 60Hz display? Should be OK, or not?
Sean James April 10th, 2008, 09:30 PM What would be absolutely ideal (outside of the stupid BDA revisiting its spec to reflect current trends in videography), is if every progressive display had the intelligence in its video processing chipset to natively process every conceivable format.
If Blu-Ray doesn't do that, we may see another format coming, or not?
I have read a few comments on computer forums, that go into that direction: Blu-Ray has won - for NOW.
Brian Boyko April 10th, 2008, 10:18 PM From the filmmaker's perspective, I'd say that you probably want to shoot at 30p instead of 60i in 95% of all situations, including NTSC television playback. 30p is a standard that gives you the nice, clear look of progressive video with a framerate that handles motion better than 24p.
Additionally, 30p converts to 60i very, very easily, and I don't think there would be much problem with interlacing the footage for broadcast.
And finally, what comes across as "motion blur" in 30p is exactly the same stuff that comes across as "interlacing artifacts" in 60i. That is, if you're moving the camera too fast to capture the subject within 1/30th of a second, you'll get interlacing lines in 60i (as the subject will have switched position in that fraction of a second)
But what about that other 5%? Well, in a word: Sports. If you absolutely know there is going to be fast action - split second action - you are going to want to get as many frames as possible in that 1 second; even if it means you're really only using half the resolution. 60i also does slow motion better than 30p - though you lose vertical resolution by "deinterlacing" the picture, you can get 60 "blended" frames with 60i, and slow that down to as low as 1/4 the speed before people start complaining about it being a slideshow.
So, why 24p?
The advantages of 24p are threefold: It gives you a "film-like" look that can look more professional if you do it right. This introduces motion blur that gives the video an... air of 'unreality.' And while you do have to deal with motion blur, you do NOT have to deal with interlacing problems, which look ugly. Motion blur - from time to time - can look elegant. Just try not to use it for handheld shots. In fact, 24p should probably almost always be on a tripod or dolly.
Additionally, 24p converts to 25p, and from there to 50i, more easily than 60i. This is important when considering a release in PAL country television, and it's one of the reasons I chose to film my documentary about NZ's politics in 24p. A 4% speedup (barely noticeable) and it's a conversion that takes care of itself.
Finally, if you hope for a theatrical release, 24p transfers to 16mm or 35mm film easily.
Now, there's no such thing as a wrong choice and there are ways to get the 24p look from 60i footage with ~$200 computer programs.
Aaron Courtney April 10th, 2008, 10:41 PM Thanks for this.
That means that if you want to play 24p on a modern, progressive TV, it would be reformatted twice, one 24p to 60i, and then to 60Hz(is that like 60p?).
Not quite. The ideal method would be as we have already discussed. The BD film is authored as 1080/24P, the player recognizes 24P and passes it natively via HDMI to the progressive display which also recognizes 24P and merely frame triples (72Hz), quadruples (96Hz) or quintuples (120Hz) the original progressive frame video stream. You have to get out of the 60Hz rate if you want to eliminate judder from pulldown. Natively handling 24P material means neither the player nor the display is interpolating frames. The display, in this case - as in all other cases that I've read - is simply flashing the same frame delivered to it via the BD player 3x, 4x, or 5x generally depending on the manufacturer's design.
It also looks like 60i is pretty much over.
Hardly. It's firmly embedded in both the ATSC spec and the industry (not consumer) chosen HDM technology.
So what about 30p footage on a 60Hz display? Should be OK, or not?
As I've said either in this thread or the the one in the HV20 forum, it's a total crapshoot at this point depending entirely on the video processing chipset in the BD player. It seems easy enough right? Well, it ain't panning out that way in practice as the professional testers are finding out. I think HQV makes a BD torture test disc that you can purchase to evaluate how your display chain will handle 1080/30P video flagged as 60i in both BD and HD-DVD (probably irrelevant now unfortunately). BTW, they also make the venerable Silicon Optix HQV Reon chipset that is pretty much the best out there today, and one which handles 1080/30P video on BD with aplomb.
Sean James April 11th, 2008, 02:20 AM From the filmmaker's perspective, I'd say that you probably want to shoot at 30p instead of 60i in 95% of all situations, including NTSC television playback. 30p is a standard that gives you the nice, clear look of progressive video with a framerate that handles motion better than 24p.
Additionally, 30p converts to 60i very, very easily, and I don't think there would be much problem with interlacing the footage for broadcast.
And finally, what comes across as "motion blur" in 30p is exactly the same stuff that comes across as "interlacing artifacts" in 60i. That is, if you're moving the camera too fast to capture the subject within 1/30th of a second, you'll get interlacing lines in 60i (as the subject will have switched position in that fraction of a second)
But what about that other 5%? Well, in a word: Sports. If you absolutely know there is going to be fast action - split second action - you are going to want to get as many frames as possible in that 1 second; even if it means you're really only using half the resolution. 60i also does slow motion better than 30p - though you lose vertical resolution by "deinterlacing" the picture, you can get 60 "blended" frames with 60i, and slow that down to as low as 1/4 the speed before people start complaining about it being a slideshow.
So, why 24p?
The advantages of 24p are threefold: It gives you a "film-like" look that can look more professional if you do it right. This introduces motion blur that gives the video an... air of 'unreality.' And while you do have to deal with motion blur, you do NOT have to deal with interlacing problems, which look ugly. Motion blur - from time to time - can look elegant. Just try not to use it for handheld shots. In fact, 24p should probably almost always be on a tripod or dolly.
Additionally, 24p converts to 25p, and from there to 50i, more easily than 60i. This is important when considering a release in PAL country television, and it's one of the reasons I chose to film my documentary about NZ's politics in 24p. A 4% speedup (barely noticeable) and it's a conversion that takes care of itself.
Finally, if you hope for a theatrical release, 24p transfers to 16mm or 35mm film easily.
Now, there's no such thing as a wrong choice and there are ways to get the 24p look from 60i footage with ~$200 computer programs.
24p is always mentioned as the "film look".
I just wonder if the p as progressive isn't more important than the 24.
I mean, the shorter exposure time seems to speak a lot in favor of 30p.
Ron Evans April 11th, 2008, 01:57 PM Just found this reference to displays that manage 24p correctly and thought it was of interest in this thread.
http://forum.blu-ray.com/showthread.php?t=5155
Ron Evans
Sean James April 11th, 2008, 04:33 PM Glad the best displays are among them: the Sony LCDs and the Panasonic Plasma's.
Ken Ross April 11th, 2008, 07:35 PM It also looks like 60i is pretty much over.
Not even close. 60i will be around for a very very long time.
Aaron Courtney April 11th, 2008, 09:14 PM LOL, I was wondering when someone was going to link to that thread!
Sean James April 12th, 2008, 07:59 PM Not even close. 60i will be around for a very very long time.
But it looks like we are getting more and better options, don't we?
Couldn't it be, that we are still having so much 60i around, because it has been around for such a long time? Given infrastructure, comparable to a person's habit? So, even if there are were options around, the more economical prevailed (for the moment)?
Aaron Courtney April 12th, 2008, 10:52 PM Sean, check the link I provided in the HV30 thread. Apparently, this is straight from the horse's mouth. IMO, 1080/60I (actually, any interlaced format) should never have been approved by the ATSC committee; nor should it have been included in the BDA spec. Clearly, there were factors at work in these decisions that were not in the best interests of consumers or the advancement of technology. Oh well, it is what it is.
Ken Ross April 13th, 2008, 07:17 AM Sorry duplicate post...deleted.
Ken Ross April 13th, 2008, 07:18 AM IMO, 1080/60I (actually, any interlaced format) should never have been approved by the ATSC committee; nor should it have been included in the BDA spec. Clearly, there were factors at work in these decisions that were not in the best interests of consumers or the advancement of technology. Oh well, it is what it is.
It's called 'bandwidth' or lack thereof. Broadcasters barely have enough bandwidth for high quality, high bitrate 1920X1080 @60i, so it's not reasonble to think they would have had anything close to what would have been necessary for 60p at the same resolution.
Look at how Directv, Dish, Fios etc. are struggling to add more HD @60i. So that was, I'm sure, a prime motivating factor.
For me personally, when it comes to acquisition, I do not like 24p or 30p due to poor motion handling relative to 60i. When my cam has this feature, I totally ignore it.
Dave Rosky April 14th, 2008, 01:50 PM It's called 'bandwidth' or lack thereof. Broadcasters barely have enough bandwidth for high quality, high bitrate 1920X1080 @60i, so it's not reasonble to think they would have had anything close to what would have been necessary for 60p at the same resolution.
The problem is that on modern progressive displays, 60i doesn't give the same motion advantage as it used to on CRTs, because it must be delinterlaced to 30P, and it doesn't save any bandwidth over 30P
Because of this, I personally think 60i should eventually go away as CRTs go away. It is now a solution to a problem that doesn't really exist anymore. It should be replaced by 30P for normal usage, and for fast motion, by *true* 1080/60P, where 60 *full* frames per second are recorded, and can be displayed on a 60Hz monitor without reducing it to 30P via deinterlacing. Yes, it will take more bandwidth to do that, but on progressive displays, it's the only way to get 60Hz motion. While bandwidth is still a limited resource for broadcasting, with Blu Ray disks and camcorders, there isn't such an issue, although it would require a really fast processor.
Kevin Shaw April 14th, 2008, 03:03 PM The problem is that on modern progressive displays, 60i doesn't give the same motion advantage as it used to on CRTs, because it must be delinterlaced to 30P...
But given that most consumer-grade HDTVs have a 1080i input (presumably designed to take a 60 field per second input), what exactly do they do with that signal? For example, if the HDTV simply displayed each field twice for alternating odd/even lines, wouldn't that yield a visual result similar to an interlaced CRT?
In any case, I have edited 60i material which looks fine when played from a PS3 to a 1080p LCD, so whatever's being done to accomplish that is working. I like the smooth motion of 60i for the projects I do and see problems with some 30p and 24p footage from digital video cameras, so for general-purpose use 60i seems to have a place yet in the modern world.
Ken Ross April 14th, 2008, 03:47 PM The problem is that on modern progressive displays, 60i doesn't give the same motion advantage as it used to on CRTs, because it must be delinterlaced to 30P, and it doesn't save any bandwidth over 30P
Because of this, I personally think 60i should eventually go away as CRTs go away. It is now a solution to a problem that doesn't really exist anymore. It should be replaced by 30P for normal usage, and for fast motion, by *true* 1080/60P, where 60 *full* frames per second are recorded, and can be displayed on a 60Hz monitor without reducing it to 30P via deinterlacing. Yes, it will take more bandwidth to do that, but on progressive displays, it's the only way to get 60Hz motion. While bandwidth is still a limited resource for broadcasting, with Blu Ray disks and camcorders, there isn't such an issue, although it would require a really fast processor.
I don't think the public would tolerate 30p video Dave. The stutter is just too much of an issue...it sure is for me. It's why I don't bother with it.
I just don't see much of an issue with 60i at all on a high quality fixed pixel display. Now if you're talking about 60p, that's a different story. But good luck seeing that for a long long time in broadcast.
Aaron Courtney April 14th, 2008, 04:23 PM I'll go ahead and add some more fuel to this fire. I am in complete agreement with Dave. Interlacing, as a format, was established to overcome some technical hurdles that no longer exist if you really get down to it. I don't think there's anyone who really believes interlacing is superior to progressive shooting - hey, go out and take half of a still photo, wait 1/60 sec and then take the second half, and then try your best to combine them while smearing the interframe movement to get rid of all the motion artifacts. This is exactly what is happening with today's progressive displays. Obviously, this is not how interlaced televisions of yesteryear functioned - although, as the MIT expert noted in that link I provided, interlacing never really worked as advertised even with interlaced displays.
For everyone who complains about 30P video, I highly doubt you have actually watched 30P under the ideal conditions - BD player ignores 60i flags, realizes it's dealing with no interframe motion (progressive), so it weaves the two fields together to recreate the original progressively acquired frame, frame doubles to hit 60Hz, and then outputs to 1080p/60 display.
To my knowledge, there's only ONE BD player on the market - Sammy BDP1200 - that will do this correctly, although it has a host of other problems not related to video processing, so it's not a popular model among videophiles.
I have no idea what would happen if someone broadcast a 1080/30P signal and your 60Hz PDP/LCD tried to tune it in. But, it's in the ATSC spec, although I suppose that doesn't guarantee perfect compatibility.
Aaron Courtney April 14th, 2008, 04:36 PM But given that most consumer-grade HDTVs have a 1080i input (presumably designed to take a 60 field per second input), what exactly do they do with that signal? For example, if the HDTV simply displayed each field twice for alternating odd/even lines, wouldn't that yield a visual result similar to an interlaced CRT?
I don't believe progressive displays work that way because they can't display fields, only frames. So the display receives the first field, stores it in its buffer, receives the second field comprising frame 1, de-interlaces the two fields, in an attempt to progressively display the original frame (won't ever work if the frame came from interlaced video, can work if frame came from progressive video), then, I presume, it flashes frame 1 twice to match 30 fps to 60 Hz refresh rate.
So it's a "fake" 60fps, just like 3:3 pulldown on a Pioneer Kuro (72 Hz) is a "fake" 72fps from 24fps film material. In the case of the 60i video, you're simply repeating the same frame twice. I would bet that if people who complain about 30P video would watch frame doubled 30P on a 1080/60P, not one "motion" complaint would be raised. Conversely, if everyone had to watch 60i at 30fps on progressive displays (exactly what is happening with 30P video improperly decoded by BD+progressive display chain), there would be an uproar.
Kevin Shaw April 14th, 2008, 04:43 PM Conversely, if everyone had to watch 60i at 30fps on progressive displays...there would be an uproar.
But I'm doing exactly this and it looks fine to me, so the de-interlacing must be working reasonably well. Conversely, I regularly see footage on the news now with obvious motion-judder issues, so I wonder what's going on there...?
Ken Ross April 14th, 2008, 04:54 PM For everyone who complains about 30P video, I highly doubt you have actually watched 30P under the ideal conditions - BD player ignores 60i flags, realizes it's dealing with no interframe motion (progressive), so it weaves the two fields together to recreate the original progressively acquired frame, frame doubles to hit 60Hz, and then outputs to 1080p/60 display.
To my knowledge, there's only ONE BD player on the market - Sammy BDP1200 - that will do this correctly, although it has a host of other problems not related to video processing, so it's not a popular model among videophiles.
If true, then perhaps that's the problem. Good luck with only the Sammy 1200 being able to do that. I wouldn't touch the Samsungs with a 10' pole, but that's me. But I sure as heck haven't seen smooth motion handling with 30p from a consumer camcorder hooked up directly to the display. Additionally, I don't like the idea of having to hassle with software to improve things I get right away with 60i. Each to his own I guess.
I also wonder at times if people have seen high quality 60i broadcasts on a high quality 1080p plasma. On my 1080p 60" Pioneer Elite Kuro, motion is beautiful and interlaced artifacts are minimum and most times simply absent. This is a far cry from the interlaced artifacts we used to get under our old NTSC system. In fact, I much prefer 60i and its higher resolution to 720p broadcasts. Yes, I'll take 60p over all of them, but we're a long way from there.
Aaron Courtney April 14th, 2008, 04:54 PM Kevin, yes, the de-interlacing of the progressive displays is "good enough" - no doubt, it's probably as good as it could ever get if you buy the best video processing tech today. BUT, it will never be "perfect", as in shooting progressively for display on progressive televisions. Now, you're talking native language to the display - no complicated conversions required, which result in nominal to significant reductions in original resolution.
It's like buying an E85 vehicle or whatever they're called and instead of using that fuel, installing some sort of converter to allow you to keep filling up with reg. gasoline which screws up your fuel economy. It doesn't make any sense, LOL!
Aaron Courtney April 14th, 2008, 05:03 PM This is a far cry from the interlaced artifacts we used to get under our old NTSC system.
I am going to have to disagree with you here, Ken. IMO, there were no artifacts present because we were watching interlaced footage on interlaced televisions! There was no de-interlacing. Everything was speaking the same language.
I am in the fortunate position to own both an interlaced HDTV and a couple progressive HDTV's (PDP & LCD). If I watch SD DVD's (480i material) on the progressive sets, I can spot interlaced artifacts - easy to see the jaggies on the edges of circular objects as the camera is panning across them. If I watch that same DVD on the Sony interlaced CRT, presto, jaggies are gone! Sure, the image is "softer" but I'd rather have that than distracting artifacts.
Dave Rosky April 14th, 2008, 05:26 PM I don't believe progressive displays work that way because they can't display fields, only frames. So the display receives the first field, stores it in its buffer, receives the second field comprising frame 1, de-interlaces the two fields, in an attempt to progressively display the original frame (won't ever work if the frame came from interlaced video, can work if frame came from progressive video), then, I presume, it flashes frame 1 twice to match 30 fps to 60 Hz refresh rate.
So it's a "fake" 60fps, just like 3:3 pulldown on a Pioneer Kuro (72 Hz) is a "fake" 72fps from 24fps film material. In the case of the 60i video, you're simply repeating the same frame twice. I would bet that if people who complain about 30P video would watch frame doubled 30P on a 1080/60P, not one "motion" complaint would be raised. Conversely, if everyone had to watch 60i at 30fps on progressive displays (exactly what is happening with 30P video improperly decoded by BD+progressive display chain), there would be an uproar.
I think this post really hits the nail on the head. What many people don't realize, and what I didn't realize either until after a lot of internet searching, is that modern, non-CRT TV's are strictly progressive and don't display separate fields separately in time. They buffer the fields and then deinterlace them. When you watch 60i on an LCD TV, you are actually watching 30P in 2:2 pulldown because the TV has deinterlaced your 60i and then plays the deinterlaced frames twice in a row. As far as I have seen, all non-CRT TV's work that way, at least all common ones. Because the two fields are not displayed separately, you would loose the effect of 60 fps motion even though the display's refresh rate is 60Hz.
I don't yet have an LCD or plasma TV (I am in the market for one), but the above fact makes me wonder why a lot of people say that 30P video looks bad. It must indeed look bad, or people wouldn't say that; but the only thing I can think of is that possibly a lot of TVs somehow don't handle a true 30P signal proplerly and that causes display problems, because other than some sort of problem like that, 30P and delinterlaced 60i should look quite similar.
Historically, interlace was introduced mainly to solve the trade-off between flicker and phosphor persistence in CRT tubes. Phosphors can be made short or long persistence. Long persistence phosphors don't flicker at low refresh rates, but they tend to smear motion. OTOH, a phosphor with a persistence short enough to display 30 fps without smearing would have too much flicker in bright ambient light. So, the compromise trade-off was to interlace the video to increase the effective display rate to 60 Hz, eliminating flickering without taking any extra bandwidth than 30 fps. Now that displays store frames and fields in RAM, this persistence issue goes away, even for plasma TVs (which still use phosphors, but in a different manner than CRTs).
Ken Ross April 14th, 2008, 06:55 PM I am going to have to disagree with you here, Ken. IMO, there were no artifacts present because we were watching interlaced footage on interlaced televisions! There was no de-interlacing. Everything was speaking the same language.
I am in the fortunate position to own both an interlaced HDTV and a couple progressive HDTV's (PDP & LCD). If I watch SD DVD's (480i material) on the progressive sets, I can spot interlaced artifacts - easy to see the jaggies on the edges of circular objects as the camera is panning across them. If I watch that same DVD on the Sony interlaced CRT, presto, jaggies are gone! Sure, the image is "softer" but I'd rather have that than distracting artifacts.
Yeah, we will certainly agree to disagree on this one! I've never seen any NTSC TV that didn't produce interlaced artifacts. This was a very well known issue with our NTSC system...I'm really surprised you never saw this.
As far as fixed pixel devices such as plasmas, I think you may be living a bit in the past. Most of the modern plasmas of today have virtually perfect deinterlacing. This is a very easy accomplishment for any decent plasma of today. Take a look at Gary Merson's tests on this and you'll see the vast majority of HDTVs he tested did perfectly in this regard.
My Pioneer presents virtually no interlaced artifacts with 60i. So for me this is simply a non-issue. This is why I prefer the significantly higher resolution of 1920X1080i broadasts to 720p broadcasts. But again, each to his own.
Oh, and one last thought. Not many people are aware that over 90% of scenes on TV are static or nearly static in nature (including sports!). So even the issue of fast movement as a plus for 720p is somewhat exaggerated.
Aaron Courtney April 14th, 2008, 11:06 PM We may be missing each other with semantics. I'll give you a specific example. I dropped by my parents' house with one of my kids last year. They have a really nice Sony XBR(3?) 52" LCD that was playing Cars to help keep my kid in check while we visited. We also have the DVD so I started to watch a bit more closely. One of the scenes in the middle of the movie begins with McQueen in court. During the opening of that scene, the picture pans across the statue in front of the courthouse. At one point in that pan, you can see quite a bit of artifacting present in the grill of that statue, which I showed my dad. It's when the local law enforcement says, "The Radiator Springs court is now in session" or something along those lines. I remember the audio better because we went back and forth comparing the de-interlacing capabilities of the display vs. the upconverting DVD player and the dialog got embedded into my brain, LOL!
Inside the courtroom, the picture at one point pans across the VW van. The circular emblem of the VW displayed severe jaggies during that pan (as one would expect). I apologized to my dad for potentially ruining his future viewing experience by showing him what to look for in interlaced video playing on progressive televisions, and of course began to explain that one of the primary attractions of HDM (format war was undecided at the time) was to finally be able to get away from authoring interlaced discs.
When I got home, I played the same scenes on my PDP. My 50" plasma did not exhibit as severe artifacts probably because it is only a "720P" rez display, but they were there nonetheless. I then played the same scenes on my 34XBR970 interlaced HDTV CRT. Perfection. Although the TV could not physically compete with the raw rez of the 1080P LCD, the picture was much more pleasing and completely artifact-free.
http://www.hqv.com/technology/index1/deinterlacing.cfm?CFID=&CFTOKEN=50840332
I consider the above required reading on the subject of de-interlacing in today's world of consumer electronics. It also shows the degree of variability among video processing technology currently used in both display and playback devices. As I've said before, anyone who is interested can pick up one of their de-interlacing torture test discs and see how their playback chain holds up.
Ken Ross April 15th, 2008, 05:02 AM Aaron, I know exactly what you're talking about. But you are missing the fact that there were all kinds of nasty artifacts with our NTSC system, many of them interlaced artifacts. There were many test scenes on calibration discs once DVDs came out that vividly pointed these interlaced artifacts out. This really is a very well known issue with NTSC.
As far as what you see, it's virtually absent with a combination of a good DVD player and plasma. I've got a Panasonic BD30 Blu Ray player and together with my Pioneer plasma, I simply don't see this kind of issue.
As I said, Gary Merson did an excellent write up on this and most of the better plasma displays of today do a perfectly good job of deinterlacing. LCDs exhibit many of their own weird artifacts including motion artifacts, which is why I'd never use an LCD as my primary viewing display.
For me nothing beats the reality and overall pictrue quality of a good plasma to say nothing of a far better, much more immersive home theater experience than a small CRT. Most reviewers of the new Pioneer Kuro plasmas have called them the best TVs ever and I don't think they'd get that reputation if they displayed lots of interlaced artifacts.
Ron Evans April 15th, 2008, 05:53 AM I can relate to this discussion too. Until last fall I had a Sony HiScan 1080i CRT that I got almost solely to watch the output from my FX1 since the results for normal cable TV were not that good compared to the lower resolution JVC iArt TV we have in the family room. Just before Christmas I decided to treat myself to a new 40" 16x9 and at first got a Samsung LN-T4069 one of the new 120hz 1080P LCD's. I immediately disliked it the moment I got it home!!! I subsequently changed it for a Panasonic 42" 1080P Plasma which I like much better though to be honest for normal SD programming from cable the iArt CRT is far superior. My grandson's Cars DVD plays fine on both the iART and from the PS3 to my Panasonic over HDMI. What I find interesting is that my own HDV, DV or AVCHD seem to play much smoother and with apparent higher resolution for the AVCHD and HDV on the Panasonic than even the few BluRay discs that I have, they are concert videos so likely not from 24p film? Certainly the AVCHD video is like looking through a window viewing outdoor shots of my grandson.
Upconversion from the PS3 is also of interest. Some SD discs like the first Norah Jones are wonderful but others are just awful!!! The combination of scaling/de-interlacing in all the playback and display chain can lead to a lot of problems that I do not think were as evident in a consistent interlace world. The combination of using 24p video shot badly, compressed for transmission over cable as interlace and then displayed on a low cost LCD can be almost unwatchable!!!!
Ron Evans
Ken Ross April 15th, 2008, 11:10 AM Ron, you bring up a couple of good points. All DVDs are not created equal. I can remember a few years back when people were talking about 'banding' with fixed pixel displays. True, many displays did have banding issues, but what many people missed was that often the DVD was the culprit!
I'd put the 'offending' DVD on my 34 Panasonic CRT HDTV and saw the same banding I saw on my Fujitsu plasma of that time! But a well mastered DVD (which are more common today than a few years ago), on a good plasma, should show not show any more noticeable artifacts than a CRT.
I really don't want to get in to the CRT vs plasma argument, but having had a few CRT HDTVs (including a Zenith 64" RP HDTV with 9" guns and the 34" directview Panasonic), I find the better plasmas produce far superior picture quality. Issues such as misconvergence, focus, linearity, purity and many others are totally missing in plasmas. That together with the far greater, more immersive size of many plasmas makes this a 'no-brainer'....at least for me. ;)
|
|