View Full Version : 1080 vs 720, i vs p
Jonnie Lewis December 12th, 2012, 07:22 AM Hi all,
I know this is a topic that gets covered regularly - I've been doing some reading online but I've managed to confuse myself thoroughly. I understand that 1080 and 720 are simply resolutions, and that i and p are simply ways of displaying the same picture.
Looking at my export settings in Premiere Pro, I can choose between 1080i and 1080p. I'm confused about how this relates to being displayed on a screen for a couple of reasons.
I've read that TV stations only broadcast in interlaced, so what happens if I give them a 1080p file? And what about the viewer - they can choose 1080p or 1080i on their televisions sets too. Where is it decided whether an image is interlaced or progressive?
Also, how does the resolution relate to the size of a screen? If I was to buy a 60" TV, surely a 1920x1080 image would have to be stretched up to fit it and it would look crap?
Would there be any adverse effect on the quality of the footage if I was to film in 1080p, and place it into a 720p sequence within Premiere Pro? It would obviously have to be shrunk - would that have a negative effect on image quality?
If so... How is it that 1080p looks just as good on a 42" television as it does on a 27" television? Surely the 27" television has to shrink the image to fit it on the screen?
I hope that's not too confusing a post... it's something I'm struggling to get my head around.
Thanks in advance,
Jonnie.
Lee Mullen December 12th, 2012, 08:04 AM To me, progressive looks more natural, similar to what the eye conveys light. Interlaced looks fake and shiny, studio like and glittery. :D
Dave Blackhurst December 12th, 2012, 11:43 AM You will find 1080 LCD displays, down to about 22" from what I've seen, but there are also "HD" displays that are ACTUALLY 720 - and if you get close, the difference is usually pretty obvious. You need to (carefully) review the specs for any given display, as a 720 display will indeed accept a 1080 signal, but will then "munch and crunch" it onto a 720 panel... you've apparently already surmised that the results may vary...
Ultimately, the goal IMO is to stay at the highest possible resolution/data rate as far into the production/display process as possible - progressive video should produce MORE data to work with than interlaced in any given capture device... more data = potentially more overall detail, and when and if the time comes to downsize or compress, you should have more input to feed into the program that is to do the compression/sizing - this of course may vary, but typically, the better the source, the better the output.
Think of it in terms of "you can't recreate what wasn't captured", although there are some ways to try really hard, and with motion files, you can potentially span the timeline to recreate SOME additional data points.
The key to understanding i vs. p is remembering that MOVING subjects will shift between "frames" or parts of frames - so with i you've got some potential for offset between frames (think of the typical "mouse teeth" you see in some poorly processed i video), where with p you should be dealing with complete frames, and only motion from frame to frame. Again, more complete "data" to work with.
Hope this helps with understanding the concepts!
David Heath December 12th, 2012, 01:30 PM I've read that TV stations only broadcast in interlaced, so what happens if I give them a 1080p file? And what about the viewer - they can choose 1080p or 1080i on their televisions sets too. Where is it decided whether an image is interlaced or progressive?
There are three main attributes to a TV signal - resolution, scanning type, and frame rate. It's usual to now express them in a form (no of lines)(scan type)/(framerate) - so a full descriptor will be something like 1080i/25. Hence simply talking of "interlace" or "progressive" only tells half the story. In terms of "look" the no of lines will mainly determine the resolution, the scan type and framerate will together determine the way motion is portrayed.
It used to be the case that framerates were spoken of if progressive, FIELDrates if interlace. The standards bodies decided some years ago that correct nomenclature should be to always speak of framerates, even for interlace - hence the most common systems broadcast are 1080i/25 (Europe) and 1080i/30 (US/Japan). You will frequently hear them still referred to as 1080/50i and 1080/60i - same thing, but old terminology.
In the UK and most of Europe, it's true that the main transmission system is 1080i/25, so yes, interlaced. To answer your question, there are actually three types of "scan type" possible: interlaced , progressive, or psf - "progressive, segmented frame". It may be worth looking at Progressive segmented frame - Wikipedia, the free encyclopedia (http://en.wikipedia.org/wiki/Progressive_segmented_frame) . In brief, psf is a way of sending a progressive video over an interlaced system.
Point is that the difference between p and psf is simply one of the order in which the lines are presented. In each case the data is the same - and it should be possible to transparently reorder and change from one to the other without loss.
Also, how does the resolution relate to the size of a screen? If I was to buy a 60" TV, surely a 1920x1080 image would have to be stretched up to fit it and it would look crap?
It's more down to viewing angle than screen size, so the distance at which the screen is being viewed is as important as screen diagonal. There's been a lot of research done, and based on about a 3 metre viewing distance concludes that for 1080p and up to about a 60" screen, the 1080 system resolution is better than the eye can resolve. Go for a bigger screen, or view more closely, and the eye is capable of seeing more. Hence the choice of 1080 for most TV broadcast, and the desire to move to 4k for digital cinema - the numbers weren't just plucked out of nowhere, there is science behind them..
Would there be any adverse effect on the quality of the footage if I was to film in 1080p, and place it into a 720p sequence within Premiere Pro? It would obviously have to be shrunk - would that have a negative effect on image quality?
Yes, on output the footage would have to be rendered to 720, so would lose resolution. The caveat to that is assuming the detail was there in the first place. Many lower end cameras (pretty well all DSLRs for example) may give a 1080 output, but may not have as much inherent resolution as the 720 system is capable of preserving.
Tim Polster December 12th, 2012, 05:18 PM Complex topic Jonnie. This has made our jobs a lot more difficult over the past 5-8 years.
First start with the available broadcast options:
Standard Definition - US 720x480 60i EU 720x576 50i - Always Interlaced
High Definition - US 1280x720 60p, 1920x1080 60i, & 1920x1080 24p (certain satellite providers only)
I am not sure about the EU :) but we know 1920x1080 50i is a choice!
As dave mentioned, movement will play a large part in the percieved image quality. If you have a static camera and no subject movement, 1080i will look as good as 1080p because the frames are duplicates of each other. If you are moving the camera the scene will start to show mis-matches.
But remember, you can shoot in a framerate and deliver in another format. So these "standards" are just wrappers of the footage. 24p material has been delivered over interlaced signals forever.
Where I find all of this stuff important is when you are deciding what format to shoot a project in. The framerates are where you can get into trouble, like shooting ice hockey in 24p. You will have way too much judder compared to 50p/60p. You can still deliver that material in a broadcast format but the judder will remain.
So it is important to reverse engineer your project and choose your framerate first, then the resolution. My overall point is that the TV format is almost of no concern because whatever you shoot can be delivered. It is how you shoot it that makes it look good or not.
Also, in my experience 720p looks fantastic on a 1080p television. A lot better that 1080p60...because you can not broadcast in 1080p60! - yet. So do not get caught up in numbers and specs so much. Quality is what matters. My father has a 720p plasma and I have a 1080p plasma from the same maker (Panasonic). I can see a little more detail on my TV but they both look great.
Hope this is not rambling on!
Steve Game December 12th, 2012, 06:13 PM The prospect of better viewed picture quality when the video format is progressive full frames compared with interlaced video at the same frame resolution is only necessarily true if the data bandwidth is greater for the progressive video. In other words, more image data requires more bitrate.
Its no coincidence that the current AVCHD camcorders that shoot 1080p/50 require higher bitrate streams when recording at that frame rate compared with either 1080p/25 or 1080i/25. This may or may not be an issue in the wide open spaces of Blu-Ray storage, but when it comes to squeezing video streams into multiplexed broadcast channels, the temporal effects of interlacing can be less objectionable than mpeg2 or 4 compression artifacts. The BBC for example only allows a bit budget of about 9Mbps for each of the four (soon to be 5) channels in the HD multiplex. When films are televised,the original 24fps film is scanned 4% fast and broadcast as 1080psf/25, which is seen by viewers as progressive.
Interestingly, the view at the BBC seems to be that it is better to reduce the source image resolution and compress it less in the transmission chain, than try to maintain a high detail in the picture and squeeze it harder to fit the available bandwidth. The UK HD terrestrial broadcasts and their DBS equivalent services are originated as 1440x1080 frames. The original HDCAM studio equipment had this resolution, but much of that has since been replaced with 'full HD' kit. 1440 samples per line is still used though, despite the availability of broadcast statmux kit that could accept the full 1920 samples per line.
I fully agree with David Heath on the real resolution of cameras, especially consumer products including SLRs used to produce video. They have abysmal resolution,yet the sales hype makes big news of the 1080 line recording resolution capability. Most SLRs struggle to achieve a genuine 600 LPPH image, and even that comes with a healthy dose of sub-sampling artifacts and moire patterning.
Les Wilson December 12th, 2012, 10:31 PM ....Also, how does the resolution relate to the size of a screen? If I was to buy a 60" TV, surely a 1920x1080 image would have to be stretched up to fit it and it would look crap?... How is it that 1080p looks just as good on a 42" television as it does on a 27" television? Surely the 27" television has to shrink the image to fit it on the screen?
HD resolution is 1920x1080. MD is 1280x720.
A showing HD content is displaying it in 1920x1080 resolution no matter if it's 42" or 27". Period. The image is not "shrunken" in terms of resolution. Size and density of the pixels varies, not the number of them.
Leon Kolenda December 20th, 2012, 05:42 PM Why would you shoot in 720p/60p/30p/24p instead of 1080p I have read somewhere that 720p can be a very good looking spec?
Leon
Trevor Dennis December 20th, 2012, 09:11 PM Why would you shoot in 720p/60p/30p/24p instead of 1080p I have read somewhere that 720p can be a very good looking spec?
Leon
It might be to support a special function like a faster frame rate you want to slow down in post for smoother slow-mo. Higher frame rates might force you to lower resolution, and you can upscale 720 footage in a 1080 project and what you lose by upscaling is more than made up for with the smoother slow-mo.
Ervin Farkas December 20th, 2012, 10:09 PM Why would you shoot in 720p/60p/30p/24p instead of 1080p I have read somewhere that 720p can be a very good looking spec?
Leon
There has been a ton of discussion on this topic a few years back when HD televison became mainstream. Side by side comparisons revealed very small (if any) difference between the two. While 720P is a smaller resolution, it provides full frames; 1080i has more resolution but only half pictures. Some differences are only evident with certain types of source material.
Personal preference is also a factor. Some people simply hate interlaced video because it looks like TV, while progressive looks a lot like film. Sure, 1080P is the best of the bunch, but that comes at a price: significantly higher bitrate (limited in case of TV transmission); in other words 720P at higher bitrate might look better than 1080i at the same bitrate. Then again... the best is what looks best to YOUR eyes.
David Heath December 21st, 2012, 07:03 PM There has been a ton of discussion on this topic a few years back when HD televison became mainstream. Side by side comparisons revealed very small (if any) difference between the two.
I have to disagree, a lot of the early comparisons turned out to be meaningless - they were hardware comparisons rather than system comparisons. A few years ago only a minority of screens were "true HD" (1920x1080 resolution), the majority were something like 1350x768. Feed such 720p and 1080i signals and you won't see much difference - but that's down to the display, not the system. Similarly, a few years ago many cameras, even expensive ones, were not capable of true 1920x1080 - they would give a 1080 signal, but effectively one upscaled from a lower definition starting point.
All that has changed now. Cameras with true 1920x1080 resolution are commonplace now, even at relatively low price points, and screens with 1920x1080 resolution are now the norm. Use such for a comparison and the difference is far from small, believe me.
While 720P is a smaller resolution, it provides full frames; 1080i has more resolution but only half pictures. Some differences are only evident with certain types of source material.
As far as 1080i goes, then in the horizontal direction it has significantly better resolution, period. Vertically, it's more complex and yes, will depend on movement and how the camera signal is derived. Engineers refer to a "Kell factor" which may be seen as how much interlace scanning reduces the vertical resolution compared to the number of lines/frame. Typically, 1080i is regarded as having a resolution around 750-800lpph - so better than a 720p system, but not by as much as may be first thought.
Personal preference is also a factor. Some people simply hate interlaced video because it looks like TV, while progressive looks a lot like film. Sure, 1080P is the best of the bunch, but .......
If people hate interlace (because it looks like TV) then what they are disliking is the motion rendition - the smooth motion given by 50 images per second. When you say that "progressive looks a lot like film" I assume you are referring to 24p or 25p, and the "film look" is the jerkiness that the lower image rate gives?
But 720p systems are normally 720p/50 or 720p/60 - and as such have the same fluid motion appearance as 1080i. Casual viewing wouldn't be able to tell 1080i/25 and 720p/50 apart in the sense of "looks like TV" or "looks like film" - they are both showing 50 images per second, each captured 1/50s apart. Conversely, 1080p/25 will have the "film look" to motion - and may be carried over the 1080i transmission system as 1080psf/25. Which is exactly how most drama type shows are transmitted on European HD channels. Whilst the same transmission system is used for true 1080i/25 for such as sport or news etc programming.
All this was gone into in great depth some years ago by the BBC research department - see http://downloads.bbc.co.uk/rd/pubs/whp/whp-pdf-files/WHP092.pdf . The underlying science is very sound - but the conclusion was out of date almost before it was published though. They hadn't anticipated just how big affordable screens would become very quickly, and based a "720p is good enough" recommendation on 37" screens being the norm for quite a while to come. In practice, 42" became the figure to go by, possibly bigger still, hence the same research points to a 1080 system. (As the BBC did indeed adopt.)
Darryl Jensen January 28th, 2013, 01:31 PM I like to shoot in 1080p and final edit to 720p. That allows me to recompose, zoom, tilt and pan in post-production without losing resolution. Seems to work for me. I agree 'p' looks more natural than 'i'.
Tim Polster January 29th, 2013, 12:47 AM Why would you shoot in 720p/60p/30p/24p instead of 1080p I have read somewhere that 720p can be a very good looking spec?
Leon
You have to separate 1080p & 1080i. If you are going to Blu-ray you only have the choice of 1080p24 or 1080i60. So 1080p30 & 1080p60 are not in the delivery spec. They will be interpreted to another format upon playback. (I have heard some players will play 1080p60, but nothing to count on for consumer delivery)
So one would choose 720p60 to get 60p as it is the only way to get 60 progressive frames and deliver in it's true state.
Also, bit for bit 720p gets less compression at the recording stage than 1080p. If you have a 50mbps bucket 1080p will fill it up faster than 720p. In my view, 720p mode looks more refined than the 1080p modes on the sub $15,000 cameras. Especially on the sub $6,000 cameras. I often shoot in 720p60 and it looks great.
Ervin Farkas January 29th, 2013, 11:31 PM Here's a solid article to document my point: on your average TV, there is no visible difference between 720P and 1080i.
Read the numbers.
Why 4K TVs are stupid | TV and Home Theater - CNET Reviews (http://reviews.cnet.com/8301-33199_7-57366319-221/why-4k-tvs-are-stupid/)
A short quote: "A 720p, 50-inch TV has pixels roughly 0.034 inch wide. As in, at a distance of 10 feet, even 720p TVs have pixels too small for your eye to see. That's right, at 10 feet, your eye can't resolve the difference between otherwise identical 1080p and 720p televisions."
Read the entire article... fascinating numbers about the physics of the human eye.
Just to clarify, I am not talking about studio monitors or projectors.
Jonnie Lewis February 7th, 2013, 07:08 AM Thanks for all the replies on this. Apologies that I've not replied sooner.
It's certainly a complicated topic and I appreciate all the detailed responses.
Most televisions have settings of 1080p and 1080i to select from. If broadcasters are sending signals in 1080i, what is the television doing when I set it to 1080p? Presumably it doesn't do anything and it's reserved for things like BluRay players?
Similarly, my friend has his PC hooked up to his TV for use as a monitor. Despite living in the UK, I noticed that his TV was displaying in 60HZ. I'm assuming this is because PCs work differently from broadcast signals and it probably has a frame rate of 30fps+?
I think I can feel my brain melting - does anyone know of any good books that I could pick up regarding the different video formats and their uses? I think it'd be nice to have as a reference - reading posts across various different forums often gets me more confused! So much terminology...
Thanks again.
Sareesh Sudhakaran February 8th, 2013, 05:47 AM Most televisions have settings of 1080p and 1080i to select from. If broadcasters are sending signals in 1080i, what is the television doing when I set it to 1080p? Presumably it doesn't do anything and it's reserved for things like BluRay players?
It depends on the panel. My panel changes automatically depending on the input. Some panels upsample SD to make it bearable (mine does). Whatever the case, I have manually set my television to show 1080p60, but obviously it has a mind of its own. When I play games, it shows 1080p60. When I watch PAL DVDs, it upsamples SD and displays 720p50.
Similarly, my friend has his PC hooked up to his TV for use as a monitor. Despite living in the UK, I noticed that his TV was displaying in 60HZ. I'm assuming this is because PCs work differently from broadcast signals and it probably has a frame rate of 30fps+?
This is similar to what I have, except my computer is an HTPC. Same thing, really. Modern panels mostly go up to 60p, and the higher end ones go to 120 Hz (120p) - which 3D gamers use.
I think I can feel my brain melting - does anyone know of any good books that I could pick up regarding the different video formats and their uses? I think it'd be nice to have as a reference - reading posts across various different forums often gets me more confused! So much terminology...
Thanks again.
You can start by reading my blog! To answer your previous questions:
I've read that TV stations only broadcast in interlaced, so what happens if I give them a 1080p file?
You can't. The BBC, e.g., only officially accepts interlaced masters. Of course, if they wanted it, they will accept progressive. I've read they are going to shoot their new mega-series in 4K, for which there is no interlacing standard.
Where is it decided whether an image is interlaced or progressive?
It can be 'decided' in many places. The only way to know for sure is to study each signal as it passes a device for changes. Professional devices are used because they claim to tell you like it is, though in reality some compromises are always made.
Also, how does the resolution relate to the size of a screen? If I was to buy a 60" TV, surely a 1920x1080 image would have to be stretched up to fit it and it would look crap?
Yes, it would. At the same resolution, the bigger the TV, the further back you have to go, but after a point you lose the advantage of resolution. The same happens when you get closer. Every size-resolution combination has a zone of best 'viewability'. You can read What is 4K Television? | wolfcrow (http://wolfcrow.com/blog/what-is-4k-television/) for starters. Then I recommend reading the THX specifications for cinema, which is written lucidly enough. A little trigonometry, and you'll be gold.
Would there be any adverse effect on the quality of the footage if I was to film in 1080p, and place it into a 720p sequence within Premiere Pro? It would obviously have to be shrunk - would that have a negative effect on image quality?
That depends on what you call 'image quality'. Will it make that beautiful flower worse? No, but it will lose some resolution. Most people wouldn't notice, simply because most people don't sit at the right distance from their television sets. This is one reason why standard definition is dying a slow death (among other important factors like lack of bandwidth, money, etc).
Also, 1080p is compressed more than 720p, even on the web. This 'extra' compression reduces perceptible resolution. It's all about perception.
If so... How is it that 1080p looks just as good on a 42" television as it does on a 27" television? Surely the 27" television has to shrink the image to fit it on the screen?
No, the resolution is the same. Each pixel is a box. The number of boxes remain the same. The size of the box changes. In fact, all things being equal, a 27" panel at 1080p has a higher resolution than a 40" panel at 1080p, and is a better television. Simple test, walk into an electronics store and view them side by side. See which ones let you get closer, and at what point they look the same.
No rocket science, just basic trigonometry. Hope this helps.
Jonnie Lewis February 11th, 2013, 03:29 AM That was a wonderful answer, Sareesh. Really appreciate it!
I'll check out your blog now and add it to my RSS feeds :)
David Heath February 13th, 2013, 11:50 AM Here's a solid article to document my point: on your average TV, there is no visible difference between 720P and 1080i.
Read the numbers.
Why 4K TVs are stupid | TV and Home Theater - CNET Reviews (http://reviews.cnet.com/8301-33199_7-57366319-221/why-4k-tvs-are-stupid/)
A short quote: "A 720p, 50-inch TV has pixels roughly 0.034 inch wide. As in, at a distance of 10 feet, even 720p TVs have pixels too small for your eye to see. That's right, at 10 feet, your eye can't resolve the difference between otherwise identical 1080p and 720p televisions."
Read the entire article... fascinating numbers about the physics of the human eye.
Sorry, but those numbers are incorrect, and whilst much of the rest of his article is correct, that error leads him to completely false conclusions.
It's easy enough to show. For the 720p, 50" TV he refers to, that will be 50" across the diagonal, so for 16:9 aspect ratio, that gives a height of 30" and a width of 40". (By Pythagorus theorem.) Now if it's 720p, there must be 720 pixels vertically in those 30", or 24 per inch, which equates to a pixel spacing of 0.042" - NOT 0.034".
I agree with his other figures about the limiting resolution of a typical human eye (1 minute of arc, or 0.035" at 10 foot viewing distance), but that means the eye is capable of resolving more detail than a 720p screen is capable of giving at that distance. Redo the figures with 1080p, and 1080 in 30" means 36 per inch, or a spacing of 0.028". So better than human vision.
And that's the maths and science behind why 720p is not good enough for 50" screens at normal viewing distances, whilst 1080p is generally regarded as OK up to about 60".
In ball-park terms, at 10 foot viewing, 720p is normally regarded as good enough up to about 37-40" screens, but you'll see an improvement with 1080 with bigger screens. 1080p should be good enough up to about 50-55" - bigger than that, and there will be a noticeable difference with 4k.
Practically, it's better to oversample, rather than be right on the resolution limit, for reasons so 4k may start paying dividends even around the 50" mark. Certainly the CNET conclusion (".....at 10 feet, your eye can't resolve the difference between otherwise identical 1080p and 720p televisions") is simply not true.
Alister Chapman February 21st, 2013, 03:47 AM You can't. The BBC, e.g., only officially accepts interlaced masters. Of course, if they wanted it, they will accept progressive. I've read they are going to shoot their new mega-series in 4K, for which there is no interlacing standard.
That's a little misleading. The BBC only accepts masters that are recorded/ encoded using an interlaced signal structure. That includes both conventional interlace and PsF. PsF is Progressive wrapped in an interlace frame. PsF is progressive. A huge percentage of BBC HD production is progressive, the same for Discovery and Nat Geo etc. All of the BBC, Nat Geo and Discovery productions that I've worked on in the last 3 years have been progressive.
As web delivery of content by broadcasters becomes more and more significant (and in the future may well overtake traditional broadcasting over the air) less and less interlace will be used.
The majority of HD content is watched on progressive display devices that have to use tricks like bob-deinterlacing. Many productions are edited on computer systems that only have progressive computer monitors for display. I think that interlace is something that will gradually disappear.
Sareesh Sudhakaran February 21st, 2013, 04:15 AM That's a little misleading. The BBC only accepts masters that are recorded/ encoded using an interlaced signal structure.
From the horses mouth (http://www.bbc.co.uk/guidelines/dq/pdf/tv/TechnicalDeliveryStandardsBBCv3.pdf):
All material delivered for UK HD TV transmission must be:
1920 x 1080 pixels in an aspect ratio of 16:9
25 frames per second (50 fields) interlaced - now known as 1080i/25.
colour sub-sampled at a ratio of 4:2:2
The HD format is fully specified in ITU-R BT.709-5 Part 2.
2.1.1 Origination
Material may be originated with either interlaced or progressive scan.
Interlaced and progressive scan material may be mixed within a programme if it is required for editorial
reasons or the nature of the programme requires material from varied sources.
2.1.2 Post-production
Electronically generated moving graphics and effects (such as rollers, DVE moves, wipes, fades and
dissolves) must be generated and added as interlaced to prevent unacceptable judder.
2.1.3 Film motion or ‘film effect’
It is not acceptable to shoot in 1080i/25 and add a film motion effect in post-production. Most High
Definition cameras can capture in either 1080i/25 or 1080p/25. Where film motion is a requirement,
progressive capture is the only acceptable method.
You can shoot progressive or interlaced, but only interlaced masters are officially accepted.
I agree with you about intelacing disappearing. The UHDTV standards make no room for it.
Ron Evans February 21st, 2013, 08:33 AM I wonder when we will actually move to true1920x1080 50/60P since almost all the consumer cameras can now do this as well as my latest GoPro !!! There is a difference and it would make display a lot easier.
Ron Evans
Alister Chapman February 21st, 2013, 02:50 PM Sareesh, Paragraph 2:1:1 Material may be originated with either progressive or interlace scan.
If you shoot progressive and place it in an interlaced stream what do you get? PsF. An interlaced master that contains progressive material is PsF, the content on the tape is still progressive, it hasn't become interlaced, its still progressive, only the single progressive frame is now split into odd and even lines. The temporal motion of the footage is still the same, there is no temporal difference between the fields. The delivery master may well be a tape or file that has 50 fields per second, but if the two fields contain odd and then even lines from material that was originated progressively, then this is PsF.
Most BBC documentary production is progressive, almost all drama is progressive.
Sareesh Sudhakaran February 21st, 2013, 09:29 PM Sareesh, Paragraph 2:1:1 Material may be originated with either progressive or interlace scan.
If you shoot progressive and place it in an interlaced stream what do you get? PsF. An interlaced master that contains progressive material is PsF, the content on the tape is still progressive, it hasn't become interlaced, its still progressive, only the single progressive frame is now split into odd and even lines. The temporal motion of the footage is still the same, there is no temporal difference between the fields. The delivery master may well be a tape or file that has 50 fields per second, but if the two fields contain odd and then even lines from material that was originated progressively, then this is PsF.
Most BBC documentary production is progressive, almost all drama is progressive.
Agreed! Are we disagreeing or agreeing? :)
Alister Chapman February 22nd, 2013, 03:33 AM Agreeing, if we both agree that the BBC do accept productions that are shot, edit and produced in progressive. It is only the delivery file or tape that must use PsF. As PsF is still progressive, then the BBC do accept progressive files, you just can't send them a straight from the camera 25p frame only file.
To me your original statement sounded like you we're saying the BBC don't accept progressive programmes.
Sareesh Sudhakaran February 22nd, 2013, 06:03 AM Not at all. We agree 100%!
Petter Flink February 25th, 2013, 04:43 PM Sorry, but those numbers are incorrect, and whilst much of the rest of his article is correct, that error leads him to completely false conclusions.
It's easy enough to show. For the 720p, 50" TV he refers to, that will be 50" across the diagonal, so for 16:9 aspect ratio, that gives a height of 30" and a width of 40". (By Pythagorus theorem.) Now if it's 720p, there must be 720 pixels vertically in those 30", or 24 per inch, which equates to a pixel spacing of 0.042" - NOT 0.034".
A small mistake in your calc.
A 16:9 ratio display the sides can not be 3:4 with a hypotenuse of 5.
@16:9 ratio the sides will be 24.5" and 43.6" with a diagonal of 50".
And 720 pixels over 24,5" gives a pixel size of 0.034"
I agree with his other figures about the limiting resolution of a typical human eye (1 minute of arc, or 0.035" at 10 foot viewing distance), but that means the eye is capable of resolving more detail than a 720p screen is capable of giving at that distance. Redo the figures with 1080p, and 1080 in 30" means 36 per inch, or a spacing of 0.028". So better than human vision.
And that's the maths and science behind why 720p is not good enough for 50" screens at normal viewing distances, whilst 1080p is generally regarded as OK up to about 60".
In ball-park terms, at 10 foot viewing, 720p is normally regarded as good enough up to about 37-40" screens, but you'll see an improvement with 1080 with bigger screens. 1080p should be good enough up to about 50-55" - bigger than that, and there will be a noticeable difference with 4k.
Practically, it's better to oversample, rather than be right on the resolution limit, for reasons so 4k may start paying dividends even around the 50" mark. Certainly the CNET conclusion (".....at 10 feet, your eye can't resolve the difference between otherwise identical 1080p and 720p televisions") is simply not true.
David Heath March 3rd, 2013, 12:06 PM A small mistake in your calc.
A 16:9 ratio display the sides can not be 3:4 with a hypotenuse of 5.
@16:9 ratio the sides will be 24.5" and 43.6" with a diagonal of 50".
And 720 pixels over 24,5" gives a pixel size of 0.034"
Hmmm. The only thing I can disagree with there is the word "small"! :-) Having had it pointed out, I can only say I'm guilty as charged, and it's a pretty big mistake by me! My apologies to all.
If I have any excuse, it's that my (incorrect) working seemed to agree with conclusions that have been generally found in practice, so maybe I didn't check the figures as closely as if they had seemed to show something surprising. As illustration, the best research I've come across was done a while back by BBC R&D and is online at http://downloads.bbc.co.uk/rd/pubs/whp/whp-pdf-files/WHP092.pdf .
Then, they came to the conclusion that 720p should be good enough for general broadcast - but were basing that on the assumption that screen sizes for the home were likely to be around 37" typically, up to about 42" max. In practice, that was very pessimistic, home screens are often larger than 42" now and the conclusion was out of date almost before the document was published (though not the test results).
Look at figure 9 and it shows that up to a 40" screen only 15% of observers could see a benefit to 1080 over 720. That increases to nearly 30% for a screen size of 42" and nearly 50% for a 50" screen.
Alister Chapman March 4th, 2013, 04:18 PM I was lucky enough to spend a weekend using an 84" 4K Sony Bravia to show and demo some 4K footage and HD footage.
Up close (within 4m) the difference between the 4K and HD was clear to see, but further away the difference became harder to see. 4m is pretty close to such a big screen.
However, even when at the back of the room, about 8m to 10m away there was something incredibly "real" about the 4K image that wasn't there with HD material. I would not say the footage looked sharper, or that I could see more detail, but many in the room could sense something different about the footage when it was shown as 4K compared to HD. It really was like looking out of a window. Perhaps it was just the scale, but I've never had such a distinct sensation of looking out of a window with 4K projection. I get to play with the TV again next week at CabSat.
|
|