|
|||||||||
|
Thread Tools | Search this Thread |
April 21st, 2006, 01:37 PM | #1 |
Trustee
Join Date: May 2005
Location: Saint Cloud, Florida
Posts: 1,043
|
1080p a joke?
I came across this article today. Some intersting commentary about the uselessness of 1080p. Whatdoyouthink?
http://theinquirer.net/?article=31089
__________________
www.facebook.com/projectspecto |
April 21st, 2006, 02:29 PM | #2 |
Major Player
Join Date: Oct 2003
Location: Portland OR
Posts: 227
|
Putnam may have CRT blinders on...
I imagine that virtually all 1080i/p sets sold henceforth will be LCD or Plasma panels which will use a 60Hz refresh (or 50Hz if the media demands). I also believe that all interlacing means is that each 60th of a second the odd or even rows will be updated. Sony's proof of concept BD-ROM I recall was done in 24p and Blu-Ray players will be expected to add frames to get to 30i or 30p depending on how the player is set (or it discovers from the HDMI dialog). I believe his comments on horizontal scan rates are entirely CRT centric.
|
April 21st, 2006, 03:43 PM | #3 |
Major Player
Join Date: Sep 2004
Location: Phoenix, AZ
Posts: 493
|
Hey, 1080p makes sense. LCD's and plasmas are progressive devices, so displaying an interlaced format requires some voodoo. You want a progressive format... but...
There is no such thing as 1080p right now. It's 720p or 1080i. If you had a native 1080p display, you're either upscaling 720p, or field doubling 1080i. So, what's the point? Hey, the 1080p display may be compatible with whatever standard emerges for 1080p... or it may not. It's hard to tell, and may not support the refresh rate required at that time. 1080i is for interlaced displays (tubes). The big mistake forging HDTV with TWO "standards," and that either one of them was INTERLACED. Pointless, considering progressive video is backward-compatible with interlaced displays, but interlaced video is not forward-compatible with progressive displays without a lot of degrading manipulation. Of course, those decisions were set in stone a long time ago... if you ask me, 1080i is the joke.
__________________
Owner/Operator, 727 Records Co-Founder, Matter of Chance Productions Blogger, Try Avoidance |
April 21st, 2006, 04:08 PM | #4 |
Trustee
Join Date: May 2005
Location: Saint Cloud, Florida
Posts: 1,043
|
Yeah I never quite understood that either.
__________________
www.facebook.com/projectspecto |
April 21st, 2006, 05:00 PM | #5 | |
Inner Circle
Join Date: Jan 2006
Posts: 2,699
|
Quote:
Briefly, the origination is true 1080p/25, true progressive and full 1080 resolution, though with the motion rendition of film. That is then recorded/delivered in an interlace compatible form, for full compatibility with a true interlace signal, whilst preserving the true progressive nature. For a true progressive frame, the lines of a frame may be numbered and transmitted 1,2,3,4,5,6.......1078,1079,1080. Converted to psf, these identical lines are then transmitted 1,3,5......1077,1079 (new field) 2,4,6....1078,1080. Conversion between the two is obviously transparent - a matter of reordering lines, NOT involving any scaling or destructive manipulation. 1080p/50 should be the ultimate goal. 1080psf/25 and 1080i/25 in the meantime seems a sensible interim (chosen according to subject, drama or sport, say) - transmission wise they are equivalent. |
|
April 21st, 2006, 09:12 PM | #6 |
Major Player
Join Date: Apr 2006
Location: UK
Posts: 204
|
Joshua,
Agreed, 1080i is an abomination. Interlacing is an analog trick used to reduce bandwidth for a given static resolution, it has no place in a compressed digital world. You can compress a progressive frame so much more easily in a given amount of data than two fields of it seperatly and for MPEG, with picture frames and movement frames it makes even less sense. |
April 21st, 2006, 09:25 PM | #7 |
Major Player
Join Date: Jan 2006
Location: Sydney, Australia
Posts: 275
|
Interlaced is no abomination....
Some of us who work in the Australian TV industry use interlaced because that is the medium to which television is delivered... and I dissagree that Progressive is easier and better to compress. With Progressive, you record and store every frame. With interlaced, you only need to record the changes..... However, I can be wrong... But despite that, There is a reason why we have interlaced, there are some major broadcasting companies out there who would dspise Progressive based on the difficult nature it has on their signal and bandwidth. |
April 21st, 2006, 09:48 PM | #8 |
RED Problem Solver
Join Date: Sep 2003
Location: Ottawa, Canada
Posts: 1,365
|
Marvin is correct though. Progressive works with inter-frame compression like MPEG2 much better. You don't need to double the bit rate to get the same quality 1080p60 as 1080i60, plus you'll get a higher rez picture as no need for interlace filtering.
Interlace is poor compression by modern standards, and should be resigned to the dustbin of history. Graeme
__________________
www.nattress.com - filters for FCP |
April 21st, 2006, 10:44 PM | #9 |
Major Player
Join Date: Apr 2006
Location: UK
Posts: 204
|
Thanks Graeme but I'd go furthur,
"Interlace is poor compression by modern standards" Its throwing away half the picture information to obtain half the data rate, and in that sense its not really compression at all. Any compression has better performance. Leo, "Some of us who work in the Australian TV industry use interlaced because that is the medium to which television is delivered..." This is the same as arguing that since most of a channels input is delivered in standard def now, they should not bother moving to high def ever. Either you want to improve things, or you are happy with stasis. "With Progressive, you record and store every frame. With interlaced, you only leend to record the changes." This is nonsense. "There is a reason why we have interlaced, there are some major broadcasting companies out there who would dspise Progressive based on the difficult nature it has on their signal and bandwidth." You are missing the point, there is no reason why a progressive channel would need to take more bandwidth. 1080p24, 1080p25 or 1080p30 could be broadcast in the same space as 1080i48/50/60 and then displayed on a TV with a duplicate frame, the same thing as we see at the cinema. Fast moving images at worst could be broadcast at 540p and *displayed* at scaled up to 1080p, which makes more sense than displaying an interlaced 1080i signal at 50 fields a second. This would take the same bandwidth using the same compression system as interlaced. But going to the root of the compression system, there is no reason to only present the codec with 540 lines of progressive information, as 1080i is doing. More advanced compression system should deliver improved practical resolution given a 1080p input, even at 50 or 60 fps than a 1080i system can deliver in the same bandwidth. 1080i is throwing away visual information the CODECs never get a chance to compress. With a clever compression system there is no reason why a 1080p50/60 signal could not be compressed to exceed the performance of a 720p broadcast in exactly the same bandwidth, with nothing more than downsampling you can get a 720p50 picture from 1080p50 and then compress that to get potentially the same quality as a 720p50 broadcast, so a more advanced compression system should have no problems exeeding the quality (however slightly) of 720p50 in the same bandwidth given the 1080p50 image. The thing that bothers me the most is that while computer technology, silicon fabrication methods, HD storage doubles in capability every 18months or so, TV is essentially doing the headless chicken dance due to legacy formats and snake oil. We had 1000 line TV being broadcast in the 1940's! Ok, it was 4:3 (As far as I am aware) we are now 60 years on. In that time computers have gone from ENIAC type buildings full of valves being clocked in the kilohertz range, to 64bit CPUs in the region of 5GHz, with total computation in the ballpark of one thousand billion times faster, and the cutting edge of TV is still 1000 interlaced lines. Ok, as far as the end user is concerned, its now in colour and in 16:9 but TV technology seems to move with glacial speed and if the past is anything to go by we'll be stuck with whatever finally emerges as the dominant broadcast method (In the UK its looking to be 1080i even for sport) for the 40 years it took to replace the last system. The people that shape TV technology seem to lack vision. Its considered ok to produce something only fractionally better than the current system even though it will likley take decades before the next standard comes along. While most people in the industry recognise 1080p is the hottest thing since microwaved apple pie the TV manufacturers have responded by covering their products performance with lies ("Ready for 1080p" = The sony lie - the bravia has no 1080p input method). In a few years when 1080p capability either in 24/25/30 or 50/60 becomes the standard for optical sensors intended for HiDef cameras, and the data rates become feasable with off the shelf PC hardware the broadcast industry will be hamstrung by the limitations of the current interlaced format in its standards, its equipment and in the homes of the consumers. |
April 22nd, 2006, 06:51 AM | #10 |
RED Problem Solver
Join Date: Sep 2003
Location: Ottawa, Canada
Posts: 1,365
|
I certainly see what you're getting at. For instance, you can get a better picture in the same data rate by instead of doing say, 4:2:2 or 4:2:0 or 4:1:1, you just compress the Cb and Cr more strongly, but leave their resolution at 4:4:4. Again, it's the difference between compressing strongly, and just throwing away.
What people forget about interlace is that it must be strongly vertically filtered to stop flicker, and this kills resolution. Graeme
__________________
www.nattress.com - filters for FCP |
April 22nd, 2006, 01:19 PM | #11 |
Regular Crew
Join Date: Mar 2006
Location: Zephyr Cove, NV & Anchorage, AK
Posts: 82
|
While I agree with all of the technical arguments against interlacing, I have to admit that the end result is not always consistent with the technical theory espoused here.
Consider American Football on CBS (1080 60i) versus Fox (720 60P). In our case we are using an HD Tivo Satellite receiver (DirecTV, NFL Sunday Ticket HD). We watch the games on our post-production projector (NEC iS8-2K) which is actually a 2048x1080 DLP Digital Cinema projector, projecting on a 16' Stewart Films GrayHawk screen. The CBS games (1080 60i) look infinitely better than the Fox games (720 60P). The 720P games look blocky and low rez, while the 1080i games are great. I don't see any more strange motion artifacts on CBS than Fox (there are plenty on both). If anything Fox has more annoying motion artifacts. In the early days Fox had better slow-mo than CBS, but now I can't tell the difference (except that CBS has higher rez). We are running the 1080i from the satellite receiver through a high-end scalar (which de-interlaces among other things) and sending 1080 60P to the projector. Maybe that helps. Of course the scalar also does the best job it can up-rezzing 720P to 1080P. Perhaps Fox is taking other short cuts that reduce picture quality, but at least for American football I will certainly take 1080 60i over anything else currently being broadcast. Of course I would love 1080 60P broadcasts (or 4K 444 60P!), and agree with the comments about the slow TV industry. Of course, it is nothing compared to the slow film industry (like 24P is somehow a virtue!). Another error in the article cited at the top of this thread. He says "there are no off-the-shelf broadcast cameras that can handle 1080p/60", which is not true. Sony's standard (and very successful) HDTV cameras (HDC-1000 & HDC-1500) do in fact support 1080 60P, although I am doubt if any major broadcaster is using them that way, except maybe for slo-mo effects. |
April 22nd, 2006, 01:35 PM | #12 |
Inner Circle
Join Date: Mar 2003
Location: Ottawa, Ontario, Canada
Posts: 4,220
|
Greame, can you explain what you mean by:-
"What people forget about interlace is that it must be strongly vertically filtered to stop flicker, and this kills resolution." I may be incorrect in assuming that interlace signals and CRT displays are really part of a system. The combination of our eyes/brain and the CRT fools us into thinking that the image is detailed, lacks flicker etc. Inputing this interlaced image onto a progressive display results in having to manipulate the image into a progressive format. The electronics involved in doing this governs how well the picture looks but is likely to be worse than on a CRT. With a progressive image its possible to take a frame and analyse quality etc. Grabing a field from an interlaced image is only half the vertical detail and yes the other half is thrown away for this scan ( and the alternate half for the next scan) and as such trying to grab a frame from an interlaced image is more a measure of how the deinterlacing algorith performs than the image itself!!! I think the downside at the moment is that for broadcast TV a 4x3 CRT will likely give the best picture and for watching DVD's a 16x9 progressive will be the best. Means one really needs two TV's!!!! Since I really dislike the stutter of the film look I can't wait for 1080p60 myself if I am still around if and when this happens. Ron Evans |
April 22nd, 2006, 01:51 PM | #13 | |
Inner Circle
Join Date: Jan 2006
Posts: 2,699
|
Quote:
Assuming your observations are accurate, it is conceivable that the differences are caused by other factors, and the interlace/progressive difference between adds little to which looks better, one way or the other. |
|
April 22nd, 2006, 02:16 PM | #14 |
Regular Crew
Join Date: Mar 2006
Location: Zephyr Cove, NV & Anchorage, AK
Posts: 82
|
David I agree that there are many other factors besides progressive versus interlace. The whole chain as deployed in the real (economic) environment is what matters. Now there are rumours that DirecTV is going to be down-rezzing everything to get more channels out of their satellites. Oh well.
|
April 22nd, 2006, 04:10 PM | #15 |
RED Problem Solver
Join Date: Sep 2003
Location: Ottawa, Canada
Posts: 1,365
|
Well, I have some FOX native DVCproHD 720p60 on my system, from when they were using Film Effects, and it's by no means blocky or low rez. It's very nice indeed. You can't really compare how two totally different broadcasters do things, because there are too many variables.
One may look better in your home, but because you don't know the route it took to get to your TV, and also, how your TV handles progressive and interlaced, you can't compare. I understand, that you want to say, "but I have to live with the practical reality of how it looks in my home", but that's not in doubt. What is the issue is showing how a broadcaster could do better by ditching ancient interlace for better, more modern technology. As for interlace filtering: When the camera captures the image from the CCD, the vertical resolution of each field is filtered down to about 70% of the full vertical resolution to stop interlace interline twitter artifacts. That means, a 1080i system doesn't have a vertical resolution of around 1080, but more around 760 lines. So, each field has about 380 lines resolution, not 540. A true progressive system of 720p has a vertical resolution of around 720 lines, and a 1080p system around 1080 lines - that's quite a bit more resolution than the equivalent interlaced system. Graeme
__________________
www.nattress.com - filters for FCP |
| ||||||
|
|