View Full Version : 1080i vs 1080p vs 720p


Pages : [1] 2

Steve Mullen
January 18th, 2007, 07:46 PM
It's a lot more than just 'reality shows'. You can add to that most HD documentaries & theme shows (virtually everything on HDNet), nature shows (Discovery HD, National Geographic HD etc.), talk shows such as Leno, Leterman....I could go on.

Nope, interlaced HD is far from going away. I myself much prefer the look of reality that 60i brings to the screen. To my eyes there's nothing like looking at an HDTV and feeling as if you're looking through a window. Progressive just doesn't do that for me. Each to his own. :)

While you are correct that there is no 30p video being used for HDTV -- you are simply wrong about 60i. The best HD is 60p which is why FOX, ABC, DOD, and NASA use it. These folks care about perfect motion rendition which is why the use 60p. This is why the EBU recommended only 50p for Europe.

Douglas Spotted Eagle
January 18th, 2007, 07:55 PM
While you are correct that there is no 30p video being used for HDTV -- you are simply wrong about 60i. The best HD is 60p which is why FOX, ABC, DOD, and NASA use it. These folks care about perfect motion rendition which is why the use 60p. This is why the EBU recommended only 50p for Europe.

So you're saying Fox and ABC are broadcasting 60p?

Ken Ross
January 18th, 2007, 07:59 PM
I'll disagree with you there Steve. I prefer the full 1920X1080i as opposed to the 720p programming. In fact, most people I've seen comment much prefer football programming on CBS & NBC as opposed to ABC or FOX. You may improve the issue of interlaced artifacts with 720p, but the fact is that with HD, interlaced artifacts are nowhere near as bad as they were with NTSC. Additionally, the vast majority of people simply don't see the difference in motion handling between events on FOX and similar events on CBS & NBC.

The big gain with 1920X1080i is the significantly greater detail. If you don't have a full rez HDTV, you can easily miss that. But to be perfectly honest, I can easily see the difference in detail & sharpness between FOX and CBS. The average show on CBS is significantly sharper than FOX.

By the way, most inside sources I've read have stated that the decision to use 720p was done more for economy than quality. 720p requires less bandwidth than 1080i. Viewer reaction is simply more favorable to 1080i than 720p and I'm sure it's the added sharpness that does it.

The big change will be when/if we ever get 1920X1080p....THAT will be the flat out best.

Thomas Smet
January 18th, 2007, 11:45 PM
I'll disagree with you there Steve. I prefer the full 1920X1080i as opposed to the 720p programming. In fact, most people I've seen comment much prefer football programming on CBS & NBC as opposed to ABC or FOX. You may improve the issue of interlaced artifacts with 720p, but the fact is that with HD, interlaced artifacts are nowhere near as bad as they were with NTSC. Additionally, the vast majority of people simply don't see the difference in motion handling between events on FOX and similar events on CBS & NBC.

The big gain with 1920X1080i is the significantly greater detail. If you don't have a full rez HDTV, you can easily miss that. But to be perfectly honest, I can easily see the difference in detail & sharpness between FOX and CBS. The average show on CBS is significantly sharper than FOX.

By the way, most inside sources I've read have stated that the decision to use 720p was done more for economy than quality. 720p requires less bandwidth than 1080i. Viewer reaction is simply more favorable to 1080i than 720p and I'm sure it's the added sharpness that does it.

The big change will be when/if we ever get 1920X1080p....THAT will be the flat out best.

720p uses almost the same amount of bandwidth as 1080i. Yes 1280x720 is less but it has 60 frames so the bandwidth is also very high. In fact 1080i only uses 1/8th more bandwidth then a 720p stream. This does not make a huge savings.

I don't know how you can say 1080i football games look better. I guess that is your view of it. I recently watched two games one on ABC and one on NBC and I flipped back and forth between the two. The 1080i had a lot of compression artifacts in the grass while the 720p was almost perfect. The levels of detail were very close as well. 1080i may have a slight horizontal advantage of resolution but really to most people they look to have the same amount of detail. My wife even asked me the other day why ABC looks better then NBC. She could care less about this stuff so I have never told her before what the difference was. She however noticed right away that ABC had a cleaner artifact free image while all the other stations can fall apart easily. About the only thing that really does look better on 1080i are still graphics such as the score graphics. They don't even look better to me but slightly sharper. The 720p graphics look perfectly fine to me and in fact in motion graphic opening sequences with particles and flares and a lot of fast moving elements the 720p broadcast almost looked perfect with no artifacts at all.

Steve Mullen
January 19th, 2007, 12:11 AM
So you're saying Fox and ABC are broadcasting 60p?

Yes. And, all NASA and DOD is also 720p60. Obviously, NASA needs max temporal resolution with no interlace artifacts.

The EBU is going to push for only 50p once Sony gets it products to 1080p50. Sony are already shipping switchable 1080i and 1080p studio cameras.

That's why 3ClearVid is so important to Sony. The CMOS and EIP all run at 1080/60p. It's only the recording system that needs to be enhanced. First, comes "Full HD" AVCHD already speced at 1920x1080/60i at 24Mbps. This arrives in 2007. (Not speculation, I have the spec. from Japan.)

Douglas Spotted Eagle
January 19th, 2007, 12:24 AM
Yes. And, all NASA and DOD is also 720p60. Obviously, NASA needs max temporal resolution with no interlace artifacts.

The EBU is going to push for only 50p once Sony gets it products to 1080p50. Sony are already shipping switchable 1080i and 1080p studio cameras.

That's why 3ClearVid is so important to Sony. The CMOS and EIP all run at 1080/60p. It's only the recording system that needs to be enhanced. First, comes "Full HD" AVCHD already speced at 1920x1080/60i at 24Mbps. This arrives in 2007. (Not speculation, I have the spec. from Japan.)

Brian Kitchens, engineer for ABC Western Regional office: "If we're broadcasting 720p60, it's news to me."

Steve Mullen
January 19th, 2007, 12:35 AM
I don't know how you can say 1080i football games look better. I guess that is your view of it. I recently watched two games one on ABC and one on NBC and I flipped back and forth between the two. The 1080i had a lot of compression artifacts in the grass while the 720p was almost perfect. The levels of detail were very close as well. 1080i may have a slight horizontal advantage of resolution but really to most people they look to have the same amount of detail. My wife even asked me the other day why ABC looks better then NBC.

You've stated it very well! Compression artifacts on NBC and CBS as well as all the premium 1080i channels are horrible. And, certainly not worth the very slight increase in resolution of 1080i over 720p. Plus, we know more than 50% of HDTVs fail to deinterlace well -- so their vertical resolution is less than when viewing 720p. (So much for Full HD.)

Oh, yes -- DirecTV cuts 1920 to 1280 and so do some cable companies. Which means there's a good chance folks aren't really seeing any increased resoution with 1080.

Yes, studio cameras offer 1920, but everything shot with HDCAM -- which is the kind of stuff we shoot -- is 1440 verses 1280. That's a nearly invisible difference.

Moreover, 720p60 has room for 1 SD channel. There is no room in 1080i, yet may stations are adding one or even two SD sub-channels. So the desire for profits cuts 1080i quality.

Steve Mullen
January 19th, 2007, 01:21 AM
Brian Kitchens, engineer for ABC Western Regional office "If we're broadcasting 720p60, it's news to me."

Leave it be Spot -- ABC has been 720p60 from day 1. As has FOX and ESPN1 and ESPN2.

"Good HDTV: It's More Than a Numbers Game

by Randy Hoffner
ABC Television Network

Since digital HDTV broadcasting began, we have heard a lot of discourse about the two HDTV scanning formats that are used by broadcasters: 1080i and 720P.

The other HDTV scanning format, 720P, is a progressively-scanned format. Each 720P line is made up of 1,280 pixels, and there are 720 lines in each frame. 720P is typically transmitted at about 60 full frames per second, as opposed to 1080i's 60 half-frames per second. This affords 720P some significant advantages in picture quality over 1080i, advantages such as improved motion rendition and freedom from interlace artifacts.

The advocates of 1080i HDTV support their cause with a flurry of numbers: 1080 lines, 1920 pixels per line, 2 million pixels per frame. The numbers, however, don't tell the whole story. If we multiply 1920 pixels per line times 1080 lines, we find that each 1080i frame is composed of about two million pixels. 1080i advocates are quick to point out that a 720P frame, at 1280 pixels by 720 lines, is composed of about one million pixels. They usually fail to mention that during the time that 1080i has constructed a single frame of two million pixels, about 1/30 second, 720P has constructed two complete frames, which is also about two million pixels. In fact, if the horizontal pixel count of 1080i is reduced to 1440, as is done in some encoders to reduce the volume of artifacts generated when compressing 1080i, the 1080i pixel count per second is less than that of 720P.

Another parameter 1080i advocates use to advance their cause is resolution. Resolution is the ability to preserve the separate components of fine detail in a picture, so that they may be discerned by the viewer. But picture quality is not dependent on resolution alone. Numerous studies of perceived picture quality reveal that it is dependent on brightness, color reproduction, contrast, and resolution. Color reproduction is identical in all HDTV scanning formats, and may thus be disregarded as a factor. A typical study assigns the following weights to brightness, contrast, and resolution:

Contrast 64%

Resolution 21%

Brightness 15%

Resolution, then, is only a factor, and not the largest factor, in the determination of the subjective quality of a television picture.

Television pictures move, so when we consider resolution, dynamic resolution is typically a more important factor than static resolution. Similarly, a moving 1080i picture may have its vertical resolution reduced to around 540 lines. Thus, the real vertical resolution of a 1080i picture dynamically varies between the limits of almost 1080 lines and almost 540 lines, depending on the degree and speed of motion. This resolution degeneration in interlaced scanning has been well known for many years, and its degree is quantified by application of the interlace factor, which effectively specifies dynamic vertical resolution as a percentage of the total number of lines in an interlaced frame. Progressive scanning does not have this problem, and the dynamic vertical resolution of a 720P picture is very close to 720 lines under any conditions of motion.

Results of testing done by the Japanese broadcaster NHK in the early 1980's indicate that picture quality achieved with interlacing is nearly equivalent to that achieved from progressive scanning with only 60 percent of the number of scanning lines, which is an interlace factor of 0.60. This finding agrees with the 1967 study, and also with another study that was published back in 1958. What this means to the HDTV viewer is that the vertical resolution of any HDTV pictures that have a vertical motion component is better in 720P than in 1080i. Based on the above findings, progressively-scanned images equivalent to the observed dynamic vertical resolution of 1080i may be achieved using only 648 lines. If we want to play a numbers game, 720P has better dynamic vertical resolution than 1080i by 72 lines.

Horizontal motion also causes artifacts when interlaced scanning is used. Depending on its speed, horizontal motion in interlaced scanning generates distortions that range from serrated edges, through blurriness, to double images in the extreme case.

But wait, there's more! The resolution impairments of interlace, plus the fact that progressive scanning affords far better motion rendition than interlaced scanning, make it apparent that a football game, for example, would be much more enjoyable in 720P than in 1080i. Add to this its freedom from other well-known interlace artifacts such as visibility of scanning lines, line crawl, and flickering aliases, and it quickly becomes clear that 720P is equal to, if not better than, 1080i in the representation of real-world, moving television images.

We have seen that interlaced scanning was born as a compromise to conserve analog bandwidth; a compromise that results in picture impairments and artifacts. A DTV broadcast is limited not by analog bandwidth but by digital bandwidth: the critical limitation is on the number of digital bits per second that may be transmitted. In order to broadcast DTV pictures, their bit rate must be aggressively reduced by digital compression to fit within the broadcast channel or pipeline that is available. The digital bits representing HDTV pictures must be compressed by a ratio that averages around 70 to 1 in order to fit into the 19 megabit-per-second DTV transmission channel. This creates a "funnel effect": for each 70 bits that enter the funnel's large end, only a single bit passes through the small end of the funnel into the transmission channel. Digital compression technology is improving rapidly, but it has been consistently observed that 720P HDTV pictures may be compressed much more aggressively than 1080i pictures before they become visually unacceptable. In fact, compression of 1080i pictures routinely generates visible artifacts, particularly when the pictures contain fast motion or fades to or from black. These artifacts cause the picture to degenerate into a blocky, fuzzy, mosaic, that may be observed frequently in 1080i broadcasts. The stress level to the HDTV broadcast system caused by bit rate reduction is much lower for 720P, and blockiness artifacts are seldom observed in 720P broadcast pictures. It may be expected that 720P will always lead 1080i in compressibility and freedom from compression artifacts, because progressive scanning is by its nature superior in the area of motion estimation. This gives it a "coding gain" relative to interlaced scanning, and the result will always be delivery of the same picture quality at a lower bit rate.

We saw previously that the real vertical resolution of 720P pictures is better than that of 1080i pictures. It is also true that the additional horizontal resolution that 1080i boasts cannot be displayed on any currently available consumer HDTV display of any technology. Fortunately for the viewer, it is not necessary to the enjoyment of HDTV. An instructive illustration is the much-admired digital cinema, where micromirror projectors are used to project theatrical features onto screens that may be 50 feet or more wide. The horizontal resolution capability of these projectors is 1280 pixels, the same as that of 720P, and we have not heard anyone complain that digital cinema has inadequate horizontal resolution.

All these advanced displays are inherently scanned progressively, and 720P may be displayed on all of them without the potentially image-degrading de-interlacing step.

AND MORE:

"What production format does FSN (FOX) HD use?
FSN HD broadcasts games in the 720p format. The 720p format presents a picture with 720 vertical pixels and 1280 horizontal pixels. The 'p' stands for progressive scanning, which takes 60 sharp, complete pictures per second, producing spectacular moving images. In fact, because of its ability to render ultra-sharp pictures of subjects in fast motion, 720p provides excellent resolution for HD sports programming.

AND MORE:

"ESPN HD, launched March 30th, 2003 is a high-definition simulcast of the cable television network ESPN, both owned by Disney that broadcasts 24 hours a day, 7 days a week. It uses the 720p standard because the progressive 'p' nature of that signal is thought by some to be better for the fast fluid motion seen in sports."


AND MORE:

The European Broadcasting Union (EBU) recommends to its members to use 720p50 for emission with the possibility of 1080i50 on a programme-by-programme choice and 1080p50 as a future option. Sveriges television in Sweden broadcasts in 720p50.

Case closed.

Ken Hodson
January 19th, 2007, 03:12 AM
Archaic, is archaic. Should Sony make an only interlace Blue-ray? Why not? Why is all digital media inherently progressive? Hmm.
I don't see any modern TV of quality being interlaced, so why?
Film is progressive. Digital is progressive, so lets convert it to interlaced? Please, help my logic.
Are PC's digital output interlaced? Why?
Is my LCD monitor interlaced?
Is my Plasma interlaced?
Is my projector interlaced?
What single part of the future is interlaced?(besides bandwidth starved HDTV)
Tell me how to buy interlaced, please!!
Who's content is derived from all sources including film(progressive).
Everyone is buying progressive. Why is the capture cams so slow on the realization? Huh? Of course, marketing!!!!

Ken Ross
January 20th, 2007, 03:50 PM
Actually I'm pretty sure National Geographic does 30P. They shoot on Varicam and deliver 30P for US. I know someone who works there. My guess is the Discovery is the same. Just because it is delivered in 1080i doesn't mean it was shot in 1080i. You can show a progressive scan image on an interlaced format (just not the other way around) To me 60i looks too live. But that's an aesthetic opinion of which there are many.

I think it's not necessarily progressive that you don't like, but less temporal detail of a 30P image. 60P will replace 60i at some point. Which will give the same window effect, but not have the problems that interlaced introduces.

Brett, in actuality both Discovery HD & NG HD use both 30p and 60i. Some shows are shot one way and others another way. But it's very easy to tell which is which while watching. However, I certainly do prefer the look of 60i...to me it's the greatest window effect on TV.

Since I haven't seen 60p, I'll reserve judgement on that. I'm sure it's a very subjective thing.

Alex Leith
January 20th, 2007, 04:00 PM
At the same resolution 60p would presumably look even MORE realistic than 60i, given that interlace only gives you half the vertical resolution every 60th whereas progressive gives you the full raster.

720p60 and 1080i30 give you approximately the same number of "fresh" pixels each second.

I know the BBC are suggesting producing in 1080p25 and 1080i50 depending on the nature of the show.

Personally I don't think "i" has too much life left, given that television audiences are getting smaller, and major broadcasters are likely to switch distribution increasingly towards on-demand, internet based distribution, and that most HD flat panel televisions can't even display interlaced pictures without deinterlacing them first anyway!

Douglas Spotted Eagle
January 21st, 2007, 12:32 AM
"i" has a *lot* of life left. A lot of life. It's not going anywhere soon. Psf will be more likely to take its place, as the two can work well together.
But either way, with all the broadcast support in place for a while to come, "I" broadcasts will be with us. Web is but a *very* small destination for watching content, and until televisions and computers converge more conveniently, they'll be separate.
I wish that wasn't so, but it is...
Acquiring in 'i' might not have such a future left, however...

Steve Mullen
January 21st, 2007, 02:03 AM
Actually I'm pretty sure National Geographic does 30P. They shoot on Varicam and deliver 30P for US. I know someone who works there. My guess is the Discovery is the same. Just because it is delivered in 1080i doesn't mean it was shot in 1080i.

I'm not sure where this idea of ANYONE using 30p for broadcast. They don't! It would look horrible. Which is why FOX gave up the idea of going DTV using 480p30.

This idea even appeared in a Panasonic PR. I caught the error. He is their reply:

`We queried Craig Piligian, and indeed he clarified what's in the story, to wit:'


"We used to shoot Chopper with the Varicams at 720p/60. Since the Varicam is a variable frame rate camera, it was more than we needed in that respect, but it did not shoot 1080i.

When Panasonic came out with the HDX900, we found it to be the perfect combination of a great HD picture
at an affordable price. Since we deliver a 1080i master, we decided to shoot at 1080i to avoid converting and maintain a clean path.

We are now shooting at 1080i/60 with the HDX900 and are very pleased with the picture quality, ease of use, and affordability."

`So, eagle eye, you're right on the money! All the best, P.'

--------


"Since I haven't seen 60p, I'll reserve judgement on that. I'm sure it's a very subjective thing."

If you watch ABC, ESPN, and FOX -- you have only been watching 60p.

-----------------

There are only two ATSC formats used in the USA and their proper designations are:

1080/60i or 1080i60 -- there is no such thing as 1080i30.

720/60p or 720p60

24fps movies are sent with 2-3 pulldown for both 1080/60i and 720/60p.

Some are thinking of transmitting film at 1080/24p as it will fit in a 6MHz channel. This would support displays than can use 24p, 48p, or 72p.

---------------

The BBC is not going to use 1080p25 for live video -- only for movies that are 25p. Live will be 1080i50 until it can go to 1080p50.

----------------

DSE seems to have removed my quotes from ABC, ESPN, and FOX that confirmed their use of 720p60.

Tony Tremble
January 21st, 2007, 04:06 AM
what has I vs P got to do with V1 vs XH-A1?

Steve you are very quick to tell other people to stay on topic. Perhaps you should take a piece of your own advice.

Thanks

TT

Alex Leith
January 21st, 2007, 04:41 AM
The BBC is not going to use 1080p25 for live video -- only for movies that are 25p. Live will be 1080i50 until it can go to 1080p50.


That's not entirely true. Whilst delivery is still going to be 1080i50 the BBC are recommending that any programming that would benefit from a film "look" are acquired as 25p.

More programming in the UK is p25 (25psf) than i50. Films, dramas, soaps, documentaries, sitcoms, even gameshows.

Web is but a *very* small destination for watching content, and until televisions and computers converge more conveniently, they'll be separate.

You're right, but I don't think computers and televisions will merge 'cause we use them in different contexts. However, content delivery systems are already merging, and the technology that plays downloaded content on your TV (like Apple TV, etc) is going to quickly become much easier (and more integrated) into home entertainment systems.

We spend more time on the internet than watching TV each week, and broadcasters want to find new ways of getting our attention because they want the revenues.

(sorry Tony - this is still off topic... but I just can't help myself)!

Steve Mullen
January 21st, 2007, 05:12 AM
That's not entirely true. Whilst delivery is still going to be 1080i50 the BBC are recommending that any programming that would benefit from a film "look" are acquired as 25p.

More programming in the UK is p25 (25psf) than i50. Films, dramas, soaps, documentaries, sitcoms, even gameshows.


Of course film and episodic programming is 25p as it is 24p here. But, I can tell you there are NO HD soaps, taped documentaries, sitcoms, gameshows, news, or sports done in 24p here or anywhere else in the world. This must be something British. :)

A USA network tried broascasting a major award show in HD 24p "film look" and the response was fast and angry. Absolute crap! Impossible to watch! Made me sick! I saw the show and the comments were correct. Motion judder is a horrible artifact.

Those of us with JVC HD1 and HD10 have had to live with motion judder for many years. Every camera review claimed -- somewhat unfairly -- that the video was unusable. Obviously, not many cameras were sold.

Moreover, in Asia, where ALL narrative programming -- in fact everything except the broacast of film -- is done on video and 50i/60i is the only format used. When I was in a Korean airport I watched an HD video drama shot in 1080/60i. It looked great! The belief in using a "film look" is confined to the USA and Europe.

In fact, in the USA 24p is only used for episodic dramas in prime time. Not for any other HD programming. And, the reason may be less esthetics than the fact 24p can very easily be converted to ANY format. Which is good for world-wide sales.

Ali Husain
January 21st, 2007, 05:54 AM
could someone comment on the dynamic range of the v1? is it greater than the a1?

i was just looking at the footage posted here:

http://www.dvinfo.net/conf/showthread.php?t=84373

the footage in this link:

http://geekstudios.com/demos/shoeshine.m2t

shows pretty impressive dynamic range and colors (maybe a bit oversaturated). i don't know if it's been CC'd but i haven't seen any xl-h1 or a1 footage, CC'd or not, that looks like that. comments?

thanks!

Alex Leith
January 21st, 2007, 08:33 AM
Sony claims that the V1 has "improved" dynamic range thanks to the CMOS imaging sensors it uses - which are better than CCDs (the sensor type used by the A1) at handling light lattitude.

However the CMOS isn't the only link in the chain that determines the amount of latitude a camera can display in an image.

The Canon XH-A1 has a reasonably impressive dynamic range (for this class of camera) of 8.3 stops.

Looking at comparative images, the V1 undoubtedly has a greater range, but I would imagine the difference was maybe less than a stop. I would be very interested to see test data.

Where the V1 does excel is how it handles near-blown out objects. The A1 does a reasonable job of handling highlights, but the V1 hangs on to detail in bright objects longer than the A1.

However, neither of these cameras display the range that larger cameras can, and I don't think any camera has yet come anywhere near the 16 stop potential that 8bit formats have.

Tony Tremble
January 21st, 2007, 08:53 AM
Alex

When you say A1 do you mean Sony HVR-A1 or Canon Xh-A1?

The jury is still out with me whether it is the CMOS chips alone that are reason for the greater latitude. It is more likely it is a combination of the EIP and CMOS that really makes the difference as every pixel can be addressed more targeted image processing can be achieved.

The Contrast Enhancer does find more detail in darker areas even with black stretch engaged. The effect can be subtle but in hard light it is a godsend taking the image beyond the typical video look.

I think CCD is in its last generation judging by the beauty of the V1 image.

TT

Alex Leith
January 21st, 2007, 09:03 AM
I was talking about the Canon XH-A1 (I've just edited my post to make that clear).

I agree with you that CMOS is undoubtedly mature enough to use in videography. And the image from the V1 does (generally) look great.

However, all I was debating was that the dynamic range in the V1 is actually vastly improved over CCD cameras.

Douglas Spotted Eagle
January 21st, 2007, 09:09 AM
I'm not sure where this idea of ANYONE using 30p for broadcast. They don't! It would look horrible. Which is why FOX gave up the idea of going DTV using 480p30.

This idea even appeared in a Panasonic PR. I caught the error. He is their reply:

`We queried Craig Piligian, and indeed he clarified what's in the story, to wit:'


"We used to shoot Chopper with the Varicams at 720p/60. Since the Varicam is a variable frame rate camera, it was more than we needed in that respect, but it did not shoot 1080i.

When Panasonic came out with the HDX900, we found it to be the perfect combination of a great HD picture
at an affordable price. Since we deliver a 1080i master, we decided to shoot at 1080i to avoid converting and maintain a clean path.

We are now shooting at 1080i/60 with the HDX900 and are very pleased with the picture quality, ease of use, and affordability."

`So, eagle eye, you're right on the money! All the best, P.'

--------


"Since I haven't seen 60p, I'll reserve judgement on that. I'm sure it's a very subjective thing."

If you watch ABC, ESPN, and FOX -- you have only been watching 60p.

-----------------

There are only two ATSC formats used in the USA and their proper designations are:

1080/60i or 1080i60 -- there is no such thing as 1080i30.

720/60p or 720p60

24fps movies are sent with 2-3 pulldown for both 1080/60i and 720/60p.

Some are thinking of transmitting film at 1080/24p as it will fit in a 6MHz channel. This would support displays than can use 24p, 48p, or 72p.

---------------

The BBC is not going to use 1080p25 for live video -- only for movies that are 25p. Live will be 1080i50 until it can go to 1080p50.

----------------

DSE seems to have removed my quotes from ABC, ESPN, and FOX that confirmed their use of 720p60.

DSE didn't remove your quotes. DSE removed your personal comments about another poster and moved this thread. If anything has been removed, it was removed by another moderator, but that quote seems to be there.

Tony Tremble
January 21st, 2007, 09:16 AM
However, all I was debating was that the dynamic range in the V1 is actually vastly improved over CCD cameras.

I know you were. :)

Even doing a side by side shoot between XH-A1 and HVR-V1 you still wouldn't be able to know how much the difference, if any, is due to DSP EIP technowizardry!

It would be interesting to find out though with some hard evidence not some glib statement like the usual marketing bumf recital.

TT

Benjamin Eckstein
January 21st, 2007, 09:17 AM
Not entirely on topic, but since things have strayed a bit, and now we are back onto the CMOS sensors.....

How would the V1 and Z1 match up in a multi-cam environment? I shoot a lot for docs and corporate work with a company that uses Z1s. I generally take their gear out but but if I get the V1, will it match well enough with stuff shot with the Z1 and its CCD sensor?

Ken Ross
January 21st, 2007, 09:27 AM
"Since I haven't seen 60p, I'll reserve judgement on that. I'm sure it's a very subjective thing."

If you watch ABC, ESPN, and FOX -- you have only been watching 60p.



Steve, what I meant to say is that I haven't seen 1920X1080p. :)

Ken Ross
January 21st, 2007, 09:45 AM
I don't know how you can say 1080i football games look better. I guess that is your view of it. I recently watched two games one on ABC and one on NBC and I flipped back and forth between the two. The 1080i had a lot of compression artifacts in the grass while the 720p was almost perfect. The levels of detail were very close as well.

There are many variables Tom, such as your display and service provider. My opinion is shared by most others on a different forum where this type of thing is discussed endlessly. Most truly prefer the additional detail in 1920x1080i.

Now the reason I say much depends on your service provider and display is that I used to have Directv and I too saw many compression artifacts. I originally thought this was the broadcast until I realized how much compression was going on by Directv itself. I switched to Verizon's FIOS and it's truly remarkable how much less compression I see now. It's almost non-existant. This has served to further illustrate to me the additional detail in the 1920X1080 picture vs the 1280X720p picture. Those with full rez HDTVs that are capable of reproducing all of the 1920X1080i signal, say the difference is even greater on those sets. Some people with 720p displays that may not downconvert 1080i so well will naturally prefer the 720p broadcasts. So yes, I fully stand behind my feeling that this is 1080i is the preferred medium for live broadcasts and documentaries when your display and service provider can adequately show the quality of the signal.

Ken Ross
January 21st, 2007, 10:08 AM
The above explanation of 720p by Randy Haffner is interesting but also self serving. I've seen equally impressive discourse by engineering heads of CBS and NBC as to the advantages of 1920X1080i...so it's not quite so simple.

One thing Randy seems to leave out is the fact that with virtually all live broadcasts, static or near static shots (yes, even sports!) are the rule. Very fast motion is NOT the rule with any sport. So this mitigates, to some degree, the 'p' advantage. With the static and near static shots, the greater detail in 1920X1080 is very obvious (at least to me and many many others who discuss this ad nauseum). The other thing that Randy leaves out is the weighting factors may well be true for "Joe Six Pack", but videophiles can see beyond simple contrast. Just as many people preferred the look of many EDTVs over HDTVs for the simple reason that many EDTVs provided greater contrast, videophiles saw beyond that. If Randy's weighting factors were so true, why even bother with the expense of full-rez 1080p HDTVs when rez is 'relatively unimportant'? A 1080p display will show no improvement on a 720p broadcast since the current crop of 720p & 768p displays are already 'good enough' to show all that is present in the 720p broadcast. However a 1080p display will show added detail from a 1920X1080 broadcast that current non-1080p displays are missing. No, it goes beyond this.

The other thing that's simply ignored is the quality of the deinterlacer & scaler in the display itself. A high quality display with high quality scalers and deinterlacers will provide a 1920X1080 picture that's almost completely devoid of interlaced artifacts. When people hear 'interlaced artifacts', they immediately think back to the days of NTSC. Those artifacts ARE horrible. It is a totally different ballgame with HD. I've got a Fujitsu plasma that only does 1366X768 and I'll tell you that interlaced artifacts are extremely rare on any live sporting event that broadcasts in 1920X1080. The Fujitsu plasmas are known for top quality signal processing. Yes, I've seen some low quality displays that show many artifacts due to poor scalers and deinterlacers that rob resolution. But to say that interlaced artifacts are always originating with the broadcasters, is simply not the case. You need to look at all elements in the chain. As I've said before, simply switching from the horrid Directv broadcasts to the pristine FIOS broadcasts, is like day and night.

And of course if we talk about film, 720p broadcasts will never ever equal the quality of movies broadcast in 1920X1080.

But this is not to say that 720p is not capable of stellar images, it is, but I'm simply saying I prefer the greater detail of 1920X1080. IMO it's what HD is all about....sharper, more detailed imagery.

So IMO, I'll take the occasional interlaced artifacts for the greater detail I see almost all the time in 1920X1080. And as I've said, owners of full rez displays contend the difference is even greater with those displays.

Thomas Smet
January 21st, 2007, 11:28 AM
I don't use direcTV. I have Charter Cable for my HD provider. If the best in my area looks this bad for 1080i then HD really has a long way to go yet because only the top markets are at high quality.

Ken, just how close are you actually sitting in front of your TV? There was a chart recently that talked about viewing distance and HD resolutions and prety much any TV 50" or smaller with a viewing distance of 10' of more would gain nothing at all by using any resolution higher then 720p. It is my understanding that most HDTV's sold are either 42" or 50" and sit in living rooms with at least a 8' or 10' viewing distance.

John Miller
January 21st, 2007, 11:52 AM
When people hear 'interlaced artifacts', they immediately think back to the days of NTSC. Those artifacts ARE horrible.

I'm still living in the past, then (along with a lot of others, I'm sure). I have only SD NTSC and PAL equipment. Except when I view video on a PC's non-interlaced monitor or I pause playback, I've never seen 'interlaced artifacts'.

Are you saying that standard programming that is watched as it should be (i.e., not pausing or viewing on a progressive display) is full of artifacts?

Ken Ross
January 21st, 2007, 12:11 PM
I don't use direcTV. I have Charter Cable for my HD provider. If the best in my area looks this bad for 1080i then HD really has a long way to go yet because only the top markets are at high quality.

Ken, just how close are you actually sitting in front of your TV? There was a chart recently that talked about viewing distance and HD resolutions and prety much any TV 50" or smaller with a viewing distance of 10' of more would gain nothing at all by using any resolution higher then 720p. It is my understanding that most HDTV's sold are either 42" or 50" and sit in living rooms with at least a 8' or 10' viewing distance.

Tom, I've heard some horrendous things about Charter Cable, so I'm not surprised your HD looks so bad. You can't believe the picture quality on FIOS...it truly is everything people are saying about it. Coming from Directv it was a shocker to say the least. Alll the artifacts are essentially gone.

I'm sitting about 8' from my 50" Fujitsu. When I go 1080p, which I will, I'll be up at a 60" (new Pioneer shown at CES with new gen blacks) or possibly a 65" Fujitsu. So I agree that to appreciate the benefits of 1080p, you need a larger screen and a suitable viewing distance. At my viewing distance I should be able to see the full-rez benefits of 1080p. But my point was that the benefits are there. Reading the ABC blurb, one would be led to think there are essentially no benefits to this added rez.

Ken Ross
January 21st, 2007, 12:13 PM
I'm still living in the past, then (along with a lot of others, I'm sure). I have only SD NTSC and PAL equipment. Except when I view video on a PC's non-interlaced monitor or I pause playback, I've never seen 'interlaced artifacts'.

Are you saying that standard programming that is watched as it should be (i.e., not pausing or viewing on a progressive display) is full of artifacts?

Yes John, with out NTSC system, many broadcasts suffered quite badly from interlaced artifacts when viewing on a standard NTSC TV. Once pointed out, you'll see them forever. You can certainly see them with a larger screen or relatively close viewing distance. As I said, at my same distance with HDTV, these artifacts are essentially gone...with 1080i or 720p.

Tony Tremble
January 21st, 2007, 12:25 PM
Ken, just how close are you actually sitting in front of your TV? There was a chart recently that talked about viewing distance and HD resolutions and prety much any TV 50" or smaller with a viewing distance of 10' of more would gain nothing at all by using any resolution higher then 720p. It is my understanding that most HDTV's sold are either 42" or 50" and sit in living rooms with at least a 8' or 10' viewing distance.

I was going to mention this too.

The "Full HD 1080P" is another corporate led initiative to get people to buy a new set when they really don't need to.

In tests with the most popular HDTV size screens and at normal viewing distances consumers are not seeing the difference between 1080i and 1080P and that's on 1080P screens. The more the image is moving the less the resolution is important and other factors dominate. But we know that because we know that HDV/MPEG et al work.

I read another report on 2k vs 4k in cinemas and it came to the conclusion that 4k was OTT because the vast majority of screens in multiplexes are far too small to gain any benefit whatsoever. There will be only a hand full of people who'll ever see 4k. But even then as soon as the image moves the brain's ability to "see" the detail drops like a stone and the image might as well be 2k!

TT

John Miller
January 21st, 2007, 12:30 PM
Yes John, with out NTSC system, many broadcasts suffered quite badly from interlaced artifacts when viewing on a standard NTSC TV. Once pointed out, you'll see them forever. You can certainly see them with a larger screen or relatively close viewing distance. As I said, at my same distance with HDTV, these artifacts are essentially gone...with 1080i or 720p.

Curious. I usually have a critical eye and must just be blind to them (NTSC). I'm sensitive to other odd-ball vision things - such as the red LEDs on transmission towers (I can tell if they are strobed or continuous) and a similar trend with cars' brake lights. And, when I now see PAL material, I find the image too flickery. I must have become used to 59.94i vs 50i.

Ken Ross
January 21st, 2007, 04:13 PM
I was going to mention this too.

The "Full HD 1080P" is another corporate led initiative to get people to buy a new set when they really don't need to.

In tests with the most popular HDTV size screens and at normal viewing distances consumers are not seeing the difference between 1080i and 1080P and that's on 1080P screens. TT

Well Tony it's not just the difference between 'i' and 'p', it's more the difference between flat panels that are the norm today (1366X768) vs full rez (1920X1080) panels that are just coming out. In fact today there are only 2 full rez plasmas on the market, a 65" Panasonic and a 50" Pioneer. There are of course many full rez LCDs. People often get confused when they see the 'p' in the 1080p thinking that all they're gaining is going from interlaced to progressive. The fact is that virtually all LCDs & plasma panels sold are progressive, but of the lower, 1366X768 resolution. Fujitsu and Hitachi make an interlaced plasma (ALIS) that does 1080X1080, but that's the only one of its breed that I'm aware of.

So you can see if you're in the plasma market (where the best flat panel pictures are), going to 1080p means you're really going up considerably in the resolution of the display.

Ken Ross
January 21st, 2007, 04:15 PM
Curious. I usually have a critical eye and must just be blind to them (NTSC). I'm sensitive to other odd-ball vision things - such as the red LEDs on transmission towers (I can tell if they are strobed or continuous) and a similar trend with cars' brake lights. And, when I now see PAL material, I find the image too flickery. I must have become used to 59.94i vs 50i.

I guess it's kind of like the sensitivity that some people have to 'rainbows' in DLPs. Some people see it and it drives them nuts and others simply don't see them at all. I wish I didn't see all these artifacts...it makes finding the ideal display so much more difficult.

Tom Roper
January 21st, 2007, 05:26 PM
This is a very interesting discussion, many good observations from the perspective of personal experience, but also exactly how wrong characterizations are made about a technology as a whole. I'm always surprised how many people shooting HDV are viewing it only on PC/Mac monitors.

In tests with the most popular HDTV size screens and at normal viewing distances consumers are not seeing the difference between 1080i and 1080P and that's on 1080P screens.

Very few have actually seen the true native 1080p60. 1080p24/25/30 is the norm. It's the same bitrate as 1080i60 or less. Upscaling the frame rate to 1080p60 for newer high resolution progressive displays does not alter the fact.

The other thing that's simply ignored is the quality of the deinterlacer & scaler in the display itself. A high quality display with high quality scalers and deinterlacers will provide a 1920X1080 picture that's almost completely devoid of interlaced artifacts.

While true, you can't control what type of monitor the viewer is using. This makes progressive the safer bet for the largest audience. Personally, I prefer 1080i60 right now for the same reasons as you.

Thomas Smet
January 22nd, 2007, 12:30 AM
But how much this is really a greed factor? Most consumers right now are fine watching DVD on their HDTV's. They really do look very good if they are done right and are progressive to begin with. lets face it most of us are image freaks but the other 98% of the people on the planet are not. In a way it is almost like a SACD thing. Yes it is better but guess what, most humans could really give a rats rump. In fact most people cannot tell the difference and they may never will.

To me HD should have been a way to move away from the restrictions we have had with NTSC and boost the quality to look good on a large screen display. 720p can and does do that. Yes most plasmas are only 13666x768 and most consumers love them to death if they are using them in the correct way.

The way I look at it is that if many consumers are happy with a 480P DVD then why couldn't 720p be enough of a quality boost to move us into the future? 720p would have been more then enough to impress most consumers. I mean this is sort of like saying a Honda CRV isn't good enough and you have to have a Hummer or you don't really have an SUV.

Of course certain forms of 1080p "could" have more detail but the whole point is that maybe only 5% of the whole world actually has the right stuff to really see the difference. Of that 5% maybe only a small portion know how to actually see that difference. Most consumers will not compareside by side either and I bet you if you broadcast something in 1080i on Monday and then broadcast as 720p on Tuesday they would never tell the difference.

As for cable providers, this is how silly this really is. Ken you say that 1080i can look great but only with a certain provider. This should tell you right away that 1080i is harder to deal with or it would all look good. The fact is that 1080i is tough to encode and decode with the same level of quality as 720p. Yes a darn good encoder may be able to pull off some sweet stuff but not every broadcaster can do this. I have no idea how many HD providers there are but to make this simple lets just say there are 10. That means that maybe only 1/10 people ever get to see 1080i HD at a decent level of quality.

Resolution is not everything.

I also never said my HD looked bad from Charter. In fact I think it looks very good. Maybe it isn't the best but I have been somewhat happy with the quality so far. My 720p look perfect. Even if they reduce the bitrate or use a so so encoder that kind of makes the whole point here. 720p is easier to keep a consistant level of quality no matter who or how it is broadcast and that is what I thought a broadcast standard should be. 1080i can be all over the place and no consumer can ever be sure what they are going to get.

Ken Ross
January 22nd, 2007, 01:15 PM
As for cable providers, this is how silly this really is. Ken you say that 1080i can look great but only with a certain provider. This should tell you right away that 1080i is harder to deal with or it would all look good. The fact is that 1080i is tough to encode and decode with the same level of quality as 720p. Yes a darn good encoder may be able to pull off some sweet stuff but not every broadcaster can do this. I have no idea how many HD providers there are but to make this simple lets just say there are 10. That means that maybe only 1/10 people ever get to see 1080i HD at a decent level of quality.

Resolution is not everything.


Tom, in discussing Directv, it is a well known fact that they butcher both the HD signal and their SD signal. Trying to fit so many channels in to a narrow pipe just doesn't work. This forces them to use huge amounts of compression, downrez as well as reduce the bitrate. Reducing the bitrate affects both 720p and 1080i. It results in artifacts on both. Sudden changes in lighting, scenes with explosions or fireworks, all result in a pixellization that is independent of 1080i or 720p. They have well deserved the reputation of being the "HD Lite" provider. In fact there are lawsuits under way contending they are not delivering the product promised. I myself have watched them go downhill during the 10+ years I had them. As soon as I had an alternative, I switched.

As we move in to the very near future, all displays sold will be 1080p. So like it or not, we're going there. Why not? Yes, I agree with the fact that resolution isn't everything, but if the standard allows 1080p, why not go there? To me it would be foolish to not achieve the highest quality we can within the standard we're working with. But as I say, like it or not, it will happen and is currently happening. In about 3-5 years it will be hard for you to find a 720p display....and IMO that's a good thing.

Thomas Smet
January 22nd, 2007, 03:09 PM
Again it is a greed thing. Yes 1080p may look better but very few people will ever notice. 720p would have been more then good enough but 1080i always sounded more impressive to people who didn't really know better. Clearly 1080i is interlaced so it is hard to do so now the holy grail is 1080p 60p which is exactly the same as 720p 60p just with a little bit more detail. Yes perhaps in 3 years there will not be any more 720p displays but a lot of that has to do with marketing and the fact that it just sounds better. Why not go with it? because it costs the consumer more and many of them may not even be able to notice the difference. It is just a way to make more money. They technology keeps moving up before people can even get into it.


Regardless if DirecTV is bad or not, there are a lot of people that have it. Starting this year they are going to have somewhere between 100 and 150 HD channels. To a lot of consumers they would rather have the choice of 100 HD channels then be limited to 20 from a cable provider even if they do look better. It's like VHS vs beta. Beta was better but VHS won because they had 2 hour tapes. The rule of the consumer world is that they don't always go for whats better but for what is either cheaper or what gives them more for the money. Yes the bitrate may be lower and it will harm both 1080i and 720p but the 1080i will suffer more at the reduced bitrate which means for now a lot of cable providers should have stayed with 720p until the world of 1080p 60p was here.

Of course I like 1080p. I also like 720p and find it to be fine and I enjoy it very much on my 50" Perhaps if I had a larger TV I would have wanted a 1080p display but like it or not I am in the norm with thinking a 50" TV is big enough for me right now.

Marco Wagner
January 22nd, 2007, 05:53 PM
I now own 4 HDTVs, two of which are identical and the other two (one LCD 32" and the other 42" Plasma). 1080i looks best on ALL 4 displays. Putting a 720p set next to the other set at 1080i with the same ball-game on -720p looks subpar. When a friend brought over some tape shot at 720p AND 1080i I was amazed at how much better the 1080 looked. He was also shooting a local short track race, loads of movement.

720p just doesn't do it for these eyes. It just seems like a hyper 480...

1080i gives that "ah, now that looks nice!" feel to it. I really don't think there'd be a huge visually noticeable difference going to 1080p from i. Sounds like the "You must buy a digital TV" ploy from a couple years ago. The picture never got better on a digital vs. the 5 year old POS I had at that time. In fact it got worse in some cases, but that is a different thread.


my $.02

David Heath
January 22nd, 2007, 06:24 PM
It's like VHS vs beta. Beta was better but VHS won because they had 2 hour tapes. The rule of the consumer world is that they don't always go for whats better but for what is either cheaper or what gives them more for the money.
Sorry, but Beta wasn't better - at least not per se. The truth behind the myth is that Beta machines were always high quality and expensive, from few manufacturers, VHS machines came in a variety of qualities and prices from a variety of manufacturers - JVC licensed the rights.

A friend and I had the chance to test two comparably priced (and expensive) Beta and VHS machines in a lab in the early days, and we were both pretty surprised how comparable they were for picture quality. (Though the Beta machine didn't seem to have as good a drop out compensator.) Subsequently, I saw some pretty ropey VHS machines - but they were far cheaper than any Beta machine on the market.

You have to compare like with like. There was very little difference between VHS and Beta as formats, though I seem to recall VHS tended to be generally ahead with enhancements like hi-fi sound and long play.

John Miller
January 22nd, 2007, 06:48 PM
You have to compare like with like. There was very little difference between VHS and Beta as formats, though I seem to recall VHS tended to be generally ahead with enhancements like hi-fi sound and long play.

Sony was ahead of the game with HiFi audio on Betamax - JVC followed more than a year later, copying the same technique - depth multiplexing.

http://en.wikipedia.org/wiki/Betamax suggests that Betamax offered many "firsts" ahead of JVC.

Ken Hodson
January 22nd, 2007, 07:02 PM
When a friend brought over some tape shot at 720p AND 1080i I was amazed at how much better the 1080 looked. He was also shooting a local short track race, loads of movement.

Just a question for your observations: What was the cam that shot the dual 1080i and 720p footage. And being that there was loads of movement and you were more impressed with an interlaced image, I am guessing the 720p wasn't shot a 60p?

Ken Ross
January 22nd, 2007, 07:03 PM
Again it is a greed thing. Yes 1080p may look better but very few people will ever notice. 720p would have been more then good enough but 1080i always sounded more impressive to people who didn't really know better. Clearly 1080i is interlaced so it is hard to do so now the holy grail is 1080p 60p which is exactly the same as 720p 60p just with a little bit more detail. Yes perhaps in 3 years there will not be any more 720p displays but a lot of that has to do with marketing and the fact that it just sounds better. Why not go with it? because it costs the consumer more and many of them may not even be able to notice the difference. It is just a way to make more money. They technology keeps moving up before people can even get into it.


Regardless if DirecTV is bad or not, there are a lot of people that have it. Starting this year they are going to have somewhere between 100 and 150 HD channels. To a lot of consumers they would rather have the choice of 100 HD channels then be limited to 20 from a cable provider even if they do look better. It's like VHS vs beta. Beta was better but VHS won because they had 2 hour tapes. The rule of the consumer world is that they don't always go for whats better but for what is either cheaper or what gives them more for the money. Yes the bitrate may be lower and it will harm both 1080i and 720p but the 1080i will suffer more at the reduced bitrate which means for now a lot of cable providers should have stayed with 720p until the world of 1080p 60p was here.

Of course I like 1080p. I also like 720p and find it to be fine and I enjoy it very much on my 50" Perhaps if I had a larger TV I would have wanted a 1080p display but like it or not I am in the norm with thinking a 50" TV is big enough for me right now.

I think you will find that providers such as Verizon with FIOS will begin to put pressure on other providers for enhanced quality. If FIOS can do it, it can be done. Yes, the started with a brand new infrastructure, but the others will have to catch up as the marketing becomes more intense. Already FIOS is saying they provide the best picture....and they do. Wherever FIOS has shown up, there has been mass defection from Directv HD subs. I don't believe we should cater to the lowest common denominator. I don't believe in 'good enough'. Why not go for the best. I CAN see the difference between the additional detail of 1080i and 1080p will only be better. I CAN see the difference in the gorgeous transfers on many HD DVD movies. So yes, many out there in Joe Six Pack land could care less, but that doesn't mean the industry should cater to this lowest common denominator.

If we did, we'd still be driving Model Ts.

Ken Ross
January 22nd, 2007, 07:05 PM
I now own 4 HDTVs, two of which are identical and the other two (one LCD 32" and the other 42" Plasma). 1080i looks best on ALL 4 displays. Putting a 720p set next to the other set at 1080i with the same ball-game on -720p looks subpar. When a friend brought over some tape shot at 720p AND 1080i I was amazed at how much better the 1080 looked. He was also shooting a local short track race, loads of movement.

720p just doesn't do it for these eyes. It just seems like a hyper 480...

1080i gives that "ah, now that looks nice!" feel to it. I really don't think there'd be a huge visually noticeable difference going to 1080p from i. Sounds like the "You must buy a digital TV" ploy from a couple years ago. The picture never got better on a digital vs. the 5 year old POS I had at that time. In fact it got worse in some cases, but that is a different thread.


my $.02

Couldn't agree more Marco and this is the common refrain from true videophiles. AVS forum is filled with people who agree that 1080i just provides more of the window effect that we all strive for. 720p is nice, but it just doesn't give quite that same degree of detail that HD is all about. And I agree that the difference going from 720p to 1080i is probably greater than going from 1080i to 1080p.

David Heath
January 23rd, 2007, 04:29 AM
http://en.wikipedia.org/wiki/Betamax suggests that Betamax offered many "firsts" ahead of JVC.
An interesting article - it does say that there were differences between NTSC and PAL variants, most notably I also never remember the one hour limit with Beta. Maybe there was a significant quality VHS/Beta difference in NTSC machines (at the expense of a time limitation for Beta), which never applied with PAL? I first remember VHS machines appearing in the UK late 1978/early 1979, so a couple of years after the NTSC machines came out, and the comparison we did must have been approximately a year later.

I do remember a service engineer telling me that a big problem with a particular range of Beta machines was a stock fault with the power supply. They weren't any more or less reliable overall, I believe, but they tended to fail at the time of max current - in FF or Rewind. Since the tape wound laced up (unlike VHS), and access to the power supply was via the tape deck, repair inevitably meant destruction of the tape - a big problem at the time as rental tapes were vastly more expensive in real terms than now. In the end his shop just stopped stocking Betamax machines.

Marco Wagner
January 23rd, 2007, 06:26 PM
Just a question for your observations: What was the cam that shot the dual 1080i and 720p footage. And being that there was loads of movement and you were more impressed with an interlaced image, I am guessing the 720p wasn't shot a 60p?


Z1U for the 1080 and I forget what he said the other tape was shot on but he did say it was progressive 60. I want to say Panny, but I seriously forget.

Ken Hodson
January 23rd, 2007, 11:09 PM
Well that was my point essentially. Two cams will look different, and being 60p as you say, the Panny is a good guess. It also happens to be the softest HD cam. Observations are good, but un-referenced comments of apples to oranges comparisons don't give credibility to the "i" vs. "p" argument.

Steven White
January 24th, 2007, 09:36 AM
Here's my 2 cents:

1080p will win. 1080i is interim, and 720p already lost. Why?

First off, if you go into big box store, any screen that has a 1080p resolution has it in a big fancy pants sticker. "Full HD 1080" or any other such gimmick. If it doesn't have it, you ask why. Eventually, all the screens are going to be 1080p. Why bother with a 720p screen?

Next up, 1080p24 is better than 720p24 hands down. I have a 17" 1920x1200 computer monitor on which I watch HD content. I can tell the difference on a screen this small between 1280x720 and 1920x1080. I can see the softness in 720p, the downsampling errors, etc. In 1080p resolution I can see the grain of the film, how well it was focussed - pretty much all the imperfections of the flick. I like that. All HD-DVD and Blu-Ray content produced by major studios is going to come out as 1080p24, and most consumers will want TVs that can play it properly. (Bring on 120 Hz displays!)

The only application where 720p reigns supreme is in 720p60 content. And here it's got to fight 1080i. Well, 1080i can be adaptively deinterlaced to have comparable vertical resolution to 720p. What makes this worse is that the primary source of 60i/p content are sports broadcasters... but there aren't any real quality controls in the broadcast industry. There are different encoders, different scalers, different source cameras... Not to mention the broadcaster can bit-starve each HD channel in order to have more channels. Invariably the image broadcast is lower quality than it could be.

I wouldn't buy a TV or choose a format based upon how well it can handle badly handled data... I'd base it on how it can excel. If bandwidth goes up, it may pave the way for 1080p60... at which point 720p will be a distant memory.

-Steve

Tom Roper
January 24th, 2007, 01:46 PM
Ken makes great points as always yet I've never heard anyone say it better than Steve just did, that you buy it for what the format can be at its best, not to avoid what it is at its worst.

Peter Malcolm
January 24th, 2007, 02:12 PM
I have a 17" 1920x1200 computer monitor on which I watch HD content.
But... 17" computer monitors can't display 1920x1200. It's unheard of for a 22" monitor to get that high, unless you're speaking in terms of laptops.

Anyways, televisions with 2160p are on the way. Quad Full High Definition. Be very afraid ;)