View Full Version : Mjpg 24p 1280x720


Anna Uio
June 17th, 2009, 09:35 AM
Hi all,

Anyone know if one of the GH1 modes is Mjpg 24p 1280x720? Is it pulldown? How's the quality, i.e. how bad are the compression artifacts?

Thanks,
Anna

Ian G. Thompson
June 17th, 2009, 09:56 AM
The MJPEG is actually 30p @ 30Mbps. It does not look bad at all to me. This cam's 24p is using AVCHD @ 17Mbps. But the image has lots of issues in regards to artifacts. There are some workarounds it seems though. The cam also has 60p @ 17Mbps which is more robust than its 24p mode but you take a hit in resolution.

Anna Uio
June 17th, 2009, 11:28 AM
Strange... I wonder why they have Mjpg 30p and not also 24p. The bitrate would be lower, or they could up the quality. Seems like a feature that would appeal to a lot of people.

Cheers,
Anna

Steve Mullen
June 18th, 2009, 08:14 PM
The MJPEG is actually 30p @ 30Mbps. It does not look bad at all to me.

In the prototype, the MJPEG looked much better than the AVCHD clips. It also edits very easily. The 30.0 shouldn't make a difference unless one took long shots.

PS: I'm not sure there are ways to impove the 24p video unless Pana makes a firmware change. If it's the case they limited the bit-rate on 24p to half that of 60p -- that seems a bug. Unless, the AVCHD encoder's compression limit is somehow frame-size dependent.

The "hit" in resolution depends on your point of view. 720p60 has half the spatial resolution of 1080p24 but over twice the temporal resolution. So if you ask if there is a QUALITY hit -- the answer is NO. You simply trade one type of "resolution" for another.

Ken Ross
June 20th, 2009, 08:33 AM
I've found, judging from broadcast HD, that I much prefer 1080i to 720p. Since most displays are now progressive in nature, the 1080i gets converted to 1080p anyway. Yes, on occasion, you may still see some minor artifacts with 1080i, but the greatly increased resolution of 1920X1080 vs 1400X720 (or thereabouts), more than makes up for that in my opinion. If you view on a large screen HDTV, the differences can be very significant. If you have a smaller screen, or one that is not 1080p, the differences will be less significant.

Keep in mind too that over 95% of material you see on TV is static in nature, or nearly so, thus the advantages of 720p's progressive nature is relatively minor.

As an aside, I should have my GH1 sometime next week. I'm looking forward to this camera!

Steve Mullen
June 20th, 2009, 04:11 PM
Since most displays are now progressive in nature, the 1080i gets converted to 1080p anyway.

With a 1920x1080 DLP or many plasma HDTVs you do indeed get 1920x1080P. However, with all but a couple of LCDs HDTVs -- you SEE only 1920x540 (540p) when viewing 1080i. Because the vast majority of HDTVs are LCDs -- "most" people see a more balanced resolution when viewing 720p since they SEE 1280x720. In fact, they SEE the same 1Mpixels no matter which format they view because 720p is upscaled to 540p.

Thus, the idea that there is a difference in favor of 1080i is pure marketing. With the push toward very flat HDTVs, DLP's are now rare. And, plasma is all but dead. That leaves LCDs -- and it's the cheap LCDs that are selling the most units. So it's fair to say that within a year only a few will have HDTVs that can show 1920x1080p.

Unfortunately, to SEE the extra 1 million pixels from 1080i they will get all the traditional interlace nasties: line-twitter and flicker, interlace artifacts, and far less clear motion. Bottom line -- 1Mpixels is all most people will ever see. Shooting and recording 2Mpixels makes folks feel better -- due to marketing, but in the end that's all that's needed for an audience.

The horrible interlace artifacts are why the EBU took the stand that Europe should move to progressive as fast as possible. Sony is already shipping 1080p60 cameras and switching gear to meet the needs of Europe. This will require a distribution shift to h.264 and, given our economy, it could be a decade before we in the USA get 1080p60. (1080p24 is far simpler and can be done now.)

PS 1: The only LCDs that can show 1920x1080 are 2 models from Samsung and are 120Hz units. In other words, given HOW LCDs work -- an LCD at minimum must run at 120Hz. However, how do you get 120fps from 30fps or 60fps? Therein lies the rub. The process of upscaling MOTION can introduce artifacts. Getting a clean 120fps from 60p is much simpler than from 60i.

PS 2: All this favors DLP and a few high-end plasmas.

Ken Ross
June 20th, 2009, 04:21 PM
Steve, I couldn't disagree with you more and I've had both types of displays (720p, 1080p, LCD & plasma). Tested results from totally independent testing magazines consistently show a far higher resolution with 1920X1080. This is not even in dispute. There is just no question that the measured resolution and picture sharpness is far higher with 1920X1080 vs 1280X720p. Watch any good broadcast in 1080i vs 720p and you'll see much more detail in the 1080i broadcast.

My Pioneer Kuro 60" shows FAR more detail than any of my prior 768p Fujitsus ever could. The Fujitsu was considered one of the best back then and the Pioneer is considered the best HDTV ever made.

As far as artifacts are concerned, I suggest you look at a high quality plasma like a Pioneer. You'll be hard pressed to see ANY artifacts, regardless of whether it's 720p or 1080i broadcasts.

I don't think you've seen modern displays if you still think that 1080i is ridden with artifacts. It simply isn't so.

I also have no idea where you're getting your info regarding only 2 LCDs capable of 1920X1080...there are FAR more than that. My goodness, Sony, Samsung, Toshiba, lower end brands....many out there capable of that resolution.

Steve Mullen
June 20th, 2009, 07:46 PM
Steve, I couldn't disagree with you more and I've had both types of displays (720p, 1080p, LCD & plasma). Tested results from totally independent testing magazines consistently show a far higher resolution with 1920X1080. This is not even in dispute. There is just no question that the measured resolution and picture sharpness is far.

Exactly the reverse -- truly independent testing for the past 3 years has shown the deinterlacer (used only for 1080i) in the MAJORITY of HDTVs fails to deliver more than 330- to 580-lines of VIDEO INFORMATION. That's because "bob" (line-doubling) is used.

http://www.hdguru.com/will-you-see-all-the-hdtv-resolution-you-expected-125-2008-model-test-results-hd-guru-exclusive/287/

Moreover every scientific test has shown greater perceived resolution from 720p over 1080i. These tests are why the EBU voted against interlace no matter the resolution. Moreover, in the ATSC meetings every computer company voted against interlace. See this Broadcast Engineering story:

Restricted Content - Has HD officially arrived in Europe? (http://login.broadcastengineering.com/wall.aspx?ERIGHTS_TARGET=http%3A%2F%2Fbroadcastengineering.com%2Fmag%2Fbroadcasting_hd_officially_ar rived%2Findex.html)


Interlace delivers an inferior picture: artifacts and when displayed, no greater REAL resolution. It is a form of compression that was needed because MPEG-4 was not yet available. There is NO support for interlace anywhere except from those who believe they see more because the marketing NUMBERS are bigger. For example, "The rule seems to be that big numbers must be good, because they impress the public. In the move to HD, many networks chose 1080i over 720p, even though careful tests from bodies like the EBU proved that 720p pictures looked better, plus they had the advantage that compression of progressive scanning is more efficient."

http://broadcastengineering.com/viewpoint/count-pixels/index.html

And, the reason is simple -- even though fields (60i) are converted to frames (60p) by the deinterlacer (a process that often does NOT work well) the VIDEO INFORMATION in each 1920x1080 frame is 1 Mpixel -- exactly the same as from 720p60 which doesn't need to be deinterlaced. Thus, every second, 60 HALF pictures are shown from 1080i while 60 FULL pictures are shown from 720p. Obviously, the eye sees P as having more resolution than I.

The only exceptions are found with certain high-end plasmas (your no longer made becauseonly a few would buy them, Kuro), the 150-inch Panasonic plasma, and a few large Samsung plasmas (which may be discontinued by now).

These few sets use an adaptive deinterlacer which will -- ONLY for static pictures deliver nearly all 1080-lines using "weave." On motion, they use "bob" on those areas with motion. So moving objects have HALF vertical resolution. In theory, because the eye sees objects in motion as blurred, the loss of vertical resolution isn't too noticeable. But, clearly an image made up of soe objects with FULL vertical resolution and some objects with HALF vertical resolution objects has less QUALITY than a solid 1280x720 image.

Moreover, adaptive deinterlacers need to gracefully switch between bob and weave AND need to switch between interlace and progressive (film). This is nearly impossible to do without switching artifacts.

Bottom-line -- it is irrelevant what you claim you see. Industry experts differ strongly. The technology of interlace inherently does NOT deliver high-quality video. At the camera there is a 25% loss of vertical resolution from the sensor. This low-pass filtering is necessary to TRY to prevent line-twitter. However, watch a vertical pan of the stands in a game and you'll see the attempt does not work. Twitter is horrible. Now watch ESPN or FOX. No ugly twitter!

Then, unless you own one of a few "good" HDTVs, the deinterlacer cuts the ALREADY LOWERED BY 25% vertical resolution by HALF. And, even if you own one of the few better HDTVs, objects in motion lose HALF vertical resolution.

LCDs have a different problem. Only ONE 120Hz Samsungs deliver near 1080-lines of motion VIDEO RESOLUTION. All other 120Hz LCDs on the market failed to deliver more than about 580-lines and 60Hz LCDs generally delivered only 330-lines. Yes -- only 330-lines.

This situation may be because Samsung put an adaptive deinterlacer only in their top of the line 120Hz HDTV. Or, the way LCDs write to a panel -- only half the rows get NEW information for each re-fresh. Thus, doubling the number of refreshes doubles the number of lines that can be presented. If it is the former -- then 720p will be perfect. If it is the latter -- then both 1080i and 720p will be cut to 540p60 unless one buys one of the top of the line 120Hz Samsungs. (I'm investigating which reason is true.)

My Broadcast Engineering story at:
http://files.me.com/dvcinlv/75cegw

And, my story in Broadcast Engineering at:
http://broadcastengineering.com/test_measurement/hd-monitors/index.html

Ken Ross
June 20th, 2009, 09:42 PM
Gary Merson's testing was done almost two years ago. I know Gary, he lives near me and had done an ISF on an old Panasonic CRT HDTV I had owned.

What you may not be aware of, is that Gary did a subsequent test much later and found that the state of deinterlacing had MUCH improved in almost ALL HDTVs he tested. Modern deinterlacers do not achieve their end-result by simply bobbing. In fact Gary's later testing found far greater resolution even for motion testing. Those lower rez days are largely over. Horizontal and vertical resolution is FAR FAR higher than what you quoted for a good quality 1080p HDTV. Your numbers make very little sense Steve since they're almost the same as those achieved for SDTV. Now please please, don't tell me I'm seeing very little more resolution with my HDTV on 1080i than I did with my old standard def TV. C'mon now, there isn't a person that's seen any HDTV I've ever owned (or any that any person ever owned) that wouldn't laugh out loud at that comment.

The Pioneer Kuro has been measured significantly higher with a 1080i source as have many other HDTVs Gary has tested. In fact, resolution down to the pixel level has been achieved...the FULL 1920X1080. I am most assuredly not alone is seeing much greater detail in 1080i broadcasts as opposed to 720p. The eyes see it and the measurements prove it. Again, the vast majority of broadcasting is static or nearly static (yes, even sports) in nature and so even with TVs that don't do a great job of deinterlacing, artifacts are not significant.

720p requires much less bandwidth and it was that reason that some broadcasters chose it. However, more chose to go with 1080i as opposed to 720p.

And Steve, of course computer companies voted against interlaced, it's contrary to their best interests! They live in a progressive world so to speak. :)

Steve Mullen
June 21st, 2009, 03:18 AM
Gary Merson's testing was done almost two years ago. I know Gary, he lives near me and had done an ISF on an old Panasonic CRT HDTV I had owned.

What you may not be aware of, is that Gary did a subsequent test much later and found that the state of deinterlacing had MUCH improved in almost.

The Pioneer Kuro has been measured significantly higher with a 1080i source as have many other HDTVs Gary has tested)

Gary's last test was published on 2008 models and if you read my link you know it reports that deinterlacing did get better from 2007. IF 2009 models test better, we'll have to wait until he publishes these tests. In any case, let's assume the LCDs do improve because they have been forced to move from 60Hz to 120Hz to solve motion problems. With only 330- to 580-lines of rez in 2008 -- they'll have to improve a LOT to really display anything near 1080-lines.

You are also ignoring the EBU studies that 720p60 delivers a BETTER picture than does 1080i. And, you are ignoring that 1080-line sensors running at 1080i are filtered to about 810-lines. There are NO 1080i sources other test instruments. That's why in the real world you can't see the difference between 720-lines and 810-lines. In fact, unless your de-interlacer can deliver 90% or more of the vertical resolution on a 1080i test signal -- a number almost impossible to achieve -- monitors using 1080i will display LESS than 720-lines.

So despite the claims of FullHD -- real world video programming on 99% of HDTVs will have vertical resolution no better than does 720p. And, in the case of all but a very few camcorders, the CCDs and CMOS chips can NOT deliver even 1000-lines let alone 2000-lines of horizontal resolution. So, other than $50K to $100K video cameras -- nothing records anything near FullHD. Most camcorders are hard pressed to record 1280x720.

Only film transfers on BD are really FullHD -- and these are PROGRESSIVE. But, that's not the topic of this GH1 thread.

Lastly, your Kuro did reach 900-lines under motion and if you read my BE stories you know it is the ONLY HD monitor I recommend. So, I full agree your Kuro delivers more resolution than almost any other HDTV. Unfortunately, none of these plasmas are being built anymore. (: Folks couldn't see enough difference to pay their high price.

So, now only the 150-inch Panasonic and one Samsung reaches this level of performance. How many folks have 150-inch plasmas at home?

ESPN, FOX, ABC, and NASA did not choose 720p for no good reasons. The EBU did not recommend 720p50 for no good reasons. In fact, I know of no study that has ever recommended 1080i. CBS and NBC chose it because the BIG NUMBERS allowed their marketing to claim it was the best. FullHD is a marketing tool. Like Red's "4K" which ain't 4K as the term is used by the rest of the industry.

PS: when we get 1080p60 which will neither be filtered before recording nor deinterlaced before display, there really will be 2Mpixels of information to be seen. Behind closed doors I've seen Sony 1080p50 shot in Europe for IBC and presented on an OLED monitor. This truly will blow away 720p50 -- which is exactly why Sony is aiming it at Europe. They are saying skip 720p50 and go directly to 1080p50.

Ken Ross
June 21st, 2009, 09:29 AM
I think it's a bit arbitrary to say that CBS, NBC, PBS, CNN, HBO, Showtime etc. etc. went to 1080i because of 'marketing'. You mean to tell me that these guys somehow fell prey to marketing and ABC, FOX (lesser networks with less capital to work with) and ESPN were the only 'honest' guys? Not buying it Steve nor do my eyes or my logic.

As I said before, much of your information was outdated, including your belief that only ONE manufacturer, Samsung, was providing 1920X1080 displays. There have been many more manufacturers than that providing full rez displays for several years. And the limitation has nothing whatever to do with refresh rates, it has to do with pixel density. Higher refresh rates in LCDs are being used to provide a 'smoother' image with motion and get around motion lag which has been an issue with LCDs for many years.

Hell, I could clearly see a better, more fully resolved picture on 1080i vs 720p with my 768p Fujitsu plasmas! It wasn't just the 1080p Pioneer that showed me the picture was better, although the Pioneer certainly magnified the differences tremendously. The Pioneer simply demonstrated more vividly that the 1080i SIGNAL was superior to the 720p. Better displays will show the differences to a greater degree, but it clearly shows there are differences that are far from subtle. Good deinterlacing is no longer the sole property of high end displays, many mid displays have very good deinterlacing.

Good displays show it clearly and no advertising, marketing hype or outdated papers with outdated findings can change that. Japan also chose this resolution and I don't think they're prone to this 'marketing hype' either. This is not to say that 720p can't display a beautiful picture, it can. But there is simply no getting away from the fact that there is FAR more detail in a 1920X1080i signal than 1280X720p...unless the math I grew up with has changed...and that's not change I can believe in. The 1080i signal requires more bandwidth for a good reason, it carries more information! I've also read that one of the reasons NASA chose 720p, was bandwidth constraints. And if you think about, it makes perfect sense. For NASA, bandwidth IS a critical issue.

There have been tons of people who see the same thing and they're not the least bit privy to 'marketing hype'. In fact you hear almost no marketing hype at all from any network anymore advertising 'our 1080i is better than their 720p'. They don't have to, it shows. Honestly, for the life of me, I have no idea how anyone could not see the difference unless they're watching on a small HDTV or a pretty poor one. Virtually everyone I know with an HDTV has commented on it at one time or another. Some are not even aware of why it is and have no idea of the underlying resolution differences.

The same holds true for camcorders. There is an obvious increase in detail going from from a 720p camcorder to a 1080i unit. Yes, neither will resolve a full rez signal due to other limitations, but you'd have to be nearly blind not to see the greater detail in the 1080i signal which will generally have both a higher horizontal and vertical component in detail.

I should also point out at this time that some of the 'artifacts' that people complain about with broadcast HDTV has nothing to do with the nature of the signal, but rather with multicasting! Multicasting has been one of the true detriments to high quality broadcast HDTV. When a broadcaster decides to multicast on 4 or more different channels, it plays total havoc with the available bandwidth for the main HD channel. This has nothing to do with 'i' or 'p'. But I've seen many people confuse this issue as the result of multicasting.

Steve, again, you need to appreciate the fact that 95% of all broadcast material is static or nearly static in nature. A fact the computer guys don't like to mention since they will forever push that everyone conform to their progessive world. That fact alone virtually eliminates any of the old 'artifact arguments'. Several years ago I had a conversation with one of CBS' top engineers and he mentioned to me how misunderstood this concept was. He emphasized how even with sports, the vast majority of the progamming is nearly static in nature.

Emmanuel Plakiotis
June 21st, 2009, 06:29 PM
Steve,

I can confirm you, that while in IBC two years ago, I have read on the free IBC mag, EBU's finding - based on real life tests on consumer sets - 720p looked better than 1080i.

Your argument is also supported by the fact that even in their latest camcorder (800) Sony halved the vertical resolution of the slow motion (1920X540 at 60P). It was not a hardware limitation. They knew that nobody would see the extra lines on their TV's and at the same time they protected their digital cinema gear.

In the same manner NTSC SDTV is only 240p and that illustrates the real differences between the resolution of the formats. It also explain why 480p (especially uncompressed) looked so good compared to HDTV (even when transfered to film).

Ken Ross
June 21st, 2009, 06:49 PM
As I've said, processing has improved considerably in HDTVs. The resolution is NOT half when displaying & viewing 1080i. Test after test after test has proved this. It's time we don't live in the past folks. If you believe 720p looks better to you, then watch 720p. But I've found most diehards for 720p are diehard computer people. The industry lives in the world of 'progessive' and wants everyone else there. I'd more than welcome 1080p for all sources, but until that happens (and it won't happen for a long time), I'll gladly take 1920X1080.

But there are tons of people out there that have seen both and agree there is significantly more detail in 1080i. I would have to be blind when watching my 60" Pioneer Kuro to not see the additional detail.

Please also keep in mind that when viewing MOVIES, this entire argument is meaningless. You will get the full 1920X1080 (or whatever the movie's actual telecined resolution was) when viewing a BD movie. If 720p was so wonderful, then the movies too would have been mastered at 720p. It is most certainly not the case. If 720p were so wonderful, we'd see HIGHER priced 720p camcorders relative to their 1080i counterparts. This is not the case and the professional broadcast arena's highest priced models are 1080.

As for 480p, yes it can look good if the upconversion is done well. But one only needs to compare that same movie to a BD release to see the huge difference. If you can't see that difference and see it easily, it's time to get a new HDTV or, at the very least, a larger HDTV. I just don't know what else I can say, but it seems that many people here are not seeing the full glory our HDTV standard has given us...and that's a pity.

At any rate, I'm surely looking forward to the arrival of my GH1...I don't know how we got off on this tangent. Fortunately for everyone, the Panny shoots in both modes. However, it looks like there is insufficient bandwidth for smooth motion in the 1080p mode according to some. We shall see.

Steve Mullen
June 21st, 2009, 07:21 PM
[QUOTE=Ken Ross;1161466]
As I said before, much of your information was outdated, including your belief that only ONE manufacturer, Samsung, was providing 1920X1080 displays. There have been many more manufacturers than that providing full rez displays for several years./QUOTE]

Ahhh. I see why you are so confused. You are following the "logic" of most consumers who believe that what counts is the number of pixels. This is why companies adopted the FullHD marketing approach. Make panels with 2 million pixels and call it FULL hd. (The same marketing is used with cameras. Make cameras with 2 million pixels and call it FULL. hd)

What engineers, and what Gary blew the whistle on, is that it's the amount of INFORMATION that's fed to the panel (and recorded from the sensors) that is the real limiter of what you see.

Since I have enough knowledge on this topic to have my information published in the highest high level video engineering magazine world-wide, aren't you a little over your head in this discussion? Since BE publishes my information to educate video engineers, what are the chances that were I wrong on any of this, I wouldn't receive instant feedback from people like Larry Thorpe ("inventor" of CineAlta while at Sony). Who by the way, not only also writes for BE -- he reads my stuff.

You have the links by which you could actually educate yourself. As long as you want to talk about what you see and believe -- this exchange is a waste of time.

PS1: BACK TO THE GH1. The reason folks report seeing more resolution from 1080p24 than from 720p60 is because 24p is recorded with 2-3 pulldown. The virtue of pulldown is once a GOOD deinterlacer (software or hardware) detects pulldown -- it switches from video to film deinterlacing. It can now use cadence to reconstruct perfect 24p frames. These frames have ALL the information in the original 24fps frames -- unlike interlace video frames where information can NEVER be fully reconstructed.

These 24fps frames are then repeated in a 2-3 pattern to yield 60p or repeated 3 times in a Kuro (72Hz) or repeated 5 times for a 120Hx LCD or repeated 10 times for a 240Hx LCD.

The Kuro was the only monitor that can CORRECTLY present film without 2-3 pulldown because 3X is the highest acceptable repeat rate for film. Once 5X or 10X is used "film" looks like "video." And, 60p viewrs get 2-3 pulldown which is not what you see in a theater.

PS2: 720p passes thru deinterlacers and only needs to be scaled up to 1920x1080. And, if the video is 720p25 and 720p30, each frame is repeated twice to generate 60p or 50p -- just like when film is projected.

720p25 and 720p30 are the only video formats that when viewed on a flat panel monitor create the same experience as watching film in a theater! In fact, with a 120Hz LCD that displays a black frame after each video frame (MOST DO NOT DO THIS) you'll see similar to what a projector throws onto a movie screen! (frame > black > frame > black)

1 2 3 4
1b1b3b2b3b3b4b4b
OR
11bb22bb33bb44bb


PS3: This is why folks who do NOT like the look of film hate 720p25 and 720p30. The motion judder is that of film. Which is why they want to shoot 720p50 and 720p60.
CONVERSLY, if really want the look of film projected in a theater, you should shoot 720p25 and 720p30 -- not 1080p24.

Khoi Pham
June 21st, 2009, 08:17 PM
All I can say is 1080i has much more details and more wow factor than any 720P material I've ever looked at, don't care what numbers you come up with Steve. (-:

Steve Mullen
June 21st, 2009, 11:45 PM
All I can say is 1080i has much more details and more wow factor than any 720P material I've ever looked at, don't care what numbers you come up with Steve. (-:
It ain't "numbers." It's an understanding of HOW things work. Plus, measurements to confirm or refute what you see. Plus, let's face facts -- there are indeed a very few people in any field who have "trained" eye/ears. These few have eyes/ears that are accepted as "golden" and so their opinions are accepted. These people are able to publish their opinions and get paid for them. Do you really believe you are in this elite group?

If you are convinced in your limited personal experience that you see more "wow" in CBS over ESPN then that's your "reality."

However, the fact people believe in all kinds of gods and herbs (etc.) without one shred of scientific proof only shows how faulty personal beliefs can be and why test and measurement based upon sound science is the only way to get past "opinion" to "facts."

Would you fly in a jumbo jet assembled by a group that based it's design upon their own untested "opinions"? There's a reason why when it comes to life and death we turn to engineers who aren't afraid to TEST their beliefs and use MEASUREMENTS. When your life depends on it, suddenly "numbers" becomes a requirement. Video is no different, except it's not life and death.

Ken Ross
June 22nd, 2009, 04:59 AM
All I can say is 1080i has much more details and more wow factor than any 720P material I've ever looked at, don't care what numbers you come up with Steve. (-:

Khoi, this is what amazes me. We are led to believe that we are not really seeing what we're seeing! We are told "As long as you want to talk about what you see and believe -- this exchange is a waste of time"! Funny, I always thought that 'seeing WAS believing'. So now you must 'train' yourself to convince yourself that the softer, less detailed picture is actually showing more detail. I have a very very good eye for detail, color and artifacts. I've been in HD as long as HD has been available. I have never been prone to 'marketing B.S.' and I know my eyes are discering enough to see beyond the B.S. But I also know that the computer world has also tried to sell us a bill of goods because they want everyone in their 'progessive' world. They've always wanted the consumer TV world to come to them. Their idea of 'convergence' was for TV manufacturers to come to design according to THEIR standards. I've said it before, I'd love to have 1920X1080p, but it isn't going to happen any time soon in the broadcast world. Steve ignores the fact that 1920X1080i requires far more bandwidth than 720p. It does so for a good reason, it carries MORE INFORMATION.

What amazes me more is the fact that people who have absolutely no idea about the relative numbers of different channels' resolution, will repeatedly say "how come ABC doesn't look as sharp as CBS or NBC?". I've heard it from people myself countless times! These people have no clue whatsoever that ABC is 720p and CBS & NBC are 1080i! But again, somehow they're eyes have deceived them!

Yet these people have not been 'sold a bill of goods' by the networks! They have no idea what the numbers or claims are, they simply see a sharper, more detailed picture!!! According to Steve, they too are wrong.

And Steve, please please, I'm well aware of the fact that pixel resolution is not necessarily related to horizontal or vertical resolution. Give me a break, I know a bit more than you think I do. I just had this discussion with a video friend several weeks ago who felt that a 1920X1080camcorder would yield a more detailed picture than a 1440X1080 cam. I showed him that my Z5 (1440X1080) yielded a decidedly sharper, more detailed picture than my XR500 (1920X1080). However, I would defy ANYONE who has watched a typical "HDNet" broadcast (1080i), using their standard HD vidoecameras, to show me an ABC show that has the same detail. You can't because it simply doesn't exist. HDNet has been considered the picture quality 'Gold Standard' for many years in HD broadcasting. This aint 720p gentlemen. Interesting too that HDNet (1080 equipment) was selected by NASA to provide all their launch video.

But as far as this general discussion, I will tell you again, your numbers are OUTDATED. First you claimed that only Samsung was producing 1920X1080 panels (you mentioned NOTHING about resolution). You were wrong. Then you said none of these displays were showing more than 400-500 lines of resolution and when I pointed out the fact that your info was outdated, you then said the Kuro was the only one that could deinterlace properly and yes, its resolution was decidedly higher. That's still not entirely accurate. The Kuro has better processing than other panels, but many other panels are fully capable of showing more detail than is contained in a 720p picture.

I suggest you read some more current reviews over the past 1-2 years and you will see that plasmas from Panasonic (according to Gary Merson e.g.) as well as others at the time of testing, are now also deinterlacing properly and providing far more than 720p resolution.

Sorry Steve, seeing IS believing and the TESTED numbers DO prove it.

Khoi Pham
June 22nd, 2009, 09:52 AM
Same here, I have been watching HD since the beginning, at that time you had to buy a HD receiver, I bought an Zenith HD receiver that can pick up HD over the air and Direct TV and yes I completetly agree with you that HDNET is the best quality HD channel outthere and it ain't 720P.
Steve funny how you compared CBS to ESPN, I always thought that ESPN HD sucks big time, even BEFORE I knew that they broadcast in 720P, and half of the time they don't even have real HD, 4X3 SD material with graphic on the side to fill out 16X9 screen.
And you seem to contradict yourself when you were trying to explain why people think that the 1080 mode on the GH1 has more detail than the 720mode, even with the cripple codec missing B frame and low bitrate, the 1080 mode still has more details than 720P, but has more artifacts when there are more movement, why is that? because there are more details in 1080 mode and with the low bitrate and no B frame, it could not handle the stream with more information therefore you are getting more artifacts in 1080P than in 720P.
Yes I know you are big time writer and all, going up againts you is suicide but, I know what I see and I ain't seeing what you are saying.

Ken Ross
June 22nd, 2009, 11:22 AM
Yes I know you are big time writer and all, going up againts you is suicide but, I know what I see and I ain't seeing what you are saying.

Nor are many others that believe their eyes without having any inkling as to the underlying numbers. 1920X1080 carries far more information and therefore requires far more bandwidth. That is a simple fact. But there are some that continue to live in the past using outdated HDTVs and processors as a reason why 720p is better.

Steve used the airline industry as a reason as to why you trust engineers. Seems to me some horrific engineering went into planes like the DC10 that cost many people their lives. Listen to engineers all the time? Nope. Especially when what you see so totally contradicts what their 'reality' is. 95% of programming is static or nearly static in nature, so even there a relatively poor deinterlacer will STILL show more rez than a 720p signal.

That's something the 720p guys don't like to talk much about...and when you do, they just ignore it.

Xavier Plagaro
June 24th, 2009, 05:38 AM
Wow, very nice tech thread! ;-D

It shows what a messed world we live in, please we need some organization that come to earth and force everybody to use a couple of broadcast standards, like 1080p24 for fiction and 720p60 for reality. Wouldn't the world be an easier place for us??

Steve Mullen
June 25th, 2009, 06:15 PM
Wow, very nice tech thread! ;-D

It shows what a messed world we live in, please we need some organization that come to earth and force everybody to use a couple of broadcast standards, like 1080p24 for fiction and 720p60 for reality. Wouldn't the world be an easier place for us??

You've got it right! Progressive 1080 is very different than interlace 1080. Progressive 1080 -- either 1080p24 or 24fps carried within 60i -- can deliver to the screen a full 2 million pixels of video -- assuming the deinterlacer can sense the pulldown for the latter. (No deinterlacing is needed for pure 1080p24.) So for movies 1080p24 is much better than 720p60. For live/taped video which has no pulldown to be sensed -- most deinteracers can NOT obtain more resolution than 720p60 delivers.

1) 1080p24 is a legal ATSC standard, but it's not used -- except by DISH and DirecTV when you order 1080p24 movies.

2) 2-3 pulldown is added film to make 1080i60 or 720p60. When you watch on most flat-panels, 720p60 is shown as sent. (A Kuro is able to remove pull-down to get back 24p and then show it at 72Hz.)

3) With 1080i60, the hdtv deinterlacer needs to correctly sense the 2-3 pulldown. (An amazing percent fail to be able to this -- so they treat film as video.) If a deinterlacer senses pulldown -- it reconstructs the original 24p and then adds 2-3 pulldown to convert 24p to 60p. (A Kuro is able to remove pull-down to get back 24p and then show it at 72Hz.)

PS1: Although many hdtv's fail to sense pulldown, once sensed, the cadence tells the deinterlacer EXACTLY how to construct PERFECT 24p. Which means when you watch movies via 1080i60 -- you really do get 2 million pixels of video. Which is why HBO, etc use 1080i.

Conversly, when VIDEO (e.g., sports) is sent via 1080i60 -- there is NO way to PERFECTLY convert 60i to 60p. Which is why so many hdtvs don't try and thus present video with only 330- to 580-lines of resolution.

So, for those who bought all but a few very expensive hdtv's -- when you watch 1080i60 VIDEO what you are seeing on your 2 million pixel screens is at best almost 1 million pixels of video and at worst only slightly more than 1/2 million pixels of video. When you watch 720p, you ALWAYS see 1 million pixels of video.

TO SEE THE TEST RESULTS DATA on 2008 HDTVs:

http://www.hdguru.com/will-you-see-a...exclusive/287/

THERE IS NO "NEWER" PUBLISHED DATA -- CONTRARY TO ERRONEOUS CLAIMS

Let's assume you didn't buy a really cheap FullHD LCD so you get about 580-lines of resolution -- less than the 720-lines from 720p. So, you get less vertical resolution, but more horizontal resolution -- 1920 verses 1280. So you see about the same amount of TOTAL detail.

However, the person watching 720p will see motion 2X more accurately than from 1080i60. Which is why ESPN, etc. use 720p60. For MOST hdtvs -- all viewers will see 2X greater motion resolution and about the same spatial resolution from 720p60. (Which is why the GH1 is of interest to me.)

PS: if you order a 1080p24 movie, unless the dish box can output 1080p24 AND your hdtv can input 1080p24 -- the box will add 2-3 pulldown to make 1080i60 or 720p60. This is the same thing that happens with a BD player.

=============

BACK TO THE GH1: if the AVCHD codec worked well for 1080p24 AND you didn't want high motion accuracy -- you would shoot 1080p24. And, as I've said repeatedly, because it is progressive video with 2-3 pulldown, of course you will see more DETAIL than if you shot 720p. (Assuming a deinterlacer correctly senses pulldown.)

However, since the 720p60 codec works better, you can shoot 720p60 and simply drop into a 720p24 Timeline -- and you'll have a 24fps movie. (Although the 1/60th shutter-speed is a bit too fast.)

HOWEVER, if you really want a film look -- 720p30 is the best choice. You can select a 1/30th shutter when shooting Motion JPEG. 30fps with 1/30th provides the low motion accuracy filmmakers love.

And, if you've learned anything from this long thread you now know that 30p does NOT need 2-3 pulldown removed before editing.

Because it's not interlaced, it doesn't need to be deinterlaced in your hdtv. You get a full 1 million pixels of information which is fine because filmmakers tend to like to "soft" video.

And, when viewed. each frame is shown twice which is the same as when film is projected. You'll get the motion judder filmmakers love. (Of course, you'll need to control motion just as filmmakers do oterwise you'll get way too much judder.)

Ken Ross
June 25th, 2009, 08:57 PM
As I've stated several times, getting an HDTV that correctly deinterlaces is NOT limited to 'the most expensive sets'.

As Gary said in his last test, "This year's sets fared much better with 96% of the 125 HDTVs passing". This is why I say it's time to stop living in the past. Your chances of buying a new HDTV that deinterlaces properly are excellent...essentially 96%! :)

Add to that the fact that about 95% of broadcast material is static or nearly static in nature, and you have the ingredients for a great 1920X1080 picture that will significantly surpass the detail available in 720p.

Moving on to the actual thread subject, I received my GH1 a couple of days ago, and it is one superb camera!!! From a still camera standpoint, the success rate of properly exposed & focused pictures is higher than any camera I've had in the past. The white balance on this camera is truly excellent and surpassed what I've seen from both Nikon and Canon. The tonal quality of the picture is just great as is the detail.

The movie mode is truly wonderful. The 720p, on my 60" Pioneer, is really very very nice with buttery smooth motion. It's not a match for my Sony Z5, but hey, this is still primarily a DSLR. As for the 1080p mode, unfortunately I'm not a big fan of 24p video and the stuttering that comes along with 24p is distracting to me. But the detail that's available in this mode (1920X1080) is truly amazing. Here you're combining some really great dynamic range, superb color and fantastic detail, to create a really eye-popping image...but damn that 24p!!! Anything with significant motion in the frame, falls prey to the stuttering. There is also some blurring of fine detail as you pan across the frame. The encoders, codec and bitrate may just not be up to the task of holding all this fine detail together with movement. But suffice is to say that if you intend to use this camera as you would a typical camcorder, you might be disappointed with the 1080p mode.

This is why I've avoided the 24p mode in any videocamera I've ever owned. Now if they released a similar model with 1920X1080 @60i...now that would be a show stopper in my opinion. But the 720p video is as good or better than any DSLR I've yet seen. On a sidenote, Panasonic has already released a firmware upgrade, but I can't honestly say I've seen much of a difference with it.

Right now the biggest problem with this camera is simply getting one!

Khoi Pham
June 26th, 2009, 09:23 AM
Hi Ken, when you said 720P mode in the GH 1 is no match to your Z5, do you meant in details compare to Z5 in 1080i? Can you tell a big difference if I cut between the two camera on a 1080i timeline? how is the F4 stock lens compared to the Z5 in low light? Can you still get good shalow dof at F4?
Thanks.

Luke Tingle
June 26th, 2009, 12:09 PM
Ken, did you buy from a US retailer?

Ken Ross
June 26th, 2009, 01:42 PM
Hi Ken, when you said 720P mode in the GH 1 is no match to your Z5, do you meant in details compare to Z5 in 1080i? Can you tell a big difference if I cut between the two camera on a 1080i timeline? how is the F4 stock lens compared to the Z5 in low light? Can you still get good shalow dof at F4?
Thanks.

Khoi, in terms of detail at 720p, the Z5 does better at 1080i. There is definitely more detail and an overall 'tighter' picture. I would expect that since the Z5 produces such a nice picture and I wouldn't expect a DSLR to surpass it...certainly not at 720p.

However, in terms of color and dynamic range, the GH1 certainly holds its own and surpasses the Z5 when used at 1080p. Forgetting the motion issue, the color and dynamic range are simply better on the GH1 than the Z5. Of course it's not easy to 'forget' about the motion issue, it's there.

So you'd have to shoot at 720p if you want to mix these two cams. I think it could be done, but it will take some work in post to achieve it.

The low light of the GH1 is better than I expected. In video mode it's not quite as good as the Z5, but it's not far off. I don't think that's where you'd run into trouble.

Luke, I bought the camera from a dealer in Canada (Henry's). They had just gotten 6 or 7 units in when I saw it posted on another forum. I immediately called and ordered from the last two that were available. They were sold out in 3 hours. My order with Panasonic was sitting there until yesterday and was further delayed to at least mid to late July. Panasonic has done a horrific job with the ordering process. My order went from Panasonic direct to "Shopatron"...good luck.

Steve Mullen
June 26th, 2009, 06:15 PM
As Gary said in his last test, "This year's sets fared much better with 96% of the 125 HDTVs passing".

Add to that the fact that about 95% of broadcast material is static or nearly static in nature, ...

Now I see why you are so insistent that video is essentially STATIC -- even though you clearly point-out you hate 24p because of MOTION judder. Gary's test of STATIC video (test #1) did show an improvement in the percentage of deinterlacer Passing THIS test.

However, that's his old test and not the important NEW one. He now MEASURES vertical resolution when there is MOTION in one part of the picture -- test #2. (A tracking shot, pan, or zoom -- all impart motion even if the subject is static.)

This is where the MEASURED resolution showed the vast majority of the sets have a vertical resolution under 600-lines -- less than from 720p.

To try to support your OPINION you have completely miss-represented to the readers of this thread, what Gary published. Anyone who wants to SEE this miss-representation:

Will You See All The HDTV Resolution You Expected? 125 2008 Model Test Results- HD GURU Exclusive HDGURU.Com (http://www.hdguru.com/will-you-see-a...exclusive/287/)

========

So, for those who shoot all scenes with NO motion whatsoever in every shot -- then it is true that 96% of the 2008 deinterlaces will use Weave deinterlacing and feed 1080-lines to the display.

For everyone else -- what you see depends on which hdtv you own. For example, if you buy a 150-inch Panasonic you'll see a whole screen with almost 1080-lines of resolution. Same with the Kuro and the top of the line Samsung.

If you don't own these -- then 1080i60 will be MEASUREED at under 600-lines.

* The worst case will be a cheap hdtv that upon the presence of motion switches the entire frame from Weave to Bob.

* The best case will be an hdtv that upon the presence of motion switches the only the object(s) that are in motion frame from Weave to Bob. Now the "static" areas will MEASURE almost 1080-lines while the "motion" areas will MEASURE under 600-lines.

So the question is -- "are you feeling lucky?" If you shoot 1080i60 there will be a very few folks who get the UP TO full vertical resolution. Most will see UP TO 1080-lines in static areas but only UP TO half that in areas with motion. With 720p they will see always see 720-lines.

++++++

Why the UP TO? Because all interlace recoding is done after sending the 1080-lines through a low-pass filter which cuts recorded vertical resolution to about 800-lines. Progressive recording does not require this filtering and so 720-lines are recorded.

Gary's tests used an signal generator that delivered 1080-lines of interlace video. This is not what recorded media can have! So, for example, when Gary's TESTs show vertical resolution is cut by 50% -- you need to multiply this 50% times the 800-lines in recorded 1080i. Now the real difference between 1080i and 720p emerges. And, remember this is true of even 720p30 video! The only advantage of 60p over 30p is 2X greater motion accuracy.

+++++++

Remember, my comments about 1080i are NOT valid for 1080p24 -- IF, and it is a IF, an hdtv can correctly sense 2-3 and/or 2-2 pulldown. 720p24 and 720p30 are always displayed correcly because the de-intelacer simply passes the signal through.

Bottom-line what your audience will see when you shoot any 1080-line format is subject to huge variations downstream. For example, your camera can't compress 1080i efficiently -- which we see with the GH1. (Even though the sensor is shooting 24p -- the compression system is inputting interlaced frames.)

And when broadcast, 1080i60 is often greatly compressed to allow carrying a second or third SD channel. (NBC allows this, CBS trys to prevent this.) This need not be done with 720p60 since it naturally (because it is more efficiently compressed) leaves 4Mbps free for multicasting.

And, we shouldn't forget the conversion of P to I is simple -- but the conversion of I to P is very very hard. When would this occur? Anytime 1080i video is sent to a station using 720p. And, anytime a conversion between 50Hz and 60Hz video is done.

Once you understand video technology -- you see why, for example, NASA shoots it's own HD in 720p60.

Ken Ross
June 26th, 2009, 07:59 PM
Once again Steve tries to tell us 'don't believe your eyes, believe ME'. Steve, have a little respect for the vast majority of people with 1080p that DO see the benefits of the higher resolution of 1920X1080i broadcasts. They all don't have the most expensive HDTVs as you imply, yet they STILL see the benefits of 1920X1080.

Speaking of misrepresentation Steve, you clearly imply that if you don't have a 150" Panasonic or the expensive Pioneers, you won't get resolution greater than 720p is capable of. This is CLEARLY false and very misleading. Your own link to Gary's testing shows models from LG (not too expensive Steve), 42" Panasonics, 50" Panasonics, 65" Panasonics and the Pioneers, are ALL capable of vertical resolution greater than 720p with motion. Also, as a sidenote, Panasonic has always been the most prolific manufacturer of plasma HDTVs and Gary's tests show they pass his motion testing. So there are many many Panasonics out there that owners are enjoying and these were not the most expensive HDTVs on the market. And folks, keep in mind we've been discussing VERTICAL resolution. In Steve's world horizontal resolution doesn't exist. Why? Because it doesn't fit his 720p world and 720p argument.

Horizontal resolution is SIGNIFICANTLY greater with a 1920X1080 signal than it is with a 720p signal. But Steve omits this on a consistent basis. When he points out my statement of 'static, or nearly static', he ignores the fact that this still implies that even vertical resolution will be higher than 720p with some movement...hence the 'nearly static' comment I made. The fact remains that 95% of broadcast material IS static or nearly static and therefore WILL display vertical resolution in excess of 720p even if the deinterlacing isn't perfect. The horizontal resolution will be greater regardless. This is one of the reasons why owners of 1080p HDTVs so often prefer the 1920X1080i broadcasts. It's not their imagination that they are seeing a sharper, more detailed picture.

But again, you don't need to read articles, charts or Steve's words or mine. Your EYES will tell you the truth and the vast majority of comments I've read from owners of 1080p HDTVs attest to the fact that they DO see a sharper, more detailed HD picture with 1920X1080i broadcasts. I believe MY eyes, not someone's campaign for 720p.

And the other simple fact is that if you have a properly designed HDTV, you WILL see significantly greater horizontal AND vertical resolution with 1920X1080i. Unlike Steve, I believe in getting the display equipment that will enable me to enjoy this better picture. I don't believe in subscribing to the 'lowest common denominator'. We have a better standard and a properly designed HDTV will show that better standard.

Using Steve's logic, we should all stick to AM radio since AM radios are cheaper. You can get better sound quality, but it will cost you more. As with most anything else, you can get a better HDTV picture, but that too will cost you more. But even the relatively cheaper Panasonics will yield more resolution than 720p is capable of.

Oh, and once you understand bandwidth, you'll better understand why NASA went to 720p. But Steve omits the fact they went with HDNet's 1920X1080i equipment to capture all the fine details of critical launches just in case something were to go wrong.

Steve Mullen
June 27th, 2009, 09:39 PM
Speaking of misrepresentation Steve, you clearly imply that if you don't have a 150" Panasonic or the expensive Pioneers, you won't get resolution greater than 720p is capable of. This is CLEARLY false and very misleading. Your own link to Gary's testing shows models from LG (not too expensive Steve), 42" Panasonics, 50" Panasonics, 65" Panasonics and the Pioneers, are ALL capable of vertical resolution greater than 720p with motion.

What you fail to mention is that ALL these are PLASMA -- which is a nearly dead technology. Of the three companies you mention (LG, Panasonic, and Pioneer) -- Pioneer has already stopped ALL production of plasma hdtvs.

Of the 2 LG's -- the 60-inch fails the 2-3 pulldown test, leaving a 50-inch. This looks like a very good hdtv, but you need to consider this "On Thursday, a report in Digitimes claimed that LG would pull out of the plasma business this year, and sell it to Changhong, a Chinese company." LG has denied this report.

The best choice, not mentioned, is a Vizeo PLASMA -- however it doesn't do well on the bandwidth test. But, a refubished it is really cheap at under $800. However, note:

"February 11, 2009 — Vizio has just announced it will get out of the plasma business altogether, due in large part to the overwhelming popularity of LCDs and the vanishing price gap between the two technologies."

Of the 4 Panasonic's tested -- 3 fail the 2-3 pulldown test. The only 1 to pass measures only 850-lines -- not really much more than 720 when you consider the limit of 1080i recorded media is about 800-lines.

Bottom-line, there is only 1 company certain to produce plasma into the future -- and with a 75% failure to sense 2-3 (Panasonic has almost always not listed this as a feature on ANY of it's plasmas -- and even the 150-inch fails this test!) this is not a brand for watching movies.

Without plasmas to buy -- there are only LCDs and DLP. While the DLP is far better -- the LCDs, other than the top of the line 120Hz Samsung, all show only about 1 field of vertical resolution.

Think of the controversy this way -- all but one LCD displays only 1 field of 1080i each 1/60th second. That's 1 Mpixels of VIDEO information. In the same 1/60th second, 720p60 delivers 1 Mpixels of VIDEO information. Exactly the same amount of VIDEO information! That's why I don't need to count H. rez.

Moreover, if you look at my first post I never claimed 720p was better. You took it that way in your rant after my post.

I said, "The "hit" in resolution depends on your point of view. 720p60 has half the spatial resolution of 1080p24 but over twice the temporal resolution. So if you ask if there is a QUALITY hit -- the answer is NO. You simply trade one type of "resolution" for another."

So, since you want to rant -- this thread is yours.

Ken Ross
June 28th, 2009, 08:54 AM
No Steve, there is no desire on my part to 'rant', but rather to present the facts accurately. You clearly implied in several posts that 720p was the 'preferred' standard and looked better.

I refuted that by correctly mentioning that most owners of 1080p displays (ANY...no specific brand, no specific technology), report a better, sharper, more resolute picture while watching 1080i broadcasts. You can spin this anyway you like, but it seems that the vast marjority of people are preferring 1080i, many of whom have no idea of the underlying numbers. So their reaction is clearly not the result of 'advertising' (which by the way you hear nothing of anymore). I get the distinct impression you haven't spent much time watching a quality 1080p display, otherwise I don't think you'd feel the way you do.

By the way, Panasonic owners are also in this mix for reporting a better picture while watching 1080i broadcasts...it's certainly not just Pioneer owners!

So let's just agree to disagree on this and move on. We've both hijacked this thread long enough.

Ian G. Thompson
June 29th, 2009, 12:49 PM
Well, in terms of detail, this test conducted by Jack Daniel Stanley shows that the GH-1 has lots of detail in 720p...even more than the HPX170 @ 1080p. I thought this was a very good test.

GH1 - Frame Rate Conversion Tests - 720 60p & 30p to 24p vs. HPX 170 1080 24p - DVXuser.com -- The online community for filmmaking (http://www.dvxuser.com/V6/showthread.php?t=175852)

Ken Ross
June 29th, 2009, 02:02 PM
I saw that Ian. The thing that's too bad (at least from my perspective) about the GH1, is that there is far more detail in the GH1's 1080 24p video than its 720p output, but motion stutter makes it unaccpetible to me. When you view a static scene at both outputs, it drives you nuts to see how much potential there is from this camera.

It's the same issue I've had with any cam I've owned that does 24p. To my eyes I'll never understand why some people like this motion issue. I understand the desire to replicate the 'look of film', but if this is what the movies looked like, I'd never go out to see them.

Steve Mullen
June 30th, 2009, 04:07 AM
Well, in terms of detail, this test conducted by Jack Daniel Stanley shows that the GH-1 has lots of detail in 720p...even more than the HPX170 @ 1080p. I thought this was a very good test,

Ian, when I shot all three codecs at PMA and viewed the results -- to me, M-JPEG had far more clarity -- which I thought was really sad since I wanted p60. At PMA the prototype shot true 1080p24 with no pulldown. So when Pana went with adding 2-3 pulldown I could see no advantage -- and still don't.

Now about the no-B frame explanation and the supposed "overload" caused by 1080p24. FullHD has 2 million pixels that need to be compressed 24 times per second. Let's round this up to 30 times per second. 720p60 is 1 million pixels that must be compressed 60 times times per second. Both are the same load. So, in fact, 1080p24 is fewer pixels to compress per second -- so it places slightly LESS load on the encoder.

However when 2-3 pulldown is added, "split" (judder) frames are created that are inherently difficult to compressed. (Harder than interlace. And interlace is harder to compress than progressive because under motion each field has the same objects in different locations.) So this GREATER load might well balance-out the fewer pixels per second making both 720p60 and 1080p24/60i about the same "load."

Therefore, both should encode to equal bit-rates. But from what I've read, 720p60 gets 2X the data-rate of 1080p24/60i. That crazy! And, wrong!

The only way I see how this could be if someone designed the encoder based upon a huge logic error: "OK -- 24fps is half of 60fps so for equal quality, 60fps must get the full 17mbps and 24fps only needs 8mbps.

At CES, I reported the GH1 video looked horrible. Even on slow pans tree leaves just turned into mush. I wondered how no one at Pana noticed how bad the video looked.

It's possible that the encoder is an off-the-shelf LSI chip that has no modifiable firmware. Japanese companies typical build an initial quanity -- like 20K units -- and will sell them without any changes until they gone. Then they phase in a new part.

PS: Without mentioning names, I was blown away when -- on a whim -- I compared my Casio EX-F1 to a 3-chip camcorder that sells for many many times more money. My F1 provided far more clarity!

I have to wonder if the lens on a still camera that shoots 6MP to 25MP images -- doesn't inherently have a really high MTF at relatively low video resolutions. So I suspect that "clarity" -- which looks different that "video" resolution -- may well be better with still cameras. Moreover, these camera's chips are already well beyond Red's 4K!

Steve Mullen
June 30th, 2009, 04:09 AM
I understand the desire to replicate the 'look of film', but if this is what the movies looked like, I'd never go out to see them.
BRAVO, Ken!

Ken Ross
June 30th, 2009, 05:29 AM
herefore, both should encode to equal bit-rates. But from what I've read, 720p60 gets 2X the data-rate of 1080p24/60i. That crazy! And, wrong!



Steve, are you sure about that? I was under the impression that both 720p & 1080p (using AVCHD) were getting the same bitrate on the GH1, but I could be wrong.

As far as the small still cam looking better than the pricier 3-chip videocam, I think there are other factors. One is color accuracy and it seems the digital still cams do better on average than the typical camcorder.

The other issue is the size of the chips. Generally a good digital still camera will have a larger chip (even if it's 1 vs. 3). The larger chip generally gives the camera a larger dynamic range than a typical consumer/prosumer vidoecamera.

I can see this for example in comparing my GH1 to my Z5. As good as the Z5 is, the GH1 has a greater ability to capture a larger tonal range and seems to come closer to capturing a scene as you saw it. On the other hand, since I'm limited to 720p on the GH1 and the Z5 can produce smooth 1440X1080 (even though it only captures about 900 lines of horizontal resolution), I still prefer the overall 'look' of the Z5 for video. Of course other issues such as the Z5's far superior autofocus, power zoom and a host of other image adjustment parameters, make the Z5 a superior tool for capturing video.

Now if the GH1 were able to produce smooth 1080p motion and had a better autofocus, all bets would be off. ;)

Ian G. Thompson
June 30th, 2009, 10:37 AM
Steve, are you sure about that? I was under the impression that both 720p & 1080p (using AVCHD) were getting the same bitrate on the GH1, but I could be wrong.
This is true...both those modes are @ 17Mbps. I think what Steve might have meant was the GH-1's MJPEG mode which is actually 720/30p @ 30Mbps. It actually looks decent and is much easier to CC.

At PMA the prototype shot true 1080p24 with no pulldown. So when Pana went with adding 2-3 pulldown I could see no advantage -- and still don't.I agree.

But as far as the 720/60p getting twice the data rate….it does not. It’s the same as the 24p. And also the mud is a non-issue in 720/60p. Do you think the interlacing (or pull-down) is contributing to the 1080/24p mud? You barely see any in 60p at all. It only happens during some high motion with high details with 24p. This is typically where interlaced artifacts show up in other cams footage that was not de-interlaced correctly. I understand the issue of lacking B-frames but then again the Canon 5D ll also lacks B-frames....but is set to a higher data rate.

Ken Ross
June 30th, 2009, 12:53 PM
I don't think this 'mud' issue is well understood. On another site someone did a test and commented that he say 'mud' in stationary objects, but not moving objects! I saw exactly the same thing.

In a shot from my window, grass that was stationary looked muddy, yet moving tree limbs and leaves were perfect. So I think something else is going on here.

Steve Mullen
June 30th, 2009, 10:18 PM
I don't think this 'mud' issue is well understood. On another site someone did a test and commented that he say 'mud' in stationary objects, but not moving objects! I saw exactly the same thing.

In a shot from my window, grass that was stationary looked muddy, yet moving tree limbs and leaves were perfect. So I think something else is going on here.

I posted "But from what I've read, 720p60 gets 2X the data-rate of 1080p24/60i." because when I skimmed thru the 1000 posts are remember several posts saying the saw 108024 running -- even motion -- only about 8Mbps while 720p60 was at 17Mbps.

If those posts were wrong and both are at 17Mbps AND the mud is NOT only on motion -- then it look like the "issue" is encoding 2M pixels verses 1M pixels.

Since the first step of encoding is moving a kernal over the image -- if the encoder CPU is speed limited -- it may use a less "fine" grain search which would cause both a loss of fine detail AND poor motion handling. In which case there is no firmware fix. They'll have to wait for a 2X or 4X faster encoder CPU. Of course, that will consume far more power and generate far more heat.

Wacharapong Chiowanich
July 6th, 2009, 06:17 AM
Steve, I live in a PAL area where ALL the flat panel TVs (LCDs and plasmas) sold here are 50Hz or a multiple of this, 100Hz-200Hz for higher-end models. Will the 1080/24p output from the GH1 or from other sources such as Blu-ray discs (Hollywood movies encoded in 24p) for that matter look worse on these screens or the same as it does on the 60Hz-multiple screens in the US? Unlike 24p-to-60i/p, I find the math of converting 24p to either 50i or 50p hard to fathom.

Wacharapong

Steve Mullen
July 6th, 2009, 07:10 PM
Steve, I live in a PAL area where ALL the flat panel TVs (LCDs and plasmas) sold here are 50Hz or a multiple of this, 100Hz-200Hz for higher-end models.
In 2008 there was only one LCD hdtv that could display more than 600-lines of V rez. It used LED backlights that could be turned-off briefly to replicate the erase cycle in plasmas.

There are two ways of getting 100Hz from 50Hz: repeat each frame twice or interpolate a frame between. The latter is done by a Motion Compensated Interpolation which SMOOTHS motion. It is often called "dejuddering." This works fine for 50p and 50i -- although interplolation artifacts can be nasty. 200Hz most likely has to use Motion Compensated Interpolation.

There is a nice side-effect of Motion Compensated Interpolation on this one LCD -- the interpolator actually generates ALL the displayed iamges and these images have 1000-lines of measured vertical resolution. Turn-off the interpolator and resolution drops in half. (This is why you want a plasma or DLP and not an LCD. However, if you do buy an LCD it must be illuminated by LEDs -- otherwise rez drops to about 330-lines.)

=======

The issues get more complex with both 1080p24/60i and 1080p24. The latter ONLY comes in via HDMI from a BD player that has 1080p24 output and the hdtv must accept 1080p24. Since each frame has 1080-lines, one would hope the LCD would repeat each frame 5 times and display 120fps that would measure 1000-lines. Unfortunately, this critical resolution test was not performed!

With 1080p24/60i the reviews all note that turning the interpolator OFF let's film look more like "film" but the downside of a 60Hz display is a loss of about half the vertical resolution. Given these coments, it's reasonable to assume that the interpolator does "dejudder" film. But, one needs the interpolator to get 1000-lines of V rez.

==========

Frankly, I don't think it's surprising that reviewers devote a paragraph to resolution tests performed on 60i WHERE THE INTERPOLATOR DOESN'T HURT MOTION. This supports the manufacturers marketing of 120Hz and 240Hz.

Wacharapong Chiowanich
July 6th, 2009, 09:21 PM
Now, I think I understand the options for 24p materials on 60Hz-multiple LCD TVs but what about 24p on 50Hz-multiple (PAL) LCD TVs? How could an LCD TV interpolate 24p output from a Blu-ray player into 50/100/200Hz display? Does something more with respect to spatial or temporal resolution have to give?

Steve Mullen
July 6th, 2009, 11:10 PM
Now, I think I understand the options for 24p materials on 60Hz-multiple LCD TVs but what about 24p on 50Hz-multiple (PAL) LCD TVs? How could an LCD TV interpolate 24p output from a Blu-ray player into 50/100/200Hz display? Does something more with respect to spatial or temporal resolution have to give?

1) Repeat frames and run at an even multiple of 24 such as 72Hz (the best -- only done by the Kuro) or 120Hz.

2) Add 2-3 pulldown to 24p to get 60p than repeat each frame. (Bad)

3) Use an interpolator to create the necessary frames (Bad).

Fundamentally there is only one monitor -- the Panasonic Kuro -- that for sure does 24p with a refresh rate of under 100Hz (needed to keep film looking like film), doesn't use an interpolator which causes artifacts and "smooths" motion, and has 1000-lines of V rez. Plus has a great black level.

Since they are no longer made -- you'll have to look at 60Hz DLP. Or, we need tests run on the new LED-based LCDs runing at 120Hz. (Anything higher does nothing positive!)

PS 1: to clarify -- we want the "true" 24fps "sampling" judder but we want to eliminate the "false" 2-3 pulldown judder needed to record 24p in 60i.

PS 2: you may not feel 100Hz refresh itself causes film to look like video. I do becaue I come from a 60Hz world. However, as I said in my first post, I do NOT know if the changed "look" comes from repeating every frame twice or it comes from interpolating the tween frames.

Hard for a visitor to tell HOW the LCDs they see work. Maybe you can shed some light. Do all 100Hz look the same to you? Do some make 25fps film look like video? How do they work?

Steve Mullen
July 6th, 2009, 11:12 PM
Now, I think I understand the options for 24p materials on 60Hz-multiple LCD TVs but what about 24p on 50Hz-multiple (PAL) LCD TVs? How could an LCD TV interpolate 24p output from a Blu-ray player into 50/100/200Hz display? Does something more with respect to spatial or temporal resolution have to give?

I just realized you asked about 24p in 50Hz. Your HDTVs also run at multiples of 60Hz -- which is why you also get to see NTSC.