View Full Version : 720 vs. 1080 debate


Pages : 1 [2] 3

Graeme Nattress
April 26th, 2005, 07:00 AM
It's practically impossible to tell SD and HD apart on a small screen, and in the article I mentioned, the writer was sitting at the back of the room and couldn't see anything wrong with the image - because the large projection screen was far away and appeared small.

HD is only better than a lower resolution format if you're sitting up close to the screen, or the screen is big.

Now what makes things worse is that most people have small screens that they sit a fair distance away from, and in that situation, even if they put in a HDTV instead of their SDTV, they'd not see any benefit.

What would provide a better picture for the majority of viewers is lower compression on digital transmissions and more care taken in the production chain to optomize picture quality. Again, better quality SD would benefit the vast majority of viewers more than HD.

That's why the adoption of HD will be driven by the home cinema crowd - those with the video projectors and big screens, and they want a HD disc they can view HD material with.

Graeme

Thomas Smet
April 26th, 2005, 09:09 AM
Once again I have to agree with Graeme. I think even many of us pros have a hard time telling what is going on with HD resolution. Look at the SONY Z1 for example. I think it is a great camera and does have a very good picture on a TV and I might still be getting two of them depending on what happens with the Panasonic camera by October. The fact though is that the camera only does 960 x 1080 interlaced. If you are watching on a LCD or plasma that will get de-interlaced giving you somewhere between 540 and 1080 lines depending on who you argue with. I actually think it is more like 540 since every other line is basically fake. So in terms of raw progressive pixel detail we only get 960 x 540 which isn't all that much higher than SD. Just like what Graeme said I think the reason the Z1 looks great (and it does) is because even many of us pros are not used to seeing that much detail. Some people say the Z1 looks just as good as a Cinealta camera but I do not agree. I think to most of us it does look as good because we can't really tell right now. I always use the example of when the SONY VX1000 came out. Before all we had to use were analog cameras. Most of us in that price range were using S-VHS. The VX1000 being digital blew us away. All of a sudden that camera looked better than a 1/2" S-VHS camera. Now today if we look back at the quality of the VX1000 we just think its decent. A better example is the Canon XL1. The chips are smaller on that camera compared to other 3 chip cameras. When it first came out everybody loved the picture quality. A lot of people still do. By todays standards however most of us can now see the softness and small lack of detail of the Canon XL1.

The Canon XL1 is still SD DV but it does use smaller chips. Just like how the SONY Z1 is still HD HDV and uses smaller chips. The fact though is that we can tell the XL1 isn't as good as a 2/3" camera where we have a harder time between the Z1 and a Cinealta camera. I do not think many of us are sensitive enough to tell just yet between images that are 1280x720 or 1920x1080. Unless as Graeme pointed out we are watching on an 80" TV. Of course on a Plasma or LCd we will hit the pixel limits of that device before we will the format.

Color space and compression stand out more than resolution to me right now. Heck I even like those EDTV sets. They actually look very nice at 42" even though they are only SD. This is why I always felt there should be an HD format that gives us 853 x 480 but with 4:4:4 color and very little compression. To me right now watching that kind of video on a 42" EDTV would be pretty good.

My whole point to this retarted post is that if we pros think a 960x1080i camera gives us just as good HD then what do the consumers of the world think? I mean we only recently convinced them that DVD was much better for the extra cost.

Graeme Nattress
April 26th, 2005, 09:19 AM
Well, I can see 4:4:4 1920x1080 on my LCD, so that's not a limiting factor. CRTs are now the limiting factor - they really "gloss" over details. I tell people about the 3D graphics I did for Panavision for NAB 2001, and that at the time, I didn't have the rendering power to do 1920x1080, so I did them 3/4 resolution. I could tell the difference in After Effects on my cinema display, but after going to HDCAM tape and being displayed on a Cinealta CRT monitor, you couldn't tell that I'd not rendered at full resolution. The CRT monitor also made it hard to tell the difference between keying live off the camera at 4:4:4 or 4:2:2 compared to keying off the HDCAM tape at 3:1:1. I could just about tell the difference, but I was watching the footage on a loop all day long....

Graeme

Barry Green
April 26th, 2005, 02:06 PM
That's why the adoption of HD will be driven by the home cinema crowd - those with the video projectors and big screens, and they want a HD disc they can view HD material with.
That and sports. Football and basketball really benefit from the HD experience.

Kevin Shaw
April 26th, 2005, 10:31 PM
Thomas: your numbers are questionable because (a) the Sony HDV cameras use pixel shift to generate 1440 horizontal pixels during recording, and (b) the issue of "faking" half the lines due to interlacing should be compared to SD cameras recording at a measly 480i resolution.

In any case, all the numbers are moot once you play HDV footage on an HDTV, and it's definitely good enough to get people's attention. I suspect it will be a very long time before most consumers can tell the difference between HDV footage and anything better on typical HDTVs, but they can all immediately see the difference between HDV and SD. That's what will make the HDV format a success, in spite of its limitations and competition from more expensive options.

Regarding the original subject of this thread, I doubt most consumers will be able to tell the difference between 1080i and 720p footage, especially as we move increasingly toward progressive-scan TVs.

Paul Rickford
April 27th, 2005, 04:14 AM
I have no idea of the resolution or if indeed I am watching 'true' 1080 HDV taken on my FX1 projected 7ft wide with a Sony HS20 LCD projector, all I know is that the image knocks my socks off! The detail, smooth colours make the image come alive, I can't stop watching it!
I'm now having a problem watching standard DV which up to a few weeks ago I would have said looked OK, now it looks terrible (as an aside I would say that the standard DV or Hdv converted to standard DV taken on the FX1 at this projected size does not hold up quite as well as footage taken on my VX2100)
So however those resolution figures may stack up Sony have killed DV for me, as one of my friends said 'WOW - MOVING SLIDES, how did you do that?'

Paul

Richard Firnges
April 29th, 2005, 04:43 AM
Hello,

this is a very good discussion, but I think one important point in this debate has not been mentioned yet. Most Members seem to see the question 1080i or 760p only from the back of the camera and from the momantary situation. I for myself do video only as a hobby, mainly I am a movie fan. And from that point of view I think at the moment 1080i is the better way to go.
Why? That is easy - For distributing existing movies I see no advantage in 60p (or 50p in Pal – Countries, where I live) since all existing movies are 24p (or 25p in Pal distribution). Especially in Pal Countries where no 2/3 pulldown for framerate conversion is needed, it is very easy to deinterlace the 50i to 25p because the source is progressive. So the reserved bandwith for 60p broadcast is wasted on repeating each frame twice (or you broadcast in 720p with 25p in Pal, I don't kmow how they will do it in 60 Hz Counties, because as far as I know 2/3 Pulldown will work only with 60p or 60i)
From my own experiences I do know that interlacing has its problems, especially in editing. But if we go to 720p right now. I think a future migration to 1080p (with 50 or 60 Frames) will never happen. But if we go to 1080i know, a migration to 1080p will be feasible.

Greatings

Richard

Graeme Nattress
April 29th, 2005, 05:30 AM
But.... 24p is part of the HDTV specifications, is it not, so that a broadcaster could send out 24p as whole frames now and not have to add pulldown frames?? I don't knwo if any do that, but it's part of the spec.

The problem is, that 1080p done currently, is embedded as part of an interlaced 1080i signal, and therefore, the chances are, that it has been filtered as any interlaced signal would be, and therefore has no more vertical resolution than 720p, even though it's carrying a progressive signal.

1080p would be super - as would 4k in the home, but to see the full benefit of 1080p you'd need either a very large screen, or be sitting rather close to your normal sixed screen, and I don't think either is much of a reality for most of the population. It would be great for the home cinema fan, but that would be about it.

Graeme

Richard Firnges
April 29th, 2005, 06:24 AM
Hello Graeme,

of course 4k or even 2k would be better as 1k or 0,7k. But we have to keep in mind that once a standard is established it will stay for some decades. So I fashion a TV - norm that has some room for further developments. This room I see in 1080i but not in 720p. This format, as good as it might be to SD (Pal or NTSC) in comparison, is in my eyes a little bit to fixed on the today but not on tomorrow. Im quite sure that in only half a decade 1080p will be feasible.
Of course it would be a poor design if the displays don’t recognise a 25p Signal that ist embedded in a 50i format und don‘t a proper deinterlacing.

Richard

Graeme Nattress
April 29th, 2005, 06:54 AM
Well, all HD displays are progressive these days. Nobody is really making CRTs any more, and they can't display anywhere near HD resolution anyway. That's why any interlaced format is just plain daft. For interlace to work (read earlier in this thread) it has to be filtered vertically, which means that even if you embed 25p in 50i it still won't have full 1080p vertical rez, but only 1080i vertical rez or it would flicker on any 1080i display (which are still around) so it doesn't offer any more real vertical resolution than 720p. And because 720p being progressive, compresses better than 1080i and as it uses less pixels, for the same bandwidth you get a better overall picture. Also, 720p will, arguably, not be any worse off than 1080i for uprez to full 1080p.

Progressive really is the way to go, and if we have to go with a smaller "resolution" in 720p than the bigger number of 1080i, that's not going to mean worse pictures in the home and will not impede any move to full 1080p.

And yes, going 1080p from the get-go would be better still. But really, the problem is broadcasters, and the lack of quality in digital transmissions. There's very little point in "getting HDTV" if all you can see is macroblocks, quilting and mosquitos.

Graeme

Richard Firnges
April 29th, 2005, 07:56 AM
Hallo Graeme,

to make it clear: I definetly prefer progressive over interlaced, that is out of question. But I see for the MOMENT a better perspective for 1080i as an IINTERMEDIATE step to 1080p.
When I compare DVB – S oder DVB T with DVDs I definetly see large differences in picture quality. If You only count pixels the should be none. Broadcasters tend to sacrifice datarate (quality) for more Channels (The call it „content“). I have seen DVB – T broadcasts that have had VCD –Quality. Even most DVDs You can buy are sloppy encoded if You stretch SD to its limits. And that is the main reason I argue for a bigger format: Providers (TV Stations and Disc Providers) tend to be sloppy, because sloppy usually means cheap. If You have a system that is actually too big for most uses, at least You don’t suffer too much as a consumer under this sloppiness. If You watch a very good DVD and compare it with the usual TV - broadcast (analog or digital) you will see a big difference. I think it will be the same with HDTV. Most of the people in this forum work with good equipment and try to produce good quality. But the question is, what will be delivered to the consumer. As an engineer I know that it is easier to archive „good“ results in „rich“ environments. To do the same thing within a smaller frame needs much more skill. For this reason in consumer reality the difference between progressive and interlaced, 720p or 1080i is more academic than of real meaning. Ouf course there will be oustanding products but the average will be 50 % under the possible limit....
That may sound silly, but we need the bigger format to compensate for the sloppiness of the providers. Therefore I would prefer to wait until 1080p is feasible or accept 1080i as an intermediate format that makes migration to 1080p easier

Richard

Graeme Nattress
April 29th, 2005, 08:41 AM
But 720p, being more compact (and 1080i having no more real vertical detail) does compress easier than 1080i, and hence will provide a better picture in the home for the same data rate.

Indeed, a bigger format will mean more compression and less quality to the viewers at home. And 1080i takes more data than 720p but does not offer any real, meaningful quality improvement due to interlace being a very sloppy form of compression.

Personally, I'd argue that using MPEG2, you'd get a better image, overall giving 10mbps to and SD broadcast (anamorphic widescreen, of course) over twice that datarate (ie 20mbps) to a 1080 HD broadcast, and 90% of people would benefit from the improved SD picture (ie, those who don't have an HDTV). As you say, many SD digital broadcasts are VCD quality, and quite frankly, that's not acceptable, but by the same argument, an over compressed HD image would be better replaced for the vast majority of viewers by improved SD broadcasts at full DVD maximum quality.

Graeme

Barry Green
April 29th, 2005, 06:03 PM
But.... 24p is part of the HDTV specifications, is it not, so that a broadcaster could send out 24p as whole frames now and not have to add pulldown frames??
720/24p and 1080/24p are both accepted broadcast standards as defined by the ATSC.

Steve Crisdale
April 29th, 2005, 07:14 PM
Personally, I'd argue that using MPEG2, you'd get a better image, overall giving 10mbps to and SD broadcast (anamorphic widescreen, of course) over twice that datarate (ie 20mbps) to a 1080 HD broadcast, and 90% of people would benefit from the improved SD picture (ie, those who don't have an HDTV). As you say, many SD digital broadcasts are VCD quality, and quite frankly, that's not acceptable, but by the same argument, an over compressed HD image would be better replaced for the vast majority of viewers by improved SD broadcasts at full DVD maximum quality.

Graeme

That's the dilemma that Australia now faces. We have a current Parliamentary committee looking into the 'Slow uptake of Digital TV'.

It's becoming very clear, that if the broadcasters - for whatever reason ends up being used to justify their actions - degrade the data-rate to the point where consumers (not afficiandos) can't see a verifiable advantage to DTV; let alone HDTV, purchases of 'old technology' display units will continue.

Without the display units capable of displaying even the level of 1080i in the majority of a nation's homes, why would any DTV provider worry about pandering to the outcries from the Early Adopters, when those precious few who can, perceive how poor the service being provided has become. Early HD broadcasts here in Oz looked amazing, because the signal stream was being pumped much closer to the full data-rate... but that has gradually changed, with piggy-backed program streams and the data streams and interconnectivity for those with the appropriate STB and hand controller... Band width after all is something the broadcasters have paid for, so they need a commercial return.

The average Joe out there doesn't give a deuce of rat's droppings about how good the game looks on his analog set that he "ain't gonna sell 'cause some smarmy suited sales-guy tells me this is better than what you got at the moment....sir". He does want to see all the news and sports results running across the bottom or top of screen though... so the broadcaster provides that service for them, and as part of the bandwidth allocation, that extra stream has to come from somewhere.

Strangely enough, advertisers want to put their money where the greatest return is, and while we HD early adopters may honestly believe that's us!... I got news.... It ain't us. Until the percentage of TV owning homes that have HDTV displays is around the 40>50% mark, HD technology advances will most likely tread water, or be incrementally added. 1080p would be one of the last elements to be implemented, and it's viability to the majority of TV viewing households even if/when 1080i capability reaches saturation would be even more doubtful.

The fact is... some parts of the HD video delivery chain are far more advanced than others, and that seems to accent the gap between the links that like our HDV camera capabilities, cost, options and availability; represent HD acceptance and above expectation realization of outcome... while the uptake of DTV and HDTV display devices represents the completely converse.

Until the 'average' consumer has a HDTV set up in their lounge, we who argue about the advantage of 720p over 1080i and vice-versa will provide entertainment value for visitors to the forum; but it'll be as relevant to reality as much political debate is, i.e. Diddly-squat relevance!!

I believe those who presently own a HDV camcorder have an opportunity to at least circumvent one of the major qualifiers being thrown about as a reason for poor PQ and data-rates of Aussie DTV/HDTV. "Not enough HD content" is the most common anthem being trumpetted. Well, if they have the content because HD10u or FX-1/Z1 shooters are providing it, they ain't got no excuse.

So rather than huff and puff about is 720p or 1080i better... how's about shooting HDV regardless of whether it's 1080 or 720 and get the stuff to a broad-bloody-caster and maybe the HD revolution can be truly ascribed to the abilities of those who realised the promise. Nah... what am I thinking!! George Lucas, or maybe Jimmy Cameron deserve the sole credit for the HD viewing revolution... because they're visionaries!!!

Richard Firnges
May 2nd, 2005, 02:41 AM
Of course at the moment 720p (especially with 24 or 25 fps for movie broadcast) is more feasible then 1080 (i or p). If we take MPEG2 Pal - DVDs for example: You need bitrates between 6 and 8 mbs (without sound) for the best quality. So the bitrate of the FX1 which works with 25 mbs is in the same area. For full 1080 at 25 fps you need therefore at least 30 mbs bandwith (or more). For 720 with 25p ( not 50p!) the equivalent would be 15 mbs. So from this point you are probably getting a better overall picture with 720 if you only consider broadcast bandwidth. But there will also be disc storage like blue ray. And for this kind of media the storage of 1080 movies ist doable.
I for myself doubt that TV stations are really willing to provide quality (That‘s why I gave up watching movies on TV). Not even in 720p. The don‘t do it today in SD, why should they do it in HDTV? In comparison their „HDTV“ probably would‘t be much better than the best anamorphic SD from DVDs. So from a practical point of view Broadcast - TV will always be inferior to discs (or whatever the media will be). Of course it will be better than broadcast SD....

But on discs we can have in a foreseeable future enough storage room for the high bitrates. Downconvering is always easier than the other way Therefore wie should aim now for the larger picture. With a large enough frame and a high enough bitrate we would have a system where don’t work on the limits. An we shold never forget that als existing movies are 24p, so 720p with 50 or 60 fps is not the optimum since the higher timeresolution is not needed.

Greetings Richard

Shannon Rawls
May 2nd, 2005, 08:53 AM
the HVX200 should have eliminated this discussion.

*smile*

- Shannon W. Rawls

Steve Crisdale
May 2nd, 2005, 05:32 PM
the HVX200 should have eliminated this discussion.

*smile*

- Shannon W. Rawls

Are you saying Shannon, that the HVX200 would have settled the arguement of which is beyond any doubt, reasoning or blind faith, the superior format?

Isn't that a little like saying science should have settled the discussion on evolution? ;)

Besides; you know how much fun it is to watch all the different reactions to something that gets posted!! :)

If there weren't followers of either camp... imagine how dull it would be around here!!

Shannon Rawls
May 2nd, 2005, 05:36 PM
Besides; you know how much fun it is to watch all the different reactions to something that gets posted!! :)

HA!! ain't that the truth! *smile*

I'm sayin the HVX200 will allow people to have the best of both worlds. *CHEESE*

Mercedes Benz or BMW???

Nobody could decide which was better, so I got a Lexus. *smile*

MAC or PC???

Nobody could decide which was better, so I got em both!

720p or 1080i????

heck...just buy a HVX200!!!!

*smile*

- Shannon W. Rawls

Tommy James
May 9th, 2005, 12:11 PM
JVC knew that when they introduced the worlds first consumer high definition video camera that there would be the naysayers who would claim that 720p is not real high definition so at the same time they introduced their line of 1500i super high definition televisions that upconvert all signals 720p and 1080i to 1500i. Note that this became the perfect compliment for JVC HD camcorders because 1440i is the native upconversion of 720p.

Javier Gallen
July 25th, 2005, 05:30 PM
The fact is that, if you grab a frame from a movie in a 1080i HDTV, and then downsize to 720p and resize it back again to 1080i the result is two IDENTICAL images.

That makes me think...

Ken Hodson
July 25th, 2005, 06:26 PM
Well 1080 is a bit of an anomaly in that except for the most recent CineAlta's, no cam was actually capturing that resolution. So to choose it as a broadcast standard makes little sense to me. As well all but the most recent consumer PC's had a hard time even working with footage of that high resolution. In fact if it wasn't for the archaic adoption of interlace, this format would still be largely unusable even now.
Its 2005 and we still have people trumpeting an interlaced format? Ya got me!

Tommy James
July 25th, 2005, 08:09 PM
The reason why Sony promotes 1080i is because 1080i is a bigger number than 720p so 1080i sells more cameras. Sony's support of 1080i has nothing to do with picture quality. So far it has been a very sucessfull strategy but Sony knows that it wont be able to get away with interlace forever so soon they will start promoting 1080p.

Javier Gallen
July 26th, 2005, 04:20 AM
The reason why Sony promotes 1080i is because 1080i is a bigger number than 720p so 1080i sells more cameras. Sony's support of 1080i has nothing to do with picture quality. So far it has been a very sucessfull strategy but Sony knows that it wont be able to get away with interlace forever so soon they will start promoting 1080p.
Good point.

Until we have 1080p, i'll be more interested on 720p.

Thomas Smet
July 26th, 2005, 05:55 AM
Ok guys I am doing a test right now to compare 720p and 1080i. This test is in no way what to expect from specific cameras but is to give a general idea of detail with the formats.

I created a scene in 3D Studio Max with objects of fairly high detail. I then rendered as 1280x720 uncompressed and again as 1440x1080 interlaced uncompressed.

I next took those images and brought them into Shake to compare scaling results to match the footage. What I have found so far is what Graeme has been trying to say. There is almost no difference in detail between 1280x720 scaled up to 1440x1080p and 1440x1080i deinterlaced to 1080p. Depending on how you deinterlace and scale the 720p image is actually a little sharper. The 720p also has the advanatge of not having any aliased edges because of deinterlacing. Even with the sharpness the same the 1080i version has blocky edges on thin details which at least to me makes the 720p look overall slightly better. They are however very close and only the most anal engineer would be able to tell. It may even be harder to tell on an actuall TV. For film out however those steppy edges could show up a little bit more on the 1080i. 720p gives a clean even image.

At this point I would say the "only" advantage 1080i has over 720p HDV is the 60 frame per second motion. If your target is film or simulated film then this doesn't matter.

This test of course does not take into consideration lens quality and other factors that dumb up the image from the camera. If a 1/3" lens can't handle more than 1280x720 anyways then the results from this test would be even better in favor of the 720p. "If" the lens on a specific camera actually can deal with the native resolution of that camera then this test still proves there isn't much difference between the two current formats.

I will try to post results of this test later. I am currently working on a 3D resolution chart to render out the different formats.

Kevin Dooley
July 26th, 2005, 06:43 AM
At this point I would say the "only" advantage 1080i has over 720p HDV is the 60 frame per second motion.

While it has yet to be implemented in an HDV product (other than the new JVC's uncompressed outs), 720/60p is a recognized standard for HD and more specifically is called for in the HDV specs... this negates 1080i's advantage, as 720/60p should be as sharp (or sharper) than 1080/60i for the reasons you've already stated....

Mark Grant
July 26th, 2005, 07:53 AM
There is almost no difference in detail between 1280x720 scaled up to 1440x1080p and 1440x1080i deinterlaced to 1080p.

Good for you. Now, rather than throwing away half of the resolution of your 1080i footage, try watching it on a 1080i display, or deinterlace the fields and play them at twice the frame rate.

Why is anyone in their right mind going to choose to deinterlace 1080i footage just to make 720p look better? 1080i/30 vs 720p/30 will give you 70% more pixels on an interlaced display or 85% of the pixels _and twice the frame-rate_ on a progressive display.

About the only reason I can see to deinterlace would be to output to film. So why are you even comparing it that way?

Thomas Smet
July 26th, 2005, 08:23 AM
yes film out is exactly right. My whole point along with a good portion of this thread is comparing 720p to 1080i in terms of getting a 1080p.

I was not saying that 1080i is bad but actually saying that they are both just as good as each other. I even stated that if you like the higher framerate motion of 60i then 1080i has the clear advantage. If however you prefer progressive footage then 720p may have the advantage.

1080i also gives you the advantage of creating a 720p 60p. That however doesn't mean anything if you are shooting 24p. There is no such thing as 48p for double framerate 24p.


It is going to be tough in my opinion to get a 60p HDV format as we know it right now. To jump from 30p to 60p would require double the bandwidth. That would mean a datarate of at least 38 Mbits/s if you wanted to keep the same level of quality as 30p. Current DV tapes would have a hard time dealing with 38 to 50 Mbits/s. The only way I could see a 60p HDV version would either be to hard drive only, new tape format, or double the compression level. Those extra 30 frames per second have to go somewhere. Bumping up to 25Mbits/s for 60p in my opinion would not be enough to handle double the data.

Greg Boston
July 26th, 2005, 08:49 AM
It is going to be tough in my opinion to get a 60p HDV format as we know it right now. To jump from 30p to 60p would require double the bandwidth. That would mean a datarate of at least 38 Mbits/s if you wanted to keep the same level of quality as 30p. Current DV tapes would have a hard time dealing with 38 to 50 Mbits/s. The only way I could see a 60p HDV version would either be to hard drive only, new tape format, or double the compression level. Those extra 30 frames per second have to go somewhere. Bumping up to 25Mbits/s for 60p in my opinion would not be enough to handle double the data.

You forgot to mention P2 technology, writing to flash memory in a raid configuration which could very well give 60p capabilities in the not too distant future. Where technology is concerned, "never say never" is my mantra. After all, there were many in the early 1980's who said we would 'never' be able to go faster than 1200 baud on standard copper wire phone lines. And yet, I am sitting here typing over a 1.5mbs up/3mbs+ down phone line. My line has been tested and could handle close to 6mbs.

Just food for thought.

-gb-

Thomas Smet
July 26th, 2005, 11:58 AM
I was using the term hard drive to mean any other type of recording other than tape. This could include P2, Hard Drive, Optical media, direct capture from firewire.

My main point is that with tape itself it will be hard to get 60p. It can happen it just will not be as easy.

Greg Boston
July 26th, 2005, 12:09 PM
I was using the term hard drive to mean any other type of recording other than tape. This could include P2, Hard Drive, Optical media, direct capture from firewire.

My main point is that with tape itself it will be hard to get 60p. It can happen it just will not be as easy.

Gotcha! Just one of those little communication glitches. I thought you literally were speaking of hard drive only. Yeah, for now, tape is pretty much out of the question until they figure out a way to make head gap smaller and/or tape transport faster.

-gb-

Ken Hodson
July 26th, 2005, 12:15 PM
Thomas, good point about the lens.
Are we to believe that somehow companies are using 2x better lens to match the resolution? My belief is that JVC stuggles to provide adequate lens for 720p never mind trying to adopt a prosumer cam to a 1080i lens. The costs just don't work out.
Sony has taken the Intel route. Speed sell's, or at least bigger numbers. Regardless if your prosumer cam is even coming close to capturing half of your advertised 1080i. Logic would say that a lens capable of that should cost more than the cam! I have no respect for a company that pixel shifts then up-rez's a 960x1080 capture to 1080, strictly for maketing reasons. Although it is still a mighty nice cam ;>)

Ken Hodson
July 26th, 2005, 12:34 PM
" tape is pretty much out of the question until they figure out a way to make head gap smaller and/or tape transport faster."

I think they will just double the capture to 38/50 Mbps. Like DV50. Old tech for new HDV. Combine that with 2/3" chips and there will be no need to buy a Varicam. Heck, it will be sucking some of Cinealta's market.

Barry Green
July 26th, 2005, 12:36 PM
It is going to be tough in my opinion to get a 60p HDV format as we know it right now. To jump from 30p to 60p would require double the bandwidth.
Or double the compression. The HDV spec calls for 60p, but still maintaining the 19mb/s data rate. Actually, if you want to get technical, the new 24P HDV mode of the HD100 *is* a 60p data rate -- it's 24p with 2-3 frame duplication carried within a 60p data stream, at 19mb/s.

The only way I could see a 60p HDV version would either be to hard drive only, new tape format, or double the compression level.
Yep - double the compression level. And we'll see it implemented next year, according to JVC's schedule -- they say June 2006 for the GY-HD7000U, the 2/3" 3-CMOS camera. It's listed as supporting 720/50p and 720/60p HDV.

Thomas Smet
July 26th, 2005, 01:30 PM
Barry are you sure the 24p will be carried in a 60p stream? I thought the mpeg2 could be written to tape as 24p. Although this does make sense since that is how the analog output would work. So that might mean then that 24p isn't slightly better in quality than 30p but may be slightly worse.

I always thought for the HD7000U they would actually raise the bitrate and not the compression. If they just raise the compression it is going to be hard to sell this as a high end $27,000.00 camera.

If they bumped up the datarate to 25Mbits/s that would give us around the compression of a 4.7 Mb/s DVD which is ok but not perfect. If it stays at 19.7 Mb/s to keep the uncompressed audio track we are looking at a 3.7 Mb/s DVD. Yuck!

Could they maybe run the tape to cover 1.5 the area like with DVCAM to maybe get 37.5 Mb/s with only 40 minutes of recording time?

Barry Green
July 26th, 2005, 02:05 PM
Barry are you sure the 24p will be carried in a 60p stream? I thought the mpeg2 could be written to tape as 24p.
The HDV specification doesn't allow for 24p. HDV is 720/25p, 720/30p, 720/50p and 720/60p. So to do 24p, and still call it "HDV", they'd have to carry the 24p within a 60p stream using 2-3 frame duplication.

Also, it is my understanding that the Lumiere folks have footage on tape from the HD100, and that they've said that it is indeed a 60p data stream.

So that might mean then that 24p isn't slightly better in quality than 30p but may be slightly worse.
Not necessarily. MPEG-2 can be famously efficient with duplicate frames. It's possible that the totally-duplicate frames may not impact the compression one bit (well, okay, they have to take up at least one "bit"!) ;)

But because the frames are exact duplicates, I expect that MPEG-2 will be extremely efficient, and I would expect that the 24p would actually have more bits to spread around to the actual frames than even 30p does.

I always thought for the HD7000U they would actually raise the bitrate and not the compression.
They may. But if they do, it won't be HDV. And the specs on their website say "HDV recording" and list 60p.

They do have plans for ProHD XE, which includes a "higher bitrate". But that's not HDV, that's a new format. HDV is defined as 720p in 19 megabits, at either 25p, 30p, 50p or 60p. Anything outside that would be, by definition, outside the format.

The JVC specs do list 1080i recording, but it does *not* say HDV, it says "mpeg-2" for that. So perhaps that's where the higher bitrate format would be used?

If they just raise the compression it is going to be hard to sell this as a high end $27,000.00 camera.
It's a year away, and anything can change between now and then. A simple terminology change on the spec sheet could make all the difference (i.e., they could change the wording so it says "HDV: 720/25p 720/30p; ProHD XE: 720/50p 720/60p". I wouldn't read too much into it at this point.

Could they maybe run the tape to cover 1.5 the area like with DVCAM to maybe get 37.5 Mb/s with only 40 minutes of recording time?
Not and still call it HDV. That would be a new format (just like DVCAM is a different format than DV). So if they change the naming of the format, perhaps that would be possible. And they do plan on ProHD XE. But if they do that, it will not be HDV, that's all I'm saying -- it wouldn't be compatible with anything with an HDV logo on it, so they'd have to make a new format and call it something new. And that's not what the currently existing spec sheet says, that's all I'm saying. As it's worded now, they're saying 60p in HDV. And that means 19mb/s.

Greg Boston
July 26th, 2005, 02:20 PM
Barry,

Isn't that why JVC is calling their format 'ProHD', so that they can work around the HDV spec limitations? I was thinking the same with Panasonic but they are just bringing a high end camera format down to an entry level camera with P2.

Just wondering...

-gb-

Barry Green
July 26th, 2005, 02:39 PM
Originally that's what we thought -- that ProHD was a new format. But then Dave Walton clarified and said that ProHD is *not* a format, that the format is indeed HDV.

ProHD is their take on the idea that they're making professional gear that uses HDV, pretty much the same idea as behind the JVC DV500 -- it was the first in their "Professional DV" lineup. It still used regular DV, it wasn't a new format, but they had a different name for it to differentiate it from their consumer gear.

I believe that's the same thing they're doing here. The ProHD name is a name for the product line, not a different format.

Thomas Smet
July 26th, 2005, 03:53 PM
Thanks for the data Barry.

Do you know how much more of a chanllenge is it going to be to capture the 24p video? I did not notice if any current HDV tool can capture and remove duplicate frames on the fly. I did notice Cineform mentioned support for the HD100 when it comes out but thats about it.

Any editing tool will have to not only be able to pull out the duplicate frames but also add them back in to record back to tape.

Barry Green
July 26th, 2005, 04:00 PM
I'm certain that all the HDV editors will need an update, but I'm also pretty sure it'll be minor. Lumiere is apparently already at work on it, and I'm sure CineForm will be able to implement it quickly.

Once they update it, Vegas will probably do it on the fly, like they do with 2-3-3-2 and 2-3 DV footage, so the user probably won't even notice that anything happened.

Thomas Smet
July 26th, 2005, 05:44 PM
If ProHD is not any different how can they add a second set of uncompressed audio or was that always in the 720p specs and they just finally made a camera that can do it?

Barry Green
July 26th, 2005, 07:24 PM
They haven't. The HD100 only records two tracks of compressed audio. There is no provision for 4 tracks on the HD100.

4 tracks was talked about as a possible add-on to ProHD-XE, but is not part of ProHD or HDV.

Greg Gelber
August 2nd, 2005, 09:26 AM
Pushing the boundaries of compression just to promote 1080i when you are capturing an image that wouldn't fill 720p is only to sell cams to the sheep who need big numbers isn't an advancement.

WOW!!! You just sealed it for me. My line of freelance work doesn't nor will ever involve sending shows out for broadcast air or film premieres so I will be very happy with 720p compared to my SD projects. Thanks...

Douglas Spotted Eagle
August 2nd, 2005, 09:34 AM
WOW!!! You just sealed it for me. My line of freelance work doesn't nor will ever involve sending shows out for broadcast air or film premieres so I will be very happy with 720p compared to my SD projects. Thanks...

Not to continue what has already been an empassioned debate....
But if you don't like the upsample of DV to 720, wait'll you upsample 720p to 1080p. 1080i looks a lot nicer when resampled/deinterlaced to 1080p than 720 does, and while there are those saying we won't see 1080p for a long time....the displays have started shipping already, and so is the hardware. We're not that far off.
So, for short term you'll likely be happy with 720p, but the grail is 1080p.(60)

David Kennett
August 2nd, 2005, 03:48 PM
Not to continue what has already been an empassioned debate....
But if you don't like the upsample of DV to 720, wait'll you upsample 720p to 1080p. 1080i looks a lot nicer when resampled/deinterlaced to 1080p than 720 does, and while there are those saying we won't see 1080p for a long time....the displays have started shipping already, and so is the hardware. We're not that far off.
So, for short term you'll likely be happy with 720p, but the grail is 1080p.(60)

I agree 1080p60 would be great (whats not to like?). I just would be surprised if cable, broadcast, satellite - whatever - will be willing to "spend" the additional data rate (double that of 1080i30 or 720p60) to deliver it. I know compression is getting better, but there will always be the pressure to deliver an optimum picture with the lowest data rate.

If you have read some of my posts elsewhere, you know that my logic follows thus:
1. CRT is the only really "true" interlaced display.
2. CRTs are disappearing.
3. Why create a picture in a format which MUST be converted to display it?

I would like to observe the comparison you make between 720p60 and 1080i30 converted to 1080p60. Such a test would require the highest quality source material (we're not judging relative merits of different cameras or display technologies - remember.)

I would be willing to bet that in 10 years virtually all of the display devices in people's homes will be 720x1280 or less. The CRT will have pretty much disappeared. It is possible that 1080p60 could be delivered with blue laser DVD to a somewhat limited audience (If they've decided by then!). An even higher resolution could be developed for theater use - perhaps the final nail in the film coffin.

But then that's just what I think will happen.

Thomas Smet
August 2nd, 2005, 04:24 PM
Spot I actually do not agree that 1080i going to 1080p will look much better than 720p to 1080p.

Since it isn't very easy to compare the camera formats right now I have been some testing in 3D Studio Max to compare quality between 720p, 1080i converted to 1080p and 720p, 1080i converted to 720p.

I have so far tried 4 different deinterlacing methods and for the most part there is very little to no quality difference between the two. I have rendered with many different antialiasing filters to try and better get what a camera would give us. Even using super sharp 3D rendered images there is little to no change. The more camera like the images get in terms of softness the closer both formats become. If the aim is 24p or 30p there is very little difference. If a 60 frame motion is needed than 1080i does have the advantage.

720p scaled up can get just a tiny bit softer but with very smooth results across the whole image.

1080i deinterlaced can be a tiny bit sharper but with either jagged edges or missing pixels on tiny objects or objects at a slight angle. If you use interpolation when you deinterlace the 1080i can get just as soft as the 720p (depends on which software you use)

I am using Shake to test the images by using 6 different scaling methods and 4 different deinterlacing methods.

I agree that 1080p would look better but 1080i is not in my opinion much better than 720p.

I will try and post some of my results in a few days. I want to study this a little bit further first.

Joe Carney
August 2nd, 2005, 04:47 PM
It will be awhile before 1080p is broadcast, but don't forget the Home Theater market, where you won't be limited by your local cable company. 108024 and 48p is already available on the high end, and just like technology in the past, it's only a matter of time before it's affordable for a sizeable minority. Since Blue Ray supports 1080p I'd say give it a couple of years. But the time to plan is now. I'm hoping the technology to uprez 720p with great quality is around soon too.

Barry Green
August 2nd, 2005, 05:08 PM
It will be awhile before 1080p is broadcast
1080/24p or 1080/30p could be broadcast today, if broadcasters wanted to. Those are ATSC-sanctioned broadcast standards, with either the same (or lower) bandwidth requirements than 1080/60i.

The question mark seems to be about 1080/60p, which would seem to be years away from being broadcast, if ever. 1080/60p is not in the ATSC specifications, so it wouldn't work with any ATSC-compatible tuners.

Douglas Spotted Eagle
August 2nd, 2005, 05:22 PM
I would be willing to bet that in 10 years virtually all of the display devices in people's homes will be 720x1280 or less. The CRT will have pretty much disappeared. It is possible that 1080p60 could be delivered with blue laser DVD to a somewhat limited audience (If they've decided by then!). But then that's just what I think will happen.

I'll take that bet, gladly. I just hope I'm around in 10 years. I'll go so far as to suggest that it will go faster than 10. The first 30 shipments of 1080p displays were gone before they even hit ground, if you accept what Jon Peddie research has to say, and what most of the other research/advisement firms are saying. I know that I went to B&H a month back to see the one model they had on display. They'd sold it.

Tom,
My point isn't that 1080i is better than 720p. It is indeed sharper for fast motion, but it has its own issues as well. However, I *have* taken 720p and upsampled to 1080p, and have taken 1080i and crosssampled to 1080p (30) and viewed on both large and small LCD's, and on a Sony Qualia projector. 1080i converted to 1080p definitely looks better than 720p upscaled.
I have in my hands now, a GrassValley LDK 6000, courtesy of Lonnie Bates at the national repair center for Grass Valley. We've been doing some fun tests with it, more based around color than anything else. Calibrating these cams is definitely a learning experience. That said, I'm not even using one of the Sony cams to make my comparisons, I'm using the front end of a ViperCam.

Thomas Smet
August 2nd, 2005, 05:38 PM
Spot I would love to compare results as soon as I finish my tests. I am sure using a high end camera can give different results as to what I am doing. I will try to post my results in a few days.

What methods are you using to deinterlace? I will agree that a really good (but slow) deinterlacer will give good results depending on how much motion there is but most people would never think of using something like that except for short projects. I wouldn't want to do that on a 2 hour 3 camera concert for example.

I also wasn't saying one was better than the other. I actually think they are pretty much the same. I agree with you that the 1080i is a little bit sharper but with some bad aliasing edges. 720p is a little softer but a nice smooth overall image. I guess it all depends on how sharp you like your footage.

I will try to get the results up soon.

Bill Binder
August 10th, 2005, 11:59 PM
Since after 7 pages of responses no one has said it yet, this kind of topic has been hammered to death for YEARS in the avsforums. Not so much from the perspective of recording hardware, but tons from the perspective of display technology and what "looks" better. So, if you've never checked out the avsforums and you're interested in HDTV, go take a look... (BTW, I have no relationship to them at all, but it's a great forum to check out.)