View Full Version : 720 vs. 1080 debate


Pages : [1] 2 3

Thomas Smet
March 29th, 2005, 07:59 PM
A lot of people are busy comparing cameras that either are or are not a reality at this point. How about the two different HD formats?

I haven't seen very many debates over which one people like better for certain projects.

Seeing as everybody is picking the heck out of the cameras themselves it would be nice to get some debate over which is better.

I personally love progressive video but hate the fact that really 720p isn't that much larger than 480p. Considering that some hard core film people think 1080p HD still isn't good enough for film than 720p would be even worse.

1080i gives great resolution for TV but when deinterlaced has detail close to 720p.

For film would having more horizontal resolution of an interpolated 1080i be better than a clean lower resolution 720p source?

At first I was very excited about the new Panasonic HD camera but then I cooled down a bit. 720p is much better than 480p but I always thought of 720p 24p for film work as the poor mans HD format. I am in no way knocking 720p but the fact is that 720p's main advantage over 1080i HD video is the increased frame rate of 60p. This gives ultra realistic motion. If you were to shoot both 1080i/p and 720p both at 24 fps 720p would be much lower in terms of quality.

When it comes to cameras they are just tools to me and it doesn't matter which one I have. I know at the end of the day no matter which one I use I could tell the same story. The thing that has me stuck between HD cameras however is the HD format itself.

Graeme Nattress
March 29th, 2005, 09:45 PM
Interesting, and there's no "right" answer. Personally, I see interlaced video as a rather outdated form of compression, that, as all modern display devices are inherently progressive, should be left to the past.

However, treating interlace as compression, and good de-interlacing as decompression, will, with 1080i60 give you a reasonably nice 1080p30 with retaining somewhere between 50% to 70% resolution of what a 1080p30 shot natively progressive camera woud. So, you would get a 50% greater horizontal resolution than 720p30, but that would be your only real advantage.

But, 720p should compress better for a given data rate than 1080i60, so there could be an advantage there...

And then it comes down to the optics and DSP etc. of the camera, it's professional controls and such.

Sounds like a fun decision to make!

Graeme

Shannon Rawls
March 29th, 2005, 10:28 PM
HORIZONTAL RESOLUTION.....

ya know, nobody ever talks about Horizontal Resultion...

1440 pixels from left to right.....is allot of freakin' pixels!

Graeme, how important is that to the eyeball? Do we see vertical information more then horizontal information? and that's why we only talk about the ups and downs?

I just bought a Dell 19" LCD HD monitor today to use in the field. It boasts 1280 x 768 (WXGA) Native Resolution.

For instance.....I am contemplating on if I am going to get me one of those P2 cameras from Panasonic. However, time after time I read people saying "WELL, TODAY'S HD TV'S ARE ONLY 720, SO A PANASONIC 720p CAMERA IS PERFECT!". Yea, that's true BUT they always FAIL TO MENTION that that same HD TV is also 1280 pixels WIDE!!! and that Panasonic camera only does 960 pixels wide!!!! *dumb look*

Now...JVC.....JVC is perfect for 1280x768 HD TV because JVC's cameras are 1280x720!!! and progressive or interlaced, Sony uses 1440 fat belly horizontal pixels. But for Panasonic, you never hear anything about them skinny horizontal pixels it uses maxing out at 960.

And I wonder why?

Is it because horizontal resolution is not as important?

- Shannon W. Rawls

Barry Green
March 30th, 2005, 01:23 AM
BUT they always FAIL TO MENTION that that same HD TV is also 1280 pixels WIDE!!! and that Panasonic camera only does 960 pixels wide!!!! *dumb look*
... but... Shannon... 1080i TVs are 1920 pixels wide, and your beloved HDV camera is only 1440 pixels wide... which, by the way, come off of CCDs that are only 960 pixels wide. You know that, right?

Back on topic:
There has been endless debate over 1080i vs. 720p. Someone just pointed out an exhaustive article on Walter Graff's site that talked about it. What it all comes down to is that both systems pump about 60 million pixels per second through the TV. Both look absolutely amazing on a high-def television, when displayed at their native resolution.

What you will not hear debated is 1080/24p vs. 720/24p -- that would be a silly argument. 1080/24p is obviously much superior to 720/24p, as a format (although there's plenty of argument about which produces a nicer image to look at, the CineAlta or the VariCam, but that's beyond the scope here). And 1080/24p and 1080/30p are legitimate broadcast standards in the US, ensconced in the ATSC specifications (as are 720/24p and 720/30p).

So the main argument comes down to 1080i vs. 720p. And yes, that has been discussed ad infinitum, a google search should bring up volumes of debate.

Graeme Nattress
March 30th, 2005, 05:12 AM
Well pointed out Barry. I was just comparing 1920x1080 (which hardly any camera actually records) with 1280x720, which some do record, but on the other had, hardly do justice.

What I was alluding to, with my comments on compression is that you have an issue of pixel quality and how accurate those pixels are. Obviously, you could take a VHS image and interpolate it up to 1080i, and say it's a high definition image. And indeed it is, but none of those pixels will be in any way accurate.

Even when compression isn't producing the typical visible artifacts of mosquitos, quilting, noise pulsing, false grain and differential movement etc. it's still their adding noise, both temporal and spacial to the image, and the effect of that is, to me, a general reduction in the entire smoothness of the entire moving image. This type of compression artifacting is very hard to see without a comparison uncompressed image and the benefit of movement, but it is there and I do tend to see it's effects. Practically all video formats suffer from it - it's everywhere to one extent or another.

I think the factor that outweighs 720p or 1080i is the ability of the camera operator and the controls of the camera they're working with. Indeed, these factors can also bring the quality of DV above any HD format, if you've put a good DV camera with great controls in the hands of an expert, and the HD camera is being operated by a lesser person. I'd prefer to watch a genius shoot VHS than some also-ran shoot HDCAM.

Graeme

Thomas Smet
March 30th, 2005, 11:18 AM
Yes we are comparing 720p to 1080i but at the same time all the HD cameras that will be coming out soon only give us 24p or 30p for 720p.

Even if there were other debates at different locations I thought since so many people were wasting energy by debating the new cameras maybe we should try to shift the focus not on the hardware but the end format.

I also thought about what Shannon said about current HD monitors only really giving us 720p resolution right now. How long is it going to be before we do have true 1080i/p TV's? Maybe the main reason why both 1080i and 720p from current HD cameras look equally good is because we aren't seeing 1080i in it's full glory. If we did have a 1080i TV how would 720p look blown up compared to the 1080.

I think for broadcast 720p is great right now and will be for awhile.

About the actual chip size for the SONY camera. 1080i on it's own as a format may have certain advantages over 720p but what about 1080i coming from the Z1? The chips are 960 by 1080. After de-interlacing you end up in theory with 960 x 1080 with the same detail as 960 x 810 (1080 times .75). If the Panasonic uses 960 x 720 pixels this equals almost the same amount of raw detail. The SONY does use pixel shift to get a little more detail but only for horizontal resolution. Now if the JVC camera is going to have true 1280 x 720 pixels on the chips then the raw image would have more detail than the de-interlaced pixel shifted SONY at 1440 x 1080.

haveing 1920 x 1080 raw pixels would be very nice even interlaced but with the SONY we are nowhere near that.

There is also the debate over 24 fps. Even though many hate 24p the fact is that if you are transfering to film we are sort of stuck with this right now. Considering that the JVC and Panasonic cameras will both have true 24p this means whatever pixel data we get out of the camera is what we will have. The image will not be interpolated more to get to 24p. If you were to shoot 25p/i with the SONY and shift the audio 4% you would still have to spend time processing the video and audio for that change.

This actually makes any decision even harder because now all three HD cameras seem to be on par in terms of detail. Of course we will not know for sure until the other two cameras come out.

One other thing I would like to point out is that 720p is a heck of a lot easier to edit right now. Even though in theory both have around the same data rates which is around the same as DV the fact is that for any processing of the footage is done on the pixel level and 1080i/p needs a lot of horse power. The system I use now can easily handle a couple of streams of 720p in real time.

Steve Crisdale
March 30th, 2005, 04:46 PM
<<<-- Originally posted by Thomas Smet : Now if the JVC camera is going to have true 1280 x 720 pixels on the chips then the raw image would have more detail than the de-interlaced pixel shifted SONY at 1440 x 1080. -->>>

Putting Mathematical theory aside for a moment here....

Without getting caught up in pixel numbers or any other pseudo logical rationale for one format being better than another, the 1080i FX-1e clips look significantly better on my Aquos HDTV LCD than the 720p clips from the JVC HD10u.

I'm sure I'm going to get the "but that LCD isn't true 1080 resolution, so it's re-sampling the image" or "that HDTV can't support 1080i PAL and 720p NTSC" and "the HD10u isn't a 3 chip camcorder" and all the usual bollocks I've heard many times before.

Theoretically certain insects shouldn't be in the air.... but they are, and in this case the result is the same; namely, the proof is in the pudding.

Graeme Nattress
March 30th, 2005, 04:54 PM
But the JVC was probably the worst (and at the time, cheapest) HD camera ever made, and certainly doesn't have the resolution in it's one chip to do justice to it's 720p designation, and although the FZ1 is 1080i, it certainly doesn't have the resolution that a high end 1080i camera would have. The argument of 720p v 1080i is not added to by comparing the worst 720p and 1080i cameras around. (Not that they're bad, but given that every other high def camera costs about 10 times the price and they all look significantly better.....)

Graeme

Bill Pryor
March 30th, 2005, 05:31 PM
Steve, you're comparing a single chip camera to a 3-chip one. If they both recorded to the same format, the Sony would still look better. Regardless of the format, the recording can never make the picture look any better than what the chips produce.

Graham Hickling
March 30th, 2005, 06:56 PM
>>Theoretically certain insects shouldn't be in the air....

The myth persists that science says a bumblebee can't fly. Indeed, this myth has taken on a new life of its own as a piece of "urban folklore" on the Internet.......No one "proved" that a bumblebee can't fly. What was shown was that a certain simple mathematical model wasn't appropriate for describing the flight of a bumblebee.

http://www.sciencenews.org/articles/20040911/mathtrek.asp

Thomas Smet
March 30th, 2005, 11:25 PM
lol

Steve Connor
March 31st, 2005, 02:18 PM
Very simple answer to this one as far as I am concerned, BOTH have their place in broadcasting.

Michael Struthers
March 31st, 2005, 02:31 PM
Well with the new panasonic, maybe now the discussion is 720/60p vs 1080/24p.

Barry Green
March 31st, 2005, 03:22 PM
Or is it a discussion at all? With the new Panasonic, you can have it all in one camera: 1080/24p, 1080/30p, 1080/60i, 720/24p, 720/30p, 720/60p...

Ken Hodson
March 31st, 2005, 10:57 PM
All of the new HD DVD's that are out (T3, Matrix Reloaded, Attack of the Clones) are all 720p. As a distribution format it takes some decent hardware to watch 720p nevermind 1080i. 1080p is pushing all tech at the moment. I think it is safe to say high bandwidth 720p (HD DVD ect.) looks much better than low bandwidth (19Mbps) 1080i TV. As low cost aquisition format (HDV)720p gives far more data per pixel then 1080i. Pushing the boundaries of compression just to promote 1080i when you are capturing an image that wouldn't fill 720p is only to sell cams to the sheep who need big numbers isn't an advancement. If the industry settled on 720p we would be much better off. The future isn't all resolution. Resolution plus lack of compression plus frame rate = video advancement. Watching 1080i with over done compression in an interlaced format with yesterdays frame rate was such a sideways move. But as always bigger numbers sell.
My 2 compressed cents.

Thomas Smet
March 31st, 2005, 11:57 PM
thanks for the input guys.

I agree with Barry...

With the new Panasoinc camera it isn't really an issue anymore because we can shoot whatever we need to.

Ken Hodson
April 1st, 2005, 12:04 AM
Ah the magic bullet. Well we will wait and see.

Graham Hickling
April 1st, 2005, 01:01 AM
I'd take 720p60 over 1080i any day.

Being that it's 2005, interlaced footage of any sort should be led down to the barn and quietly put out of its misery.

Thomas Smet
April 1st, 2005, 09:39 AM
I agree Graham. Two of the things I hate the most about video is interlaced frames and pixel shift. Well with NTSC I also hate non square pixels and 29.97 fps instead of 30.

Interlaced is a form of compression that tries to get more detail into a certain bandwidth. Nothing in the real world is interlaced.

Ron Evans
April 2nd, 2005, 09:13 AM
Nothing in the real world is 24p either!!!!

Ron Evans

Ken Hodson
April 2nd, 2005, 03:19 PM
I'll state the middle ground. We have two eyes that make one image. Thats sort of interlaced. But our eyes see whole frames so that would be progressive. Sort of dual field progressive?

Graham Hickling
April 2nd, 2005, 03:38 PM
I'd say our eyes are closest to two progressive cameras mounted side-by-side, outputting stereoscopic images for later viewing on our new glasses-free 3d monitors:

http://www.stereographics.com/products/synthagram/synthagram.htm

(Note that this is an LCD monitor - so there's no interlacing going on here either...)

Giroud Francois
April 2nd, 2005, 04:03 PM
purely on a marketing point of view...
producing displays (lcd/plasma .. crt are dead) is a matter of pixel.
or a matter of pixel per dollar.
So imagine some company producing a display with 1920x1080 pixel when half of them will be not used half of the time.
Seems we will likely see more 1280x720 downconverting 1080i .
it is cheaper, easier to produce and you still can stick an HD compatible logo on it.

Ron Evans
April 2nd, 2005, 10:32 PM
Actually 24fps film projected from a film projector and interlace video both fool our eyes. BOth stem from the economics of technology at the time of introduction into providing the best way to fool the eye/brain into perceiving fluid motion at the best price. The film projector normally has a three blade shutter that flickers the image so that our eyes think they are seeing 72 fps but there are only 24 unique images ( the screen doesn't flicker but it does stutter since our eyes perceive flickering if that is below about 60 images a second). Interlace video takes advantage of two forms of latency, the phosphors in the crt and the latency combination of our eyes and brain. At 60 fields a second moving parts of the image result of course in the image being a 60th second displaced in the two fields. This is where our brains come in to play and morph these together it fluid motion. ( unless of course the image is relatively fixed and there are thin lines that don't really move. Our brain thinks this is flicker!!!!) At 60 fields a second the interlace flicker is right on the limit and the crt phosphors decay slower than the next interlace field is writen, at least that is the theory!!. To display 1080i there needs to be 1080 horizontal lines since these are displayed ( for a very short time ) together, one set decaying and the other set being writen, they are not the same 540 lines displayed twice!!. 720p doesn't cut it for a 1080i or 1080p source, but a 1080i or 1080p source will happily display 720P ( with bars !!) Any attempt to fill the screen from either of these approaches will degrade the video. Witness the terrible display of normal SD video on any HD set( creating more resolution from rubbish creates a wonderful display of rubbish!!!). If you shoot 1080i get a 1080i display likewise for 720p and in my experience SD is best displayed on a good SD display. Progresssive displays still need to get above the eyes flicker rate and still use the latency of the displays during the flicker cycles. They just use different means of doing this but they ALL fool our eyes, its just how they do it that differs from CRT , LCD etc. At the moment CRT's have the edge in the combination of resolution and refresh rate but they are big and heavy. For me the the vision is progessive capture above the flicker rate ie 1080p60 or above. Untill then a love my FX1 and Sony HiScan 1080i TV

Ron Evans

Graeme Nattress
April 3rd, 2005, 08:03 AM
But 24fps on a DLP projector doesn't flicker. It looks great - as does 24p on an LCD display.

Also, to complain about NTSC uprezzed to HD on an HD display assumes that the SD NTSC has been broadcast, and quite frankly that we see a picture at all is quite amazing after all the compression it's gone through. If you were to uprez the SD master before broadcasting, to HD, then it look a world different to what it looks like with the TV uprezzing it.

Indeed, 720p more than "cuts it" for a 1080i source, and even blowing up 720p to 1080p will be invisible to most viewers.

Indeed 1080p60 will be awesome, but nobody shoots it, and hardly anything displays it. However, when things change it will be nice.

Graeme

Prech Marton
April 21st, 2005, 12:57 AM
I have some imax hd dvd-s: speed, coral reef, etc.
The disc contains the 720p and 1080p version of the film.
But the file size is almost identical (2gb).
What does this mean?
The 720p version show less mpeg artifacts?
And the 1080 is sharper because the resolution, but show more mpeg-block?

Peter Wiley
April 21st, 2005, 11:49 AM
http://www.highdef.org/library/index.htm might be helpful in sorting some of these issues out.

Kevin Shaw
April 21st, 2005, 04:39 PM
Theoretical arguments aside, from a practical standpoint we would all be MUCH better off right now shooting 720p at 60 frames per second and delivering finished videos at either 720p60 or 720p30. Interlaced video is about to die quietly because the TV and computer screen manufacturers are reportedly phasing out all tube-based displays in favor of more modern digital solutions, almost all of which will be progressive-scan by nature. And a lot of even the most expensive HDTVs are only 720p native resolution anyway (or less), because that's what's affordable to produce using today's technology. Plus 720p is a lot easier to edit and play back on today's computers than 1080i, such that it is currently the most plausible format for actual delivery of HD video projects.

So we have a situation today where 720p is a practical end-to-end solution and 1080i is not, and by the time we have the technology to do 1080i well the displays will all be 720p. Personally, I think 720p is arguably a better format anyway, and it makes no sense to me that we're still talking about interlaced video when debating what format to use for the next 50-100 years or so. As someone else said, let's take that horse out to the barn and put it out of our misery.

Ben Buie
April 21st, 2005, 11:27 PM
One thing to worry about is editing; editing compressed interlaced video is no fun over multiple generations. Interlaced video is much more subject to compression and interlacing artifacts over time than progressive video.

For me it is just one less thing to worry about. We can record, edit, and deliver everything progressive and not even think about it.

Also if you are pulling stills from your video for marketing progressive is nice.

Let's don't forget that 720p (921,600 pixels) is still nearly 3 times (2.7 times to be exact) the pixel resolution of 480p (345,600 pixels), it isn't a "small increase". For broadcast, where most people are used to 480i, the difference is even larger.

The primary reason we haven't "upgraded" to the FX1 or the Z1 yet is I refuse to go back to interlaced. I'm very interested in both the the JVC and Panny though because of their progressive modes. 1080/30p or 1080/24p would be sweet, of course we would have to upgrade our editing computers once again :)

Ben

Barry Green
April 22nd, 2005, 12:44 AM
Let's don't forget that 720p (921,600 pixels) is still nearly 3 times (2.7 times to be exact) the pixel resolution of 480p (345,600 pixels), it isn't a "small increase". For broadcast, where most people are used to 480i, the difference is even larger.
Nearly 3x in spatial resolution, yes, but also 2x as high in temporal resolution. You're getting almost 6x as many pixels per second pushed through a television screen with 720p as you are with NTSC. And, if I may geek out for a moment, once you factor in the kell factor, that number goes even higher, more like 7x as many discernible pixels per second. 720p's real HD, no doubt about it.

Dennis Wood
April 22nd, 2005, 08:02 AM
Barry, perhaps you could explain your point. I was under the impression that 480i/1080i would have a temporal resolution of 60 samples per second, and 720 30p (or 480 30p) would have a spatial resolution corresponding to their actual measured frame resolution. Interlaced signals give higher temporal resolution than progressive, but less spatial resolution per frame. I'm assuming this is why interlaced footage makes for better slow mo.

With 480p, vertical resolution on the progressive frames is 33% higher than 480i interlaced frames due to losses incurred in row pair summation in the interlace process. If the same is true of 1080i, actual measured resolution should be somewhere around 1080-33% = 724 lines. Horizontal resolution, if displayed properly, would definitely be higher for 1080i over 720P as the same filter loss is not an issue....if the 1440 lines are actually displayed.

I sourced a paper copy of this article (“Understanding Camera Resolution,” Broadcast Engineering, August 1999)." which concluded that progressive resolution is almost twice that of interlace image resolution for moving objects.

If TV's are only displaying 1280 vertical lines, then I would theorize that 1080i and 720p should look pretty similar as far as resolution goes. If you are shooting 720 60p, then you now have the same temporal resolution and very similar spatial resolution to 1080i. If you take William E Glenn's findings into account then 720 60p should look "better" as far as our perception goes.

So how does this theory work out in practice?

Barry Green
April 22nd, 2005, 02:02 PM
So how does this theory work out in practice?
Exactly as you would expect. 720p and 1080i both look like "HD". A native 1920 signal being displayed as 1920x1080 does look sharper than a native 1440 signal being displayed as 1920. I wasn't talking about 720/30p, which isn't really shot or broadcast in the US, I was talking about 720/60p.

Of course, when talking about lower frame rates, the argument becomes absolutely moot. 1080/24p will stomp 720/24p all day long. 1080/30p will stomp 720/30p in every case. So the argument is relevant when comparing the full broadcast standards, 1080/60i vs. 720/60p. (or, when comparing NTSC or PAL against 720/60p).

What I was talking about is that I've seen *several* people posting that 720p isn't "really" HD, because it's only about 20% more lines than PAL. And that's just wacky talk. Because you have to take into account the higher refresh rate, plus because of the native progressive system, every pixel is individually discernible, you end up with 6x as many pixels per second of discernible pixels as opposed to a PAL system. And on NTSC, which is lower vertical resolution, it's more like 7x as many pixels.

Both 1080i and 720p end up pushing approximately 60 million pixels through the TV set every second. 720p doesn't suffer from resolution loss due to interlacing, and it updates a full frame rather than a half-size field. Both provide for the "reality" look by providing 60 updates per second, but while 1080 updates half its frame vertically, 720 updates its entire frame.

My point isn't that one is better than the other, the point is that they're both fully legit "HD", and that the overall viewing experience is quite comparable between the two.

Dennis Wood
April 22nd, 2005, 10:21 PM
Goes to show how much I know about HD...I was under the incorrect impression that 720 30p (not 60p) was the standard. So your point is well taken...sounds like 720 60P is pretty much the equivalent of 1080i. I've had a 51" Sony HDTV for 3 years now...it's never seen actual HD yet :-(

Brandon Greenlee
April 23rd, 2005, 08:32 AM
So at what point are most tv's going to be capable of displaying 1080i and/or 1080p?

Is this still a while off?

Kevin Shaw
April 23rd, 2005, 08:49 AM
Brandon: many inexpensive rear-projection HDTVs are theoretically 1080i displays, but I've seen comments to the effect that even this may be a bit of a technological trick. And I think there are some HDTV projectors now which have true 1080p resolution, but those are not something which will be widely used by consumers any time soon. Per my earlier post, the current trend seems to be toward digital displays which are mostly 720p or lower resolution, because 1080p displays are too expensive to produce for now.

In any case, it's currently much easier to distribute HD content at 720p resolution than at either 1080i or 1080p, so when you add all this up 720p makes sense as today's best delivery format. 1080p would obviously be better, but by the time that becomes practical there will be tens of millions of 720p and 1080i displays in use which people are not going to rush to replace. As far as I'm concerned, 720p will be the preferred output standard for the next several years, even if 1080i delivery options become more commonplace.

Steven White
April 23rd, 2005, 09:19 AM
So at what point are most tv's going to be capable of displaying 1080i and/or 1080p?

A lot of the HDTV CRTs, like the 34" ones from Toshiba and Sony are 1080i native displays. Some of the rear-projection CRTs are 1080i as well. I don't think many of the DLP and LCD projection type displays or the Plasma/LCD panels are there yet - most reside in the 1280x768 resolution. There are a few 1920x1080 displays coming out in the $10k + market (Sony has an 1920x1080p projector for $30k)... but it seems like the display manufacturers are looking to get to 1080 native displays soon.

Barry Green
April 23rd, 2005, 02:43 PM
I've got one of those Sony 34" XBR sets, and native 1080 is indeed spectacular on it, as is native 720. Both formats look incredible on this CRT.

However, one glance at the offerings at Best Buy, Circuit City, etc., will let you know: consumers don't want CRTs. The vast majority of sets on display are flat panels (whether plasma or LCD, etc.) And those type of sets have discrete pixels, so they'll be optimized for one resolution or the other. Right now, most of them are optimized for 720p (although there are some 1024x1024 plasmas out there! And plenty of them are actually EDTVs, with a resolution of 864x480).

Dell has the 2405FPW flat-panel LCD computer monitor which is 1920 x 1200; a tiny bit of letterboxing gets you 1920x1080. More will be forthcoming.

Khoi Pham
April 23rd, 2005, 02:51 PM
"So at what point are most tv's going to be capable of displaying 1080i and/or 1080p?"

3 more months.
Go to this link
http://www.weva.com/cgi-bin/newsreader.pl?op=render&type=o&storyid=2427

Steve Crisdale
April 23rd, 2005, 05:42 PM
"So at what point are most tv's going to be capable of displaying 1080i and/or 1080p?"

3 more months.
Go to this link
http://www.weva.com/cgi-bin/newsreader.pl?op=render&type=o&storyid=2427


Boy is that being overly optomistic or what!!! I'd suggest the true answer to the question of when MOST TVs are going to be capable of native 1080i/p is a hell of a lot further off...

Sure there have been demonstrations of native 1920x1080 resolution panel, but the purchase price for the immediate future - and that's at least 3 months, is astronomical. That sort of rules out mass take up of native 1080i/p display devices for an indeterminate period beyond 3 months.

Given that the current state of play indicates there isn't anywhere near market saturation by the current breed of HDTVs, why in the name of sanity would most HD TVs being watched in 3 months time, be the sort that are today, only available to a select and privileged few?

Steven White
April 23rd, 2005, 05:56 PM
Well "most TV's" is wrong... but I disagree that the price will be astronomical. The DLP technology is one of the cheapest ways to get half-decent HD displays. While the 1920x1080 plamas are going to be extremely expensive, the projection technologies are a lot more efficient and simpler to manufacture. The reasons I haven't bought an HDTV yet is because

a) I don't have room for a 34" CRT (the depth is just too big)

and

b) None of the plasmas or DLP projection DVs have been 1920x1080p. There's no way I'm buying a TV that isn't native to what my camcorder puts out.

Khoi Pham
April 23rd, 2005, 06:45 PM
Oh yeah, I meant there will be much more to choose from in 3 months, not most tv will be it 1080p, right now there is only 2 to choose from.

Steve Crisdale
April 23rd, 2005, 07:57 PM
Well "most TV's" is wrong... but I disagree that the price will be astronomical. The DLP technology is one of the cheapest ways to get half-decent HD displays. While the 1920x1080 plamas are going to be extremely expensive, the projection technologies are a lot more efficient and simpler to manufacture. The reasons I haven't bought an HDTV yet is because

a) I don't have room for a 34" CRT (the depth is just too big)

and

b) None of the plasmas or DLP projection DVs have been 1920x1080p. There's no way I'm buying a TV that isn't native to what my camcorder puts out.

So you wouldn't buy a current device that may well do an outstanding job of scaling 1080 to the devices native resolution? What if the prices of a good scaling HDTV meant that you could at least enjoy what's currently available right now, while waiting for the 1080 native sets to infiltrate the market?

My 2c's worth.... I'd much rather enjoy seeing my FX-1e clips on my Sharp Aquos LCD HDTV, than waiting for something that may not look heaps better... and if it does, I'll buy a 1080 native set when they're cheap enough..

While we may feel like the world of HD technology is advancing at breakneck speed (which still isn't fast enough for some!!), for the 'Average Joe', they couldn't give two craps whether they had native 1080 or not.... They just want a TV set that will 'work'. No need to do anything other than turn the bludger on and watch the pretty pictures.

When 'Average Joes' dip their toes into buying a new TV, they'll buy something affordable, that works now and with PQ that looks good to their eyes.

So, while it's nice to dream of where this technology appears to be heading and how soon, I'd prefer to be a little cautious. Retailers sure ain't gonna love some-one telling them to trash all their current stock, just to replace it with the next "big thing". The financial bottom line is what drives corporations in this level of technology. I'm sure they really don't want to see their investment in 720i/p capable devices with 1080 scaling just vanish before their eyes.

Steven White
April 23rd, 2005, 08:33 PM
So you wouldn't buy a current device that may well do an outstanding job of scaling 1080 to the devices native resolution?

Hm... let's check out my current current system:

http://s94963366.onlinehome.us/HDRFX1/spiffsetup.jpg

Eek. I guess the answer is a definitive "no" considering my 13" television hasn't even been upgraded to a halfway decent SD CRT. I can wait. I can usually settle for pure crud until I'm sure I'm getting exactly what I want.

Ken Hodson
April 25th, 2005, 12:30 AM
Considering you settle for that, any 720p TV should exceed all of you expectations for the next 20 years. I see you had no problems adapting to modern sound HD.

Kevin Shaw
April 25th, 2005, 06:53 PM
"you could take a VHS image and interpolate it up to 1080i, and say it's a high definition image..."

You could, but that would be fraudulently deceitful, since it wouldn't contain an HD level of source detail. However, some videographers have already threatened to do just this to try to get around having to buy any kind of HD camcorder. Bad idea.

"I think the factor that outweighs 720p or 1080i is the ability of the camera operator and the controls of the camera they're working with. Indeed, these factors can also bring the quality of DV above any HD format, if you've put a good DV camera with great controls in the hands of an expert, and the HD camera is being operated by a lesser person. "

While that's ultimately true in terms of the content, I doubt anyone will confuse SD source material with HD/HDV source when played on a proper HDTV display. Or to put this another way, does anyone here really believe that you could pass off a 640x480 still image when compared side-by-side with a 1920x1080 still image? For any subject with any real detail it it?

Graeme Nattress
April 25th, 2005, 07:07 PM
Although, I just read in the latest issue of HiFi News, an article by Barry Fox on a big "HD" event by Sony in London UK. They showed lots of wonderful HD material and wowed the audience with how good HD looked. Only later did Barry find out that Sony had not used an HD projector, and the image everyone thought was HD, was barely above SD in resolution.

Graeme

Luis Caffesse
April 25th, 2005, 08:33 PM
That's classic Graeme!
You just made my night with that story.
:)

Kevin Shaw
April 25th, 2005, 08:42 PM
That is a good story, but it still doesn't address what's going to happen when people start seeing true HD output and then comparing that to SD content on their own HDTVs. I only have to show a few seconds of HDV footage to get people to sit up and take notice.

Steve Crisdale
April 25th, 2005, 08:46 PM
Although, I just read in the latest issue of HiFi News, an article by Barry Fox on a big "HD" event by Sony in London UK. They showed lots of wonderful HD material and wowed the audience with how good HD looked. Only later did Barry find out that Sony had not used an HD projector, and the image everyone thought was HD, was barely above SD in resolution.

Graeme

Hey!! It's not so funny... well not really...er, maybe a little bit.

I remember going to a big Electronics Expo with some people who knew I had HD gear, and they were pointing out SD stuff as though it was HD.

I came to the realisation that there's massive numbers of people out there who haven't realised that they're blind yet!! They couldn't spot a nuance if it sat up spat in their faces... (of course you must know what a nuance is to be able to spot one!!)

So for the majority of the population EDTV, SD digital WS are better to their poor unconditioned eyes and 1080 would appear to be no advantage despite it's extra expense.

Whilst my eyesight may be deteriorating given my now advancing years...It'll take decades before my eyesight would ever be bad enough to accept upsampled crap as the 'Real McCoy'.

Luis Caffesse
April 25th, 2005, 09:08 PM
"I came to the realisation that there's massive numbers of people out there who haven't realised that they're blind yet!!"

I completely agree.
Not only that, I think we underestimate how little the majority of people know/care about these issues.

Many of the people I know were under the impression that DVDs were HD until I told them differently.