|
|||||||||
|
Thread Tools | Search this Thread |
April 25th, 2005, 07:07 PM | #46 |
RED Problem Solver
Join Date: Sep 2003
Location: Ottawa, Canada
Posts: 1,365
|
Although, I just read in the latest issue of HiFi News, an article by Barry Fox on a big "HD" event by Sony in London UK. They showed lots of wonderful HD material and wowed the audience with how good HD looked. Only later did Barry find out that Sony had not used an HD projector, and the image everyone thought was HD, was barely above SD in resolution.
Graeme
__________________
www.nattress.com - filters for FCP |
April 25th, 2005, 08:33 PM | #47 |
Major Player
Join Date: Apr 2004
Location: Austin, Texas
Posts: 704
|
That's classic Graeme!
You just made my night with that story. :)
__________________
Luis Caffesse Pitch Productions Austin, Texas |
April 25th, 2005, 08:42 PM | #48 |
Inner Circle
Join Date: Sep 2004
Location: Sacramento, CA
Posts: 2,488
|
That is a good story, but it still doesn't address what's going to happen when people start seeing true HD output and then comparing that to SD content on their own HDTVs. I only have to show a few seconds of HDV footage to get people to sit up and take notice.
|
April 25th, 2005, 08:46 PM | #49 | |
Major Player
Join Date: Jan 2004
Location: Katoomba NSW Australia
Posts: 635
|
Quote:
I remember going to a big Electronics Expo with some people who knew I had HD gear, and they were pointing out SD stuff as though it was HD. I came to the realisation that there's massive numbers of people out there who haven't realised that they're blind yet!! They couldn't spot a nuance if it sat up spat in their faces... (of course you must know what a nuance is to be able to spot one!!) So for the majority of the population EDTV, SD digital WS are better to their poor unconditioned eyes and 1080 would appear to be no advantage despite it's extra expense. Whilst my eyesight may be deteriorating given my now advancing years...It'll take decades before my eyesight would ever be bad enough to accept upsampled crap as the 'Real McCoy'. |
|
April 25th, 2005, 09:08 PM | #50 |
Major Player
Join Date: Apr 2004
Location: Austin, Texas
Posts: 704
|
"I came to the realisation that there's massive numbers of people out there who haven't realised that they're blind yet!!"
I completely agree. Not only that, I think we underestimate how little the majority of people know/care about these issues. Many of the people I know were under the impression that DVDs were HD until I told them differently.
__________________
Luis Caffesse Pitch Productions Austin, Texas |
April 26th, 2005, 07:00 AM | #51 |
RED Problem Solver
Join Date: Sep 2003
Location: Ottawa, Canada
Posts: 1,365
|
It's practically impossible to tell SD and HD apart on a small screen, and in the article I mentioned, the writer was sitting at the back of the room and couldn't see anything wrong with the image - because the large projection screen was far away and appeared small.
HD is only better than a lower resolution format if you're sitting up close to the screen, or the screen is big. Now what makes things worse is that most people have small screens that they sit a fair distance away from, and in that situation, even if they put in a HDTV instead of their SDTV, they'd not see any benefit. What would provide a better picture for the majority of viewers is lower compression on digital transmissions and more care taken in the production chain to optomize picture quality. Again, better quality SD would benefit the vast majority of viewers more than HD. That's why the adoption of HD will be driven by the home cinema crowd - those with the video projectors and big screens, and they want a HD disc they can view HD material with. Graeme
__________________
www.nattress.com - filters for FCP |
April 26th, 2005, 09:09 AM | #52 |
Trustee
Join Date: Mar 2004
Location: Milwaukee, WI
Posts: 1,719
|
Once again I have to agree with Graeme. I think even many of us pros have a hard time telling what is going on with HD resolution. Look at the SONY Z1 for example. I think it is a great camera and does have a very good picture on a TV and I might still be getting two of them depending on what happens with the Panasonic camera by October. The fact though is that the camera only does 960 x 1080 interlaced. If you are watching on a LCD or plasma that will get de-interlaced giving you somewhere between 540 and 1080 lines depending on who you argue with. I actually think it is more like 540 since every other line is basically fake. So in terms of raw progressive pixel detail we only get 960 x 540 which isn't all that much higher than SD. Just like what Graeme said I think the reason the Z1 looks great (and it does) is because even many of us pros are not used to seeing that much detail. Some people say the Z1 looks just as good as a Cinealta camera but I do not agree. I think to most of us it does look as good because we can't really tell right now. I always use the example of when the SONY VX1000 came out. Before all we had to use were analog cameras. Most of us in that price range were using S-VHS. The VX1000 being digital blew us away. All of a sudden that camera looked better than a 1/2" S-VHS camera. Now today if we look back at the quality of the VX1000 we just think its decent. A better example is the Canon XL1. The chips are smaller on that camera compared to other 3 chip cameras. When it first came out everybody loved the picture quality. A lot of people still do. By todays standards however most of us can now see the softness and small lack of detail of the Canon XL1.
The Canon XL1 is still SD DV but it does use smaller chips. Just like how the SONY Z1 is still HD HDV and uses smaller chips. The fact though is that we can tell the XL1 isn't as good as a 2/3" camera where we have a harder time between the Z1 and a Cinealta camera. I do not think many of us are sensitive enough to tell just yet between images that are 1280x720 or 1920x1080. Unless as Graeme pointed out we are watching on an 80" TV. Of course on a Plasma or LCd we will hit the pixel limits of that device before we will the format. Color space and compression stand out more than resolution to me right now. Heck I even like those EDTV sets. They actually look very nice at 42" even though they are only SD. This is why I always felt there should be an HD format that gives us 853 x 480 but with 4:4:4 color and very little compression. To me right now watching that kind of video on a 42" EDTV would be pretty good. My whole point to this retarted post is that if we pros think a 960x1080i camera gives us just as good HD then what do the consumers of the world think? I mean we only recently convinced them that DVD was much better for the extra cost. |
April 26th, 2005, 09:19 AM | #53 |
RED Problem Solver
Join Date: Sep 2003
Location: Ottawa, Canada
Posts: 1,365
|
Well, I can see 4:4:4 1920x1080 on my LCD, so that's not a limiting factor. CRTs are now the limiting factor - they really "gloss" over details. I tell people about the 3D graphics I did for Panavision for NAB 2001, and that at the time, I didn't have the rendering power to do 1920x1080, so I did them 3/4 resolution. I could tell the difference in After Effects on my cinema display, but after going to HDCAM tape and being displayed on a Cinealta CRT monitor, you couldn't tell that I'd not rendered at full resolution. The CRT monitor also made it hard to tell the difference between keying live off the camera at 4:4:4 or 4:2:2 compared to keying off the HDCAM tape at 3:1:1. I could just about tell the difference, but I was watching the footage on a loop all day long....
Graeme
__________________
www.nattress.com - filters for FCP |
April 26th, 2005, 02:06 PM | #54 | |
Barry Wan Kenobi
Join Date: Jul 2003
Location: North Carolina
Posts: 3,863
|
Quote:
|
|
April 26th, 2005, 10:31 PM | #55 |
Inner Circle
Join Date: Sep 2004
Location: Sacramento, CA
Posts: 2,488
|
Thomas: your numbers are questionable because (a) the Sony HDV cameras use pixel shift to generate 1440 horizontal pixels during recording, and (b) the issue of "faking" half the lines due to interlacing should be compared to SD cameras recording at a measly 480i resolution.
In any case, all the numbers are moot once you play HDV footage on an HDTV, and it's definitely good enough to get people's attention. I suspect it will be a very long time before most consumers can tell the difference between HDV footage and anything better on typical HDTVs, but they can all immediately see the difference between HDV and SD. That's what will make the HDV format a success, in spite of its limitations and competition from more expensive options. Regarding the original subject of this thread, I doubt most consumers will be able to tell the difference between 1080i and 720p footage, especially as we move increasingly toward progressive-scan TVs. |
April 27th, 2005, 04:14 AM | #56 |
Regular Crew
Join Date: Jul 2004
Location: POOLE, UK
Posts: 158
|
I have no idea of the resolution or if indeed I am watching 'true' 1080 HDV taken on my FX1 projected 7ft wide with a Sony HS20 LCD projector, all I know is that the image knocks my socks off! The detail, smooth colours make the image come alive, I can't stop watching it!
I'm now having a problem watching standard DV which up to a few weeks ago I would have said looked OK, now it looks terrible (as an aside I would say that the standard DV or Hdv converted to standard DV taken on the FX1 at this projected size does not hold up quite as well as footage taken on my VX2100) So however those resolution figures may stack up Sony have killed DV for me, as one of my friends said 'WOW - MOVING SLIDES, how did you do that?' Paul |
April 29th, 2005, 04:43 AM | #57 |
Regular Crew
Join Date: Apr 2005
Posts: 45
|
I share i different view
Hello,
this is a very good discussion, but I think one important point in this debate has not been mentioned yet. Most Members seem to see the question 1080i or 760p only from the back of the camera and from the momantary situation. I for myself do video only as a hobby, mainly I am a movie fan. And from that point of view I think at the moment 1080i is the better way to go. Why? That is easy - For distributing existing movies I see no advantage in 60p (or 50p in Pal – Countries, where I live) since all existing movies are 24p (or 25p in Pal distribution). Especially in Pal Countries where no 2/3 pulldown for framerate conversion is needed, it is very easy to deinterlace the 50i to 25p because the source is progressive. So the reserved bandwith for 60p broadcast is wasted on repeating each frame twice (or you broadcast in 720p with 25p in Pal, I don't kmow how they will do it in 60 Hz Counties, because as far as I know 2/3 Pulldown will work only with 60p or 60i) From my own experiences I do know that interlacing has its problems, especially in editing. But if we go to 720p right now. I think a future migration to 1080p (with 50 or 60 Frames) will never happen. But if we go to 1080i know, a migration to 1080p will be feasible. Greatings Richard |
April 29th, 2005, 05:30 AM | #58 |
RED Problem Solver
Join Date: Sep 2003
Location: Ottawa, Canada
Posts: 1,365
|
But.... 24p is part of the HDTV specifications, is it not, so that a broadcaster could send out 24p as whole frames now and not have to add pulldown frames?? I don't knwo if any do that, but it's part of the spec.
The problem is, that 1080p done currently, is embedded as part of an interlaced 1080i signal, and therefore, the chances are, that it has been filtered as any interlaced signal would be, and therefore has no more vertical resolution than 720p, even though it's carrying a progressive signal. 1080p would be super - as would 4k in the home, but to see the full benefit of 1080p you'd need either a very large screen, or be sitting rather close to your normal sixed screen, and I don't think either is much of a reality for most of the population. It would be great for the home cinema fan, but that would be about it. Graeme
__________________
www.nattress.com - filters for FCP |
April 29th, 2005, 06:24 AM | #59 |
Regular Crew
Join Date: Apr 2005
Posts: 45
|
Hello Graeme,
of course 4k or even 2k would be better as 1k or 0,7k. But we have to keep in mind that once a standard is established it will stay for some decades. So I fashion a TV - norm that has some room for further developments. This room I see in 1080i but not in 720p. This format, as good as it might be to SD (Pal or NTSC) in comparison, is in my eyes a little bit to fixed on the today but not on tomorrow. Im quite sure that in only half a decade 1080p will be feasible. Of course it would be a poor design if the displays don’t recognise a 25p Signal that ist embedded in a 50i format und don‘t a proper deinterlacing. Richard |
April 29th, 2005, 06:54 AM | #60 |
RED Problem Solver
Join Date: Sep 2003
Location: Ottawa, Canada
Posts: 1,365
|
Well, all HD displays are progressive these days. Nobody is really making CRTs any more, and they can't display anywhere near HD resolution anyway. That's why any interlaced format is just plain daft. For interlace to work (read earlier in this thread) it has to be filtered vertically, which means that even if you embed 25p in 50i it still won't have full 1080p vertical rez, but only 1080i vertical rez or it would flicker on any 1080i display (which are still around) so it doesn't offer any more real vertical resolution than 720p. And because 720p being progressive, compresses better than 1080i and as it uses less pixels, for the same bandwidth you get a better overall picture. Also, 720p will, arguably, not be any worse off than 1080i for uprez to full 1080p.
Progressive really is the way to go, and if we have to go with a smaller "resolution" in 720p than the bigger number of 1080i, that's not going to mean worse pictures in the home and will not impede any move to full 1080p. And yes, going 1080p from the get-go would be better still. But really, the problem is broadcasters, and the lack of quality in digital transmissions. There's very little point in "getting HDTV" if all you can see is macroblocks, quilting and mosquitos. Graeme
__________________
www.nattress.com - filters for FCP |
| ||||||
|
|