View Full Version : great interview about the equipment-z1-hvx-xl h1 , etc.


Pages : [1] 2

Kurth Bousman
February 23rd, 2006, 11:41 AM
http://www.freshdv.com/2006/02/fresh-exclusive-interview-with_18.html

looks like others feel the z1 is still THE camera to buy ! Kurth

Les Dit
February 23rd, 2006, 12:36 PM
I don't have much respect for an article that uses the F word within the first few sentences.
The guy dosn't like 720P because it's 'medium definition'? He may be the type of guy that buys a car based on how high the spedometer is labeled in MPH.
-Les



http://www.freshdv.com/2006/02/fresh-exclusive-interview-with_18.html

looks like others feel the z1 is still THE camera to buy ! Kurth

Ash Greyson
February 23rd, 2006, 01:24 PM
Not a very credible source... just one guys opinion, not an expert or anything. No hits on IMDB and the title of his current production loses all cred with me.



ash =o)

David Saraceno
February 23rd, 2006, 05:01 PM
I don't have much respect for an article that uses the F word within the first few sentences.

I agree with you there.

But I won't dismiss the meat of the comments because of one word.

Dylan Couper
February 23rd, 2006, 08:42 PM
720p is f--king retarded"

Ah, good technical feedback. I wasn't sure about 720p until the author put it in those terms for me. :)

Kurth Bousman
February 23rd, 2006, 09:33 PM
Hey guys -his language is colorful and the man appears to have used the cameras. I can't believe in 2006 that language would be an issue. If he wasn't a credible user I don't believe he would have been interviewed on the site. Kurth

Dylan Couper
February 23rd, 2006, 10:04 PM
Unprofessional language does not lend credibility to one's opinion.

Jack D. Hubbard
February 23rd, 2006, 10:21 PM
Exclusive of the language, he does make some important point that are worth thinking about, particularly the P2 storage issue.

Ash Greyson
February 23rd, 2006, 11:53 PM
He has valid points and opinions... he has some experience but he is not the authority on such matters. I am not a fan of the P2 workflow myself but he is a little abrasive for me.


ash =o)

Philip Williams
February 24th, 2006, 06:52 AM
We've had some discussions with Josh about his interview over at dvxuser. Seems like a nice enough fellow and you gotta respect him for showing up at the forum.

Frankly, I just didn't like the interview because he is talking about his particular workflow and projects. For his work, the Z1 is the best choice. That's certainly fine, but he then goes on to poo-poo EVERY other camera and/or format. Is 720P really "retarded"? For EVERYONE??

I asked him why he felt that adding a Firestore to the HVX was too cumbersome for a one man crew. He replied that its too heavy and that even hand holding a 5 pound camera becomes tiring after a while (some paraphrasing from memory there, dvxuser thread has exact quotes). Alright, again, that's all very true for his style of work. But what about people that want the HVX feature set and work on tripods 90% of the time? That's more typical for my workflow, so I'd have no problem with the HVX and an external drive.

Anyway, its funny, there are a lot of people that don't like P2 and they seem to be pointing to this interview to back up their opinions. Not liking P2 is certainly fine. But trashing the HVX/P2/Firestore product and every other camcorder out there is pointless. Can't people just say "it doesn't work for my needs". Do we have to call formats we don't like "retarded"? Is the XLH1 really "stupid"? Should Shannon Rawls, Michael Pappas, Barry Green and the countless other professionals on this board immediatly trade in their "stupid", "retarded" and "asinine" camcorders for Z1s?

I like the Z1, XLH1, HD100 and HVX. Regardless of your project style, there's almost certainly an excellent camera to cover it in that group. And at prices that mere mortals can even justify. What's to complain about? If you don't like one of them, just don't buy it. Its really, really just that simple.

www.philipwilliams.com

Steven Thomas
February 24th, 2006, 08:19 AM
What a joke that was....

I'm amazed how anyone can post anything on a website and have people read it as gospel.

Thomas Smet
February 24th, 2006, 09:56 AM
what an idiot this guy is. This guy must think he is some kind of god.

He talks about 720p as only medium definition and that 1080i is high definition. If he only knew that in most cases 720p offers more detail than his Z1 that shoots 1080i.

Based on his article to me it sounds like he really has no proof to his theories since he doesn't own all of the cameras. He just happens to own a Z1 and is just shooting his mouth off based on the specs of the other cameras.

Yes in a perfect world a perfect 1080i might have more detail than a perfect 720p but the fact is that most 1/3" 1080i cameras do not have any more detail than the HD100.

One interesting thing he mentions is how his station shoots HDCAM but edits DVCPRO HD. This means he is getting 1280x1080i worth of detail in the end. On digital displays this equals to 1280x540 compared to his @%&*! 720p which is at 1280x720 with no aliasing artifacts.

In a perfect world (to this guy) everybody would use a CRT HDTV and then he may have a point.

I personally don't really care if him and his TV station think 1080i is better and offers more true HD resolution. At the end of the day 1080i is harder to compress and I would rather have a slightly softer but clean image compared to an image with more detail(which it doesn't have) and compression blocks all over the place. I would love to know which bitrate his station uses for broadcast.

This whole article seems to be more of an excuse for "him" buying his Z1.

David Saraceno
February 24th, 2006, 10:46 AM
Hey guys -his language is colorful and the man appears to have used the cameras. I can't believe in 2006 that language would be an issue. If he wasn't a credible user I don't believe he would have been interviewed on the site. Kurth

Profanity is profanity.

Not only that, the words describe absolutely nothing. If you articulate your views without profanity, you loose credibility.

More than that, you communicate nothing.

Kurth Bousman
February 24th, 2006, 12:11 PM
I guess I should have discounted Catcher in the Rye , Catch 22 , Dostoevsky and countless other masterpieces and authors that use " profanity". Oops , guess I can't see 3/4 of modern cinema either. People that take issue with his opinion based on technical reasons , good for you. That's an argument I can get into. People that are voicing their opinion based on his language...
thank god for the freedom of the internet. Kurth

Kurth Bousman
February 24th, 2006, 12:19 PM
I guess I should have discounted Catcher in the Rye , Catch 22 , Dostoevsky and countless other masterpieces and authors that use " profanity". Oops , guess I can't see 3/4 of modern cinema either. People that take issue with his opinion based on technical reasons , good for you. That's an argument I can get into. People that are voicing their opinion based on his language...
thank god for the freedom of the internet. Kurth

Kurth Bousman
February 24th, 2006, 12:28 PM
sorry for the double post - it was not meant as a double meaning- problems with my wifi connection only - Kurth

Ed Hill
February 24th, 2006, 02:16 PM
I'm sure the Z1 is good for his workflow. But it wouldn't be my first choice for my work which is documentary, music video, commercials and the rare low-budget feature. His comments make him sound like some one who owns the Z1 but has only read the specs for other cameras.

I seem to remember reading the specs for the FX1 / Z1 . FX1 / Z1 uses 960x1080 chips. It uses pixel shift to achieve 1440x1080. So I don't like the fact that it's ccds are not native 1440x1080. Still, I've seen some good products created with the Z1, like a short narrative called "Windsor Knot" .

Compare the actual Z1 ccds at 960x1080 to the HD100 ccds at 1280x720. Hmmm, not that big a difference. Then the JVC also has the dedicated switches for all functions, true manual control for shutter speed / aperture, and the same shoulder mount I'm used to with pro cameras.

Now compare the Panasonic HDX with the P2 cards. Sure I'd love to have the variable frame rates and record in DVCPRO 50 - 720p30. More image data is better.

But I'm not gonna pay almost $1800 for a 8 Gigabyte P2 card to record only 16 mins of video. That's worse than the 20 minute field tapes I used to use with BetacamSP! So then I have to buy an expensive HD or laptop and take the time to dump the footage to disk. Sure maybe in two years when or if P2 cards drop below $ 100. The current P2 system seems impractical. I would be better off to use the Panny camera with a dedicated direct to Harddrive system.

Apparently WSB-TV 2 here in Atlanta has bought 2 of the HD100's. Court TV uses it for their "Hollywood Heat" show, and WireImage.com is doing celebrity videography with the HD100. You're aware of The Rage horror film shoot. National Geographic channel broadcast Madagascar wildlife segments shot by Andrew Young.

Until 1080 progressive monitors become wide spread, I am quite content with 720p and the JVC HD100 we bought this week. I expect it will work well for the next 2 years or so.

Ed

Kurth Bousman
February 24th, 2006, 04:24 PM
Ed, I believe that's the point . We're , most of us anyway , only using 1280x720 monitors or projectors . When 1920x1080s' become much more widespread , 720 p should be marginalized to some extent. He makes good points about p2's impracticality and about the h1s' high price. If I was buying a camera today , I still think the fx1s' the best deal for the money although , from all of the clips I've seen , the canons' got the best image but for 3x the price. And , I have to admit , the hvx 60p footage is absolutely gorgeous. Probably the canons' the most futureproof but you need to have a 42" 1920x1080 lcd to see any difference, and those are at least 5 years away from my budget. Kurth

Tom Roper
February 24th, 2006, 11:09 PM
In a purchase decision, due diligence is the best advice. In the case of these cams, there is ample opportunity to preview clips beforehand.

I think Josh's advice was pretty good. A few people got rankled because someone parsed a comment to validate a purchase decision, and a follow on to attack the credibility of the opiner.

When Josh says you should let your budget and workflow dictate the camera choice instead of the other way around, I don't see how anyone can argue.

He says some controversial things like people will use 24p because they are afraid their work will suck otherwise. But he is not alone if he thinks some of the cams are overpriced for the marginal difference.

Jack D. Hubbard
February 25th, 2006, 12:58 AM
Not to put too fine a point on it, but he does require us to think about the issues of Z!, P2, 720 etc. Don't get lost on the profanity or the bombastics, the guy does have some experience, and that is what we all contribute to this forum. Might be offensive, but what he says is worth considering.

Joel Aaron
March 7th, 2006, 01:06 PM
The profanity doesn't bother me, but he clearly doesn't have an awareness that progressive frames are much easier to deal with in post for a variety of manipulations.


Regarding 720p vs. 1080i: I'm not knowledgable about the details of shooting in 720p and then converting to 1080i but here's a point I gleaned from someone else on another forum..

"720p to 1080i is a down, not up, conversion. 1080i is actually 540 lines, not 1080. It's just that every other set of 540 lines is offset to land between the lines of the adjacent set. Theoretically, this gets you a little extra resolution at the expense of some flicker artifacts. So, 1080i is pretty much equivalent to about 700 progressive lines.

The conversion can look near perfect if it's done by the best broadcast quality equipment, or it can look a bit soft if it's just the cheap conversion built into a consumer CRT HD set."

If anyone can verify or contradict this I'd be interested to know the truth.

Also - almost ALL the consumer HD equipment out there is 720p native, that's going to take many years to see change. RED will arrive long before then.

In short, I think I'd rather have killer 720p than average 1080i despite being perceived as "retarded" by established pixel geniuses like this interviewee.

(sorry for dragging up an older thread - just noticed the date... oh well, still interesting)

Douglas Spotted Eagle
March 7th, 2006, 01:56 PM
Also - almost ALL the consumer HD equipment out there is 720p native, that's going to take many years to see change. RED will arrive long before then.



Not by a long shot is this so. Upon what are you basing your statement?
Regarding cams, there are 2 that are 720p, and six that are 1080i. Displays? 20 million 1080p (native) displays sold in 05, 55 million expected sold between Jan 06 and Jan 07 (1080) according to CE Daily (March 06) and Peddie Research.

Joel Aaron
March 7th, 2006, 02:35 PM
Not by a long shot is this so. Upon what are you basing your statement?
Regarding cams, there are 2 that are 720p, and six that are 1080i. Displays? 20 million 1080p (native) displays sold in 05, 55 million expected sold between Jan 06 and Jan 07 (1080) according to CE Daily (March 06) and Peddie Research.

I think I phrased it wrong - there aren't many 1920x1080 native TV's out there. They are mostly 1280x720 or less.

HDTV's are all capable of displaying 1080i one way or another - I agree... but the point that was noted in the previous post was 1080i doesn't appear to contain more detail than 720p... and in fact, could contain less. That's my overall point... so just try to go down that road if you happen to be a pixelguru yourself and have information to the contrary.

Here's an interesting article that covers the formats:
http://reviews.cnet.com/4520-6449_7-6361600-1.html

I guess I'm not convinced everybody that's been aquiring 720p with a Varicam has been using a "retarded format".

David Saraceno
March 7th, 2006, 02:46 PM
I guess I should have discounted Catcher in the Rye


Profanity in Catcher in the Rye was integral, at times, to the story.

You might want to let us know how his profanity is integral to his view on video cameras for an interview on the internet?

Dylan Pank
March 7th, 2006, 03:38 PM
I think we're getting caught up here in something that's started to happen on the net ~Bloggers inflating their importance by interviewing each other.

HE has an opinion, that's for sure, and he's welcome to it, but it's not really backed up by much other than bluster and self regard.

Matt Davis
March 7th, 2006, 04:32 PM
Bloggers inflating their importance by interviewing each other.

Exactly - 'and pop will eat its self'.

If you can pardon the weird metaphor ("so weak it's almost a fortnight"), there appears to be too many spectacle wearers and not enough opticians. "My glasses are great" yell the wearers, yet others try them and shout back "these suck". Cut to the Opticians...

Douglas Spotted Eagle
March 7th, 2006, 04:34 PM
First of all, of course people who have been acquiring at 720p are not "retarded." That statement in itself speaks volumes about credibility.
Second, there is a significant difference in what 720p contains vs 1080i contains on either a 1080i or 1080p display.
Up until very recently, there haven't BEEN 1080 displays, so of course there haven't been visible differences.
You have two choices for rez. Spatial, and temporal. Everyone keeps being hung up on only spatial, probably because it's the easiest way to compare images with stills. But pictures aren't still.
Temporally, 1080 is significantly more information. Spatially, while there are indeed more definitive lines of horizontal resolution in the progressive originated, full raster image (which the HVX doesn't begin to approach) in a 720p image, the story doesn't stop there. You simply cannot take an approach of "720p contains more horizontal information and is therefore better." That's ridiculous. It's like taking 540 lines of resolution, shifting it, and calling that 4:2:2 HD. If I record VHS to an HDCAM deck, does that make it 4:2:2 or make it HD, or still yet, make it 1920 x 1080? I guess so, but then the question becomes whether it's usable or not. The first part of the question is somewhat cut and dried, but the second part is pretty subjective.
Back to point, both formats have merit. 1080i certainly is substantially different, and in some ways, superior. Displays shipping today, and for many, many years to come, will all be 1080.
Put 720p on a native 1080p display, and put a 1080i image on the same screen. If you don't see a difference, then anything we might be able to subjectively discuss is moot anyway.
Bloggers, evangelists, etc may all input their opinions, but as others have commented...they're just opinions. Most of them form their opinions by numbers, not experience or testing.
Bottom line, what does the eye see and perceive.

Joel Aaron
March 7th, 2006, 06:33 PM
Bottom line, what does the eye see and perceive.

Right, exactly! There are a lot of pixelwars and format bashing going on - but you just have to look at the images and decide. That's kinda tough to do for a lot of us... but making decisions based on a spec sheet or marketing hype isn't going to work these days.

Douglas Spotted Eagle
March 7th, 2006, 06:40 PM
No, you can't make a decision based on marketing hype.
But at the same time, be SURE you're looking at all things equally.
For example...I was recently in a major retailer's store in NYC. One of the buyers there was dead set on convincing me why the 720p cam was better than the 1080 cam. He pulls me into a back room where they're feeding both cams into a BVM monitor via component. Ummmm, first of all, component out doesn't relate to what's going to tape. And that's the first thing that counts.
OK, so we then record to tape, and then send the recordings out. Indeed, the shots of rez charts showed the 720p a litte ahead of the 1080. Except the BVM monitor is only 960 lines. We took the same two cams/tapes out to the floor, and put the images on an SXRD monitor, which is a 1920 x 1080 monitor (native) and the poor guy was shocked. He'd been running around so excited at his discovery, only to have it dashed to the floor.
In other words, compare formats with the final display resolution, not a compromised resolution. This way, everything is as close to equal as possible.
The two things that matter most, IMO, is image sensor resolution, and DSP quality. From there, you get to start making some decisions.

Joel Aaron
March 7th, 2006, 06:47 PM
The two things that matter most, IMO, is image sensor resolution, and DSP quality. From there, you get to start making some decisions.

Plus there's dynamic range, motion blur, noise, artifacting under different conditions etc. - resolution is just one thing.

Not sure what cameras you're talking about here - but the JVC and Canon stuff I've seen looks awfully good to me.

David Heath
March 7th, 2006, 07:22 PM
"720p to 1080i is a down, not up, conversion. 1080i is actually 540 lines, not 1080..........

The conversion can look near perfect if it's done by the best broadcast quality equipment, or it can look a bit soft if it's just the cheap conversion built into a consumer CRT HD set."
A lot of these comparisons become meaningless if frame rates aren't defined. "720p" is usually taken to mean 720p/50, and "1080i" is normally taken to mean 1080i/25 in 50 Hz countries. (And using EBU nomenclature - the latter means 25 frames, 50 fields.)

In this case the two are by and large considered comparable in quality, so perhaps "cross conversion" may be a better may to describe it. But 1080 production here is not limited to 1080i/25, and most drama etc is made 1080p/25 - progressive, highest resolution, and many producers actually prefer the 25fps temporal look. Most importantly it can be transmitted as if it was 1080i/25, when it is properly described as 1080psf/25.

Conversely, "720p" can also be 720p/25, and here - provided the front end is capable of delivering - 1080p/25 must obviously be capable of better results than 720p/25. That's not to say there is anything wrong with 720p. There was never anything 'wrong' with 16mm film, but 35mm is obviously 'better'......

Graeme Nattress
March 7th, 2006, 08:46 PM
I can excuse profanity, but I can't excuse technical innacuracies....

720p has at least the same vertical resolution as 1080i - fact.
horizontal resolution is theoretically higher in 1080i, but in practise, you don't get very much advantage....

720p60 IMHO uprezzes to 1080p60 much better than 1080i60 does... Making it actually more futureproof, not less....

720p is NOT medium definition. To say so shows a complete lack of understanding of how interlaced video works, and it's limitations.

The RED camera is not a big IF. If you don't know that by now, which rock have you been hiding under?

It's much harder to compress an interlaced stream than a progressive one. 1080p30 is easier on the codec than 1080i60. 1080p24 is easier still.

Graeme

Douglas Spotted Eagle
March 7th, 2006, 08:58 PM
720p60 IMHO uprezzes to 1080p60 much better than 1080i60 does... Making it actually more futureproof, not less....


Can't buy that one, not even by half. Having spent a lot of time with both Qualia projectors, Qualia 70" display, and countless 1080p displays, having fed virtually every format and resolution from every HD camera currently available and some that aren't shipping yet (including ENG cams that you'll see at NAB 06') my eyes see a very different story. I'm not the only one. I've made the conversions using several tools, both hard and software.
Not digging at 720p at all. Just saying that I can't agree.
720p is a square pixel format. Compression doesn't care about square or non-square, but of course, compression cares about interlaced vs progressive. However, square pixels have to be converted to non-square pixels, lines have to be nearly doubled, so it's not a 2:1 conversion like it is with 1080i converted to 1080p 30, and same holds more or less true for 1080p 30 converted to 1080p 60.
720p to 1080p 30 requires both horizontal and vertical pixel shift, whereas 1080i does not, it's just a temporal shift.

just realized your comment about RED camera. Re-read my post before jumping on my backside. I'm well aware of the camera. My disagreement is with the comment that "most displays are 720p" That's a bogus statement from either side of the fence. I wasn't at all referring to RED.

Ken Hodson
March 8th, 2006, 03:53 AM
I think that being this discussion has been sparked by this farce interviewers statement of 720p being "medium rez" we should get away from the display arguement and focus back on the capture aspects from what the person was refering. As far as capture is concerned the differance is not 1980x1080i compared to 1280x720p but 1440x1080i compared to 1280x720p. When you calculate in the inherant resolution loss from being interlaced the two resolution become amazingly close. Uprezzing to 1980x1080p will not suddenly give more real resolution to either. If you are watching Cinealta 1980x1080 derived material, well then, but hence we are talking HDV in this forum.

Graeme Nattress
March 8th, 2006, 07:30 AM
Not digging at 720p at all. Just saying that I can't agree.
720p is a square pixel format. Compression doesn't care about square or non-square, but of course, compression cares about interlaced vs progressive.

But full 1080p is als a square pixel format. HD was designed to be square pixel from the start, but then people started subsampling the raster and making a mess of that ideal. Ouch.

However, square pixels have to be converted to non-square pixels, lines have to be nearly doubled, so it's not a 2:1 conversion like it is with 1080i converted to 1080p 30, and same holds more or less true for 1080p 30 converted to 1080p 60.

Being an exact line multiplier is not a quality factor. You have to look at many rows of pixels to re-create the new ones you're inserting, so 2:1 scaling offers no real advantage. Indeed, the complexity of doing a proper de-interlace on 1080i60 far outweighs the complexity of scaling 720p60 to 1080p60 by a vast margin. Also, scaling can be, if you want it to be a simple engineering task based upon commonly understood scaling principles - pick a polynomial function and just do it, whereas de-interlacing is, quite frankly, a black art, and much more computationally expensive.

720p to 1080p 30 requires both horizontal and vertical pixel shift, whereas 1080i does not, it's just a temporal shift.

Scaling in two dimensions is easy, temporal stuff is very hard in comparison.

When I'm running the code I'm working on, I often use 720p footage, but displayed as 1080p - works very well in my programming environment. All you get is a slight softness, but it really passes well for the 1080p, whereas in any conversion of 1080i to 1080p you're introducing artifacts from the de-interlacing which, even with the best de-interlacers, are visible, and visible as something other than softness, so they do tend to stand out.

Graeme

Douglas Spotted Eagle
March 8th, 2006, 08:43 AM
so 2:1 scaling offers no real advantage.

I'll leave that one to the engineers to argue deeply. However Poynton, Sony, and Grass Valley's engineers have all offered different information. Poynton is the only person who I've seen lay it out clearly, and I'm terrible with math. I can follow it, but can't create it. JVC uses 2:1 scaling as part of their marketing message, but obviously, that can't be accepted any more than Panasonic's 4:2:2 HVX message either, so I'll gracefully bow out of the debate here. All I can go on is what my eyes tell me, what engineers have shown me, and by what sensibility suggests to me.

Graeme Nattress
March 8th, 2006, 08:58 AM
Well, all I can say is that people who make decent hardware standards converters have done superb jobs with NTSC to PAL, and that's a 1.18518518519 scaling factor - nothing nice at all. The 3/2 scaling factor for 720p to 1080p is as "nice" a number as 2/1, and although you're scaling 720p in both directions, most practical 1080i formats need scaling horizontally also, as well as de-interlacing.

Of course, shooting 1080p60 is best of all.... :-)

Although you just have 2x scaling to take 1080i60 to 1080p60, you have to have a very high quality de-interlace, and you've got to account for that interlace twitter where, if anyone has ever single field stepped through video knows, the image seems to jump up and down half a pixel as you step forwards, and to account for that in your conversion is a big challenge. It can work on general video, but if you get any sharp architectural lines, it can be very distracting indeed. You'd need to get into temporal motion vector style algorithms to account for that, and although good, are seldom perfect in a software environment, never mind a real time one.

So, the problems in de-interlacing, seperating fields and avoiding vertical twitter probably outweigh any scaling issues by a factor of about 10:1. 720p60 to 1080p60 just and only involves simple scalling. 1080i60 to 1080p60 involves scaling and de-interlacing, a significantly harder problem.

No need to debate further - what YOU see is often more important than anything else, but I would strongly suggest that a 1080i60 to 1080p60 conversion is very non-trivial compared to the simple scaling needed to get 720p60 to 1080p60.

Graeme

David Kennett
March 8th, 2006, 10:07 AM
Lets see... ten years from now the only 1080i native displays (CRT) will all be in museums. If we shoot 1080i, it MUST be converted, no matter what it is shown on. If we shoot 720p, it can be shown native on many displays, and sometimes being converted to 1080p. If we shoot 1080p, it can be shown native on many displays, and sometimes being converted to 720p.

Conversion is never a good thing!

Interlaced scanning was DESIGNED for the CRT in an analog world. Not only did it solve a flicker problem, but it provided a good compromise between spacial and temporal resolution. Today, digital processing could provide a much better methodology.

Graeme Nattress
March 8th, 2006, 10:12 AM
Totally agreed David - interlace is dead. Allowing HD to have interlace has a very bad move IMHO.

Graeme

Kevin Shaw
March 8th, 2006, 10:22 AM
No need to debate further - what YOU see is often more important than anything else, but I would strongly suggest that a 1080i60 to 1080p60 conversion is very non-trivial compared to the simple scaling needed to get 720p60 to 1080p60.

Right, the important thing is to do whatever conversions you need to do and assess visually how the results look. DSE has reported here that he's finding 720p HDV source material to be insufficient for customers with the best 1080p displays, but 1080i HDV source is converting nicely to 1080p30.

Graeme Nattress
March 8th, 2006, 10:36 AM
Ah, but converting 1080i60 to 1080p30 is quite a lot simpler than converting 1080i60 to 1080p60.

And there's no 720p60 HDV camera, but 720p30 HDV does indeed convert to 1080p30 very easily and looks great. I'd really say that if 720p30 doesn't scale well to 1080p30 on your projector, then it's scaler isn't that good.

Getting rid of interlace should have been the first step to HD, not the last!!

Graeme

Dylan Pank
March 8th, 2006, 10:43 AM
Graeme, that's post number 1000! do you win a prize or something?

Isn't there also the issue that interlacing leaves behind additional artefacts from the MPEG2 compression that otherwise would not be there? I've found de-interlacing Z1 footage to 25p oftenb leaves behind ugly little blotches that aren't necessarily there in CF25. I've tended to go back to CF 25 these days, despite the res drop because of this, in any "filmlook" situation.

Ironically I find the artefacts worse in plain colour areas, rather than contrasty/detailly areas.

Ken Hodson
March 8th, 2006, 11:27 AM
Right, the important thing is to do whatever conversions you need to do and assess visually how the results look. DSE has reported here that he's finding 720p HDV source material to be insufficient for customers with the best 1080p displays, but 1080i HDV source is converting nicely to 1080p30.

Considering the FX/Z1 arguably capture less detail/true resolution then the 720p HD100, then is de-interlaced for 1080p, I find the math/logic very hard to buy.

Graeme Nattress
March 8th, 2006, 12:13 PM
Dylan, I think I have to get to 1080p(osts) to get a prize!

MPEG2 compression is tough on interlace, but depending on the de-interlace algorithm I too find it can reveal artifacts that were less visible when interlaced - I don't think it creates them though.

Ken, agreed that the 720p from the HD100 has more real detail than you get from the Z1/FX1.

Graeme

Kurth Bousman
March 8th, 2006, 12:24 PM
...now that's what I was hoping for when I posted that link- a real discussion. Now I can go back a reread it( the thread, except for the first page) and maybe learn something. thanks esp. to Douglas and Graeme for digging in and giving us more to think about . Kurth

Graeme Nattress
March 8th, 2006, 12:34 PM
It's always a good discussion, and it's great that you're getting multiple educated points of view. Douglas really knows his stuff, and so do I, and that we disagree should be taken as that there's not a simple answer to the question, not that either of us is necessarily wrong. If our comments stop and make you think, then that's the best result by far :-)

Graeme

K. Forman
March 8th, 2006, 01:05 PM
Unprofessional language does not lend credibility to one's opinion.
C'mon... Even George Lucas drops the F bomb. I remember right after Return of the Jedi came out, he was saying " What was I thinking? It's nothing but a bunch of F'ing muppets!"

Douglas Spotted Eagle
March 8th, 2006, 01:38 PM
Getting rid of interlace should have been the first step to HD, not the last!!


I agree in theory, but in practice, progressive wasn't even on the plate when the HD spec was developed, nor when the Grand Alliance proposed it, nor when it was officially accepted.

Additionally, had the Grand Alliance had their sh** together, we never would be hearing of 720p, as 720p was intended as an interim, and as history would have it, it will be indeed, an interim format, which is the main reason I'm mostly a 1080 guy. We do have 3 720 cams, we use them a lot, and have scaled them a lot. We first noticed problems on a 60" SXRD monitor at Government Expo, and one of our major clients needed vid for their tradeshow. They took the 720p acquired and delivered footage, viewed it on their Samsung 70", and told us they were unhappy. We reshot the same footage using 1080, and they were happy, and it ended up on 10 screens at CES in January. That whole real-world process is what caused us to dig in deeply, and really stretch what we could do, even buying two different 1080 displays and borrowing a Qualia projector to rip into the footage. (This client is seriously important to us, obviously) I will say that in the process, I've learned that scaling isn't as easy as one might think. While scaled vid looks great on displays smaller than 40", it exponentially increases in view-ability post 60", and gets worse when blown up with a large projector and screen. Especially look at the edges, which are somewhat hidden by bezels on many displays.

Additionally, the discussion of "real" vs "perceived" becomes one of opinion, and at the end of the day, we're really stuck with that. "Really," what Panasonic has done with the HVX simply shows it can't work, on paper. But perceptually, it's a very good camcorder. so what is "real?" Given DSP on both mentioned camcorders, either one can be shown to best the other, depending on a number of variables.

Like Graeme says, just the mere fact we can't agree on what we're seeing or doing, says a lot. I'll stick a dig in here though, and point out that what we see here on our screens, is identical what Poynton has written about in several papers and his "bible" of HD. I have to chuckle though, because when you see guys like Faroujda commenting that you can't deinterlace footage all that well selling deinterlacing devices, and you see Terranex, S&W, Miranda, etc all coming on with very well-made products...some manufacturer has an agenda in the industry.

1080 is the future, whether we need to debate semantics, processes, or actualities of the media conversions.

Kevin Shaw
March 8th, 2006, 01:57 PM
Considering the FX/Z1 arguably capture less detail/true resolution then the 720p HD100, then is de-interlaced for 1080p, I find the math/logic very hard to buy.

If I recall correctly, resolution-chart tests show the FX1/Z1U having a slight edge over the HD100U in terms of actual recorded detail. Not nearly as much as the theoretical difference in overall resolution would suggest, but enough to make 1080i workable for either 720p or 1080p output. It makes more sense to me that 1080i works as a successful compromise than to expect 720p to upsample well to 1080p.

Thomas Smet
March 8th, 2006, 03:26 PM
are we not kind of comparing different cameras here instead of really just the format?

The only true way to test the formats is to take a scene from the same camera but in both formats. I know this is impossible right now.

One thing I have done however is made a scene in 3D Studio Max to match a real world setting. I then rendered at 720p and 1080i with the intent of converting both to 1080p. The 720 was slightly softer but had much less artifacts. The scene I rendered had many thin line objects such as power lines and low angle edges.

One thing that was hard for me to test however is the interlace filtering that interlace cameras always have. Any filtering of the image I did quickly killed any extra detail the 1080i had over the 720p. I know this isn't a god test in terms of what cameras do but it does tell me what can be done with raw images of certain resolutions and aspect ratios. My outcome is that 720p is softer but much cleaner. I personally would rather have a slightly softer image but with no missing thin details or aliased edges. I personally think people are looking for way too much detail. Who really wants to see every single blemish and imperfection?


Douglas what cameras were you using for your client that were shown on those 60" displays? Perhaps it was more of a certain look to the 1080i cameras and really nothing to do with the 720p format. You can take 4 different 1080i cameras and they look all look very different. The JVC HD100 for example does have a slightly more smooth filmic look to it which some people who are used to the crispness of video might not like. It doesn't mean it has less detail but just a different look and style. If you used a highend 2/3" 1080i camera compared to a prosumer 1/3" 720p camera well no kidding there was a boost in detail between the two images. If you used a HD100 compared to a Z1 I cannot see any way on how there could be more detail.

Considering a lot of people think a progressive scan SD DVD is HD on a 60" HDTV I think it is going to be a long time before people start choosing a 1080 image over a 720 image (if somebody didn't just try selling them the numbers first)

The are only two areas where I consider 1080i HDV to have an advantage.

1. smoother motion at 60/50 fields per second. Hopefully this will change when we get 720p 60p.

2. Even though 1080i has more compression per pixel the compression artifacts are not blown up as much. Mpeg2 artifacts at 720p can show up more when scaled up to 1080p.


For those interested on the points of Graeme and Douglas you should check out Steve Mullen's view on 720p HDV vs. 1080i HDV. Most of his tests show many 1080i cameras to have less detail than the 720p from the HD100.