View Full Version : XDcamEX vs JVC GY HD250


Pages : 1 [2]

Thomas Smet
November 12th, 2007, 12:04 PM
1080i is NOT 540 lines of resolution. I hate how people continue to perpetuate this myth. Although it may be a trick of the eye, 1080i TO HUMANS, not on a sheet of paper or freeze frame, shows a higher resolution image. And real world quality is what matters in the end...right?



New ones perhaps but most of the HDTVs out there are the older 1080i variety. And what format are broadcasters delivering content in? 1080i! So footage shot 720p gets converted to 1080i and DOES NOT look as good. I've done conversion from 1080i to 720p (60i and 60p) and most people do not notice a quality hit (it is minimal) but try the opposite (720p60 to 1080i60) and a lot of people notice a softening of the image.

I will give you that slo-mo looks better at 720p. So if slow motion is needed, shoot 720p for those parts but for all the rest at 1080! That's what I do...

Also, 1080p televisions deinterlacing 1080i footage looks fantastic! If a TV can do the math on the fly, imagine what software algorithms can accomplish! I'd say shooting 1080i now is "future proofing" for whenever 1080p60 becomes a broadcast format.

I also agree that 1080i is less bit-efficient than 720p. But the difference is negligible.

As far as the camera itself goes, 1/2" chips tend to give better color reproduction, latitude and of course nicer depth of field qualiities. All of this translates to a more professional image compared to the 1/3" chip cameras. I think all future prosumer cameras will HAVE to be 1/2" chip cameras in order to combat the low light problem. It really is a physics problem at this point and Sony is pushing the competition to this arena...which we will all benefit from!

Wrong. Not all HD is broadcast as 1080i. That is a myth. ABC, FOX and ESPN are 720p.

Of course 1080i creates an optical illusion of solid frames but that doesn't change the fact that it is still 540 pixel fields alternating. Of course this makes it look like 60p but it can have detail flickering if those details are too small. In order to make sure you do not have details that are 1 pixel thick you have to use low pass filtering which in the end bring the sharpness down. The image I posted shows that. The fact that it is interlaced doesn't reduce the level of detail at all. It is the low pass filtering that reduces the detail of 1080i and the jagged edges that are the problem. If you do not agree with what I have done please do your own test and show me I am wrong. I have yet to see any examples of how 1080i is that much better then 720p.

Evan Donn
November 12th, 2007, 07:16 PM
1080i is NOT 540 lines of resolution. I hate how people continue to perpetuate this myth. Although it may be a trick of the eye, 1080i TO HUMANS, not on a sheet of paper or freeze frame, shows a higher resolution image. And real world quality is what matters in the end...right?

Actually the passage you quoted was not equating 1080i to 540 lines of resolution - you missed the third number which indicates the samples per second. So saying 50 x 540 x 1440 translates to 50 samples (fields) of 540 horizontal by 1440 vertical lines. A frame, composed of two fields, will have greater than 540 lines of resolution, but less than 1080 due to the low pass filtering, as Thomas mentioned. However each individual field cannot have more than 540 because that's how many horizontal lines there are.

Adam Reuter
November 12th, 2007, 08:40 PM
Actually the passage you quoted was not equating 1080i to 540 lines of resolution - you missed the third number which indicates the samples per second. So saying 50 x 540 x 1440 translates to 50 samples (fields) of 540 horizontal by 1440 vertical lines. A frame, composed of two fields, will have greater than 540 lines of resolution, but less than 1080 due to the low pass filtering, as Thomas mentioned. However each individual field cannot have more than 540 because that's how many horizontal lines there are.

Again, on paper 720p has higher resolution but can humans really discern 60 frames per second? I won't dispute that for sports and high motion work it is best. But for those landscape/vista shots, that's when it counts. And that's why Discovery HD Theatre broadcasts in that format.

When blended together we get 1080 lines. On a good enough TV (Pioneer Elite series or something) 1080 lines definitely look better than 720. Sitting closer to the TV allows you to see an even greater difference. Even on the 55" projection HDTV that I have at home FOX and ABC look softer than NBC and CBS broadcasts. This was BEFORE I found out how they broadcast. Originally I thought Comcast was giving those channels less banwidth but now I know why the picture looks softer. I probably wouldn't notice the difference on a 40" or smaller HDTV though.

Like Jody Elred said, try converting 1080i to 720 and then try the reverse. 1080 yields better resolution. And since there are reports that this camera has 1,000 lines of resolution on the charts users should definitely benefit from shooting in the 1080i format.

But really, perhaps all this debate is null when we look at surveys. Contrast ratio is number 1 for how picture quality is perceived. The 1/2" chip offering should help that a lot. And number 2 is color saturation/accuracy. Hopefully Sony gets these right!

Kevin Shaw
November 12th, 2007, 11:04 PM
Even when you can have 100% true 1080p there just isn't that much of a huge difference.

Sure there is: full 1080p has 2.25 times the resolution of 720p, which in turn has 2.67 times the resolution of SD video. If you can see the difference between 720p and SD, you'll be able to see the difference between 1080p and 720p on a 1080p display.

And enough of the simulated photo-based examples: either compare footage from actual cameras or wait until someone else gets a chance to do so. Thanks.

John Bosco Jr.
November 13th, 2007, 05:20 AM
Sometimes I wonder...

How many people actually know that 720p50 is far superior to 1080i50? Or how many even know that 1080i is NOT full HD? And yet - it is just a matter of the easiest mathematics:



Totally wrong. I'm in NTSC land, but the concept is still the same. You cannot measure interlace as progressive because it's not progressive. It splits the frames to achieve the 1080 lines of resolution. The shortcomings of interlace lies in fast motion; that's where progressive shines. But, basically, 1080 is bigger than 720. You can't argue the numbers; however, I wouldn't classify 1080i as far superior, but it is superior.

Oh, by the way, just because a particular codec processes the pixels differently doesn't mean it's not full HD. 1440 x 1080i will view on a monitor as 1920 x 1080i.

1080i is the best we have until the broadcast bandwidth problems are solved, and 1080p becomes the high def standard.

Alex Leith
November 13th, 2007, 05:43 AM
But 1080i isn't superior... Interlace is a dinosaur from the 1930s when tube technology couldn't scan full-resolution images fast enough to give the perception of a moving image. All new LCD technology is progressive and has to de-interlace interlaced signals to display them... and how well that is done really depends on the set.

Mathematically 1920 x 1080i has the edge over 720p, but 720p has the edge over 1440 x 1080i.

1920 x 540 x 60i = 62.2 million pixels per second
1280 x 720 x 60p = 55.3 million pixels per second
1440 x 540 x 60i = 46.7 million pixels per second
720 x 240 x 60i = 10.4 million pixels per second

However, what does it look like? Personally I think all progressive images look preferable to similar spec interlaced images. Our eyes do not see interlace, and surely the ultimate goal is to match the image to our eyes? Although I find 720p60 a touch unnerving because the motion seems a little too fluid... It takes away the dreamy 25p haze!

It's interesting how it's partly cultural. A higher number of North Amercians seem attached to interlaced than Europeans. I suspect it is to do with the fact that we Europeans have been seeing 25p (or psf) every-time we watch a film on TV for the past 60 years. Whereas there hasn't been any historical broadcasting of 30p (or real 24p) in North Amercia, given that films for 60Hz regions have that awful kack-handed 2:3 pulldown. (see, it's cultural).

John Bosco Jr.
November 13th, 2007, 05:47 AM
What some of you all don't understand is the networks' decision on which High Definition standard to go with was not too much different than some of our arguments. For example, the U.S. networks CBS and NBC overwhelmingly decided to go with 1080i simply because the top executives could not accept 720p as high definition. ABC and Fox felt that progressive would be better for the majority of their shows which are sports. Although, they were hoping for 1080p; they felt that the faster motion of sports would view better using 720p.

David Heath
November 13th, 2007, 06:06 AM
....... the networks' decision on which High Definition standard to go with was not too much different than some of our arguments.
Timing is also important, with HD broadcasting becoming a reality far earlier in the States than most of the rest of the world (except Japan).

Screens with 1920 horizontal resolution have only become a reality at sensible prices in the last year or so, and same with cameras - for 60Hz standards, HDCAM is 1440 horiz, DVCProHD (1080) is 1280. Only now is 1080 starting to mean 1920, if you like. So the networks arguments may well have gone along the lines of "why go with a format whose undeniable advantage (horizontal resolution) can't really be taken full advantage of in practice?"

Fair point - but things have moved on. HD blue laser discs are full 1080p, and many screens sold now are 1920x1080 - and likely to approach 100% before very long. Differences that once didn't matter now increasingly do, and the danger for ABC and Fox is that more and more people WILL notice differences as screen and camera technology progresses.

But hopefully the current 1080p/25 and 1080i/25 mix will go to 1080p/25(50), when hopefully everybody will be happy.

David Parks
November 13th, 2007, 07:31 AM
Quote: From ABC executive on decision to go with 720p...

"They usually fail to mention that during the time that 1080i has constructed a single frame of two million pixels, [1/25] second, 720p has constructed two complete frames, which is also about two million pixels. In a given one-second interval, both 1080i and 720p scan out about 60 million pixels. The truth is that, by design, the data rates of the two scanning formats are approximately equal, and 1080i has no genuine advantage in the pixel rate department. In fact, if the horizontal pixel count of 1080i is reduced to 1440, as is done in some encoders to reduce the volume of artifacts generated when compressing 1080i, the 1080i pixel count per second is less than that of 720p".

This 720p vs. 1080i argument has been going on a while now. The truth is that in the late 90's different networks signed contracts with either Sony, in the 1080i camp, or Panasonic, in the 720p camp.

The amazing thing is now the EX does both. Life is good.

Tim Polster
November 13th, 2007, 08:09 AM
[QUOTE=David Parks;774647]This 720p vs. 1080i argument has been going on a while now. The truth is that in the late 90's different networks signed contracts with either Sony, in the 1080i camp, or Panasonic, in the 720p camp. QUOTE]

Bingo.

When they signed those contracts nobody had HD televisions in their houses and the equipment to produce in HD was such an investment that the only way to convince networks to use it was to have the manufacturers foot most of the bill.

Hence Sony or Panasonic got to choose the format because they were paying for it.

I really don't think pointy-headed pixel peeping (that we like to talk about) had much bearing on the choice for the execs. IMHO

Joel Chappell
November 13th, 2007, 08:19 AM
With the EX1 capable of both 720p or 1080p and others, does that give it an edge over the 250 in this one area? Or does the JVC claim that 720p is here for the foreseeable future, and the ability to easily "uprender" 720p to 1080i have enough merit to even it up a bit?

David Heath
November 13th, 2007, 09:41 AM
If you want 24/25fps motion ("film-look") then the EX should have a big edge theoretically because of the 1080p/24(25). It's when you want 50/60Hz motion that the argument gets more complicated. I don't think anyone will argue 720p/25 is superior to 1080p/25, if you're starting with 1920x1080 chips.

Thomas Smet
November 13th, 2007, 12:04 PM
Sure there is: full 1080p has 2.25 times the resolution of 720p, which in turn has 2.67 times the resolution of SD video. If you can see the difference between 720p and SD, you'll be able to see the difference between 1080p and 720p on a 1080p display.

And enough of the simulated photo-based examples: either compare footage from actual cameras or wait until someone else gets a chance to do so. Thanks.

At least I try to be honest and back up what I say instead of just saying it with words and convince everybody I am right just because I say so.

Kevin you are thinking with numbers and not with how it looks. Resolution is not everything and the bigger an image gets the harder it is to notice the extra detail. You cannot just say 2.25 times larger and leave it at that because that tells nobody how it will actually look.

Take a 1080i sample and down convert it to 720p. sure you may notice a slight detail loss but not 2.25 times worth of detail is lost. If you do not trust my sample which comes from a higher quality source then if I would have done it with a real 1080i then do it yourself. Sure this is a photo but this test actually gives more of an edge to 1080 since the source is such high quality.

Thomas Smet
November 13th, 2007, 12:13 PM
If you want 24/25fps motion ("film-look") then the EX should have a big edge theoretically because of the 1080p/24(25). It's when you want 50/60Hz motion that the argument gets more complicated. I don't think anyone will argue 720p/25 is superior to 1080p/25, if you're starting with 1920x1080 chips.

Well that depends. My sample shows that while 1080p may have a edge in deatil it isn't a huge edge like some would like to think.

There is also the fact that if you are shooting 24p/25p you will get much better compression compared to the 1080p flavor. 720p 24p uses 2.5 less frames then 60p so the 35mbits used will result in much higher quality compression. The 1080p sample would need a bitrate higher then 60 mbits/s to get the same compression quality as the 720p sample. So sure maybe the 720p will have less detail but it will have better compression. If you are shooting an action movie I would rather use 720p because it should be very hard to break the codec. 720p also has the option of shooting slow motion segments which is something you can't really do with 1080p. I would much rather my film have all it's shots have the same level of detail and stick with 720p for a consistant look then alternate between the two resolutions.

720p 24p/25p gives an artist a lot more creative options and lower compression at only a slight loss at detail.

Chris Hurd
November 13th, 2007, 12:15 PM
...you are thinking with numbers and not with how it looks. Resolution is not everything and the bigger an image gets the harder it is to notice the extra detail. You cannot just say 2.25 times larger and leave it at that because that tells nobody how it will actually look.This is indeed the truth, and it is a fundamental concept that is the mission of this site to impress upon all who visit here. I try to go out of my way to promote this single fundamental understanding to everyone who wanders through this site.

It's never about the numbers, it's only partially about the numbers, and numbers always take a back seat to how the image actually looks; plus there are other factors that need to be considered such as viewing distance for example... there is a point where the viewer's distance from the screen renders moot any difference in "resolution." Thanks Thomas,

John Bosco Jr.
November 14th, 2007, 05:30 AM
Timing is also important, with HD broadcasting becoming a reality far earlier in the States than most of the rest of the world (except Japan).

Screens with 1920 horizontal resolution have only become a reality at sensible prices in the last year or so, and same with cameras - for 60Hz standards, HDCAM is 1440 horiz, DVCProHD (1080) is 1280. Only now is 1080 starting to mean 1920, if you like. So the networks arguments may well have gone along the lines of "why go with a format whose undeniable advantage (horizontal resolution) can't really be taken full advantage of in practice?"


True. However, broadcast stations knew what they were getting into with HD as Sony and Panasonic demonstrated the two options, 1080i and 720p, on full HD monitors, 1920 x 1080i and 1280 x 720p. In fact, the monitors were CRTs. Panasonic later demonstrated 720p on huge motion picture like video screens and Sony on plasma. However, what was not known at that time was the broadcast of 720p 60, 720p 30 was the standard. I still don't think that would have changed the network's view on which standard to go with.

John Bosco Jr.
November 14th, 2007, 05:41 AM
Well that depends. My sample shows that while 1080p may have a edge in deatil it isn't a huge edge like some would like to think.

There is also the fact that if you are shooting 24p/25p you will get much better compression compared to the 1080p flavor. 720p 24p uses 2.5 less frames then 60p so the 35mbits used will result in much higher quality compression. The 1080p sample would need a bitrate higher then 60 mbits/s to get the same compression quality as the 720p sample. So sure maybe the 720p will have less detail but it will have better compression. If you are shooting an action movie I would rather use 720p because it should be very hard to break the codec. 720p also has the option of shooting slow motion segments which is something you can't really do with 1080p. I would much rather my film have all it's shots have the same level of detail and stick with 720p for a consistant look then alternate between the two resolutions.

720p 24p/25p gives an artist a lot more creative options and lower compression at only a slight loss at detail.

That is the best argument yet for 720p, and considering that U.S. broadcast TV compresses to 19mb/s using long GOP MPEG 2, compression artifacts will have some impact. Good point.

Of course, I don't know how we got started on 1080i vs 720p as the EX 1 shoots both.
I believe the EX 1 is a better camera than the JVC 250 because mainly of the sensor size and higher bit rate. I feel the 250 is overpriced for what it is.

David Heath
November 14th, 2007, 05:51 AM
True. However, broadcast stations knew what they were getting into with HD as Sony and Panasonic demonstrated the two options, 1080i and 720p, on full HD monitors, 1920 x 1080i and 1280 x 720p.
Maybe, but at the time the decisions were being made, full HD monitors weren't even available for consumers, and that possibility seemed in the very far future. Now they are close to being the norm. Hence, even if a difference was seen, it would be quite understandable for it to be ignored on the basis that it couldn't be noticeable to any consumer. Which until recently has been true.

But I agree with previous posts that too much shouldn't be read into some network decisions, they are not always made for the best technical reasons.

Kevin Shaw
November 14th, 2007, 08:14 AM
Hey gang, the EX1 will be shipping in a few days and we can start looking at some actual footage to compare that, instead of speculating about it based on technical specs. :-)

Does Blu-ray support 720p playback at 60 fps?

Tom Vaughan
November 14th, 2007, 07:58 PM
Does Blu-ray support 720p playback at 60 fps?

Yes, it does... encoded in MPEG-2, VC-1 or AVC (H.264).

Tom

Werner Wesp
November 28th, 2007, 11:16 AM
uprezzing 1080i50 to 1080p50 of a scene with (a lot of movement) asks for deinterlacing first. Therefore yielding a lower quality 1080p50 compared to 720p50 uprezzed. If the original 1080i footage isn't interlaced first you'd get jagged edges on movement.

Anyhow, I do a lot of people get me wrong. interlacing is a very nifty trick. It solved the original problem of tubes years and years ago for instance. It is just an old fashioned standard nowadays with virtually no benefits, since all flat panel TV's and monitors are natively progressive. They do accept 1080i, but have to deinterlace it to show it on their screen. interlacing is only seen in full quality on CRT's and that's whats in the screen technology natively.

Since virtually all equipment is progressive, codecs are adapted to progressive better and so on, why retain the old standard that was only invented because TV's couldn't handle full images at onces (i.e. progressive)? How many that have a 1080i camcorder actually have a 1080i capable CRT monitor, I wonder...

The mathematics are simple enough, I don't have to repeat them. 720p50 packs over 46,0 million pixels per second - 1080i50 packs theoretically just under 38,8 million pixels per second*. I know what footage I'd like to have on my NLE, especially if I want to do a lot of time-stretching effects.

* in reality it is even less, because if 1080i resolved all 1080 lines, horizontal details of 1 pixel thick would cause severe interlace flicker. To avoid that and keep the image steady, the interlaced smears details out over several lines - resulting in a small gaussian filter effect - working out to a resolution loss of about 30%. More realisticly, 1080i50 packs just over 28,0 millions per second.

All this combined makes it difficult to believe 1080i is still here. Although, why would sony have 1080i only on older models and on recent models also the capability of 720p?

720p and 1080i are both standards that'll be lost soon enough and make way for a 1080p50/60 standard, but in the meantime, I'd suggest shooting in 720p50/60 and if you really want 1080i50/60, you can always DOWNsample your original 720p50/60 to 1080i50/60...
(720p50 to 1080i50 is indeed downsampling 46M to 28M pixels per second)

David Heath
November 28th, 2007, 11:25 AM
All absolutely true if you need 50 or 60 Hz motion. But for most drama etc, 25Hz motion is chosen for aesthetic reasons, and then we're not talking about 1080i/25, but rather 1080p/25. But yes, the sooner 1080p/50 sweeps both 1080i AND 720p away, the better.

Werner Wesp
November 28th, 2007, 11:29 AM
All absolutely true if you need 50 or 60 Hz motion. But for most drama etc, 25Hz motion is chosen for aesthetic reasons, and then we're not talking about 1080i/25, but rather 1080p/25. But yes, the sooner 1080p/50 sweeps both 1080i AND 720p away, the better.

True, since 1080p24, 25 and 30 is possible, that is superior to 720p24,25 or 30. What the camera will deliver in those modes remains to be seen, but 1080p the best you'll get if the camcorder can deliver it and the desired frame rate for your purpose is available in that mode.

Kevin Shaw
November 28th, 2007, 03:04 PM
My understanding is that the EX1 has a 1080p recording mode, so if it maintains full resolution at that setting it should look pretty good compared to the JVC cameras. And there are other reasons to expect the EX1 to be stiff competition, so it's not just about pixel counts and frame rates.

Thomas Smet
November 29th, 2007, 12:06 AM
I have done a few tests now with motion video. I have created many samples of animated video including simple fast moving objects and scenes that have a lot of 3D rendered hair and fur. In the end with proper filtering the 1080i versions and the 720p both playing back upscaled to 1080p 60p look almost exactly the same. I would share these samples but I cannot think of a way to share 1920x1080p 60p video with you guys. All the tests I have done were in special playback software I wrote so it could run the video at 60p.

I would say the 1080i was about 1% sharper which was kind of an illusion really. The reason why it looks sharper is because of the interlacing. Because the video is interlaced you end up with 1080 unique lines of detail that are all different. With progressive video the lines tend to blend into each other making a more natural looking image. With interlaced however you do not see this blend as much because of how the fields are split. 1080i does work very well for HD and while there can be edge artifacts they do not show up as much as some would think. 1/60th of a second is pretty fast and before you could notice anything missign the missing part is replaced with the next field and so on. The beauty of 1080i however is that even when you have a single field it still looks like a 1080 image. A single field with the lines alternating will look much sharper then a 1920x540 rendered image so you cannot say 1080i only has 1920x540 per field. Sure it has that many pixels but they are spread out better so they look more detailed.

Steve Mullen
November 29th, 2007, 12:09 AM
My understanding is that the EX1 has a 1080p recording mode, so if it maintains full resolution at that setting it should look pretty good compared to the JVC cameras. And there are other reasons to expect the EX1 to be stiff competition, so it's not just about pixel counts and frame rates.

It seems lots of folks are throwing around the term "1080p" without realizing that the only two useable frame-rates are 25p -- ONLY in R50 -- and p24 -- ONLY in R60. And, either frame-rate is only useable where low-frame rates are desired.

When smooth and clear motion capture is needed, 1080p is not available. You'll have to shoot 720p50 or 720p60.

So there remains a conflict between fine detail (1080) and clear motion (progressive). The only thing that has changed is that before we had to select Sony or JVC -- while now we can select between formats on the EX1.

Whoever can get 1080p50 and 1080p60 in production -- gets the prize. (I suspect JVC will announce at NAB 2008 as they are committed to progressive.)

What's odd is that the EX1's CMOS chips and DSP are already working at 50Hz/60Hz. It simply isn't recording it. Which is strange because solid-state recording shouldn't be a limiting factor.

Kevin Shaw
November 29th, 2007, 01:22 AM
When smooth and clear motion capture is needed, 1080p is not available. You'll have to shoot 720p50 or 720p60.

So the EX1 has a 720p60 recording mode like the JVC cameras and a full-raster 1080p24 mode, both using higher bandwidth? Sounds like healthy competition to me...

Werner Wesp
November 29th, 2007, 05:54 AM
I have done a few tests now with motion video. I have created many samples of animated video including simple fast moving objects and scenes that have a lot of 3D rendered hair and fur. In the end with proper filtering the 1080i versions and the 720p both playing back upscaled to 1080p 60p look almost exactly the same. I would share these samples but I cannot think of a way to share 1920x1080p 60p video with you guys. All the tests I have done were in special playback software I wrote so it could run the video at 60p.

I would say the 1080i was about 1% sharper which was kind of an illusion really. The reason why it looks sharper is because of the interlacing. Because the video is interlaced you end up with 1080 unique lines of detail that are all different. With progressive video the lines tend to blend into each other making a more natural looking image. With interlaced however you do not see this blend as much because of how the fields are split. 1080i does work very well for HD and while there can be edge artifacts they do not show up as much as some would think. 1/60th of a second is pretty fast and before you could notice anything missign the missing part is replaced with the next field and so on. The beauty of 1080i however is that even when you have a single field it still looks like a 1080 image. A single field with the lines alternating will look much sharper then a 1920x540 rendered image so you cannot say 1080i only has 1920x540 per field. Sure it has that many pixels but they are spread out better so they look more detailed.


This is why you can't use animated graphix as base. When you resampled that to 1080i, probably no anti-flickering was added. You haven't checked your 1080i on a 1080i CRT, I suppose.

Furthermore, most motion graphics aren't subject to the compression that is less suited to MPEG compression, since there's no higher order correction for redundant info in between fields.

1080i has just 1920x540 per field (or 1440x540). On a CRT without deinterlacing filter is seems to have 1920/1440x1080 per frame, still 1920/1440x540 per field.

If you really want to test: stitch a few videos in 720p50 together to get 1080p50. Then downsample to 720p50 and to 1080i50. The difference is there. I'd say it is quite something, others don't mind or don't see it - perception is subjective and it need to be said both 720p and 1080i look rather good (especially compared to SD, even PAL).
When you slow down the 1080i and the 720p it is a whole different game: no-one remains that doesn't see the difference.

Thomas Smet
November 29th, 2007, 04:01 PM
This is why you can't use animated graphix as base. When you resampled that to 1080i, probably no anti-flickering was added. You haven't checked your 1080i on a 1080i CRT, I suppose.

Furthermore, most motion graphics aren't subject to the compression that is less suited to MPEG compression, since there's no higher order correction for redundant info in between fields.

1080i has just 1920x540 per field (or 1440x540). On a CRT without deinterlacing filter is seems to have 1920/1440x1080 per frame, still 1920/1440x540 per field.

If you really want to test: stitch a few videos in 720p50 together to get 1080p50. Then downsample to 720p50 and to 1080i50. The difference is there. I'd say it is quite something, others don't mind or don't see it - perception is subjective and it need to be said both 720p and 1080i look rather good (especially compared to SD, even PAL).
When you slow down the 1080i and the 720p it is a whole different game: no-one remains that doesn't see the difference.

I have tested all of this. My graphics all had filtering in 3 ranges from small to higher. I even tried different ways of filtering the video such as blur filters and even down scaling slightly and scaling back up to simulate lower detail. None of my tests had sharp crisp graphics because I always design my graphics with some level of filtering to reduce DCT based image artifacts when compressed. The graphics from 3D Studio Max used the video antialiasing method which simulates video filtering and is actually very soft in detail. In fact some people may even say I filtered a little too much which is why I also tried this a few times with less filtering.

What I have found is that even if you did start with totally unfiltered video the 1080i version still wasn't really that much more detailed and they both ended up pretty much looking the same in terms of detail although unfiltered the 1080i looked like garbage because it flickered all over the place while the 720p still looked very clean.

I have even stritched together uncompressed live SD captures to create fake HD but this can only go so far because SD cameras tend to filter more then what you need with HD. This is also a almost useless test for motion because you have no way to capture 1080i 60i and 720p 60p at the same time to test the motion. I mostly use this method to test encoders and processing software.

As much as I hate 1080i it really isn't fair to call it 1440x540. If you create a image that is 1440x540 pixels in size and then one that is 1440x540 but with space alternated in between the second one will look sharper. 1080 with every other line duplicated will look sharper then 540 pixels blown up to 1080. It is all about the illusion of detail and that is exactly what it does. It doesn't really work to think of interlaced in terms of pixel size because there are so many optical illusion factors in place that it just doesn't work.

Steve Mullen
November 29th, 2007, 05:31 PM
So the EX1 has a 720p60 recording mode like the JVC cameras and a full-raster 1080p24 mode, both using higher bandwidth?

And, a 720p24 mode.

That's why the ball is now in JVC court.

They could do a handheld version of the HD250 and/or push the art with 1080p50 or 1080p60. Europe needs 1080p50 immediately!

Or, give us ProHD 422.

Or, all of the above.

A harddisk will take the higher data-rates needed for these.

Thomas Smet
November 30th, 2007, 12:19 AM
I am pretty sure 1920x1080 at 60p is not possible with mpeg2 so a move to a very high level of AVC would be needed in order to get that type of video.

While we may at some point get some funky camera to record a funky form of 60p you guys will be waiting a very long time for a delivery option for that type of video. If you could shoot that type of video you would have to choose to deliver it as 1920x1080ix60i or 1280x720px60p in which case you would have been better off just shooting in that format to begin with.

For the very tiny quality boost you would get I just don't see how spending twice the amount of bandwidth is worth it. No TV station is going to eat up double the bandwidth just for that tiny boost in quality. Most consumers who are still very happy with DVD is going to be perfectly happy with 720px60p and 1080ix60i for a very long time yet.

This whole 1920x1080x60p thing is just insane and is more of a sick fantasy then anything of any great use other then to waste money.

David Heath
November 30th, 2007, 04:20 AM
While we may at some point get some funky camera to record a funky form of 60p you guys will be waiting a very long time for a delivery option for that type of video. ...........

For the very tiny quality boost you would get I just don't see how spending twice the amount of bandwidth is worth it. No TV station is going to eat up double the bandwidth just for that tiny boost in quality. ........

This whole 1920x1080x60p thing is just insane and is more of a sick fantasy then anything of any great use other then to waste money.
Sorry - I disagree, and the latest research has shown that the additional bandwidth required to transmit 1080p/50 is nothing like twice as much as for 1080i/25. Although the initial data rate may well be twice as much, it compresses far better, so the COMPRESSED data rate is far less than that would suggest.

The BBC in the UK has just released a detailed report about it's plans to start a full HD service next year - http://www.bbc.co.uk/bbctrust/assets/files/pdf/consult/hdtv/pvt_final_conclusions.pdf - and one section is relevant to technical issues.
"3.39

We think it likely that broadcasters may face a variety of pressures, including regulatory, to move towards the 1080p picture resolution – capability for which is not available in current consumer display equipment. But we would expect that future transmission of 1080p pictures would be backwards compatible with current display equipment. So there should not arise an issue of consumer disadvantage. Nonetheless, we would expect the BBC Executive to make any move towards 1080p with sensitivity towards the current choices facing consumers who equip themselves to receive HD."
From which it seems they see a move to 1080p/50 highly likely - the only concern being not rendering existing home displays obsolete.

Werner Wesp
November 30th, 2007, 09:56 AM
As much as I hate 1080i it really isn't fair to call it 1440x540. If you create a image that is 1440x540 pixels in size and then one that is 1440x540 but with space alternated in between the second one will look sharper. 1080 with every other line duplicated will look sharper then 540 pixels blown up to 1080. It is all about the illusion of detail and that is exactly what it does. It doesn't really work to think of interlaced in terms of pixel size because there are so many optical illusion factors in place that it just doesn't work.

Once more: at 1/50th (1/60th) of a second it IS 1440x540. Due to displacing it look somewhat more (+/-750) when playing, I never denied that. but every FIELD is just 1440x540.
Since 720p50 has no fields, just full frames, it has full resolution every 1/50th (1/60th) of a second.

Stitching together SD can never work, because there's no 50p mode is SD (unless with JVC HD100). You have to stich some 720p50 together to get some 1080p. Never try to get decent progressive images out of interlaced onces - you always loose quality, that's just the raw laws of physics.

Werner Wesp
November 30th, 2007, 10:02 AM
Sorry - I disagree, and the latest research has shown that the additional bandwidth required to transmit 1080p/50 is nothing like twice as much as for 1080i/25. Although the initial data rate may well be twice as much, it compresses far better, so the COMPRESSED data rate is far less than that would suggest.

Indeed: All or virtually all compression schemes work far more efficiently with progressive images, therefore you can have less compression artifacts in 1080p then in 1080i, withouth doubling bandwith - although, the larger the bandwith, the better of course.

Obviously 1080p50 and 1080p60 is no sick fantasy - it is the logical next standard and a serious improvement over 720p50 (and an even bigger improvement over 1080i50). It is defenately no slight improvement. Furthermore is it a logical stardard that will fit all LCD- and Plasma-displays, as they are pushing that new standard (known by regular consumers as "Full HD").

Kevin Shaw
November 30th, 2007, 10:18 AM
I am pretty sure 1920x1080 at 60p is not possible with mpeg2 so a move to a very high level of AVC would be needed in order to get that type of video.

The Convergent Designs XDR recorder discussed here recently includes a 1080i/p recording mode using MPEG2 compression at variable data rates of 50 or 100 Mbps. http://www.dvinfo.net/conf/showthread.php?t=106861&highlight=hd-sdi+recorder

I'm not sure why we're discussing 1080i versus 720p here given the original topic of this thread, which is the XDCAM EX versus the JVX HD-250U. The EX can record 720p60 like the JVC (but at higher bandwidth) or record in 1080 formats with twice the real-world resolution, so all around better than the JVC by any technical measure. If you like shoulder-mounted cameras that's a different discussion, and form factor is an important consideration in picking a camera.

David Heath
November 30th, 2007, 11:48 AM
Indeed: All or virtually all compression schemes work far more efficiently with progressive images, therefore you can have less compression artifacts in 1080p then in 1080i, withouth doubling bandwith - ...........
I don't think it's just the progressive nature. Go to higher resolution frames and it's possible to use higher levels of compression, even when both systems are progressive - hence it still shouldn't be 2x the 720p/50 data rate after compression.
........the original topic of this thread, which is the XDCAM EX versus the JVX HD-250U. The EX can ....., so all around better than the JVC by any technical measure. If you like shoulder-mounted cameras that's a different discussion, .......
All true, and once accessories like radio mic receivers and camera lights start to be used, the JVC form factor starts to take on even more significance. It's also pretty easy to add a Firestore to a JVC camera, and still end up with an ergonomic package, and some users may find that tape+tapeless combination a big draw.

Thomas Smet
December 1st, 2007, 02:27 AM
The Convergent Designs XDR recorder discussed here recently includes a 1080i/p recording mode using MPEG2 compression at variable data rates of 50 or 100 Mbps. http://www.dvinfo.net/conf/showthread.php?t=106861&highlight=hd-sdi+recorder

I'm not sure why we're discussing 1080i versus 720p here given the original topic of this thread, which is the XDCAM EX versus the JVX HD-250U. The EX can record 720p60 like the JVC (but at higher bandwidth) or record in 1080 formats with twice the real-world resolution, so all around better than the JVC by any technical measure. If you like shoulder-mounted cameras that's a different discussion, and form factor is an important consideration in picking a camera.

Yes that is for 1080i 60i or 1080p 24p,25p or 30p. Nowhere does it ever say it can encode 1080p 60p.

Werner Wesp
December 1st, 2007, 11:56 AM
MPEG2 is scalable to 1080p.

Kevin Shaw
December 1st, 2007, 06:54 PM
Nowhere does it ever say it can encode 1080p 60p.

And neither can the JVC, so I don't see how that's relevant to this comparison. In any case, the EX1 works well and makes a fine alternative to other HD cameras in the same price range, depending on your particular needs.

Thomas Smet
December 2nd, 2007, 12:02 PM
I was commenting on the fact that no camera while still using mpeg2 will be able to do 1080p 60p. This includes HD broadcast as well. So if JVC or SONY sticks with mpeg2 then you will not see 1080p 60p. Also until the point when we can make Blu-Ray and HD-DVD discs with VC1 and AVCHD encoding you will have no way to deliver 1080p 60p.

So while in a few years 1080p 60p may be nice it is pointless to think about it right now. This is why it is a fantasy. Until we can shoot with it and make use of it it is a fantasy. Sure maybe some people are working on a solution but it isn't ready yet so it is pointless to get all worked up over a format that isn't even around yet. When it does come out it may be awhile before any of us will be able to deliver with such a format.

This is relevant because I am trying to point out for now we can use 720p 60p or 1080 60i or 1080p 30p. So we should all go back to thinking about using those formats and stop thinking about 1080p 60p.

Werner Wesp
December 2nd, 2007, 12:11 PM
I was commenting on the fact that no camera while still using mpeg2 will be able to do 1080p 60p.

This is ridiculous. You need a well adapted bandwith (less than double indeed but still adapted to the raw datastream), but there's nothing in the MPEG2 that prohibits encoding in 1080p50 and/or1080p60. More so, MPEG2 will probably be a preferred format for 1080p, because codecs without temporal efficiency would eat up too much bandwith.

1080p50 or 1080p60 in MPEG2 with 6 (or 12) frame GOP and a bandwith of 50-100Mbps would look very, very good indeed.

Kevin Shaw
December 2nd, 2007, 05:35 PM
Once more: at 1/50th (1/60th) of a second it IS 1440x540. Due to displacing it look somewhat more (+/-750) when playing, I never denied that. but every FIELD is just 1440x540.

On the EX1 in HQ 1080i mode each field would be 1920x540, and in 1080p mode would be 1920x1080. In 720p60 mode the still-frame resolution would be the same as a JVC HDV camera, but with slightly more bandwidth per frame.

After shooting some footage on the EX1 this weekend I'm hungry for even higher resolution: when can I get 4K at 60 fps? ;-)