View Full Version : Sony X70 4K - Lowest bit rate in the industry!
Noa Put July 22nd, 2015, 03:21 PM Chill out and stop thinking about numbers so much. I bet u use a PC because u can hit better numbers for cheaper? Am I right?
Some people like talking numbers, I"m not one of them but I do respect those who do, maybe you should too?
In short, noone cares about your whining!!!
Maybe some do? You might not agree with Cliffs statements but he just has an opinion on the matter, just like so many others do. You can either join the discussion in a civilized manner or stay out because at this moment you are only adding noise with your insults.
Ron Evans July 22nd, 2015, 07:11 PM Cliff is clearly a Sony fan as he has at least two Sony's and would obviously like Sony to up the data rate on the X70 . Not sure if he thinks Sony will listen to him but I doubt it. It is either in their plans or not. As previously stated I think the main purpose of the X70 is as a HD 10 bit 4:2:2 camera for which its data rate is fine. To truly meet the XDCAM 4k features ( Z100. FS7, F5 etc ) it would need to go to 50/60P and at least 150Mbps like the PXW-Z100 or the FDR-AX1, in my mind that just is not going to happen. The mistake Sony made I think is putting 4K on it at all. I have an AX100 and an AX1 and can tell you the image quality from the AX1 at 60P 150Mbps is better than either of these cameras shooting 30P at 60 or 100Mbps. If you want to see the leaves or grass move smoothly you need frame rates faster than 25/30P. I got the AX100 for its potential resale value rather than the CX900 and only shoot HD on it which is lovely and better than my NX5U !!!
Ron Evans
Cliff Totten July 22nd, 2015, 08:01 PM Sony fan? I'm embarrassed to say that I'm a Sony "tool" or Sony "puppet"!!
I have now owned 16 different Sony models in the last 20 years. This includes Handycam, NXCAM, XDCAM and Alpha models.
My current lineup is EX1r, X70, A7s and AX100. MY new RX10-II will be here on Friday. My AX100 goes on eBay next week and my A7s will be replaced by the A7s-II when that hits the streets. I'm thinking IBC time or in Q4.
My complaints to Sony about 100mbp/s are mostly just me telling Sony to be consistent, that's all.
If they are going to give a $900 Handycam 100mbp/s than please give at least the same thing to a $2500 Pro model too.
Sony has stated that they are working on it but won't give any promises. They know that a 60mbp/s cap is very unpopular because of this consistency mismatch.
Jody Eldred July 22nd, 2015, 08:20 PM I truly wonder if ANYONE here has actually done the tests that I have done with this camera? Have you, Clif?
This isn't personal (it really isn't), so there's no feelings to be hurt here, and I intend no one any ill will. I just want the truth to be told and correct error.
People slam codecs and bit rates and all kinds of stuff, CMOS imagers with "rolling shutter", purporting that those are terrible, awful, horrible, unthinkable factors that render certain cameras unacceptable for professional use. They usually do this with ZERO real-world tests, and certainly not tests under rigorous and exacting conditions such as the demanding 4K color grading theater at Roush Media (with one of the best colorists in Hollywood), as well as ShapeShifter Post in Hollywood with Senior Colorist Randy Coonfield. I am fortunate to live in Los Angeles and work with VERY demanding DPs, directors, producers, and colorists (Oscar and Emmy winners) who will tear to shreds any camera system that comes their way if they dislike it for any reason. This is the crucible and I live in it.
In the 39 years I've worked professionally as a D.P. (with one national Emmy under my belt), I have owned the following: Sony D600 Betacam SP, Sony PD100, Sony PD150, Sony F900, Sony F900R, Sony F350 XDCAM HD, Sony Z1U, Sony V1U, Sony EX1, Sony EX3, Sony 3D1U (3D camera), Sony FS700, Sony F55, Sony PXW-X70, and a fleet of GoPros. I was Sony's primary U.S. tester for most of these cameras, plus the PDW-700 and F800, and have used them in a wide range of programs such as ABC World News Tonight, GMA, Nightline, 20/20, Dateline NBC, 48 Hours, PBS "Frontline", Oprah, National Geographic, CBS dramas "J.A.G", "NCIS", and "NCIS: Los Angeles", NBC drama "Medium", feature films, network comedies, and commercials. I was in the Iraq War with Peter Jennings and Diane Sawyer, the L.A. Riots with CNN, devastating fires with ABC News, have shot underwater for CBS News, JAG, NCIS, Nat Geo, and Sony, worked with US presidents and hundreds of celebrities such as Michael Jackson, Cher, Christie Brinkley, Cindy Crawford, Sharon Stone, Oprah, even Charles Manson... too many others to even remember. I've worked extensively on the open ocean, in jungles in Southeast Asia, in alpine environments, in Death Valley.
And yes, I've shot weddings too.
Many thousands have seen my presentations at events like NAB, at Sundance, at DV Expo, and SATIS in Paris. They will tell you that my sole job is to share my experiences with these different camera platforms in REAL WORLD conditions... what works and what doesn't work so well... what challenges I encountered... the mistakes I made and how I overcame them (if I did.)
I am your Crash-Test Dummy. I am your Consumer Reports. I am the guy that has used just about every camera system out there (including the first RED camera) and has made all the mistakes you can make!
When I talk about a camera system like the X70, I have a broad base of comparative analysis. It does not make me right, only experienced. I take my work seriously and am concerned ultimately about one thing, "Does this camera system deliver what I need it to in real-world conditions for my client's demands and my audience's expectations?"
Specs, numbers, codecs, algorithms, bit depth, data rates are all good starting points. But the speed at which technology advances often makes images recorded at last month's higher data rate inferior to this month's lower one. I've witnessed that firsthand for a good decade now. I've had engineers call me out in front of several hundred attendees stating that there's no way the camera can look good with a 35 mbps data rate at 4:2:0 and the networks will never air that. (Except of course the top-rated drama on television, "NCIS", which utilized some of my footage in the show's opening montage, 35 mbps, 4:2:0...)
Bottom line: use anything you like. It does not matter one whit to me. Feel free to ignore the experiences I've had, the mistakes I've made, and real-world tests under demanding conditions. Doesn't matter to me. My job is not to sell cameras or anything else. My job is simply to be the best visual storyteller I can be using the best tools for each job, and to share that with anyone who's interested. And to be a truth-teller and dispel rumors so my colleagues can be better visual storytellers too.
We stand on the shoulders of those who came before us. I've been very fortunate to have shared company with some of the greatest visual storytellers of all time. I hope a little of their magic has been imparted to me, as I hope I've been able to impart some of that to others along the way as well.
Paul Anderegg July 22nd, 2015, 08:58 PM Jody, I am sure (somewhat sure) you would agree that a properly calibrated camera (colorimetry) with a crappy codec looks better than an ugly scope picture on the best codec available!
My number one wish to Santa is for Sony to stop putting their greens half way to yellow. :-P
Paul
Jody Eldred July 22nd, 2015, 09:42 PM It's all about how the image looks in the medium it needs to look right in (theater, home, projection at event or house of worship, internet, smartphone, ad display, etc.)
How. It. Actually. Looks.
As for a Sony camera leaning yellow, just adjust the tint on your monitor!
FIXED!
;-)
Noa Put July 23rd, 2015, 01:09 AM My current lineup is EX1r, X70, A7s and AX100.
You do have the x70 and the ax100, why don't you place them side by side and shoot something that will stress the codec? It would just take 1 framegrab from each camera to proof what you have been trying to say here for 3 pages :)
Noa Put July 23rd, 2015, 01:30 AM In the 39 years I've worked professionally as a D.P. (with one national Emmy under my belt), I have owned the following: Sony D600 Betacam SP, Sony PD100, Sony PD150, Sony F900, Sony F900R, Sony F350 XDCAM HD, Sony Z1U, Sony V1U, Sony EX1, Sony EX3, Sony 3D1U (3D camera), Sony FS700, Sony F55, Sony PXW-X70, and a fleet of GoPros. I was Sony's primary U.S. tester for most of these cameras, plus the PDW-700 and F800, and have used them in a wide range of programs such as ABC World News Tonight, GMA, Nightline, 20/20, Dateline NBC, 48 Hours, PBS "Frontline", Oprah, National Geographic, CBS dramas "J.A.G", "NCIS", and "NCIS: Los Angeles", NBC drama "Medium", feature films, network comedies, and commercials. I was in the Iraq War with Peter Jennings and Diane Sawyer, the L.A. Riots with CNN, devastating fires with ABC News, have shot underwater for CBS News, JAG, NCIS, Nat Geo, and Sony, worked with US presidents and hundreds of celebrities such as Michael Jackson, Cher, Christie Brinkley, Cindy Crawford, Sharon Stone, Oprah, even Charles Manson... too many others to even remember. I've worked extensively on the open ocean, in jungles in Southeast Asia, in alpine environments, in Death Valley.
And yes, I've shot weddings too.
haha, that last sentence cracked me up, especially after your impressive experience summary, I bet the weddings where the hardest part of your career? :) and why do you have an emmy under your belt? Is that not very uncomfortable? (edit: don't take this comment personal, I only didn't expect that someone with your background would bother doing weddings)
I am not a specifications guy but I judge my camera's based on what I see, when the sony rx10 came out some geartech sites trashed it because the avchd codec was supposed to be useless, something Sony has address later on by adding the xavc-s codec. I honestly couldn't see an issue with the "old" 28mbs codec, it was fine for me and for my paying clients, ofcourse I could see codec break-up when took framegrabs from high detailed moving images and blow them up in photoshop but who watches a film like that anyway.
I also believe you never can win an argument based on specifications only but by shooting in real world scenarios and comparing camera's side by side.
Cliff Totten July 23rd, 2015, 06:17 AM I truly wonder if ANYONE here has actually done the tests that I have done with this camera? Have you, Clif?
This isn't personal (it really isn't), so there's no feelings to be hurt here, and I intend no one any ill will. I just want the truth to be told and correct error.
People slam codecs and bit rates and all kinds of stuff, CMOS imagers with "rolling shutter", purporting that those are terrible, awful, horrible, unthinkable factors that render certain cameras unacceptable for professional use. They usually do this with ZERO real-world tests, and certainly not tests under rigorous and exacting conditions such as the demanding 4K color grading theater at Roush Media (with one of the best colorists in Hollywood), as well as ShapeShifter Post in Hollywood with Senior Colorist Randy Coonfield. I am fortunate to live in Los Angeles and work with VERY demanding DPs, directors, producers, and colorists (Oscar and Emmy winners) who will tear to shreds any camera system that comes their way if they dislike it for any reason. This is the crucible and I live in it.
In the 39 years I've worked professionally as a D.P. (with one national Emmy under my belt), I have owned the following: Sony D600 Betacam SP, Sony PD100, Sony PD150, Sony F900, Sony F900R, Sony F350 XDCAM HD, Sony Z1U, Sony V1U, Sony EX1, Sony EX3, Sony 3D1U (3D camera), Sony FS700, Sony F55, Sony PXW-X70, and a fleet of GoPros. I was Sony's primary U.S. tester for most of these cameras, plus the PDW-700 and F800, and have used them in a wide range of programs such as ABC World News Tonight, GMA, Nightline, 20/20, Dateline NBC, 48 Hours, PBS "Frontline", Oprah, National Geographic, CBS dramas "J.A.G", "NCIS", and "NCIS: Los Angeles", NBC drama "Medium", feature films, network comedies, and commercials. I was in the Iraq War with Peter Jennings and Diane Sawyer, the L.A. Riots with CNN, devastating fires with ABC News, have shot underwater for CBS News, JAG, NCIS, Nat Geo, and Sony, worked with US presidents and hundreds of celebrities such as Michael Jackson, Cher, Christie Brinkley, Cindy Crawford, Sharon Stone, Oprah, even Charles Manson... too many others to even remember. I've worked extensively on the open ocean, in jungles in Southeast Asia, in alpine environments, in Death Valley.
And yes, I've shot weddings too.
Many thousands have seen my presentations at events like NAB, at Sundance, at DV Expo, and SATIS in Paris. They will tell you that my sole job is to share my experiences with these different camera platforms in REAL WORLD conditions... what works and what doesn't work so well... what challenges I encountered... the mistakes I made and how I overcame them (if I did.)
I am your Crash-Test Dummy. I am your Consumer Reports. I am the guy that has used just about every camera system out there (including the first RED camera) and has made all the mistakes you can make!
When I talk about a camera system like the X70, I have a broad base of comparative analysis. It does not make me right, only experienced. I take my work seriously and am concerned ultimately about one thing, "Does this camera system deliver what I need it to in real-world conditions for my client's demands and my audience's expectations?"
Specs, numbers, codecs, algorithms, bit depth, data rates are all good starting points. But the speed at which technology advances often makes images recorded at last month's higher data rate inferior to this month's lower one. I've witnessed that firsthand for a good decade now. I've had engineers call me out in front of several hundred attendees stating that there's no way the camera can look good with a 35 mbps data rate at 4:2:0 and the networks will never air that. (Except of course the top-rated drama on television, "NCIS", which utilized some of my footage in the show's opening montage, 35 mbps, 4:2:0...)
Bottom line: use anything you like. It does not matter one whit to me. Feel free to ignore the experiences I've had, the mistakes I've made, and real-world tests under demanding conditions. Doesn't matter to me. My job is not to sell cameras or anything else. My job is simply to be the best visual storyteller I can be using the best tools for each job, and to share that with anyone who's interested. And to be a truth-teller and dispel rumors so my colleagues can be better visual storytellers too.
We stand on the shoulders of those who came before us. I've been very fortunate to have shared company with some of the greatest visual storytellers of all time. I hope a little of their magic has been imparted to me, as I hope I've been able to impart some of that to others along the way as well.
That is a fantastic resume you have, I must say!
Simple question to Jody:
If you asked your colorist; "I'm shooting some awesome stuff next week. My PXW-X70 can do 60Mbp/s or 100Mbp/s....which one do you prefer I bring it to you in?" (Let's say you were shooting a car chase/crash/explosion scene for next season's NCIS)
What would he tell you to use?....
Why?....
Simple question for all:
If Sony upgraded your X70 and it could shoot in either 60Mbp/s or 100Mbp/s (like it's AX100's baby sister does) Which bit rate would you use? (let's say you were on vacation in Yellowstone shooting a grizzly attacking a wolf.)
Why?.....
CT ;-)
Noa Put July 23rd, 2015, 07:02 AM Simple question for all:
If Sony upgraded your X70 and it could shoot in either 60Mbp/s or 100Mbp/s (like it's AX100's baby sister does) Which bit rate would you use? (let's say you were on vacation in Yellowstone shooting a grizzly attacking a wolf.)
Why?.....
My ax100 can do 60 and 100mbs in 4K yet I shoot 60mbs all the time, why, because I compared both and to my eye they look the same, so I save space on my cards yet still am able to deliver a IQ that surpasses that of my 1080p camera's.
My gh4 can do different high bitrates in 1080p yet I always shoot at 50mbs ipb, why, because I compared and they look the same to me so again I save space and get a IQ that suits my and my clients needs.
If I would be shooting high detail scenes with lots of motion and if I would have high IQ demands from my clients I would shoot in the highest bitrate possible, if I would have a camera like the x70 and if the 4K option would not meet my needs I simply would sell the camera and buy a camera that does.
I don't see any need to complain about something that might never change, with Sony you can only hope they might give you a firmware update but they move in mysterious ways, maybe they have a x70 mark II in mind that can do 4K50p at 100+mbs, maybe they will provide a firmware update or maybe they just leave it as is and move into another direction depending on sales.
Just get a camera that fits your needs now and start shooting instead of dreaming of that pot of gold at the end of a rainbow. :)
Ron Evans July 23rd, 2015, 08:35 AM To support Noa. The only difference I see between 30P at 60 or 100Mbps ( on the AX100 or the AX1 ) is if I pan then both are awful compared to 60P at 150Mbps. Fixed on a tripod I see no difference between 60Mbps and 100Mbps. As mentioned before 30P is too slow for me so only shoot at 60P anyway.
Ron Evans
Noa Put July 23rd, 2015, 08:55 AM I have a simple question for Cliff; have you even shot 4K with the x70 and if so, can you show us why it's a codec that doesn't deliver? So no numbers this time, just something visual that shows that the 60mbs codec from the x70 is no match against a 100mbs+ codec from any of those other 12 camera's you mentioned at the beginning of this thread.
Or maybe just the ax33, it can do 100mbs but I have read that owners have been underwhelmed by the IQ of this camera compared to a ax100, can we say that the x70 is even worse, because it only does 60mbs?
Cliff Totten July 23rd, 2015, 09:33 AM Noa, I'm not saying it's terrible or anything of the sort. Guys, I'm not saying the AX100 or the X70 is unusable, ungradable garbage. An yeah, the AX33 at 100Mbp/s wont look as good as the AX100/X70 at 60Mbp/s. The AX100/X70's front end is WAY better than the AX33's....no CODEC bitrate at the end of the camera can "fix" that image. Yes, we ALL know this.
Some of you are completely missing my point. When a codec gets it's image input, it discards TONS of information. The H.264 implementation that XAVC uses is about as good as the h.264 library allows it to be. Sony does a great job with it's encoders.
So with that being said,...more image data is being dis-guarded in 60Mbp/s than it is in 100Mbp/s. The compression ratio is significantly higher in 60Mbp/s. The bits per pixel count is lower than it is in 100Mbps. (Sorry Noa for using "numbers" here on you)
60 Mbp/s holds well on static scenes. I have tested 60 vs.100 on light motion testing. I have not done it of fast complex motion scenes. But, yes I will before I sell my AX100.
Here...I did this a couple of months ago. This is very low complexity scene with just tree branches moving. Yes, 60mbp/s holds up well. but this is literally as "static" as it gets so it's not stressing out the 60 very much at all.
Sony AX100 - 60Mbps vs 100Mbps in 4K - YouTube
I suppose I can do a light and dark shady scene with allot of motion to see how fast the two break. Remember, bits are always allocated to mid tones first and shadows get the least so they are usually the first areas of a CODEC to break and macro block up.
Jody Eldred July 23rd, 2015, 01:15 PM Clif, until you stop talking numbers and start talking real-world results, we have nowhere else to go in this conversation.
As to which codecs my colorists would prefer, that would be 16-bit 4K F55 Raw (@ 23.98) or S-Log, or 16-bit F65 Raw.
The 4K F55 would be 960 mbps and the F65 could be as high as 5 GBps.
Jody Eldred July 23rd, 2015, 01:26 PM BTW, we compared 4K AX100 footage @100 mbps to 4K 60 mbps from the X70 in that same 4K grading suite. Two DPs and the senior colorist all agreed that the footage recorded in the X70's XAVC-L codec was superior in most respects, and inferior in none.
Some of the footage was captured with the two cameras side-by-side shooting the exact same shots.
WHAT MATTERS: the images from both cameras looked very, very good and were more than adequate for the productions being shot with them.
(My friend who owns the AX100 loves it, but after seeing my X70 and the 4K tests he is strongly considering selling his and buying an X70. Better features, deeper menu, more advanced picture profiles, a seemingly superior recording codec, etc.)
Cliff Totten July 23rd, 2015, 02:05 PM Clif, until you stop talking numbers and start talking real-world results, we have nowhere else to go in this conversation.
As to which codecs my colorists would prefer, that would be 16-bit 4K F55 Raw (@ 23.98) or S-Log, or 16-bit F65 Raw.
The 4K F55 would be 960 mbps and the F65 could be as high as 5 GBps.
I like those numbers you are talking there! Do you think they mean anything to a colorist? Prolly not because "numbers" like that don't matter....right?
You have S-LOG listed. What CODEC do they want their S-LOG in and does that "really" matter in your opinion?
Cliff Totten July 23rd, 2015, 02:12 PM BTW, we compared 4K AX100 footage @100 mbps to 4K 60 mbps from the X70 in that same 4K grading suite. Two DPs and the senior colorist all agreed that the footage recorded in the X70's XAVC-L codec was superior in most respects, and inferior in none.
Some of the footage was captured with the two cameras side-by-side shooting the exact same shots.
WHAT MATTERS: the images from both cameras looked very, very good and were more than adequate for the productions being shot with them.
(My friend who owns the AX100 loves it, but after seeing my X70 and the 4K tests he is strongly considering selling his and buying an X70. Better features, deeper menu, more advanced picture profiles, a seemingly superior recording codec, etc.)
I love both my AX100 and X70 too. They both give great results. I'm selling my AX100 only because the X70 is taking that role (palm-corder) away from the AX100 now
[QUOTE=Two DPs and the senior colorist all agreed that the footage recorded in the X70's XAVC-L codec was superior in most respects, and inferior in none."[/QUOTE]
Can you clarify that a little bit?...was "superior" to what?"...."inferior to non?...non, what?
Noa Put July 23rd, 2015, 02:32 PM Can you clarify that a little bit?...was "superior" to what?"...."inferior to non?...non, what?
The answer is in the same sentence where you took this quote from, it was compared to the ax100 100mbs codec.
Cliff Totten July 23rd, 2015, 02:46 PM They are the same CODEC but two different bitrates. The CODEC is H.264 for both. XAVC-S,-L and I all use the same codec.(with different frame rates, bit rates, inter and intra modes)
Noa Put July 23rd, 2015, 03:29 PM So? What are you trying to say now? Jody said the x70 codec looked superior to the ax100, even if it had a lower bitrate, so can we conclude the 60mbs bitrate from the x70 is more then adequate to compete with a 100mbs codec from "other 4k" camera's?
Cliff Totten July 23rd, 2015, 05:13 PM Look, If Jody's position is that Sony's XAVC offers no quality advantages above 60Mbp/s than so be it. What can I say to that? It's his opinion. He has his own reasons.
Sony designed many XAVC modes that go WAY up the ladder in bit rate. (With 60mbp/s at the very bottom) Sony markets them as higher and higher in quality as they go up the list. They also charge more for each mode as it goes higher into the model lineup. It appears that 100Mbp/s is the new "basement" or "low standard" for 4k/UHD cameras as low as $499 and higher. I'll bet allot of money that Sony will never make another 4k camera at ANY price that has a maximum bit rate below 100Mbp/s.(Especially when their direct competition offers 100-150Mbp/s) Yeah,..I know...absolute crazy talk,..right?
I dont think that Sony actually realizes how good their 60Mbp/s is?? They certainly don't know that it's just as good as their 100Mbp/s. Stupid Sony, I guess.
Me? I agree with Sony engineers and XAVC designers. The science behind higher and higher bit rates in MY opinion is on Sony's side. Sony built it, they know XAVC best...in my opinion. (But maybe Jody and his people know things that Sony doesnt?..possibly)
I'm also certain the Motion Picture Experts Group will also agree that higher h.264 bit rates exhibit less and less artifacts and capture motion more accurately. This is why they increased the level restrictions over all the years for h.264. I do not believe they did this because it does nothing for quality. But hey,...again. More radical, crazy talk on my part I guess.
For me,..id rather shoot at the same bit rate that their "cheap and REALLY cheap" 4k models now shoot in today. (No I don't want the AX300's sensor,..I just want it's h.264 bit rate) Is that a terrible thing to want? Ridicules and horrible of me? What an absurd proposal on may part! How dare I ask this of an XDCAM camcorder?
I don't doubt Jody or his associates but If you polled colorists around the world, the general return would be: "The more information the better". (Matching codec for codec of course)
The guy doesn't agree with me or Sony. What do you want me to say to that?
I suppose we are now just beating a dead horse at this point. There is no way that anybody is going to convince me that much lower bit rates are just as good as higher ones within the same compression algorithm. Yes, if we compare H.264 vs H.265 HEVC, than yes, I will agree there. But this is all within the SAME MPEG H.264 LIBRARY.
One thing is certain: Neither one of us needs each other's "approval" of our opinions. I think we will both sleep perfectly well tonight.
Sony has stated they are working on this and I have a "feeling" (or hope) they they will succeed. Competition and market forces are pressuring Sony on this.
I just hope Jody doesn't persuade them that it's not necessary after all.
lol ;-)
Jody Eldred July 23rd, 2015, 07:09 PM I own an F55 and can shoot 4K Raw all day long.
My X70 is merely another tool I can use in many situations, and I know exactly what to expect from it because I've done the tests. I've posted videos of the results here and on the X70 Shooters FB page. Do your own or ignore mine or whatever. Makes no difference to me. Sony knows they have a terrific product and my tests have further validated that to them. (Yes, to SONY. I have done real-world testing and evaluating of a dozen camera systems for them over the past decade +, some of which can be found on their websites. I have met with the Japanese designers on many occasions and am proud to have had input as to some of their recent high-end camera systems.)
Just sharing my experiences. And in my 31 years in Hollywood, this adage has served me well:
"A man with an argument is no match for a man with an experience."
;-)
Cliff Totten July 23rd, 2015, 07:58 PM For the record, I love my X70. That's not the problem.
And all this experience you have collected in your career gives you the ability to declare that significantly higher h.264 bit rates do nothing for image quality or color grading?
Than why in the world would you ever want to shoot with your "F55 in 4K raw all day long"?
You "experience" goes against all the "experience" that Sony and MPEG have.
Here is some more crazy talk from me:
"Higher h.264 bit rates = more image data = more information to work with in post"
or...
"Lower h.264 compression ratios = more image data captured"
or...
"Higher h.264 bit rates allow for more bits to be allocated into the shadows of an image and less macro blocking as well as banding"
or
"Higher h.264 bit rates allows smaller block encoding of moving pixels and allows finer and more accurate motion detail and entropy estimation"
Yeah,...radical, weird crazy talk that is all completely FALSE!! Your "experience" says that I am completely dead wrong on this. In fact, I'm clearly displaying my lack of experience here. MPEG and Sony would CLEARLY not agree with these statements I have listed here!
Where do I get this stuff from? lol
;-)
Cliff Totten July 23rd, 2015, 08:11 PM Woah,..woah,...let me hit the breaks here.
If is the real "Jody Eldred" and not some "impostor"....
I just Googled you....I KNOW YOU! We have met before! I have talked to you several years in a row at the NAB Sony booth!! Many, MANY times actually!
Hey, for the record, I think you ARE are pretty cool guy. Sorry I did not recognize your name until I Googled you. Dang! Nah man, I dig you and respect what you have done with Sony. You are actually one of the cooler guys to talk to at the Sony booth. No B.S, I mean this sincerely, I really do.
I don't take back my stand on bit rates and maybe I have been a bit too sarcastic at times but I do certainly respect your work and experience.
When I see you next year at NAB, you might want to kill me but I'll still want to shake your hand. I'm totally cool with you. Sorry, I didn't know this was you all this time. If I had Googled you earlier, I would not have changed my position but I would have changed my tone.
I hope we can just laugh at this next year. ;-)
Christopher Young July 23rd, 2015, 08:44 PM So with that being said,...more image data is being dis-guarded in 60Mbp/s than it is in 100Mbp/s. The compression ratio is significantly higher in 60Mbp/s.
This compression vs bit rate argument has been around for a long time and was discussed to death many times in the early days of XDCam 35-mbit vs 50-mbit yet their corresponding compression ratios were very much the same. As Jody says "How does it look." If it holds up in post and especially if transferred to a higher level codec and works in the real world and everybody is happy with the result that should be the final arbiter.
I've shot countless hours of broadcast on both the 50-mbit and 35-mbit XDCam flavors and never had an issue with networks here and overseas and subjectively it's it very hard to tell the difference in the picture quality given that both come of correctly set up cameras. I've lost count of the number of hours we have shot in 35-mbit that has been rendered to 50-mbit for delivery to TV and never once had a query on quality.
As Alistair quoted way back in 2009:
"4:2:2 = (1920×1080 + 960×1080 + 960×1080) x30(fps) x8 (bits) = 995Mb/s. Divide by 19.9 and we get 50Mb/s
4:2:0 = (1920×1080 + 960×1080) x30(fps) x8(bit) = 746Mb/s. Divide by 21.3 and we get 35Mb/s
So from this we see that the compression ratio for EX is 21:1 and for XDCAM HD 422 20:1. This is extremely close and in terms of compression artifacts means there will be little, if any, difference between the two."
HD Warrior » Blog Archiv » XDCAM 422 v XDCAM EX by Alister Chapman (http://www.hdwarrior.co.uk/2009/12/15/xdcam-422-v-xdcam-ex-by-alister-chapman/)
In the real world I have found this to be pretty true so we can't judge everything on bit rate and compression ratios alone.
It is also well know that Sony have their own proprietary compression algorithms that comply with the H.624 standards. Therefore we are only surmising when we say there must be a difference between the 60 and 100-mbit bit rates re artifacts when we don't know whether Sony is using a different proprietary algorithm for the the bit rate compression between the AX100 vs the X70. On that basis alone it makes it almost impossible to come up with an objective assessment of the differences between the two bit rates.
I've long ago given up nit-picking with the numbers. If it was all based on numbers then the Go-Pro would never have had a look in in the broadcast world yet they are everywhere like a plague. As Jody says if it looks good, feels good and works okay for the project you are working on run with it.
Chris Young
CYV Productions
Sydney
Cliff Totten July 23rd, 2015, 08:56 PM I want to be really clear on something here...and maybe this has gotten lost in my CODEC complaints.
I strongly believe the Sony PXW-X70 is EASILY the best "palm-corder" on the market today. Canon's offering is not even close. JVC's model is nice but I'll take Sony's larger 1 inch-type sensor and front end over that JVC model ANY day. (even though the JVC sports 150Mbps h.264)
I am VERY happy I bought it and I'm very happy I bought the 4k upgrade too.
My writing is NOT a "bitch session" about a camera I hate...far from it. It will be a great tool of mine for many years to come. (or until Sony replaces the model, and I buy that one. hehe)
I am a die-hard Sony freak and over all, they have gotten more money form me that I want to think about. Sony is ONLY company I buy camcorders or photo cameras from. I work for a global media company and I buy TONS of Sony products too at work.
With all this said clearly. I do get bothered at times when I see cheaper Handycam's and Alphas receive features that more expensive models don't get. The new RX10-II has 100Mbp/s AND SLOG-2...this bothers me. A while back ago the NEX5 got peaking and the far more expensive VG-10 was not allowed to have it. I called Sony and protested and complained...do you know what happened? At the very end of the VG10's sales life, as it was being discontinued and the new VG20 was being released, Sony released a tiny VG10 firmware update. Guess what the firmware did?....Added peaking! (no, I cant take credit for that. Allot of customers demanded it on the internet and blogs and forums)
That's the only reason why I do this. If Chevy allows the Tahoe to have leather seats but then limits it's Escalade sister to be locked down to cloth seats only,...there is something wrong with that.
I just hate seeing features in cheap cameras that are locked out of much more expensive ones. (I know consumer Alpha, Handycam and Pro departments debate this stuff internally within Sony too)
I'm sure that you know the FS7 vs. F5 scandal...That wasn't fair and Sony fixed that with an upgrade option for F5 customers. Good job on that.
That's all I'm trying to say here. Sony, please try to be "feature consistent" as cameras go up the ladder.
I can already see this coming. Sometime down the road, Sony will have two 4k brother and sister models. An XDCAM "pro" version and a Handycam "consumer" version. What is going to happen? Sony consumer division will somehow give the Handycam version 4K 60P and the more expensive XDCAM will be stuck with only 30p for a long while. It sounds funny,...but it's certainly quite possible! haha
Cliff Totten July 23rd, 2015, 09:08 PM This compression vs bit rate argument has been around for a long time and was discussed to death many times in the early days of XDCam 35-mbit vs 50-mbit yet their corresponding compression ratios were very much the same. As Jody says "How does it look." If it holds up in post and especially if transferred to a higher level codec and works in the real world and everybody is happy with the result that should be the final arbiter.
I've shot countless hours of broadcast on both the 50-mbit and 35-mbit XDCam flavors and never had an issue with networks here and overseas and subjectively it's it very hard to tell the difference in the picture quality given that both come of correctly set up cameras. I've lost count of the number of hours we have shot in 35-mbit that has been rendered to 50-mbit for delivery to TV and never once had a query on quality.
As Alistair quoted way back in 2009:
"4:2:2 = (1920×1080 + 960×1080 + 960×1080) x30(fps) x8 (bits) = 995Mb/s. Divide by 19.9 and we get 50Mb/s
4:2:0 = (1920×1080 + 960×1080) x30(fps) x8(bit) = 746Mb/s. Divide by 21.3 and we get 35Mb/s
So from this we see that the compression ratio for EX is 21:1 and for XDCAM HD 422 20:1. This is extremely close and in terms of compression artifacts means there will be little, if any, difference between the two."
HD Warrior » Blog Archiv » XDCAM 422 v XDCAM EX by Alister Chapman (http://www.hdwarrior.co.uk/2009/12/15/xdcam-422-v-xdcam-ex-by-alister-chapman/)
In the real world I have found this to be pretty true so we can't judge everything on bit rate and compression ratios alone.
It is also well know that Sony have their own proprietary compression algorithms that comply with the H.624 standards. Therefore we are only surmising when we say there must be a difference between the 60 and 100-mbit bit rates re artifacts when we don't know whether Sony is using a different proprietary algorithm for the the bit rate compression between the AX100 vs the X70. On that basis alone it makes it almost impossible to come up with an objective assessment of the differences between the two bit rates.
I've long ago given up nit-picking with the numbers. If it was all based on numbers then the Go-Pro would never have had a look in in the broadcast world yet they are everywhere like a plague. As Jody says if it looks good, feels good and works okay for the project you are working on run with it.
Chris Young
CYV Productions
Sydney
I agree with everything you say here Chris. I'm not going to name names here but...the media company that I work for setup an HD deliverables standard for "Gold", "Silver" and "Bronz". Sony was very unhappy with where we placed 35Mbp/s XDCAM EX and really wanted us to change our standard. We didn't. At a SMPTE meeting in our office, Sony sent their CTO (I wont type his name but he was EXCELLENT btw) and even he tried to pitch XDCAM 4:2:2 and EX to the group. He even brought monitors and gear with him on his demo. He showed us phase-inverted-canceled A&B shots to inspect artifacts....really cool stuff! In the end, we still didn't elevate XDCAM EX on our standard (because of the "numbers") and I think Sony was really upset by that.
Meanwhile, I know for a fact that we were airing a major show with boats and drama on the water that was shot mostly on HDV (25Mbp/s 1440x1080, non-square pixel) tape at that time. (go figure?) Ehh,...maybe I have said too much already ;-)
Anyhoo....I agree with you on your post Chris.
Ron Evans July 24th, 2015, 06:18 AM I think if Sony made the Prosumer/Professional difference as just SDI, Timecode in/out, Genlock , World camera specs, at each price point ( which would be features different ) it would make everyone's life a lot easier, including Sony.
Ron Evans
Cliff Totten July 24th, 2015, 09:16 AM I'm perfectly OK with a paid upgrade sales model. I would certainly pay additional money for a second 100mbp/s liscense. I would certainly pay for an SLOG-2 X70 upgrade if the price was reasonable.
Speaking of SLOG-2....we are now seeing SLOG-2 make its way deep into the lower consumer markets. Multiple Sony Alphas and Cybershots now have it.
Who would have thought we'd all see the day when my Grandmother could go to a Best Buy and get her camera with an SLOG option it?
It's funny how cheap consumer cameras get things that many pro market cameras are not allowed to have.
SLOG-2...wonderful dynamic range recording! Get in on our high end super35 models or get it in the cheapest low end market! (Not available on any mid level pro market camcorder) $8k and higher or in several consumer $2.5k and below.
Strange?...
David Heath July 24th, 2015, 09:51 AM Sony designed many XAVC modes that go WAY up the ladder in bit rate. (With 60mbp/s at the very bottom) Sony markets them as higher and higher in quality as they go up the list.
Well, not quite. Yes there are many XAVC modes, and yes they go up in bitrate..... but a large reason for that is to accommodate I-frame only versions of the codec. Such don't NECESSARILY give better quality, but there are reasons why they may be desired, right at the acquisition stage.
And unlike long-GOP, their bitrates are directly proportional to framerate, so up the framerate and ........
As Alistair quoted way back in 2009:
"4:2:2 = (1920×1080 + 960×1080 + 960×1080) x30(fps) x8 (bits) = 995Mb/s. Divide by 19.9 and we get 50Mb/s
4:2:0 = (1920×1080 + 960×1080) x30(fps) x8(bit) = 746Mb/s. Divide by 21.3 and we get 35Mb/s
So from this we see that the compression ratio for EX is 21:1 and for XDCAM HD 422 20:1. This is extremely close and in terms of compression artifacts means there will be little, if any, difference between the two."
The maths is correct as far as it goes, but I'm afraid the reasoning is wrong on several levels.
Most importantly, you don't need twice the bitrate to code 4:2:2 chroma as you do 4:2:0. The reasoning is similar to why when you compress still images, for a given quality level the file sizes don't scale linearly with image size. If you've got a 4:2:0 chroma image, you can effectively "guess" at in between values. ("Guessed" 4:2:2.) So to code the true 4:2:2, you only need the values of how different the true values are from the guesses. Which can be represented by a lot less data than the original 4:2:0 chroma signal.
Secondly, Alister assumes the same compression ratio for luminance and chrominance, and for MPEG2 this is not necessarily true. It's frequently the case that the chroma signals get far harder compression than luminance - this is why when banding may sometimes be seen (esp on digital broadcast TV), it's frequently most noticeable on areas of plain saturated colour, more rarely on more monochrome scenes.
So you can't do a simple sum of total numbers of chroma and luma pixels, compared with total bitrate and reach any meaningful simple conclusion. It's far more complex than that.
Cliff Totten July 24th, 2015, 10:49 AM Off topic question but I cant resist.
"The reasoning is similar to why when you compress still images, for a given quality level the file sizes don't scale linearly with image size."
I dont know the answer to this at all, but maybe you do...
Two files:
1.) If you encode a 500x500 frame size image with specific h.264 setting and achieve "X" quality.
2.) If you encode a 1000x1000 frame with the same exact h.264 setting to achieve the same "X" quality.
A.) Will file #2 require 2 times the bit rate to achieve the same "X" quality? (because its frame is 2x larger?)
B.) Does H.264 "gain" higher and higher efficiency levels as frame size increases? (So maybe it only needs 1.5 times the bitrate instead of 2 times to achieve "X" quality)
C.) I believe that H.264 operates in fixed block sizes. Or is this wrong and are these block sizes (4x4 or 8x8 or 16x16...etc) are infinitely scalable? I know that h.265 operates in different block sizes and supposedly is a major advantage over h.264 because of that. (one simple reason on many) If block sizes are fixed, than that would suggest a more linear or proportional scaling bit rate???
I'm not sure about any of these questions but maybe you or Jody or anybody else knows?
Craig Seeman July 24th, 2015, 11:57 AM Might I suggest someone do a short recording using the AX100 and x70 in 4k and open and compare the files in MediaInfo. While it won't tell the whole story it will tell you what OTHER differences there are besides bit rate. Things like Entropy and GOP Structure can result in a lower bit rate file looking better than a higher bitrate file.
David Heath July 24th, 2015, 12:13 PM Two files:
1.) If you encode a 500x500 frame size image with specific h.264 setting and achieve "X" quality.
2.) If you encode a 1000x1000 frame with the same exact h.264 setting to achieve the same "X" quality.
A.) Will file #2 require 2 times the bit rate to achieve the same "X" quality? (because its frame is 2x larger?)
And - like with a lot of things(!) - I don't think the answer is simple.....
Firstly, I think you mean "Will file #2 require 4 times the bit rate to ......", don't you? You're quadrupling the number of pixels, so "common sense" would seem to indicate you'd need 4 times the bitrate?
The answer is generally taken to be "no", but it depends a lot on what variables change. You're changing the image resolution - but are you changing the size of the viewed image as well? If your original image is viewed in a frame 8"x8", the answer will vary depending on whether file#2 is also viewed as 8"x8" - or scales up proportionally to 16"x16".
That's important when you think of the difference between 4:2:0 and 4:2:2 chroma sampling. In such a case you are far more analogous to "fixing the size of frame" - as the 1920x1080 luminance aspect is unchanging. And it then follows that although there is a doubling of chroma samples, it shouldn't necessarily be thought of as needing a doubling of bitrate for equivalence - it should be less.
Jody Eldred July 24th, 2015, 12:53 PM Might I suggest someone do a short recording using the AX100 and x70 in 4k and open and compare the files in MediaInfo. While it won't tell the whole story it will tell you what OTHER differences there are besides bit rate. Things like Entropy and GOP Structure can result in a lower bit rate file looking better than a higher bitrate file.
As reported earlier (in this unnecessarily very long thread) I've done that with a colleague's AX100 and my 4K X70 in a 4K color grading suite at Roush Media in Burbank, CA. Projected in 4K on a 15-foot screen. They both looked very good (for what they are... I'm used to seeing F55, RED, or Alexa 4K up there.) The X70 graded better, chroma noise was a bit less, the compression appeared to be superior as there was slightly more digital artifacting in the AX100.
I have no way of sharing this as we did not have opportunity for outputting, much less doing so in a comparative side-by-side edit. (You'd not be able to evaluate it online anyway do to compression and the wide variance in monitoring.) I have posted some 4K X70 that Roush Media graded on this website. It'll give you some idea of what you can do. Here again is the link:
First 4K Sony PXW-X70 Color Grade, by Jody Eldred - YouTube
Jody Eldred July 24th, 2015, 12:56 PM Woah,..woah,...let me hit the breaks here.
If is the real "Jody Eldred" and not some "impostor"....
I just Googled you....I KNOW YOU! We have met before! I have talked to you several years in a row at the NAB Sony booth!! Many, MANY times actually!
Hey, for the record, I think you ARE are pretty cool guy. Sorry I did not recognize your name until I Googled you. Dang! Nah man, I dig you and respect what you have done with Sony. You are actually one of the cooler guys to talk to at the Sony booth. No B.S, I mean this sincerely, I really do.
I don't take back my stand on bit rates and maybe I have been a bit too sarcastic at times but I do certainly respect your work and experience.
When I see you next year at NAB, you might want to kill me but I'll still want to shake your hand. I'm totally cool with you. Sorry, I didn't know this was you all this time. If I had Googled you earlier, I would not have changed my position but I would have changed my tone.
I hope we can just laugh at this next year. ;-)
No prob. Hope to see you there.
Ron Evans July 24th, 2015, 01:26 PM Can you imagine the reaction 10 years ago if someone said those images were from cameras cost around $2000 ?
Ron Evans
Craig Seeman July 24th, 2015, 02:44 PM Jody, Cliff doesn't believe eyeballs. That's why I mention a comparison using MediaInfo. That will provide the "numbers" Cliff loves. My own hunch is there might be things other than the data rate that are different which may show why a lower data rate file can be better than a higher data rate file.
Cliff Totten July 24th, 2015, 06:40 PM Might I suggest someone do a short recording using the AX100 and x70 in 4k and open and compare the files in MediaInfo. While it won't tell the whole story it will tell you what OTHER differences there are besides bit rate. Things like Entropy and GOP Structure can result in a lower bit rate file looking better than a higher bitrate file.
Good idea. Will do that in the next day or so!
Edit: Had the files and exported the metadata form MediaInfo as text. I included the AX100@100Mbp/s too
Cliff Totten July 24th, 2015, 07:05 PM And - like with a lot of things(!) - I don't think the answer is simple.....
Firstly, I think you mean "Will file #2 require 4 times the bit rate to ......", don't you? You're quadrupling the number of pixels, so "common sense" would seem to indicate you'd need 4 times the bitrate?
The answer is generally taken to be "no", but it depends a lot on what variables change. You're changing the image resolution - but are you changing the size of the viewed image as well? If your original image is viewed in a frame 8"x8", the answer will vary depending on whether file#2 is also viewed as 8"x8" - or scales up proportionally to 16"x16".
That's important when you think of the difference between 4:2:0 and 4:2:2 chroma sampling. In such a case you are far more analogous to "fixing the size of frame" - as the 1920x1080 luminance aspect is unchanging. And it then follows that although there is a doubling of chroma samples, it shouldn't necessarily be thought of as needing a doubling of bitrate for equivalence - it should be less.
Yeah,..., total brain fart on my part. Good catch.
I'm thinking of the general scalability of h.264. (either 4:2:0 or 4:2:2) Let me put it a better way than my original bone-headed explanation:
If you took 15Mbp/s, h.264, 8bit, 4:2:0 at High profile with CABAC at whatever long GOP structure you want at 1920x1080 frame size. The would give you a certain quality level.
Wait!,...do we agree that 15Mbp/s for the above specs for 1920x1080 are not too good? (I hope so...it's actually way below AVCHD's h.264, High profile, CABAC, 24Mbp/s)
Now, take this video, multipy it's bitrate and frame size times 4 and you get 60Mbp/s UHD. NOW!,...take your 70inch UHD TV, get a piece of cardboard and cover the bottom half of your screen as well as the top left 1/4. This leaves you watching a single 1/4 size or ""1080" quadrant of your 4k video. When you watch this, will you see video that equals the same "15Mbp/s" of quality or artifacts that you see in a normal 1080 video??
The short question: Is h.264 "proportionally" scalable in quality as you increase it's frame size and bit rate?
Or,..does it GAIN or LOSE bit rate effeciency as you increase or decrease it's frame size with the bitrate?
I do not know the answer. I only suspect that because it operates in fixed block calculations that you MUST increase the bitrate TOGETHER in a direct "linear" or "proportional" fashion with a proposed frame size increase. Both factors must increase "evenly" together??
So, UHD h.264 @ 60Mbp/s = FOUR 15Mbp/s "1080" quadrants in your UHD TV???
Again, yes, yes, yes this is speculation on my part. No doubt. But I have always wondered h.264's scalability and efficiency for years.
Anybody know? Anybody care?.. or maybe it's just too geeky of me to think about. lol
Cliff Totten July 24th, 2015, 07:15 PM Jody, Cliff doesn't believe eyeballs. That's why I mention a comparison using MediaInfo. That will provide the "numbers" Cliff loves. My own hunch is there might be things other than the data rate that are different which may show why a lower data rate file can be better than a higher data rate file.
My eyeballs tell me that my X70's 4k looks good today. I think Sony will make it look better in the 100Mbp/s codec update they are working on. (hopefully they succeed)
I wonder why Sony is even trying this?
Official Sony statement = *We are looking to support a higher bit-rate recording mode than 60 Mbps for 3840x2160 XAVC-L in the future.
What is Sony thinking here?? I don't think Sony understands yet that nobody want's it and it's really not needed? ;-)
Are we really arguing against the reasons for something that Sony is trying to do?
On a completely different note: Just took my new RX10-II for a 30 min spin. The new stacked 1 inch-type sensor has NO rolling shutter in 4k!!...it's GONE! Woah! SLOG-2 looks fantastic, highlight handling is a HUGE improvement....dynamic range is EASILY way better than previous 1 inch-type models I have used (RX10, AX100 and X70) with SLOG-2 turned on. (rec 709 highlights seems the same as previous 1inch-type) Way to go Sony...you fixed it!!
The high speed frame rates are spectacular for a cheap $1,300 camera.
This thing is a little MONSTER!
Christopher Young July 25th, 2015, 09:30 AM Most importantly, you don't need twice the bitrate to code 4:2:2 chroma as you do 4:2:0.
David ~
Agreed and I believe that is why Sony settled on 50-mbit for its XDCam MPEG-2 422 Profile@High Level as opposed to its 35-mbit 420 version MPEG-2 Main Profile@High Level.
Secondly, Alister assumes the same compression ratio for luminance and chrominance, and for MPEG2 this is not necessarily true.
This I believe to be correct for MPEG-2 420 in the transmission chain but this is at a considerably lower bit rate than typical camera compression rates. I believe Alistair to be correct with regards to Sony's proprietary camera 422@HL and MPEG-2 Main Profile@High Level implementations as they use the same compression algorithm for both luminance and chroma. The same goes for Canon's and Convergent Design's use of these codec as they was licensed from Sony. In Alistair's mathematical examples I think he was keeping it reasonably simple to understand for those who want an overview of compression schemes relating to cameras but without drowning them.
With regards to artifacts. Even with the newer DBS transmission systems based on DVB-S2 we are now seeing 420 HD bit rates between 8~19Mbps with a mean average of 12Mbps for both MPEG-2 and MPEG-4 AVC transmissions so it it hardly surprising we are seeing banding artifacts. With many broadcasters now using rate shaping to fit three 1080i channels on a 31.6Mbps DBS transponder I think we will see more of these compression banding artifacts. Without laboring the point I think to compare low bit rate transmission compression artifacts with some of the later compression schemes used in more recent camera designs is probably not the best example to put forward.
Chris Young
CYV Productions
Sydney
David Heath July 25th, 2015, 04:18 PM This I believe to be correct for MPEG-2 420 in the transmission chain but this is at a considerably lower bit rate than typical camera compression rates. I believe Alistair to be correct with regards to Sony's proprietary camera 422@HL and MPEG-2 Main Profile@High Level implementations as they use the same compression algorithm for both luminance and chroma. The same goes for Canon's and Convergent Design's use of these codec as they was licensed from Sony.
I confess I don't know whether the compression ratios in XDCAM for luminance and chrominance are the same or not. They may use the same algorithms but does that automatically mean the ratios will be the same? The point I was trying to make is that it's dangerous to assume they are - they need not be for MPEG2, and in various implementations they are not.
In Alistair's mathematical examples I think he was keeping it reasonably simple to understand for those who want an overview of compression schemes relating to cameras but without drowning them.
But "keeping it simple" shouldn't mean saying anything incorrect, and I'm afraid here it does. It "proves" that the basic compression ratios between XDCAM EX and 422 are the same (the only difference between them is the 4:2:0/4:2:2 aspect) and regardless of the truth about differing luminance/chrominance compression ratios, there is still the fact that doubling the number of chrominance samples won't mean a doubling of bitrate. And that's before we even start to consider any other factors that may differ between them.
It's similar to the comparisons made some time ago between XDCAM EX when it first came out and HDV. Same basic codec (MPEG2) and XDCAM EX not that much higher bandwidth...... True as far as it goes, but totally failing to take into account less quantifiable factors, of which I believe the dynamic bitrate allocation between Intra and difference frames of XDCAM was one major fact.
Without laboring the point I think to compare low bit rate transmission compression artifacts with some of the later compression schemes used in more recent camera designs is probably not the best example to put forward.
I was trying to illustrate a different point - that there is no need whatsoever for luminance and chrominance compression within a codec to have the same ratio, and in some cases they don't. Hence taking the combined number of samples per frame and dividing by bitrate doesn't (by itself) prove very much.
David Heath July 25th, 2015, 04:43 PM If you took 15Mbp/s, h.264, 8bit, 4:2:0 at High profile with CABAC at whatever long GOP structure you want at 1920x1080 frame size. The would give you a certain quality level.
Wait!,...do we agree that 15Mbp/s for the above specs for 1920x1080 are not too good? (I hope so...it's actually way below AVCHD's h.264, High profile, CABAC, 24Mbp/s)
I'll say again, beware of simple headline numbers.
As far as AVC-HD goes, then I understood that the 24Mbs figure (for 720p and 1080p/25, 1080i/25) is a peak value, not an average value. I seem to remember when the first Panasonic AVC-HD cameras came out that the average value I was seeing reported was far more around the 17-19Mbs mark?
And that was for early implementations, not using anything like all the tricks that AVC-HD is capable of. So my first question to you is this 60Mbs XAVC implementation you are unhappy about an AVERAGE of 60Mbs or a PEAK of 60Mbs? Because without being absolutely crystal clear about that, this entire debate becomes meaningless!
Secondly, for AVC-HD we may be talking about interlace (1080i/25) - with what you're referring to (XAVC for 4K) it's progressive. It has to be - 4K is always progressive. And progressive lends itself better to compression than interlace.
Thirdly, XAVC is a more advanced codec than AVC-HD. It's more recent - so requires a more recent decoder version - but that allows it to be more complex than earlier H264 incarnations (like AVC-HD).
Take all that together and it all becomes feasible that a quadrant of such a 60Mbs QFHD image *MAY* indeed be better than an AVC-HD single image. Frankly, there are so many variables that I wouldn't like to predict. All I will really say is DON'T rely on "simple" headline numbers.
(And note also that as Chris says, digital broadcasting of 1080i/25 is indeed using bitrates averaging around 12Mbs, and I can't remember seeing any noticeable artifacting on BBC One HD at home for a long time. OK, they've got statmuxing with other channels in the multiplex to help get it so low, along with expensive broadcast encoders, but the implementation is unlikely to be able to be as sophisticated as XAVC (to maintain backwards compatibility with all existing decoders), and it's for an interlaced system.)
Wacharapong Chiowanich July 25th, 2015, 07:28 PM I think the thing about XAVC either S or L's published 60Mbps being the peak or average sustainable bit rate is the key. I myself still have had no idea that being one or the other after all these times!
Ron Evans July 26th, 2015, 08:34 AM I have seen the following.
AX1 QFHD 60P 150Mbps shown as 155Mbps in Catalyst Browse and 149 to 152 in VLC player tools
AX1 QFHD 30P 60Mbps shown as 59 in Catalyst browse and 59 in VLC
AX1 QFHD 30P 100Mbps shown as 99 in Catalyst Browse and VLC player
AX100 QFHD 30P 60 and 100 show the same as the AX1
AX100 HD nominally 50 Mbps show as 50 to 53 in VLC player and 53 in Catalyst browse.
Same clips used of course to compare in Catalyst browse and VLC player. Not sure if Catalyst Browse just gives the maximum data rate as one can watch the data rate as the clip plays in VLC player.
Not sure if these mean the 60 and 100 are maximum or not but the 50 for HD and the 150 for 60P QFHD are not the maximum as I have observed more in several clips. I have not shot XAVC-S HD in the AX1 to try but will do some time. Expect it is just like the AX100.
Ron Evans
Wacharapong Chiowanich July 26th, 2015, 07:07 PM Looks like we could say that at least for Sony cameras' published XAVC bit rates the numbers actually mean "around xxxMbps". They mean neither peak nor average but it seems safe to refer to for the sake of arguments since the variances, as Ron has found out, are pretty small.
The definition aside, my experience shooting with my AX100 at both 60Mbps and 100Mbps for the past 2-3 months has yet to reveal any clear proof the higher bit rate files have any noticeable IQ improvement in moving (either the camera itself moving or shooting at moving subjects) shots. In static shots or shots with little movement, they look exactly identical.
Ron Evans July 26th, 2015, 07:27 PM I am not a big fan of slow frame rates being particularly sensitive to judder. I can tell the difference on a very slow pan at 30P on the AX100 and the AX1 between 60Mbps and 100Mbps but it is almost insignificant. Noticeable if the pan is across grass or leaves. Also very little difference between the two cameras. For relatively still frame there is no difference that I can see between 60 and 100 Mbps maybe as one would expect. 30P frame rate motion cadence is more dominant. For me there is a big improvement in moving to 60P at 150Mbps.
Ron Evans
Cliff Totten July 26th, 2015, 08:23 PM In the side by side test I posted a pager earlier in this thread, there are only "mild" difference between 60 and 100Mbp/s on the AX100. However, all those test scenes were on a tripod pointed at palm trees and a very gentle breeze. So obviously, both bit rates were relatively close. Neither one was too stressed.
I must admit that I'm a little surprised at how many people had a strong reaction to this topic. I figured we'd all jump on the band wagon with our pitch forks and shout for 100Mbp/s. I guess it didn't really happen that way and some disagreed completely with the general concept that less compression is a good thing.
I also had a really difficult time explaining to people that I think 60Mbp/s looks "OK" (they think I hate it...I dont) but that I strongly agree with Sony on their attempt to add higher bit rates to the X70. I think that with the rest of Sony's models and the rest of the industry doing 100Mbp/s, the X70 was left all alone at 60Mbp/s.
I think Sony is attempting to do the right thing with the X70. (I guess some see it differently)
I guess we can all agree to disagree. That is what makes this forum so much fun.
CT;-)
David Heath July 27th, 2015, 02:47 PM I must admit that I'm a little surprised at how many people had a strong reaction to this topic. I figured we'd all jump on the band wagon with our pitch forks and shout for 100Mbp/s. I guess it didn't really happen that way and some disagreed completely with the general concept that less compression is a good thing.
It depends which way you look at it. Higher compression rates may indeed be seen as a "good thing" in respect that they mean lower file sizes, and lower data rates - which taken together can mean smaller memory card sizes, and speed performance is less critical - so much lower per GB costs as well. But yes, go too far and quality may suffer. And the question becomes where you draw the line. That's the question constantly confronting designers and manufacturers. Compress too high and people complain about the quality. Compress too little and people complain about media costs.
It's easy enough to simply say "less compression is a good thing" - but higher bitrates come at a cost. If it makes a noticeable difference, well, fair enough, but if any difference is marginal then you have to strike a cost/benefit deal.
For my own part, I was really just trying to get over the general concept that drawing strong conclusions on headline numbers is dangerous. The more I find out about the whole subject, the more I realise I don't know - but I do now realise that simplistic assumptions can be seriously wrong. Which is why I haven't given any direct opinion to your fundamental question - I don't know the answer.
In the very first post you said:
Considering that the rest of the industry and all other Sony cameras shoot at 100 Mbp/s or higher. .......... Literally ALL Panasonic, JVC and Canon 4K cameras shoot at 100Mbp/s or higher. (because realistically, every manufacturer knows it's very necessary)
But I put it to you - what if investigations showed that the 60Mbs was an average, and the 100Mbs of the other cameras was their peak value? Indeed, what if it turned out that for that "100Mbs" codec the AVERAGE rate was only (say) 50Mbs? Would you still be pushing for it?
Or what if the 60Mbs footage had a different GOP structure - more difference frames between I frames?
Yes, those scenarios are unlikely to be true - I'm certainly not saying they are - but..... it just shows how headline number "reasoning" COULD lead to very wrong conclusions. That's the point.
And same with comparing one quadrant of the 4K image with a 1920x1080 image. Factor in that XAVC bitrates are average, AVC-HD are peak, then add in the fact that 4K is progressive (so will compress better), and a modern XAVC encoder is almost certain to be more efficient than an early AVC-HD one, and the conclusions become very different. (And there may well be other factors I'm not accounting for, which may work either way.)
What you're asking for (100Mbs instead of 60Mbs, all else equal) may well give an improvement - but if it's only very slight, then is it worth it? That's why we're not picking up the pitchforks and marching...... :-)
Greg Boston July 27th, 2015, 03:11 PM II guess we can all agree to disagree. That is what makes this forum so much fun.
Indeed, as long as it's done with tact and respect. Good discussions about the merits or downsides of a subject are often enlightening to both sides of the table. It's just sad to see how despicably it often plays out on social media or as comments under news stories.
-gb-
|
|