View Full Version : Will 4K cameras be practical for weddings?


Pages : 1 [2]

Jeff Harper
June 10th, 2014, 12:04 AM
Joining this thread late, but it's a good one. I'm excited about 4K, but have reservations about it insofar I haven't shot or edited anything yet!

I've been watching 4K for almost two years or so at the local cinema. They've been using 4K at The Esquire and at The Kenwood Theatre and it's fantastic. You do get spoiled. Is there a difference? I don't know, it look pretty darned good. Some films there seems to be no difference, but on others it's like looking through a window, very very clear.

I have a 4K cam on order but of course am apprehensive about the workflow. Storage not too much of an issue, since it's only one camera.

I'm more concerened with editing, obviously. Workflow will be all thrown off due to likely need to create proxies.

The ability to zoom in on the hi-rez footage is intriquing. As a solo shooter it's very appealing, at least on paper. The potential flexibility is great.

No downside to ordering a 4K camera. If it doesn't work out, I send it back.

Similar arguments as I've seen here were made when HD was new. I hated it the very idea of it. I didn't want to have to buy new equipment. I had just settled into having three matching cameras and was very happy with my VX2100s and PD150 when along came the FX1, FX7 and FX1000. I jumped on the FX1000 and was pretty happy with it after I got used to the inferior low light ability.

Today I'm purchasing $2K cameras instead of $3500 cams. The market has changed. The lines between consumer and pro cams has blurred, at least at for me.

"They" have been working on 8K for years. I have read that 4K is the reasonable limit for quality, that the human eye can't resolve much more detail then 4k offers. I don't know if that's true, but I'm hoping it is.

But the argument for 8K will probably be that 4K is not truly full 4K, and that 8K will fill in the gaps to make it truly stunning. I don't know. I do believe that 4K is here already (it's in the theatres) and there is no need to upgrade yet. It won't be long before my AX100 will be outdated, but for now I hope it works out for a year or two.

Dave Blackhurst
June 10th, 2014, 06:05 AM
Jeff -

4K is already past the resolution of my nekkid eyeball, contact corrected vision! My 4K capable computer setup is actually quite nice, and it took my eyes a week or so to adjust to how sharp it is in comparison to my old dual 1920x1080 setup. Actually think it gave me headaches at first!

Don't worry about storage, my old WD "Green" 5400 drives I use to "archive" seem to be able to deliver enough throughput! Storage is cheap, even fast SDXC cards are not bad, and I actually just stuck an "SDHC" (albeit class 10 UHS1) card I already had in the AX100, and it works just fine!

I had forgotten I had experimented a bit with Vegas 12 and editing while still running W7 - while Vegas still needs a reinstall, I was able to launch from the .veg file... and previewing at VERY acceptable quality was no problem choosing (auto). I've got ZERO concerns about being able to edit without issue!

Doing a 1080 crop yields a very good looking image on the preview screens, certainly as good as, if not better than a "native" 1080 frame. Pan/crop/scan is definitely a viable proposition - making this a good way to replace at least a couple "HD" cameras for event work - just for giggles I decided to try inputting 720 (lower rez HD) and 480 (SD) in the height settings in crop... you could probably get away with those on a lower rez project! OK, so SD rez isn't super pretty, but....it never really was...

You'll like the AX100, I was almost ready to abandon "video" cameras once I got the RX10, but I quite like the AX100. Makes me wish for an extra right arm so I could shoot both at once!

Not sure what your system specs are, but I collected all the bits for a new system core, reused my HDD's, PS, and case ( so... new motherboard with the intel integrated 4K graphics, CPU and memory). I only started the adventure when I found a 39" Seiki 4K TV could be had under $400, no $2-3K monitor in my budget!! Long story short, if the AX100 can be budgeted, you probably can put together a cheap computer upgrade similar to mine without the piggy bank squealing too loudly!

Of course there's a good deal of tweaking to get things working - graphics drivers have been "interesting", but I think the latest one I got from Intel finally got things looking pretty smooth!

Moving to being able to work with 4K is actually going better than I expected, and with selling off a few HD cams and my old computer core, I may actually make a few extra bucks!



I'm not even sure that there ARE consumer cameras anymore - they are rapidly being replaced by phones and tablets - I think Sony at least understands they HAVE to return to professional features for their higher end offerings, even if they are at price points that at one time were considered "consumer" - there's still an "enthusiast" market, but the expectations from those buyers are WAY higher than your average "consumer", who somehow is "happy" with the output from that phone or tablet!

Jeff Harper
June 10th, 2014, 08:47 AM
Well Dave, I'm very very fond of my CX900, and since the AX100 is essentially the same camera I'm pretty sure I'll love it also.

I really appreciate you sharing your experiences.

Tell me Dave, do you edit multicamera projects with 4K? I can see one line of 4K footage working OK, but I'm wary of it bogging down a multicamera project.

Camera arrives tomorrow so I'll try it out right away on a current project.

Craig McKenna
June 10th, 2014, 02:15 PM
Still waiting for my GH4 to arrive. After reading the reviews, I believe that I may also trade in my GH3 and pick up a second GH4... or more likely, buy an extra GH4 and keep the GH3 as my third camera.

Despite this, it is my very humbling understanding that my 4TB GRAID Thunderbolt Drive hosts nowhere near enough storage once my footage is converted to Apple ProRes. Therefore, I think it's unlikely that I'd be able to shoot 4K unless I pick up a 16TB server as my main storage and use my G RAID to edit from. Then I'd do the back ups that everyone else seems to do here with the small portable HDD.

Ultimately, video calls for so much extra storage compared to our photographer counterparts. It's for this reason that I think / feel that whilst 4K will be practical for weddings, it'll only ever be truly practical once 64TB are available for your HDD - even if we're talking a server. Otherwise, our footage is going to become so great through a wedding season, that we're really going to struggle for space. I may be alone in thinking this - as I'm still an amateur (but with a Comp Sci background), but I just don't see this happening in my workflow any time soon.

That said, it's exciting... but I'm more so excited for 1080p 100mbps 24p/30p. I think I'll be shooting this over the course of the next few years... and I doubt that I'll be downscaling from 4K. That said, you're all years ahead of me, so maybe I would adopt sooner if I had your experience.

Ultimately though, it's a costly transition and I'm not the type to skimp on quality in any part of my workflow - probably the reason that I'll never make any money if I did turn pro.

Dave Blackhurst
June 10th, 2014, 05:16 PM
@Jeff -
Only have one 4K camera so far, though another AX100 at a cheap enough price might tempt me - I'm hoping for an RX10Mk2 with 4K, though that may be a bit optimistic since the RX100M3 only got high bitrate 1080p.

I suspect that multiple cams could slow things down a bit (heck even the switch from 17Mbps to 28Mbps was noticeable with multi-cam), but only in the initial layout - once you get the cuts so you have one "stream" to deal with, shouldn't be a big issue. I'll know more once I get this new build 100% stable and functional - just seems to be down to squishing little bugs now, so should be soon!



@Craig -
I'm guessing you are planning to create uncompressed files to work with? 4K files are bigger, but storage options are a lot bigger now, and as 4K becomes a bit more common, it should be possible to work with the XAVC S files "native" - Hardware and drivers are still in flux from my experience so far, but I'm at the point where it's already obvious that working with 4K on this new system will be as easy as HD was on my old one.


As a very practical matter, I expect to be working with 28Mbps files as a minimum, and 50Mbps files, either 1080/60p or 4K/30p, in the VERY near future. You're talking about even higher bitrates from the GH4...

The bottom line is that we demand higher quality, and that takes more data points - more digital bits that have to be dealt with SOMEHOW... it's NOT just 4k, it's "better" HD too!

Ivan Hurtado
June 11th, 2014, 10:06 AM
4k on a global CMOS would give the ability to leave the stabilizer in home. Just pop the Monopod and crop the image after post-stabilization if needed. Perfect for one-man-bands.

I'm also considering now the AX100 paired with the ninja star. It would give backup in 1080p prores and still be portable enough. The benefit of the internal ND is that it leaves me the front to add just the polarizer or an ultracon (or both) instead of a fader. Does anybody now if you can have just one option in auto? As in just gain?

Steven Davis
June 11th, 2014, 12:35 PM
A couple of years ago I stood in line before the Sprint store opened up. It was the only time I've stood in line for technology. Two days later I took the phone back because I figured out it did not have the functionality I needed.

And 4k feels like that, sortof like the lemmings who stand in line at the Apple store for days waiting for the newest gizmo. It's not to say 4k isn't here to stay, personally I'm going to be patient till the dust settles. I will stand on the sidelines a bit till I see what Apple will do with FCPX, and to a lesser degree, Sony's Vegas. Both of which I use.

I bought two Canon XF300s this year, and did so after they'd built a reputation of solid performance. Is there a time to buy 4k, probably, but I'll wait a bit.

Just my two cents.

Bob Drummond
June 11th, 2014, 06:49 PM
Again, I ask the question no one will answer: If 2K is fine for Avengers 2, when will anyone reading this actually need 4K?

Fun? Yes, maybe. But anyone who thinks they NEED it is suffering from Gear Acquisition Syndrome.

Mark Whittle
June 11th, 2014, 07:32 PM
Craig, why are you converting to ProRes?

What are you editing on, FCPX or Premiere? If so does editing native not work?

Just curious.

Dave Blackhurst
June 11th, 2014, 09:12 PM
@ Ivan - as near as I can tell, yes you can control individual settings... just poke the appropriate button on the lower left side of the cam and adjust.


@ Bob - I remember when Bill Gates made some remark about never needing more than 256K of RAM...

I'd prefer higher bitrate, sharper, cleaner video. Since the cost is not excessive over buying something only capable of HD, and the technical side of it is doable, why not? I bought a cheap 4K TV for a monitor - I have 4x the workspace vs. 2x "HD" monitors, am getting more done, my photos look phenomenally better, it's easier on my eyes, and oh yeah, the 4K video looks HUGELY better than the HD...

Benefits of cropping to reframe are obvious, and I've actually DE-acquired gear in the process, as a single 4K camera can easily cover for 2-3 HD cameras in a multicam shoot scenario... after I get done selling off gear I won't need anymore, I actually will come out ahead in the $$ department!

Did I "need" the upgrade... I suppose not, but if you tried to take it away, I do believe I'd have to hurt you! There are HUGE benefits I did NOT expect in the 4K upgrade process...

I don't NEED to eat a nice breakfast of steak and eggs, a bowl of oatmeal will "do"... but the choice is sort of obvious if you've had both and the option is available and affordable/feasible within budget.

SO what was your point again?

Ivan Hurtado
June 11th, 2014, 11:22 PM
Again, I ask the question no one will answer: If 2K is fine for Avengers 2, when will anyone reading this actually need 4K?

Fun? Yes, maybe. But anyone who thinks they NEED it is suffering from Gear Acquisition Syndrome.

2K is fine with careful shooting. I'm sure Whedon doesn't go applying stabilization when he can go around with a steadicam team. For us it may mean the difference between a usable shot or one with the photographer in the middle; or even a shake-fest trying to get an unexpected moment that otherwise would be unwatchable.

We buy things based on future experiences, and as Dave said, if it means less cameras to carry, less tripods, no stabilizer and a positive income due to getting rid of things with a higher value than next year, then sure, I don't need 4k for the same reason I don't need a new camera. But I want it. Now. Cue Queen.

Nigel Barker
June 12th, 2014, 12:18 AM
I'm not convinced by the argument that you will replace 2-3 HD cameras with a single 4K & then just crop in post to reframe. The shots will still be from one camera position & won't look like three cameras just one that zooms in & out. Even then it's not going to look the same as a single camera with shots taken with different focal lengths as it will all need to be deep DoF.

Same goes for stabilisation. Software stabilisation only goes so far & is no substitute for doing wobble free in the first place. It can do a good job but often their are weird distortion effects on the background as a side effect.

Noa Put
June 12th, 2014, 02:18 AM
It's clear that for wedding shooters and specifically solo shooters 4K does have advantages in the cropping department but I am with Nigel it does have it limitations, they won't replace multiple camera's, only add extra framing options, in continuous recordings a 2 camera set up will give you a versatility in choosing extra focallengths, eventhough the dof won't change it can come in handy to crop out certain parts on unmanned camera's (like if you are standing in the frame yourself) or to just get a better frame when your camera was not completely set up straight or the framing was off, you can easily correct framing mistakes that do occur.

I would be interested though seeing a ceremony shot with just one camera set wide and then use the cropping ability to simulate a multicam set up. for instance, I have a wedding next week in a very small chapel that seats only 25 people, I have been told I can only shoot from the back from a fixed position and wonder how that would look like in 4K shooting wide and do my thing in post. 4k in a 1080p projects will give obvious advantages to solo shooters, but I think we should be careful not to become lazy and think that we can just all do it in post. 4K will give you extra options in post but I don't think I would change my shooting style but would be happy knowing I can cover up framing mistakes and have extra framing options.

Stabilising in post is also something that will improve certain shots but it will introduce other unwanted artifacts and it will never match the natural looking stabilisation of my sony cx730's.

Nigel Barker
June 12th, 2014, 02:48 AM
In 1920x1080 I quite often reframe by introducing a slight crop to improve the image (as per Noa to exclude junk from round the edges of the frame so it's more aesthetically pleasing) & I acknowledge that you will haven more scope to do that with 4K but it's not going to be the game changer that some proponents imagine.

Dave Blackhurst
June 12th, 2014, 03:12 AM
Let's take the typical stage show with multiple "zones", or wedding shoot with a "wide", a tight shot here and there on the B&G, and let's toss in a couple speakers or musicians or whatever up front, but off to one side or the other.... Sort of like the shot Noa is looking forward to!

Sure, you can try to be on top of who will be where and when (we know that's always predictable)... OR have a wide shot that can be cropped in as needed. DoF does not NEED to be shallow, or DSLR like - you want to have the shot. Presuming your camera placement and height is right, ONE 4K camera will cover.

This frees the single op up for moving and getting other angles and even some shallow Dof creative stuff if you want. Or get lazy, I suppose <wink>. If you do get lazy, well, your "post" will be "interesting"...


I don't know that I'd want to rely on stabilizing in post - 4K (especially 30p) is much more sensitive to motion, and presents some challenges when shooting - there is SOME downside to the high level of detail. And for whatever reason RS/skew are worse with the increase in resolution - probably due to the longer read time for more "lines". I'm sure this will improve over time, anything available today is "1st generation", and if there weren't some potential "gotchas" in the mix, it would be quite a surprise.


A 4K cam is just another tool. Potentially useful, with the right approach. Or could be useless if it doesn't fit your particular style. Practical may be different for different people...

Bob Drummond
June 14th, 2014, 02:17 PM
....A 4K cam is just another tool. Potentially useful, with the right approach. Or could be useless if it doesn't fit your particular style. Practical may be different for different people...

Well said. I agree completely. But I do disagree with this idea that you need to switch to 4K if you don't want to be left behind.

I don't think it is fair to take the switch from 1080p (or 2K) to 4K and compare it to the switch from SD to HD. They're not really comparable. In the US at least, "SD" mostly meant interlaced and 4x3 and 480 (or less) lines of resolution

Now let's talk HD and above. On a modern television in the 50 to 60 inch range, the difference between 720p and 1080p is often difficult to discern. On a cinema sized screen, most of us would be hard pressed to see the difference between a 2K image and a 4K image unless we watched them side-by-side. Hell, 35mm projection, hailed as the gold standard of image quality for most of our lives, would have been at 2K or less once it got to the release print stage.

If I get "left behind" with HD and 2K, I'll be left behind with the Lord of the Rings and everything shot on the Arri Alexa. That's fine company for me.

Dave Blackhurst
June 14th, 2014, 06:19 PM
Have you actually LOOKED at 4K samples vs HD?? On any sort of proper display (even a 1920x1080 one), the difference is obvious.

Even downrezzed, starting with 4K, the HD ends up looking better and sharper... one might argue it as "really good HD", since many "HD" cameras really aren't even resolving full HD...

I don't buy for a second that "the difference between 720p and 1080p is often difficult to discern" unless there are lots of other reasons that the image quality is not there.

I can easily see the difference between a 720 LCD panel and a proper 1080 one, most everyone either can already spot it, or be educated quickly to where the difference cannot be unseen. Sure, to someone unaware, it's just another image, but if you actually happen to care about image quality...


As I've already stated, when I was considering 4K the big question mark was what the heck to use as a monitor - 4K monitors and TV's are bloody expensive?! I had heard about the Seiki 39" before, and seen it around $400... that's not TOO pricey, I snagged a lightly used one for less. Took a lot of tweaking and adjustment, but it's noticeably sharper than my old 1920x1080 monitors, I would rank it a HUGE improvement both easy on the eyes, and for productivity from having more desktop space to work with. It cost a LOT less than ONE of my old monitors did many moons ago. All in all, a VERY good investment even if I wasn't trying to get 4K video capability in the upgrade!!

So tell me again how it's difficult to discern the difference?? Unless your eyes are a LOT worse than mine (not too likely), I find that hard to comprehend...


If your argument is that older content shot in 2K or HD or SD, or black & white is still valid, that's fine, I love old B&W TV reruns and bad old SciFi movies... CONTENT does not become less entertaining or worthwhile just because it was recorded with what was best and available at the time...

But let's say for a moment you are doing some "once in a lifetime" thing - do you want it memorialized on a cell phone, or something better? Either way, it's "captured", right? So who cares whether it was shot in SD, HD, 4K, 8K or holo-cam? Does it really matter?

Bottom line, just SHOOT! But use the best tools available...


I'm not going to say that anyone HAS to jump to 4K, but experience tells me it's coming up fast, and my eyes tell me it's an entirely different viewing experience (one everyone can appreciate, unlike 3D). To invest in a camera to upgrade/replace one (or two) already in use that can be sold for a good part of the upgrade cost makes sense to me. And I'll get better video in the bargain.

Bob Drummond
June 14th, 2014, 07:26 PM
Dave, for the seventeenth time, I agree with you that 4K is an interesting tool that may be of use, depending on your workflow, style, gear, etc. And I also understand the benefits of shooting at a higher resolution than you deliver. My Canon C100s downsample a 4K sensor to create a lovely 1080p image.

And I'm not talking about 720p "panels" vs. 1080p ones. I'm saying sitting about 10 feet away from my decent Samsung 55" 1080p LCD TV, comparing a blu-ray copy of a film to a high-bitrate 720p rip and a 1080p rip of the same film--the differences are negligible. Yes, the blu-ray looks best, the 1080p rip looks better, and the 720p rip looks good. They all look good. There is not a dramatic difference.

In a cinema, who the heck knows what we're seeing? Even if the theater advertises that is has 4K projectors, you're probably watching something shot in 2K. And if it was shot at a higher resolution, chances are it was finished in 2K. Once in a blue moon you may actually see a movie that was shot, mastered and projected in 4K. But you wouldn't know for sure unless you scoured the internet for articles about each film's post-production pipeline.

But they all look good.

Edward Calabig
June 14th, 2014, 08:08 PM
4K will eventually be practical, but the lack of decent cameras for it right now makes me think it's not quite there yet. A RED is not practical to bring to a wedding and the image from the GH4 is still looks like "video" with ugly highlights and doesn't grade well. The A7S will require an external recorder.

I'm excited for 4k but it doesn't fit the profile for a practical video camera for weddings... yet.

Steve Burkett
June 15th, 2014, 02:43 AM
4K will eventually be practical, but the lack of decent cameras for it right now makes me think it's not quite there yet. A RED is not practical to bring to a wedding and the image from the GH4 is still looks like "video" with ugly highlights and doesn't grade well. The A7S will require an external recorder.

I'm excited for 4k but it doesn't fit the profile for a practical video camera for weddings... yet.

Well as I'm using a GH4, very successfully and being very pleased with both footage and grading, I'm going to have to disagree with you on that point. 4k in the right situation has proved very useful, so is it practical for Weddings? Definately. Given the extra workflow and time rendering, I'd hardly use it for Weddings if it wasn't. Now if the question is whether 4k will take off as a future standard for video, this I can't say.

Jeff Harper
June 15th, 2014, 06:36 AM
How familiar this thread is beginning to sound. It feels like Groundhog Day. That's the problem with getting old. Everything becomes so familiar and repetitive.

Most of us do agree that the questions is not whether 4K is coming but when. Correct?

It was fun this morning to look up some old SD vs HD threads. As I recall I was on the wrong side of that argument. In my defense I felt threatened. I had a huge investment in my Sony 4:3 cameras.

Anyway, the following quote is from 2006:


Anyway don't shoot in HD unless you're ready to switch and definately don't watch said footage until you're ready to edit in it. You pretty much get ruined after that :).

Agreed. Chris' statement applies today. Don't shoot in 4K until you're ready to switch, because you will be ruined after that :).

Also from 2006:

We shoot anywhere from 1 to 6 weddings a week and nobody has asked for 16x9.

To us 16x9 is overrated unless you are going to the actual movie house.

Our studio just picked up another pd170.

:)


Only Videographers and technofiles care about this type of stuff....lol

The average bride and groom, just want it to look better than the 1-chip camera that they use.

SD is the way to go.

I think because we as videographers get so caught up in the technology that we think our clients actually care. My clients could care less about 16:9

Just ask Glen Elliot. I know for a fact some of his best stuff was shot 4:3 with a Sony.

It seems like it was only yesterday that the debate between SD & HD and 16:9 vs 4:3 raged.

My money is tight, I am doing very few weddings and will be doing less every year. But once I shot with a 4K camera, I sent back my new HD camera and got a second 4K. It's just very cool to have the flexibility in post.

Looking ahead, it's sad to think that the window for having the advantage of cropping 4K is limited. Once everyone has 4k televisions, then we cannot take advantage of the cropping that 4K offers when one is producing 1080 videos, at least as we can now.

Jeff Harper
June 15th, 2014, 07:06 AM
My favorite quote from 2006:

I...have yet to "see" a difference.

When it comes down to it the resolution issue is probably the LAST thing about HD that would factor into reasoning for a switch and paying a prime for the newer technology.

Noa Put
June 15th, 2014, 09:54 AM
It will come eventually for me, only not this and next year, it toke me 3 years before I went for SD to HD and by the time I did all workflow problems had been ironed out, there was plenty of user experience to be found online and for every problem ever happened a solution or workaround had been found so if you had an issue an answer was just round the corner.

A big advantage from a business point of view (when you do weddings) is that if you wait, prizes of current camera's will come down, if you buy such a camera just before a new one comes out you can save quite a lot of money. Same applies for the pc you have, my current pc gives me a very fast workflow for 1080p but it will have more problems with a few layers of 4K. The one I use now is about a 1,5 year old and will have to last me at least 1,5 year more, by then a midrange pc will have the same performance as a high end now at a lower price so you again will save money. By then you also will know any hidden camera issues by the many user responses so you can make a better judgment on buying a camera.

I have to say the past weeks after seeing all those gh4 and ax100 videos from users I almost had pressed the (pre)order button but then I had a conversation with myself to ask if I really NEED a 4K camera right now or just WANT one, as the answer was the second I decided to invest my money in the meantime in other things i need.

So investmentwise 2016 would be the best time for me to replace my pc, camerabodies, tv, mediaplayers that can stream 4K and make the complete switch to 4K, I would save the most money and have an easy transition and happy clients eventhough I know that for every trailer I will show here the next 2 years many will say it's not that sharp. :)

Bob Drummond
June 15th, 2014, 10:11 AM
I really don't think it's fair to compare the SD-to-HD transition to the transition from HD to 4K+. SD was nowhere near the quality of large screen cinema projection. 4K on the other hand is greater actual resolution than standard cinema projection.

But I'll happily accept 4K in my next camera. I just don't think it is ever going to make me a better videographer or help me book one extra wedding.

Jeff Harper
June 15th, 2014, 11:44 AM
The transition time for 4K will take a few years.

The transition will be marked by less grief than what occurred during the transition from SD to HD, because the transition is simpler, IMO. Many videographers have been through the previous transition and know what questions to ask when considering the move.

Dave Blackhurst
June 15th, 2014, 06:04 PM
My computer upgrade was overdue (yep, well over 3 years...), so it came down to how much of a boost I could get without killing the piggy... the main "hitch" was the "monitor", no $2-3K+ "budget" for THAT item!! I've described my cheap solution... the rest (CPU/MB/Memory) was not terribly bad, and when I sell off the old and still decently zippy core, it will be even closer to painless.


As far as cameras go, let's take a quick look - 6-9 months ago, the "price of entry" for a 4K capable cam was around $4K, IIRC... 3-6 months ago, that dropped to $2K... a few weeks from now the just announced FZ1000 will drop that to $900... By CES, there will no doubt be a flurry of cameras (such as they may be, consumer grade stuff to be sure!) breaking the $500 price point...

I don't factor in tablets and cell phones, those should be sub $500 too... but look at that price "curve" (actually looks like jumping off a cliff with a good run and no parachute!). Do you think 4K will be in the consumer mindset by this time next year? ALL that is really needed is a simple and relatively reasonably priced delivery format/media, and CONTENT (which may well be provided more by "unknown producers" than in the past?). The ability to CREATE 4K content is ALREADY pretty much there, and when 4K TV's drop... and they will... it will happen fast - consumers already expect resolutions above "HD" on their teeny tiny screens!!

I do have some doubts whether the "big boys" learned anything from the HDDVD v. BluRay debacle where NO ONE "won", and BR still suffers low adoption... it is possible to kill any good thing with greed and stupidity. But presuming that 4K content is available, one way or another, it should catch on fairly quickly.
Even if big screens at high prices aren't the "path", phones and tablets with ultra high resolution screens can easily be the demand "pull" in the market.


Jeff correctly observes that once you get 4K working, you definitely DO NOT want "good old HD" as much... it's nice having the additional sharpness and pixels to work with.

THAT SAID... similar to how you "could" play back SD on and HD screen, usually badly (and probably in the wrong aspect ratio), and have "acceptable" results, it is quite possible to play back HD on a 4K screen! AND IT LOOKS PRETTY GOOD. Not as good as native 4K, but "good enough" - 4K on a 1080 screen seems to also look better than 1080 footage...

Yes, "HD" can now appear a bit "soft" (I'm finding the 50Mbps XAVCS 1080 60p mode in the AX100 to be better in this respect), but certainly "acceptable" for MOST situations. There are also going to be different qualities of "4K" I suspect - the samples from the FZ1000 don't measure up to the AX100 from what I've seen, despite similar or "better" specs. But it's pretty obvious the feature is going to be the next push.


With tech, it's always a challenge of "when to jump". It's hard to gauge when exactly the "old tech" is sufficiently slower or lower quality than the "new tech". SOMETIMES the "new" is nothing more than hype, but other times there is a very real quantifiable difference. I'm with Noa on usually trying to wait a bit, but cameras like the RX10 and AX100 replaced other aging cameras in ways that made them make sense to grab "open box" deals as soon as they popped up. They are proving to be good buys, as they are better "tools" than what they replace, just as with the computer upgrade.

So too with the "early adoption" bugs - yep, there are a "few"... I think I've stumbled over them all for the last 2-3 weeks, but am happy with the results. The sharper screen and images I'm working with now are very satisfying - and yes, older and lower resolution video and stills very much "show their age" by comparison.

I guess having managed to pull off 4K capability on the cheap with VERY easily seen improvements and a number of unexpected side benefits, I'm scratching my head a bit at how it could NOT be "practical"... the AX100 is proving to be the first video camera in quite a while that actually is fun and makes me want to shoot with it. It IS more of a challenge as 4K is not terribly forgiving to "bad camera technique", but that's OK...

Then again, I still have a number of "HD" cameras and still will shoot them when they best suit the situation, at least until upgrades with 4K are available!

SO -
Practical = YES!
Need = maybe?
Should you = depends!

I would suggest that 4K is inevitable, and if you're still shooting SD, maybe it's time...

Jeffrey Butler
August 14th, 2014, 01:05 PM
For weddings, today - right now - 4K is, without question, practical. You may have a different question, but for me: 1 4K camera = 2 HD cameras. Just do the math and use the tools that help you deliver a better product.

I'm replacing 2 JVC HM100s in a ceremony situation with 1 Lumix GH4. I'm so stoked about that math. In fact, when I rented a second one for the reverse angle they threw in the "reframing options package" for free.

***REFRAMING and wide/tight options are key here and why "practical" is a no brainer. In fact, just because there's a balcony, I rented a 3rd Lumix as well.

It's about options and the ability to do things in post to deliver your product. 4K has overwhelmingly more options when cutting to an HD timeline. If you don't care and your clients don't care and 4K won't "make your job easier" and "your product better" then don't shoot it.

But 4K is practical and doable today for weddings - and for me it's because of the GH4. I would not be shooting a RED wedding. That flavor of 4K is not practical for me. And for most of the wedding, at this point and because I just picked up a C100, I won't be shooting 4K the entire time. Just using it when it's "practical" (or awesome, or useful, or very helpful).

For those interested, I'll be shooting 3xGH4s and 1xC100. The C100 is primary; the GH4s will be relegated to the ceremony. And delivery is HD files as well as DVD.

Noa Put
August 14th, 2014, 01:47 PM
I recently posted a thread with some example images of the cop advantage here: http://www.dvinfo.net/forum/wedding-event-videography-techniques/524451-advantages-cropping-4k.html

I"m 6 weddings behind in my editing and only recently started to edit the footage I shot some time ago with my gh4 and in about 3 wedding edits my ax100 will be joining the edit party. Fun times :)