View Full Version : Editing Hd Footage


Joseph De Leo
November 24th, 2006, 01:59 PM
I'm the proud owner of a Sony HC1. My question, hopfully you might have some input, is:

What is the best computer system setup to smoothly handle and edit HD Video?

I currently have a p4 2.4, 2gigs of ddr ram, 256mg nvidia vid/crd, 160gig & 80gig HDD.

When I try to capture the computer chokes, the footage stutters- even on play back.

I need to know the proper components needed to construct a kick ass hd editing PC. My budget is no more than $1500.

Timothy D. Allen
November 24th, 2006, 06:09 PM
The biggest thing, in my opinion, is to upgrade your video card. I will probably guess that you're using an NVIDIA card made for gaming - such as a GeForce, or something like that. I would suggest an NVIDIA Quadro FX, it's a workstation card, and is made specifically for our kind of work, not gaming. You'll probably pay the same amount for a 128MB Workstation card as you would for a 512MB+ gaming card, but that's not an issue. Workstation cards don't need that kind of ram when their processors are much faster, and streamlined for actual video rendering and the like.

I had a high MB ATI Video card that I figured was going to do the trick no problem! It played all my games great, but when it came to running all my design software like Avid, Combustion and Photoshop, my video playback was sluggish, and I had a lot of “window tarring” – like when you drag a window across your screen, and the top part gets there before the bottom. But after doing a lot of reading I found that a workstation card was the answer.

I would say do a little more research about the two types of cards before you take my word for it, but I’ve spent a lot of time reading up on the differences between the two before I finally found out what really matters in a video card - as far as editing, and compositing is concerned.

I would say regardless if this is what is causing your current problem, a workstation card like the Quadro FX is a must in order to have a “kick ass” HD editing system.

Best of luck, and I hope that helps….

Peace,

Joseph De Leo
November 24th, 2006, 10:13 PM
Thanks Tim. That info does help. Do you have any other tips as far as other system requirements go, processors, harddrives, and motherboards etc...

what kind of complete computer system are you using??

Larry Price
November 25th, 2006, 03:47 AM
I would echo Tim's comments about video cards. Get the best "pro" card you can afford. The Nvidia Quadro FX series seems to lead the market at the moment, though the ATI FireGL cards are supposed to be decent. You might want to look at the Matrox Parhelia cards as well. If you were to go with Adobe PremierPro, it's said to work particularly well with the Parhelias. I have Matrox Millenia cards (a P750 and a P650 - they're just below the Parhelia line in terms of performance and capabilities) in two of my systems and they've been strong, stable performers, though I've not used them for video editing.

If you know which editing software you intend to use, you might be able to get a good idea of what kind of system you need by visiting the software's web site and looking for the system requirements and recommendations. Hardware requirements vary considerably depending on your choice of editors. For example, once I'd decided to go with Avid Xpress Pro Studio Complete, I followed Avid's recommendations and bought an HP xw8200 workstation with dual 3.6 GHz Xeons and a Quadro FX 1400 video card. I've also got 2 WD Raptor 150's in a RAID 0 configuration for the system drive, and will be getting a Medea VideoRaid RT3 for the video storage. Of course, such a system is considerably beyond your stated budget of $1500. I needed a high-end system because of the demands of the Avid suite. There are other very capable editors that have much more modest hardware requirements. From your post, I'm assuming you already have an editing package, so check the web and find out what the manufacturer recommends.

I've just begun learning all this stuff myself, but I'm happy to share what I learn as I go along. I'm a network admin by trade, and I've discovered that the world of video production is changing as rapidly as the IT industry, so I've essentially doubled the amount of learning and keeping up I have to do! Good thing I enjoy it!

Good luck to you, and let us know what you end up using.

Aloha!

Douglas Spotted Eagle
November 25th, 2006, 08:38 AM
with many video apps, the quality of the video display card makes no difference whatsoever. Some apps take advantage of GPU, many don't. Therefore, an expensive video card may not be important at all. Be *sure* you know which app you're using first, and then be aware of what that application recommends. As an Avid and Vegas user, I'm using cards that are 2-3 years old, and only 128Mb on the vid card, dual headers.

Joseph De Leo
November 25th, 2006, 09:41 AM
Thanks guys for your input. Douglas what kind of system are you using?

Timothy D. Allen
November 25th, 2006, 05:03 PM
Thanks Tim. That info does help. Do you have any other tips as far as other system requirements go, processors, harddrives, and motherboards etc...

what kind of complete computer system are you using??

Right now I'm on an AMD Dual Core 4400, 2 Gigs of RAM, two 300GB HDs setup on a raid, and some ATI card I got at fry’s. This has served me well, but I'm moving into a production department that runs completely on Macs, so my system is on order, and does include an NVIDIA Quadro FX 4500 512MB, Stero 3D (2 x dual-link DVI) which I'm pretty excited about.

I would agree with Larry that you should find out what the system requirements are for the specific software you’ll be using, but mainly look at the specs. A lot of companies, like Avid, have 'deals' with other PCs corps like HP to recommend their gear. So I'd say take all the specs and build your own, and you'll end up paying less. I spect out my PC on Alien Wear for about 5k, and built if for 2.5k.

Joseph De Leo
November 26th, 2006, 05:53 PM
Will do, thanks Tim. I'll let you know what happens

Jamie Hellmich
November 30th, 2006, 12:05 PM
Joseph,

Your computer sounds very similar to mine and I have no problems in HDV using Vegas 7.0b, there is some preview stutter if I set the preview window at too high a resolution.

MAKE SURE you have no other programs running in background or otherwise.

Shut down all antivirus, monitoring, updating, or any other programs running. I even disable my wireless adapter. The only thing on my task bar is the clock when capturing or rendering, and I should shut it down.

Also, having the preview window too large or at too high resolution will cause the preview to stutter. It generally does not affect the capture or rendered video if it is.

Your capture to and render to hard drive should not have any programs installed on it or reading from it while capturing or rendering. I use an 80 gig for video and picture files only. No long term storage or program use.

You don't say what software you are using, but your system should be sufficient to work with HDV, perhaps even better with a processor upgrade.

I did just order today a 3.0 mhz P4 processor for my system to replace the 2.26. I am limited right now with a 478 pin motherboard cpu socket, so 3.0 was all I could find. Should be a good $80 spent.

Jamie

Joseph De Leo
December 1st, 2006, 07:31 AM
Jamie

I'm using vegas video 7b as well.

Dennis Hingsberg
December 4th, 2006, 01:48 PM
What file format are the files in when capturing? Actually.. what is the best way to capture HD to PC?

Do you guys think a single 3.4GHz processor dual-core with 1.0GB of ram will do the trick? The video card is an dual ATI Radeon X1300 Series card.

Lastly from your guys experience, what's the best downconverting (ie. HD to SD) process?

Can't wait for my next HD project which should be soon!


Thanks...

Steven Bills
December 4th, 2006, 06:14 PM
I think that the best process for doing HD to SD is simply rendering it to SD... If that was your question.

640x360, Bit rate: 150-250, depending, Sorenson 3 @ 50%, IMA 4:1 Audio, and 'fast start' for streaming.

That is what I use, and the movies turn out great, and at a managable file size too.


SB

Dennis Hingsberg
December 5th, 2006, 12:09 PM
Thanks. This has definitely been a handy thread. I'm going to shoot some HD soon and try editing on my new machine - I just bought a Firewire card so I'm basically ready to go!

Dennis Hingsberg
December 11th, 2006, 09:39 AM
This question is for Tim or Larry...

With your dual processor systems, what frame rate and output resolution are you able to preview your editing timeline and please also include the setting ie. preview/best or full/auto, etc.. (this is if you're using Vegas, I'm not sure if AVID displays playback framerate)

As far as "how powerful a system has to be for HD" I've noticed that capturing HD footage does not seem to be as intensive as playing back from a timeline at full frame rate (ie. 29.97). Throw a filter or two on your footage and then watch your PC bog down and grind to a halt.

So the real question for the guys using dual processor PC's with the Nivida cards is what's the "highest" output you can get out your PC before taxing out the CPU?

A killer dual core 2GB of RAM high end Nividia FX card is pretty useless if you can only preview your timeline at a 320x180 preview/good setting.

Mikko Lopponen
December 11th, 2006, 02:18 PM
...Do not buy a Quadro or similar workstation card for editing. It won't change anything but just burn your pocket.

Dennis Hingsberg
December 11th, 2006, 02:47 PM
Mikko, can you elaborate a little on that then? What do you recommend for smooth editing and playback of HD?

See this is where I (and likely some others trying to build a solid HD system) are getting confused in terms of hardware specs needed for a good system.

Some say dual core, some say dual processor... and some say it's all in the card.

Having said that, an article on http://www.videoguys.com/HDV.html#system highly recommends the FX Quadro cards since the range in speed of graphic memory bandwidth from 17.6GB/sec to even 40GB/sec.

Dennis Hingsberg
December 11th, 2006, 04:30 PM
...and to follow up with my last post I also just read that Sony Vegas 7 does not use GPU power available in Quadro FX cards... so anything like a FX540 or similar will do.

Use Premiere Pro however and it's a different story you need the GPU power.

Dennis Hingsberg
December 11th, 2006, 04:30 PM
...and to follow up with my last post I also just read that Sony Vegas 7 does not use GPU power available in Quadro FX cards... so anything like a FX540 or similar will do.

Use Premiere Pro however and it's a different story you need the GPU power.

Okay @#% it , I'm buying a MAC ; )

Mikko Lopponen
December 12th, 2006, 12:35 AM
Use Premiere Pro however and it's a different story you need the GPU power.

No you don't. Quadro cards are basically hardware accelerated opengl 2.0 cards. They are for 3d-work. Premiere needs processor power, but some plugins (which you will never use) are 3d-accelerated.

After Effects utilises opengl more.

What you really need is lots of memory and a great cpu.

It's funny that the hdv handbook article is recommending quadro cards instead of gaming cards when they are basically rebadged gaming cards with different drivers. How much bandwidth do you really need? It's not like they're texturing gaming worlds, just displaying one picture at a time.

Larry Price
December 12th, 2006, 04:15 AM
This question is for Tim or Larry...

With your dual processor systems, what frame rate and output resolution are you able to preview your editing timeline and please also include the setting ie. preview/best or full/auto, etc.. (this is if you're using Vegas, I'm not sure if AVID displays playback framerate)

As far as "how powerful a system has to be for HD" I've noticed that capturing HD footage does not seem to be as intensive as playing back from a timeline at full frame rate (ie. 29.97). Throw a filter or two on your footage and then watch your PC bog down and grind to a halt.

So the real question for the guys using dual processor PC's with the Nivida cards is what's the "highest" output you can get out your PC before taxing out the CPU?

A killer dual core 2GB of RAM high end Nividia FX card is pretty useless if you can only preview your timeline at a 320x180 preview/good setting.

Dennis,

I can't give you any numbers yet because I'm still buying all the pieces! I just received Avid Xpress Studio Complete today and hope to have everything installed by the weekend. When I bought my workstation, I just went with one of Avid's "qualified" systems (HP xw8200) and bumped it up to the fastest CPUs available at the time I placed the order (3.6 GHz Xeon), and went with 4GB of RAM (Avid recommended 2GB).

As far as the Nvidia Quadro cards, I went with the Quadro FX 1400. Avid supposedly makes heavy use of OpenGL, especially in the Studio suite. I think that's why they push the Quadro's so hard. At least that's what I've read in various places.

Once I have everything up and running, AND teach myself how to use all these nifty new toys, I'll be happy to let you know what this thing will do.

Aloha,

Larry

"So many hobbies, so little time..."

Dennis Hingsberg
December 12th, 2006, 02:00 PM
Cool Larry - if you get some numbers please post 'em! That would be great...

Conrad Gibbs
December 29th, 2006, 09:59 AM
No you don't. Quadro cards are basically hardware accelerated opengl 2.0 cards. They are for 3d-work. Premiere needs processor power, but some plugins (which you will never use) are 3d-accelerated.

After Effects utilises opengl more.

What you really need is lots of memory and a great cpu.

It's funny that the hdv handbook article is recommending quadro cards instead of gaming cards when they are basically rebadged gaming cards with different drivers. How much bandwidth do you really need? It's not like they're texturing gaming worlds, just displaying one picture at a time.
I disagree Quadro cards is not a rebadged gaming card with different drivers, otherwise people would just be installing these "drivers" to the cheaper Geforce cards in order to obtain Quadro features.

Differences...

There are notable differences between Quadro and GeForce cards, and a great effort is put into creating the perfect solution for both parties.


Anti-aliased points and lines for wire frame display

A unique feature of Quadro GPUs is supporting anti-aliased lines in hardware, which has nothing in common with GeForce's full-scene anti-aliasing. It works for lines (but not for shaded polygons) without sacrificing system performance or taking extra video memory for over-sampling. Since this feature is standardized by OpenGL, it is supported by most professional applications.


OpenGL logic operations

Another unique feature of Quadro GPUs is supporting OpenGL Logical Operations which can be implemented as the last step in the rendering pipeline before contents is written to the frame buffer. For example workstation applications can use this functionality to mark a selection by a simple XOR function. When this function is done in hardware, such significant performance loss as a GeForce adapter would cause will not happen. OpenGL can be used for either consumer or workstation adapters.

The most common applications for GeForce adapters are full-screen OpenGL games. CAD applications work with OpenGL windows in combination with 2D-elements.


Up to eight clip regions (GeForce supports one)

A typical workstation application contains 3D and 2D elements. And while view ports display window-based OpenGL function, menus, rollups and frames are still 2D elements. They often overlap each other. Depending on how they are handled by the graphics hardware, overlapping windows may noticeably affect visual quality and graphics performance. When windows are not overlapped, the entire contents of the color buffer can be transferred to the frame buffer in a single, continuous rectangular region. However, if windows do overlap, transfer of data from the color buffer to the frame buffer must be broken into a series of smaller, discontinuous rectangular regions. These rectangular regions are referred to as "clip" regions.


GeForce Hardware supports only one clip region which is sufficient for displaying menus in OpenGL. Quadro GPUs support up to 8 clip regions in hardware, keeping up the performance in normal workflow using CAD/DCC applications.


Hardware accelerated clip planes

Clip planes allow specific sections of 3D-objects to be displayed so that users can look through the solid objects for visualizing assemblies. For this reason, many professional CAD/DCC applications do provide clip planes. The GPU of the Quadro family supports clip-plane acceleration in hardware - a significant improvement in performance when they are used in professional applications.


Optimization on Memory usage for multiple graphics windows

Another feature offered by the GPUs of Quadro family is Quadro memory management optimization, which efficiently allocates and shares memory resources between concurrent graphics windows and applications. In many situations, this feature directly affects application performance and offers considerable benefits over consumer-oriented GeForce GPU family.

The graphics memory is used for frame buffer, textures, caching and data. NVIDIA's unified memory architecture allocates the memory resources dynamically instead of keeping a fixed size for the frame buffer. Instead of wasting the unused frame buffer memory, UMA (Unified Memory Architecture) allows it to be used for other buffers and textures. When applications require more memory from quad-buffered stereo or full scene anti-aliasing, manage resources efficiently has becomre a more important issue.

Support for two-sided lighting

Quadro hardware supports two-sided lighting. Non-solid objects may display triangles from their "backside" when viewing the objects from the inside. Two-sided lighting prevents the lighting effect from dropping to zero when the object surface normal points away from the lighting source. As a result, these "backward-facing" triangles will remain visible from all possible viewing angles.


Hardware overlay planes

The user interface of many professional applications often require elements to be interactively drawn on top of a 3D model or scene. The cursor, pop-up menus or dialogs will appear on top of the 3D-viewport. These elements can damage the contents of the covered windows or affect their performance and interactivity.


To avoid this, most professional applications use overlay planes. Overlay planes allow items to be drawn on top of the main graphics window without damaging the contents of the windows underneath. Windows drawn in the overlay plane can contain text, graphics etc - the same as any normal window.


The planes also support the transparency function, which when set allows pixels from underneath the overlayed window to show through. They are created as two separate layers. This prevents possible damage to the main graphics window and it also improves performance. Likewise, showing an overlayed window as transparent with graphics inside allows items in the user interface to be drawn over the main graphics window.


Clearing and redrawing only the overlayed window is significantly faster than redrawing the main graphics window. This is how animated user-interface components can be drawn over 3D models or scenes.


Support for quad-buffered stereo for shutter glasses

The Quadro GPU family supports quad-buffered stereo, but GeForce GPU family does not. Quad-buffered stereo is a type of OpenGL functionality which does not depend on any special stereo hardware to show the effect. Two pictures, both double-buffered, are generated. Display is done alternately or interlaced, depending on the output device.


Many professional applications like 3ds max, SolidWorks or StudioTools allow users to view models or scenes in three dimensions using a stereoscopic display. It can be done by a plug-in like in Solidworks, an application driver like MAXtreme in 3ds max, an external viewer like QuadroView for autocad-based products, or by the application itself. The use of stereoscopic display is to have an overview in complex wire frame constructions, making walkthroughs much more realistic and impressive or simply to improve the display of large 3D-scenes. Stereo support on Quadro GPU family significantly benefits professional applications that demand stereo viewing capabilities.


Unified driver Architecture

Quadro GPUs provide several additional features and benefits for professional optimization and certification in applications.


Application Optimization


Quadro works closely with all workstation application developers that include Alias, Adobe, Autodesk, Avid, Bentley, Dassault, Discreet, Multigen-Paradigm, Newtek, Nothing Real, Parametric Technology Corp. (PTC), SDRC, Softimage, SolidEdge, SolidWorks, and Unigraphics, and it ensures that every application takes full advantage of the features provided by GPUs and that performance of graphics drivers are fully optimized.


Certification

Quadro drivers undergo rigorous in-house quality and regression testing with various workstation applications. By testing new workstation drivers against numerous applications, higher quality drivers can be released.

Mikko Lopponen
January 1st, 2007, 04:47 PM
with different drivers, otherwise people would just be installing these "drivers" to the cheaper Geforce cards in order to obtain Quadro features.

People were doing just that. A new bios and whallup, a geforce had turned into a quadro.

http://www.hardwaresecrets.com/article/72

That article is from a couple of years ago.

Timothy D. Allen
January 1st, 2007, 06:07 PM
Interesting. This is the first thing I've seen that says anything remotely close to this.

I'm not completely convinced, but I'm sure going to do some more research.

Graham Hickling
January 1st, 2007, 10:52 PM
I kinda skimmed this thread, but I don't see Joseph mentioning what NLE(s) he plans to use. The issue of whether a high-end graphic card will be helpful or not depends entirely on that.

Most (all?) of the items on Conrad's list will have no bearing whatsoever on Premier Pro performance (for example) - in that case, as Mikko says, better to spend the $$ on a faster processor with more cores.

AfterEffects makes greater use of OpenGL for some (but certainly not all) of its features, and other programs like Vegas and Liquid have their own different functionalities.

Joseph De Leo
January 2nd, 2007, 03:23 PM
I will mostly be using vegas 7.

Conrad Gibbs
January 3rd, 2007, 07:56 AM
People were doing just that. A new bios and whallup, a geforce had turned into a quadro.

http://www.hardwaresecrets.com/article/72

That article is from a couple of years ago.The last Geforce you can apply SoftQuadro is the Geforce 6800...beyond that its unworkable...

With Quadro FX, its the added 3d capabilities and features and certified/stable drivers you're paying for.

You'll generally find that both Quadro and GeForce will provide excellent performance. The Quadro however will provide additional performance characteristics in OpenGL applications. The finely tuned drivers will provide application specific optimizations for you, as well as enhanced anti-aliasing support.

The Quadro cards also support additional OpenGL calls which may not be supported in the GeForce drivers. These additional functions are mainly utilized in hardware based final frame rendering in most cases, such as Mayas hardware renderer and NVIDIA Gelato. There has been notable image quality differences between Quadro and GeForce in these respects.

One additional thing to note as well though is you will be paying a premium regardless of these advantages for hardware support and quality assurance. Additional testing is generally done on each GPU and board that will be classified as a Quadro to insure that it will be functioning properly both when you get it, and years down the road. Support I believe is lifetime as well.

Joseph De Leo
January 8th, 2007, 12:16 PM
Happy New Year

I finally upgraded to a dual core E6700- 2 gb ram- 500gbhd- nvida quadro fx1400 and it is absolutely awesome! My new system really flies.

However, I have noticed a new little issue, and this concerns Vegas Video 7. I’ve been in post production, for the better part of last year, on my first feature film and I’ve noticed that Vegas does not seem to handle footage that has been shot with red lighting. I see severe pixalation on the footage as well as some artifacts. I thought that it might be a codec issue, but now I’m not sure.

To see a sample of what I’m talking about- view my trailer, here’s the link:

http://www.deleoproductions.com/prosandcons.htm

Have any of you experienced any of this?

Dennis Hingsberg
January 8th, 2007, 12:55 PM
Hi Joe, checked out your trailer - great stuff... liked the actress too!

I don't think the issue has anything to do with Vegas. I remember an old post I read on this with regards to SD video and the DV format. The issue is more related to limitations around the DV codec. In particular the way the chroma (color) information is sampled versus the luma (black + white) information. (YUV information is sampled at 4:1:1 for NTSC DV)

The technical explanation (taken from www.adamwilt.com/DVvsMJPEG.html ) for why it happens is because of the down-sampling technique that takes place when going from RGB color of your image, to YUV color space 4:4:4 which then gets downsampled to 4:1:1 so for use by the DV algorithm. Basically 4 times your color information is thrown away and the result can be pixeled edges around color.

Many SD camera's have this issue including XL1, XL2, VX2000, PD150's.. . it's mainly observed by people when shooting the color "red".

A fix for it that I have read of (if your software lets you do it) is to do a 4 pixel horizontal blur on the chroma channels of your video. If your software doesn't let you do it, you might have to seperate your channels, perform the blur, then recombine your channels to one picture. There is probably some easy way to do this.

I just found this free resource by VASST explaining how to do it: www.vasst.com/search.aspx?text=Chromablur but I think you have to sign-up to read it.

Let me know how you make out with this.

Joseph De Leo
January 8th, 2007, 01:57 PM
Thanks man! see ya tonight!

Mikko Lopponen
January 9th, 2007, 03:30 AM
The Quadro cards also support additional OpenGL calls which may not be supported in the GeForce drivers.

Yeah, but it's just a driver thing. There's no reason a 8800 geforce couldn't be able to technically perform all those little tricks. If memory bandwidth is the be-all-end-all then the 8800 series has something going on as the gtx has about 86.4 gb/s. Compared to the highest quadro fx 5500 which has about 33.6 gb/s.

But ofcourse memory bandwidth doesn't correlate to "business" performance whatever that is. Video editing doesn't really utilise those specific opengl calls anyway. They are for 3d-work.

Dennis Hingsberg
January 9th, 2007, 02:04 PM
So Joe and I got together last night to do some HDV testing on the new crazy fast PC and some HDV footage...

Here are some results:

We dropped two video clips 1440 x 1080 60i 29.97 fps similar in size to the Vegas 7 timeline. One clip was captured directly to the hard drive in the M2T format using Vegas. The second clip was captured with a trial version of Cineform (Prospect HD 1.5kpbs 24 bit 29.97).

Using the "Preview/Best" viewer setting:

Test One: Both files played perfectly on the timeline without any change in preview frame rate.

Test Two: Added yellow tint, brightness/contrast & glow filters. M2T played back at about 23fps average. The Cineform file much less at about 19fps average.

Conclusions is: WTF???

Isn't Cineform suppose to yield a 4X increase in performance? Anyone have any ideas we can try again?

Luis de la Cerda
January 25th, 2007, 01:48 AM
Using NewTek's SpeedEdit on my core2duo 2.33 macbook pro 15" with the external display connected to a dell 24" 2407 monitor, I can do 2 streams of hdv with color correction and a DVE applied to the second one and a title overlay in real time monitoring at full resolution. :) I just love it :)