View Full Version : Quad Core 2.4Ghz vs Core 2 Duo 3.0Ghz


Jon McGuffin
August 2nd, 2007, 10:05 PM
Curious (David) or anybody else...

What kind of performance differences are we seeing with these two CPU's? Currently putting a new system together, will be editing via Vegas and then buying another copy of NEO HD.

Currently both CPU's are identical in price. One has 4 cores at 2.4Ghz, the other has 2 cores at 3.0Ghz and a 1333Mhz FSB. I'll be building in a 3 disk RAID 0 setup but I mainly want something that will give me strongest playback performance on the timeline editing Cineform .AVI files.

Anybody have any idea which way would be better? I do realize that rendering on the Quad will probalby be quicker, but I'd sacrifice some rendering performance for raw timeline performance. I've found editing multiple video streams simultaneously while applying transitions, effects to my HDV source video to be a little slower than I'd like on my current E6600 system.

Any advice or help is greatly appreciated,

Jon

David Newman
August 3rd, 2007, 09:09 AM
I still slightly favoring 3GHz dual core with 1333FSB, over the slower clocked quads -- I'm sure that will change soon.

Jon McGuffin
August 3rd, 2007, 01:03 PM
Great... That was the angle I was kinda leaning towards with my gut as well. I'm sure, in time, that may not be the case, but perhaps by the time Quad Core's will really be utilized, we'll be up to 3.0Ghz+ Quad Cores for half the price the 2.4Ghz's are now...

In a world that's easy to complain about a lot of things not being the way they should be... Buying computers at these great prices I think goes often unappreciated...

Jon

Stephen Armour
August 3rd, 2007, 02:28 PM
Jon, you could always overclock. If your board and memory support it, the 6600 appears to overclock well. If we get a better cooling solution, we'll probably overclock our quad to 2.7 or 2.8 just to gain some editing speed. Supposedly it clocks to 3GB fairly easily.

Makes me wish I had a V-8 (dual quad) now

Jon McGuffin
August 3rd, 2007, 05:11 PM
I can overclock, and that's an option, but in most cases I think I'd rather stick at stock speeds. If the processors were unlocked like the "Extreme" series, I'd be more inclined to do it by changing multipliers and bus speeds independently. I don't much care for "stressing" my systems to the point of having to rely on fans and very high quality power supplies.

I still live by the idea that a stable and reliable system is more important than a fast one.

From an overclocking standpoint, I'm pretty sure I'd have more luck with the E6850 than the QuadCore as those chips are already sucking a lot of power and generating a lot of heat. 600Mhz X2 core's is nothing to sneeze at and though I *know* rendering times in Vegas will be faster on the Quad, I can't help but think that day-to-day tasks, particularly on the Vegas timeline, would be improved with faster frequencies in the dual-core.

Jon

Richard Leadbetter
August 4th, 2007, 08:07 AM
You need to be really careful with heat when it comes to overclocking the Q6600 - even at stock it's a hot chip. And you also need to get your BIOS options right. And you need to have the right motherboard.

I wouldn't recommend it personally.

Damon Gaskin
August 4th, 2007, 03:17 PM
Thats interesting guys because my Q6600 idles at 25C and hasn't gotten over 42C since I have had it. That includes encoding(which btw only utilizes at most thus far 42% of cpu power). One thing to keep in mind with any machine is cooling in the case. The Antec 900 case that I have now, lowered my cpu temps by at least 10C from what I remember. And yes, it does overclock really easily. I just played with mine to see how high it would go without problems, and I easily took it to 2.86 on just air and stock cooler, and if I remember it went up to 32C and 44C when encoding. I currently have it at 2.6Ghz and 26C while I type this.

I am not sure how the Cineform works, but I installed a Microsoft update(Its the DB936357) that someone suggested on newegg just for an experiment, and it is working even better than it originally was(which to be honest is really hard to believe). I can easily run Photoshop, Premiere AE, and Soundbooth while burning a disc in Encore without any type of hiccups, glitches or the like that I currently experienced with my Pentium D. Yes, the clock speed is lower than the duals, but I would have to say they are extremely efficient in the way they are manufactured. I don't know, but I have never had a machine run like this before overall. It is simply amazing to me. To be honest, I have yet to see the processor go to 100% as it normally did and would with my previous processors.

Just depends on what your going to do. If your really going to multitask, the quads are amazing chips... I also definately have not had any timeline slowdowns at all. But then again, I use Premiere which does utilize all of the cores.

I just cant recommend the quads enough for editors to be honest. As the other user suggested, make sure your motherboard supports it. But from what I understand, most major MB mfg's that have done boards post and including the 965 chipsets utilize the quads. I am running an Abit AB9 and it required no bios updates at all. It was simply a drop in and took only ten minutes to install.

Just my .02..

Bob Buchanan
August 4th, 2007, 04:59 PM
Adobe apps work better on quad cores:

http://www.extremetech.com/article2/0,1697,2049695,00.asp

To quote: "While the quad-core chip does well here, the four cores don't make as massive a difference as they did in the 3D rendering tests. The increase in performance is substantial, and if you're doing a ton of video editing and encoding, with lots of transitions and filters, you should see significant productivity increases."

Salah Baker
August 4th, 2007, 05:17 PM
Sorry I am on Quad cores and no Adobe does not scale

Jon McGuffin
August 4th, 2007, 06:53 PM
Thats interesting guys because my Q6600 idles at 25C and hasn't gotten over 42C since I have had it. That includes encoding(which btw only utilizes at most thus far 42% of cpu power). One thing to keep in mind with any machine is cooling in the case. The Antec 900 case that I have now, lowered my cpu temps by at least 10C from what I remember. And yes, it does overclock really easily. I just played with mine to see how high it would go without problems, and I easily took it to 2.86 on just air and stock cooler, and if I remember it went up to 32C and 44C when encoding. I currently have it at 2.6Ghz and 26C while I type this.

I am not sure how the Cineform works, but I installed a Microsoft update(Its the DB936357) that someone suggested on newegg just for an experiment, and it is working even better than it originally was(which to be honest is really hard to believe). I can easily run Photoshop, Premiere AE, and Soundbooth while burning a disc in Encore without any type of hiccups, glitches or the like that I currently experienced with my Pentium D. Yes, the clock speed is lower than the duals, but I would have to say they are extremely efficient in the way they are manufactured. I don't know, but I have never had a machine run like this before overall. It is simply amazing to me. To be honest, I have yet to see the processor go to 100% as it normally did and would with my previous processors.

Just depends on what your going to do. If your really going to multitask, the quads are amazing chips... I also definately have not had any timeline slowdowns at all. But then again, I use Premiere which does utilize all of the cores.

I just cant recommend the quads enough for editors to be honest. As the other user suggested, make sure your motherboard supports it. But from what I understand, most major MB mfg's that have done boards post and including the 965 chipsets utilize the quads. I am running an Abit AB9 and it required no bios updates at all. It was simply a drop in and took only ten minutes to install.

Just my .02..

Damon,

What editing software are you using? While rendering, your CPU should be pegged at 100% - afterall, this is what you paid for when you bought a Quad Core CPU for rendering. Some applications are able to take advantage of the Quad Core's better than others. I know Vegas has a good reputation for deliverying rendering times that are almost 100% faster than Dual Core's at the same frequency.

The real question here though is, what happens when a dual core CPU runs 600Mhz faster on each core (E6850)? In addition to the raw speed increase, throw in a faster bus speed, newer stepping, lower power requirements, less heat, etc and this has me doubting the Quad core is the only way to go. Specially since rendering isn't something that I really get held up on because I'm either rendering something that isn't very long and to wait 1 minutes versus 40 seconds just doens't bother me, or it's a large project that I'll just have the system do it overnight either way.


Jon

Mark Williams
August 4th, 2007, 07:37 PM
Since we are talking about the Q6600, what do you think about this setup from Dell at $969.00?

Dell 9200
Intel® Core™ 2 Q6600 Quad-Core (8MB L2 cache,2.4GHz,1066FSB)
Genuine Windows® XP Home Edition
2GB Dual Channel DDR2 SDRAM at 667MHz - 2 DIMMs
160GB Serial ATA 3Gb/s Hard Drive (7200RPM) w/DataBurst Cache™
Single Drive: 16X CD/DVD burner (DVD+/-RW) w/double layer write capability
20 inch E207WFP Widescreen Digital Flat Panel
256MB NVIDIA GeForce 8600GT-DDR3
Integrated 7.1 Channel Audio
IEEE 1394 Adapter

Jon McGuffin
August 4th, 2007, 09:36 PM
Since we are talking about the Q6600, what do you think about this setup from Dell at $969.00?

Dell 9200
Intel® Core™ 2 Q6600 Quad-Core (8MB L2 cache,2.4GHz,1066FSB)
Genuine Windows® XP Home Edition
2GB Dual Channel DDR2 SDRAM at 667MHz - 2 DIMMs
160GB Serial ATA 3Gb/s Hard Drive (7200RPM) w/DataBurst Cache™
Single Drive: 16X CD/DVD burner (DVD+/-RW) w/double layer write capability
20 inch E207WFP Widescreen Digital Flat Panel
256MB NVIDIA GeForce 8600GT-DDR3
Integrated 7.1 Channel Audio
IEEE 1394 Adapter

Well... that certainly looks like a good place to start. I'm pretty much convinced that you'll want a second 20" display for video editing work and that 8600GT is a great video card as well. Also, the hard drive is just a start as well - considering adding a few extra drives and setting them up in a RAID configuration. Not bad though..

Jon

Pete Bauer
August 5th, 2007, 07:23 AM
Overall, a pretty good value for dollar. A couple of personal opinions:

Consider going with XP Pro, though, as Home is missing a few things under the hood that you might want later in terms of computer admin capabilities.

FWIW, I'm also toying with the idea of building a new box with XP 64 bit, which some people seem to be liking, as long as they're careful to make sure 64 bit drivers are available for ALL hardware in the system. Might be more hassle than it is worth, though...don't know yet.

For hardware, now that systems with the 1333 FSB are out, I personally wouldn't buy a 1066 system for future HD editing use. My understanding is that Dell builds their own motherboards, so short of getting the specs from Dell, no way to know if the MB would support future 1333 processors.

For HD, you'll want to set up a RAID 0 for your data files. Make sure the MB supports RAID 0 or that you have the budget remaining to put in a RAID card and get a couple more disks.

Damon Gaskin
August 5th, 2007, 07:29 AM
Hi Jon. I am Using CS3 Prod Premium with the Matrox RTX2 version 3.0 drivers. To be honest with you, as I stated, previously with the other cpu's it did go to 100% and stay there. But then again, alot of my other applications did so also. But with this chip, nothing every gets it to 100 with the exception of when I ran some benchmarks and had to even run multiples of those, tweeking the settings to "force" it to stress the processor more. To be honest from reviews of other quad users on my favorite newegg(lol), this is not uncommon at all. Not just with Premiere and the Adobe software, but also with other applications/most.

For example, with my previous processor, when I did a render, yes it would shoot up higher for sure. But then again it would take much longer also. So I am not sure, but I believe the processor is simply more efficient. As this actual is happening across the board with every program after purchasing the chip.

And to be honest, I am not sure of the comparison between the dual and the quad. I know I have read, they are recommending the quads strongly for editing, and the higher clocked duals for gaming. I am actually even seeing this on the Intel and Microsoft site.. And this may sound strange, but I would believe more of the route of the mac pros machines since they have been focused on getting more cores in their boxes versus the higher ghz speed, though they do have some at higher Ghz. I simply know mine is cruising. I am rendering something now in premiere and it is a 1:19 minute timeline(a scale to frame size and different format video), it is only going to take 30 minutes to render it from the little counter. This is all at the 38% of processor speed it is using. And yes, some are I am sure definately more efficient.

One question to you Jon though, you said your not talking about rendering and the like, what are you speaking of? The layering ability of the processor? Like how many levels you can stack on top of one another? What format are you using? DV or HDV? I can test and see, though, I don't think its fair with the Matrox card. But I can try a native Adobe project, just to see. I know with the Matrox card for DV, I can now stack 11 layers. On my little test, I was able to do 8 video layers with both opacity changes and motion, creating static PIP's and 3 titles on top of those.

With HDV, I was able to stack 3 layers of video(1080i captured via firewire), 2 of which with opacity and motion changes creating pips. I also was able to add three titles without any hiccups. When I actually did add the fourth layer of video, it did not prompt a render until I went to adjust the opacity of the clip or added the motion to create the third pip on that fourth video layer. I also played around with the three titles and after trying to add a ripple to the third title, I called it quits. Throughout this test, I never had a stall, the system didn't require renders at all on the DV test, the timeline was always responsive, I had Premiere, Firefox, Photoshop, and IE open writing this message, and the processor still only shot to 73 percent with a temp of 32C. When I did go to export the HDV multiple layer clip down to 50MB I frame it took the processor to 77% max and a temp of 37C taking only 1 minute 28 seconds to export the 1 minute 31 second timeline(I deleted the fourth layer clip since I wasn't using it anyhow). When I actually brought this into Cinemacraft basic doing a two pass 8.9MB encode, it took 41 seconds first pass and 51 seconds second pass. If my math is right, that is basically 2X realtime in total to export to a nice quality format ready for burning a realistic multi layered timeline. It was small for sure, but you could easily have substitued color correction for the opacity and pips.. Before I was looking at 6 to 9 times realtime for exporting and rendering the HDV. With DV, forget about it! The exports out of premiere are like and have always been since my X100(thought it wasn't I-frame and just regular DV) much less than realtime for the most part from the timeline, and with this processor its no different. I actually exported a one hour timeline the other day and it only took 20 minutes. The conversion to Mpeg in encore took another 20, and I was good to go. Still under realtime... I am sharing all of this simply to share my workflow and how this processor has accelerated. Saying this are faster is definately an understatement for me, even with the cores only shooting to less than 100 percent. My thought is, if it does what it is supposed to faster, with less cpu power, which in turn generates less heat, requiring less cpu cooling(none of the fancy thermaltake and artic cooling paperweights), it doesn't matter to me that it doesn't "go to 100%". That just is saying its more effecient with the same software..

That is on my system. I usally for my projects don't do heavy effects, but I honestly have discovered that with the HDV, I am going to have to upgrade my gpu as the Matrox card depends on it, and this is why I recieved the render with the ripple effect.. But back to the processor.

I once again don't know the comparison on how the 6850 would perform compared to this, but I also am curious and would like anyone that has purchased it to compare.. I simply believe that either way, your going to be good. The newer chips are much more efficient and cooler. And my quad is the "supposedly" hot one with the B3 stepping. I have also read great things on the 6850 so I would think your fine either way.

Also, see the little pic enclosed.. This was during the playback if I remember correctly..

D

Jon McGuffin
August 5th, 2007, 08:57 AM
Damon,

Thanks for the thourough (sp?) post.

Let me address a few points..

#1) I use Sony Vegas 7.0. Most people who are using the latest update on this software who also have a Quad Core CPU are reporting rendering times that are almost exaclty 2X what they were when they had a Dual Core (Quad 2.4Ghz vs E6600 Dual Core 2.4Ghz - for example). This is contrary to common practice with Quad Cores as most benchmarks for games and even video editing "tasks" such as rendering with Windows Media Encoder or DiVX show sizeable performance improvements, but not the true "twice as fast" improvements Vegas has been able to acheive in rendernig.

#2) I edit in HD (HDV) and a lot of my edits are multi-cam in nature so I need to be previewing on my secondary montior 4 simultaneous HDV videos and play switch camera operator on them while building my timeline. When previewing these 4 HDV images in quarter size on my monitor, playback is limited to about 15-20fps for each video. I'd like this to be the native 30fps I am shooting the source material in. I've been told this is mostly a disk and cpu intensive operation but frankly, I can't get a straight answer out of anybody. I do have two drives in a RAID 0 configuration that the source video is on.

#3) What I'm looking for is timeline performance. And when I say that, I mean when I apply heavy color correcting, transitions between frames, etc, etc I want the preview playback to be very smooth and retain at the full frame rate desired. Right now on preview, my playback will be completely smooth until I hit a video with lots of changes and then, naturally, the playback is more difficult for the computer and it slows a bit. Keep in mind, I do *not* use the timeline temporarly rendering functions in Vegas. This is all "real-time" usage.

#4) When using Cineform .AVI files, I find that performance on the timeline is about 10-15% slower than editing raw .m2t HDV files from the import. Cineform has acknowledged that playback performance of raw HDV files in Vegas 7 has been vastly improved and the small performance hit while using Cineform is to be expected. I prefer working with Cineform files as my source, so anything I can do via hardware to capture back that performance hit is great.

What I really need to is find a Vegas user with a Quad (easy) and then a Vegas user with a 6850 (harder because it's brand new) and then have them run a couple of files.

I'm not in a hurry to buy, but want to order something in the next couple of weeks ideally. As David said above, I still suspect that the 3.0Ghz Dual Core might offer me overall better performance than the 2.4Ghz Quad Core for what I'm looking to improve on...

Time will tell.. :)

Jon

Damon Gaskin
August 5th, 2007, 09:54 AM
Hmnnn... Ok, Let me ask you a question. Aren't you the Jon from Videoguys? If so, then you know what the X2 can do! You also know that many users(me included) have wished for Matrox to also use another NLE, and especially Vegas as it is extremely powerful in it's "native" state.

I love CS3 to be honest, and though it did not add alot to the feature set for Premiere, on my machine it is much, much more stable and the performance, even when I had the Pentium D, is much higher and tighter. If you are that Jon, it would be good for you guys to try and pick up a Quad and test the difference it makes, because it really does. Not even native Premiere does what your in need of, and because you want to run "realtime", I think if there are any, you would need to look at some form of hardware acceleration to bridge the gap. I think to be honest these days, with processors and the programs, if for nothing else besides what your saying, the cards and accelerators speed up exactly the things to the point of them being for the most part nonexistent. I am not going to say that there isn't a set of issues when you first begin using any hardware card(contrary to what some believe) because your adding something to your machine that originally wasn't there. But once your past the learning curve of the card(which is strange, for this one, it was an easy migration and not as picky as the X100).

I know this sounds crazy, but from everything your asking, and what your doing, the Matrox card would solve all of this. I know you probably don't or will not do that, but its the truth. With color correction it is very rare tht I ever have to render it, even when using multiple layers. Dissolves never require rendering unless you get to say multiple layers with dissolves at the same points, you never get a render requirement.

For the multicam, simply if it's anything like premiere, you end up with one layer of video with the cuts between that cross dissolves and other transitions can be added. If your like me with the multicam that I use alot of times to cut that single resulting timeline, I add titles, pips, and little animations on top of that to spice things up a bit and for by and large I don't get a render necessity. My projects usually aren't that complex and end up with the above and maxing out after the multicam with maybe two to three layers at the most and two audio tracks. I also usually use colored mattes for dissolve to's and fade to's on top of cross dissolves, and they never require renders. Its just the fancier matrox effects with my graphic card that I experience the slowdowns on framerates every once in a blue moon that your speaking of.

And also for the multicam, I think it is a disc intensive operation, anything from what I have observed using HDV files is very disc intensive. When I did that test of the three layers, and even scrubbing the timeline, my Hdd led was lit up like a christmas tree. So I think it's really hdd intensive, and is something I am thinking about also as I am just running a simple two drive raid O. I actually while they are still available am looking at the AB9 Pro motherboards since I know the layout and they have more SATA connections to allow me to run say a 4 disc array and split the requirements of the burdon.

But did you try to find a Vegas user on their forums if they have any? I am sure there have got to be one or two?

But since this is Vegas, I will leave it be. I wish you the best and am sure you will find your ideal solution these days.. Cpus are soo much more powerful than they used to be that it's daunting.. And sorry about the long post again..

BTW, what is (sp?)?

Damon

Jon McGuffin
August 5th, 2007, 01:14 PM
Nothing to be sorry about. (sp?) was inserted by me to basically say "How the heck to you spell this? Did I get it right?

I'm not the same Jon who evidently works at the Videoguys.com. Wish I were though because if I were, I'd probably have my answer! :)

I'm going to go check the Sony Vegas forumns now. I hate the way it's laid out and it takes forever to find something, but with some luck, I will in fact stumble across somebody else who has found the answer to this question.

Jon

Michael F. Grgurev
August 5th, 2007, 01:37 PM
Quick thought.

I'm not really heavily knowledged when it comes to hardware interaction when working within an editing timeline, but could the lack of full CPU usage in a quad-core setup be a result of a bottleneck elswhere in the system? Like in either the hard drive setup, as you mentioned might need upgrade, or elsewhere in the system.

Also, out of curiosity. What happens if you disable the use of one of the processors by the application through task manager. Do the other CPUs kick up to a higher usage rate? And no I'm not infering that'd be better, although I'd be curious to see the performance differences if the 3 processor did kick up, but still not all the way up.

Damon Gaskin
August 5th, 2007, 02:04 PM
LOL, Jon, its funny tht you asked that because when I read your last post on that, I will both have to borrow the sp? and also, I kinda scratched my chin on that word. It used to bother me as a kid with spelling, but I think with my old tired mind that it is thorough? Who knows man, but I get your point(you know context clues) and we aren't english teachers here... LOL If its anything like this and most of the forums out now, a quick search would probably answer it for you. I hope you do find this out, as now you have me really curious. I did however on tomshardware do some comparative specs before I purchased the quad. You can test just check a few specs, though they don't really answer your question or for your application/concern. I don't know if this would help but here is the link:

http://www23.tomshardware.com/cpu.html

And Micheal, I thought about your question for a second and I doubt it. Simply because I am using the same harddrives I had prior to the Quad. The only things I have upgraded since the 100% utilization are the processor and also I upgraded to 4GB of PC6400 ram from 2GB of PC5300. So I wouldn't think this would be possible, at least from my quick thinking of the situation. For my machine at this time to be honest I think the largest bottleneck on my machine are the optical drives. But everything is much faster, so i don't think it is the harddrives that are bottlenecks. I to be honest because of the performance, think(and I could be wrong, but by the reviews and feedback I have read on newegg, this is pretty normal from what people are experiencing and stating in their varying applications that they use also. This along with benchmarks and other reviews led me to the purchase of the processor.) that this processor is really that good. But I wouldn't think that it's bottlenecked at all. If it is, I am afraid to see what it would be like if it were not. But I don't think it is. I could see if it didn't perform as well as my previous processor, but it's performing well above and beyond the previous one.

I am certainly not bragging, I am amazed every time I do something on this machine now. Pretty much every program with the exception of say CS3 is open once I click it, which never happened before(which others have stated also). And even the CS3 applications only take maybe 10 seconds if that, to open compared to probably 30-40 seconds before on the previous version. I even had to reload windows and that only took like 5 minutes. It is just insane to me, especially considering how cheap they are now, including the Duo's also. Add to that, this thing runs ice cold, its the best of both worlds!

Ken Hodson
August 5th, 2007, 02:16 PM
I have read that to take full advantages of quad or greater cpu's, you need a lot of ram. I watched a tutorial video of Nucleo 2 by GridIron software. This is render software that makes advanced use of multi core systems for large speed gains, for those not familiar. It was mentioned to realize full potential you need 4GB minimum. They say 16GB Minimum!! for a eight core system. Ouch!! There is menu options to reduce the number of cores used to alleviate system and memory drains. It makes sense if you think about dividing 2GB between 4 cores. The Intel sytems also do not have an integrated memory controller like the new AMD quads will, so even at 1333fsb there is a bottle neck (1066fsb even worse). More ram will help keep each core happier.
So the bottom line is IMO, first you need the right software to take advantage of the cores, then you need the memory to handle it. If you have that combo you can really blow the doors off what was possible befor.

http://www.gridironsoftware.com/NucleoPro/Default.asp?Page=NucleoPro_Tutorials

Michael F. Grgurev
August 5th, 2007, 02:34 PM
Fair enough. I was just wondering because in the past on my P4, I wasn't experiencing full CPU usage when doing things far simpler but yet not realtime. I think it was because of my poor hard drive setup, which since then I think I've taken care of. While yours is way far from poor, my rationality when talking about bottleneck wasn't that the processors perform worse, but that they perform so well that they demand more speed from the hardrives; given your working with multiple video tracks. My rationalities tend to be ill informed guesses though :}

Damon Gaskin
August 5th, 2007, 02:56 PM
Michael, in a way, I think your completely right and agree with you. But this is from the standpoint of the raid and other motherboard I mentioned. I believe pretty confidently that if I were able to get another motherboard that had another two connectors that I could add to the raid, making it for example a Raid O 1TB if I added two more 250GB harddrives, that my layering capabilities would be even higher than mentioned above. How much higher, I am not sure to be honest. But I do agree with you, and I really think that the processor could handle it.

And I am not sure how heavy Jon wished to go with his layers. From what he stated, I am not sure if he was interesed in larger than a simple raid O or not, but this is also something definately to consider as he stated he uses multicam alot. I do also, and haven't experienced a problem, but better to be safe than sorry. I just recently began thinking about adding a few more drives also just for the same reason you are speaking of. The issue previously was always the cpu and ram IMHO, but now that it is strong as hell, my mindset is on the drives and their performance. Its like a vicious circle... LOL A fun one, but still..

For a while I was exporting to another drive also(I still do, but this is a bulk drive that I use for everything such as motherboard drivers and the like). The only thing I had on this drive were files for creating dvd's and exports from the timeline. I am not sure if this was really any faster to be honest because I added another dvd burner to the machine which took up this last SATA slot.

These days, there is alot to consider with HDV, the stronger processors, ram performance, raid, etc. Its definately more performance, but alot to consider now that I and I am sure others didn't have to when they began building their machines.

But with more drives, I am sure it would scream even more. You have me thinking about it! I need to order that board before I no longer can...

D

Jon McGuffin
August 5th, 2007, 04:47 PM
I'm currently using a 640Gb RAID 0 made up of two 320Gb drives. In this next system, I'm going to go with Three 320Gb RAID 0 drives which should increase performance, not to mention I'll be buying slightly better drives.

Jon

Glenn Thomas
August 10th, 2007, 06:42 AM
Apologies, I'm too lazy to read through this entire thread, but would just like to know if I've understood this correctly - A newer Dual Core CPU running at say 3ghz on a 1333mhz FSB will enable better real time performance until Quads have better support, but a Quad Core CPU such as the Q6600 on a 1066mhz FSB would still enable faster rendering providing the editing app is multi threaded and can access all 4 cores?

Stephen Armour
August 10th, 2007, 09:54 AM
Thats interesting guys because my Q6600 idles at 25C and hasn't gotten over 42C since I have had it. That includes encoding(which btw only utilizes at most thus far 42% of cpu power). One thing to keep in mind with any machine is cooling in the case. The Antec 900 case that I have now, lowered my cpu temps by at least 10C from what I remember. And yes, it does overclock really easily. I just played with mine to see how high it would go without problems, and I easily took it to 2.86 on just air and stock cooler, and if I remember it went up to 32C and 44C when encoding. I currently have it at 2.6Ghz and 26C while I type this.


Just my .02..

Damon, can you post your full system config? Our Q6600 runs a LOT hotter than that! Almost double your temps, as a matter of fact. We've changed heatsink compound 3 times thinking it must have been bad, but ... this stock cooler really seems to be lousy. I can't even get the fan to run at higher than 2500 rpm with this Intel D975xBX2 motherboard. We've got a fast RAID 0 setup for the previews, but are still getting audio stutters on timeline.

We are editing using CS3 with CF Prospect at 1920x1080 resolution, often with multilayered HD video, so it is a huge amount of data running through the bus.

We are not happy with Matrox's Parhelia either, and are looking for a better card (any suggestions anyone?).

Since we need a new system also for AfterEffects, I'm looking hard for a better setup. The quad is definitely better than our old dual XEON, but I am not real impressed with big speed gains. I'd have imagined something a little faster. Like I said, makes me wish for a V-8 (dual quads) if it would help.

Jon McGuffin
August 10th, 2007, 12:37 PM
Apologies, I'm too lazy to read through this entire thread, but would just like to know if I've understood this correctly - A newer Dual Core CPU running at say 3ghz on a 1333mhz FSB will enable better real time performance until Quads have better support, but a Quad Core CPU such as the Q6600 on a 1066mhz FSB would still enable faster rendering providing the editing app is multi threaded and can access all 4 cores?

Glenn,

The second comment is definately true. From a rendering performance standpoint, Quad Core's (in Vegas at least) are definately faster than their Dual Core counterparts, roughly TWICE as fast as an equivelently clocked Dual Core.

The jury is still out whether a 3Ghz Dual Core will feature better playback performance on the timeline than a 2.4Ghz Quad Core. I suspect it may be, but then again, if Vegas is *truly* taking advantage of all 4 cores, maybe the Quad is better here as well. I do know several people with Quad's who have said that after applying color correction, sharpening, etc, etc effects on an HDV timeline in Vegas, playback performance suffers. How much is unknown. I certainly see it with my Dual Core E6600.

I'm really trying to get this question answered but nobody seems to know.

Jon

Jon McGuffin
August 10th, 2007, 12:40 PM
Damon, can you post your full system config? Our Q6600 runs a LOT hotter than that! Almost double your temps, as a matter of fact. We've changed heatsink compound 3 times thinking it must have been bad, but ... this stock cooler really seems to be lousy. I can't even get the fan to run at higher than 2500 rpm with this Intel D975xBX2 motherboard. We've got a fast RAID 0 setup for the previews, but are still getting audio stutters on timeline.

We are editing using CS3 with CF Prospect at 1920x1080 resolution, often with multilayered HD video, so it is a huge amount of data running through the bus.

We are not happy with Matrox's Parhelia either, and are looking for a better card (any suggestions anyone?).

Since we need a new system also for AfterEffects, I'm looking hard for a better setup. The quad is definitely better than our old dual XEON, but I am not real impressed with big speed gains. I'd have imagined something a little faster. Like I said, makes me wish for a V-8 (dual quads) if it would help.


Hmmm.. Maybe try a new motherboard (Gigabyte or Asus maybe?) and find somebody with Vegas and a Quad core with Cineform NeoHD and see how performance is there compared to what you're seeing in Adobe.

Jon

Glenn Thomas
August 11th, 2007, 12:11 PM
Jon, I'm running a dual core E6600 also and trying to decide whether it would be worth upgrading to a Q6600. Especially now the prices have dropped.

Jon McGuffin
August 11th, 2007, 01:06 PM
I think if you are making a living using your editing machine, then time is money and the Quad Core would *definately* be worthy of an upgrade for the "twice as fast" rendering times alone. If you are a hobbyist or "semi" professional, not sure I would bother upgrading at this point. We're probably not farther than 9-12 months away from a much faster and better Quad Core all the while that much more software at that point taking advantage of it.

I have an E6600 now and if I weren't in the market for a secondary, new system, I wouldn't bother with the Quad yet..

Damon Gaskin
August 12th, 2007, 05:12 PM
Hi Stephen and John.

Stephen:

System Specs:

Abit AB9(non pro version)
Q6600(old stepping version) with stock intel cooler
4 GB OCZ PC6400
Nvidia 7600GS
Mushkin Enhanced Power Supply(550 watt)
System Drive: Western Digital 36Gig Raptor(or 80GB 7200.10 SATA)
Video Raid 0: 2 Segate Barricuda's(7200.10 SATA versions)
Export Drive: 200 GB Segate 7200.10 SATA
Matrox RTX2(3.0 Drivers)
Ethernet Card
Antec 900 Case

Thats pretty much my setup. The case to be honest has alot to do with the cooling. That along with cable routing, but more the airflow in the case. Before I transferred to the Antec case, it was indeed registering warmer. But not nearly as warm as your saying yours is. To be honest, my pentium D was running those temps your experiencing, but I wouldn't expect a core to run that hot at all from what I have read. I also alternate the 80 gig and the raptor for system drives if I need to reinstall windows(I haven't figured out which one is truly faster as of yet because of the raptors size that gets eaten up quickly).

My fan rpm on the heatsink doesn't go over 2220 from what I can remember. It also probably depends on what your ambient temp is also, which will determine with what your working with as far as temps are concerned. I hope you don't take this the wrong way, but what kind of airflow do you have going through your case? Also, if your heatsink is on correctly(which it to be honest doesn't sound like it is with those temps, but then again, your in Brazil, which from my geographic is a pretty hot region)? Does the fan wiggle when you attempt to move it by hand? Something really doesn't seem right.

But if your ambient is 100F, then you can only do soo much cooling with the stock hsf. You also may want to consider aftemarket HSF if your ambient is really high or airflow throughout the case is poor. In a situation that the ambient is high, I wouldn't begin to think about using a stock HSF. Thats to me just asking alot(with my assumption that your ambient is that high).

Also, I will not post any comment on the rendering to be honest smply because I am using the matrox editing board, so I am not running into the same rendering experiences you guys would have with software alone. But I will say that for the temps, you may want to consider some sort of upgraded cooling. Something I learned with my older AMD's and also the pentium D is that the cooler I was able to keep them, the better they ran. I am not sure what your thermal setup is in your bios, but maybe it's throttling because of the higher temps? It shouldn't be at that low of a temp, but then again, if your rendering temps are getting into the 60's and 70's, I suppose it's possible.

All of these are just stab's in the dark, because it's not just the cpu hsf that makes it cool, its pretty much the whole setup, or at least my thought.

John Welsh
August 13th, 2007, 02:25 PM
I think if you are making a living using your editing machine, then time is money and the Quad Core would *definately* be worthy of an upgrade for the "twice as fast" rendering times alone. If you are a hobbyist or "semi" professional, not sure I would bother upgrading at this point. We're probably not farther than 9-12 months away from a much faster and better Quad Core all the while that much more software at that point taking advantage of it.

I have an E6600 now and if I weren't in the market for a secondary, new system, I wouldn't bother with the Quad yet..

I read somewhere they (45nm intels) are coming in december. I think you better wait for presents under christmas tree ;D. Eventually rumours are flying (was posted on german site gamestar.de) that new AMD quads must be very very good and are expected by the end of the year. And Id wait since the Q's are on 1066 bus.

Tomshardware did great cpu comparison. Check their website for this

regards

Jon McGuffin
August 13th, 2007, 04:08 PM
Yes, the 45m "presler" is coming out and may be here before December actually... So I think we should wait until then... but wait.. when that time hits, there will be the next CPU, then the next... Let's just never buy a computer again... :)

Two months ago was a bad time to buy a CPU because of what Intel has just done. We are actually at a VERY good point to buy a CPU right now. Prices won't actually change dramatically over the next 3-5 months so your investment is pretty well protected. How would you like to have been one of those guys who dropped $650 - $700 on a Quad Q6600 just two months ago only to see it at $285 today? Yikes.. Those types of radical changes are behind us for now.

I (an AMD fan) don't have much confidence in AMD's ability to actually out dual Intel for a long time. This is unfortunate because AMD is a good company and frankly they are to thank for what we have now from Intel. If you are in the market for a computer, I wouldn't hesitate, this is one of the best times in the past 3-4 years to get the most bang for the buck...

Jon

John Welsh
August 14th, 2007, 01:01 PM
I love amd too, have 3000+ which runs on 2800.
I think you cant say that amd will never win over intels core. As I said those are NEW amd cpus made in 45nm structure. Maybe it will be the same we had as athlon 64 came out, beating pentium prescott (or how was the name) in almost all tests.

Yes, the prices wont drop until the new come. I even reckon they will be more expensieve in a few month. And when new come, they wont lose much in price (i think max $20 for each, and a bit more for higher models). As some experts say 45nmeters will be monsters.

Ken Hodson
August 14th, 2007, 05:16 PM
AMD will also be pushing 8 core (octa-core) setups, 2x quad core. These should be fantastic compared to Intel's 8 cores, as Intel will still be stuggling with their fsb limitations. Even at 1333 fsb, eight cores (and currently with 4 cores) will be choking for bandwidth. AMD with their integrated memory controller will again put the boots to Intel's older architecture. It will be A64 vs. P4 all over again. But AMD better hurry their ass as Intel is seling a boat load of chips right now.