|
|||||||||
|
Thread Tools | Search this Thread |
September 7th, 2012, 02:12 PM | #1 | |
Wrangler
Join Date: May 2003
Location: Eagle River, AK
Posts: 4,100
|
nVidia GeForce vs Quadro on PC Creative Suite Systems
NOTE: This thread was split out as a new topic from http://www.dvinfo.net/forum/adobe-cr...ml#post1752091
Quote:
Mercury Playback Engine | Adobe Premiere Pro CS6 - Adobe.com And of course the very simple text file hack that you mentioned does greatly expands the cards that will actually work with Mercury GPU acceleration, even if not officially certified. I wasn't sure of your meaning in the comment about 10-bit color. Can you confirm for a PC that a newer GeForce card will support 10-bit color via DisplayPort and/or dual-link DVI? I may doing a build soon and might like to give GeForce at try vs the older Quadro I'm using on my main editing box.
__________________
Pete Bauer The most beautiful thing we can experience is the mysterious. It is the source of all true art and science. Albert Einstein Trying to solve a DV mystery? You may find the answer behind the SEARCH function ... or be able to join a discussion already in progress! |
|
September 8th, 2012, 11:19 AM | #2 |
New Boot
Join Date: Aug 2012
Location: Dallas
Posts: 10
|
Re: CS6 Mac vs. Windows Difference
My Geforce GTX 670 can handle 10-bit color, kind of. I'll explain.
My previous graphics card was a Quadro CX. I use Speedgrade for coloring. When working in Speedgrade, both the Quadro and the 670 work fine. When rendering out and importing a DPX frame sequence from Speedgrade to Premiere, both the Quadro and 670 work fine. However, when actually previewing the 10-bit color DPX sequence footage on the timeline within Premiere, the 670 stutters and skips the footage a bit while the Quadro plays it seamlessly. They both render out of Premiere to their final format just fine as well. I've been told the issue is the drivers that are used. Geforce drivers are tuned towards DirectX and are meant for tearing through games, not video editing. Quadro drivers are tuned more towards OpenGL and OpenCL which work well for editing video, but chug a bit if you try to play a modern game on them. Unfortunately, due to the market separation, it is unlikely that Nvidia will come out with a Geforce price pointed video card that is more tuned to OpenGL and OpenCL than DirectX. |
September 8th, 2012, 03:27 PM | #3 |
Wrangler
Join Date: May 2003
Location: Eagle River, AK
Posts: 4,100
|
Re: CS6 Mac vs. Windows Difference
John, thanks again for that insight, although I got more than I bargained for.
It is a little surprising but much appreciated to get a reliable end-user report from someone using both types of cards and finding the GeForce giving degraded performance vs an older Quadro. Granted, yours is a single report but it does kind of call the question. I was originally just wondering about the 10 bit color, since I have a Quadro FX4800 card (perfectly serviceable but maybe getting a bit long in the tooth by now) and bothered to buy IPS monitors to make use of its 10-bit color. Some very knowledgeable folks have advocated GeForce over Quadro because of the assumption that more cores for lower cost is better bang for buck, and if it ain't crashing the drivers don't really matter, etc, etc. Not having heard complaints from the GeForce user base and assuming that a much cheaper GeForce would outperform on the timeline and still give me the 10 bit color, I was thinking about trying a newer GeForce in an upcoming build. But your experience makes me hesitate.
__________________
Pete Bauer The most beautiful thing we can experience is the mysterious. It is the source of all true art and science. Albert Einstein Trying to solve a DV mystery? You may find the answer behind the SEARCH function ... or be able to join a discussion already in progress! |
September 9th, 2012, 02:57 AM | #4 |
Trustee
Join Date: Aug 2006
Location: Rotterdam, Netherlands
Posts: 1,832
|
Re: nVidia GeForce vs Quadro on PC Creative Suite Systems
John,
Your remark may be interpreted in the wrong way. Output to a monitor is 8 bit for all GeForce cards, including the GTX 680. Output to a monitor is either 8 or 10 bit for all Quadro cards, depending on the port. Only if you have a 10 bit monitor, like a Eizo Colorgraphic, HP Dreamcolor or Dell PremierColor, does it make sense to consider a Quadro. The bulk of even IPS panels are 8 bit, so in those cases there is no benefit from a Quadro card. On the contrary, Quadro's are limited to two monitors max, while the GTX 6xx can handle 4 monitors. For editing with PR the MPE performance is largely dependent on the memory bandwidth of the video card. The GTX 680 has a memory bandwidth of 192.25 GB/s, a Quadro 4000 has 89.6 GB/s, a Quadro 5000 has 120 GB/s and the Quadro 6000 has 144 GB/s. So the performance gain for MPE in PR is best served with a GTX 680. I have stated this on a site about a new system I'm building and documenting: Video card The much touted Maximus solution, at least by Adobe, is an utter waste of money, because it requires a very expensive Quadro card plus a Tesla C2075 card, that is even slower than a simple two generations old GTX 470, for the simple reason that it lacks CUDA cores and memory bandwidth to make it faster. We have several Maximus solutions in the current PPBM5 benchmark and despite prices up to € 6000 for the Quadro 6000 plus a Tesla C2075, these solutions are easily outperformed by many GTX 470/480/570/580/670 and 680 cards, that cost only a fraction. The only thing impressive about a Maximus solution is the price. The only reason to opt for a Maximus solution, despite the cost, is if you absolutely need 10 bit output to your very expensive 10 bit monitors, and then you only get quality, not performance. What determines video card performance for PR? Based on information currently available, it is not so much the number of shaders or CUDA cores. If that were the case, all Kepler cards would easily leave Fermi and older cards in the dust with about three times the number of CUDA cores, but that is not the case. The determining factor is memory bandwidth and that explains why the Kepler range is only slightly faster than Fermi and why Kepler is significantly faster than all Quadro cards. Hope this clears things up a bit. |
September 9th, 2012, 03:05 AM | #5 |
Major Player
Join Date: May 2005
Location: Stockholm - Sweden
Posts: 344
|
Re: nVidia GeForce vs Quadro on PC Creative Suite Systems
AFAIK, no GeForce cards can output 10-bit to a preview monitor, only 8-bit while the Quadro cards can output 10-bit color to a preview monitor.
GeForce = If you don't need 10-bit preview. Quadro = If you need 10-bit preview. Both card can handle 10-bit video, but only the Quadro gives you 10-bit output. EDIT: Harm beat me...
__________________
/Roger |
September 9th, 2012, 06:57 AM | #6 |
Wrangler
Join Date: May 2003
Location: Eagle River, AK
Posts: 4,100
|
Re: nVidia GeForce vs Quadro on PC Creative Suite Systems
Thanks for the additional info, gents. For both stills and video, I really want to stay with 10-bit output so at least for me that settles the issue.
Even if my mid-cost IPS monitors actually display in 8-bit, the visual difference between them and a standard business/consumer display, both driven by the same Quadro card, is huge. It may be that the difference is more due to the better monitors than the bit depth of the card output, but I'd rather feed them the best signal my software and computer can provide. I think the just-announced Kepler Quadro K5000 will be able to drive up to 4 displays, although it'll lighten the wallet. Will be interested to see if the card's performance for PPro will measure up to expectations.
__________________
Pete Bauer The most beautiful thing we can experience is the mysterious. It is the source of all true art and science. Albert Einstein Trying to solve a DV mystery? You may find the answer behind the SEARCH function ... or be able to join a discussion already in progress! |
| ||||||
|
|