Best GPU choice for the money

Getting started with a Blackmagic product? Ask questions here about setup and installation.
  • Author
  • Message
Offline

Bob Jackson

  • Posts: 8
  • Joined: Tue Sep 16, 2014 12:10 am

Best GPU choice for the money

PostMon Feb 02, 2015 1:17 am

Hello,

First of all, I hope this is the right section for this question. I recently finished a new PC build for photo and video editing. I ended up buying a $30 GPU just to get the thing running, because this was right around the time the GTX 980 was coming out, and I wasn't sure what to get.

I will be using Magic Lantern for RAW video on my 5d mark3, and grading in Resolve (lite for now). I was set on getting a 980 at first, but I keep reading that the price jump from the 970 is not worth it at all. They have the same VRAM, and I'm told that the difference in number of CUDA cores won't make of a difference. On top of that, now there are a bunch of reports of the GTX 970 not being able to use all 4gb of VRAM, and really slowing down past 3.5gb. These reports are mostly from gamers though. Will this issue cause problems in Resolve?

Also, now that the GTX 960 is out, I'm wondering if I should get 2 in SLI (but I heard Resolve Lite only uses one GPU?) Also, if I do upgrade from Lite, and have 2 cards in SLI, does the number of CUDA cores stack, or does it work like VRAM?

Money is a bit tight, which is why I was happy to go with the 970 (maybe a superclocked EVGA or something), but now that there is this problem with the card not using all its VRAM I'm not sure what to get.

If anyone has any advice it would be much appreciated.

Other PC specs are:
i7-5930k
32 gb DDR4
Asus X99 Deluxe mobo
Offline

James_Caldwell

  • Posts: 18
  • Joined: Sun Feb 01, 2015 11:42 pm

Re: Best GPU choice for the money

PostMon Feb 02, 2015 2:44 am

AMD/ATI is supposed to be releasing new GPUs with 3D vram that are massively faster than anything on the market which will force nVidia to drop prices until they get their own 3D vram cards out at a later date. But you know that's the whole waiting game thing...

The best video card for the high end market per price is currently nVidia's GeForce GTX 970 for around $330.

If your motherboard supports SLI you can get two of these linked up for under $700 for CUDA processing.
Offline

Bill Underhill

  • Posts: 130
  • Joined: Sat Aug 09, 2014 3:30 am
  • Location: USA

Re: Best GPU choice for the money

PostMon Feb 02, 2015 4:57 am

I don't have a GPU recommendation, but I know for sure CUDA totals do not stack in SLI. If you got 1000 CUDA cores per GPU, then you basically have two lanes of GPU power with 1000 CUDA cores per lane.

Even dual GPU's in a single card don't stack CUDA cores.

I'm pretty sure there are no video editing applications out there that take advantage of dual GPU's. Best to go with the most powerful, single GPU you can get your hands on.
Offline

Bob Jackson

  • Posts: 8
  • Joined: Tue Sep 16, 2014 12:10 am

Re: Best GPU choice for the money

PostMon Feb 02, 2015 7:42 am

Thanks for the info about the CUDA cores, I wasn't 100% sure on how that worked. So, SLI 960s are out of the question. Any opinions on if the 500 or so extra CUDA cores on the 980 is worth an extra $200?
Offline
User avatar

adamroberts

  • Posts: 4538
  • Joined: Wed Aug 22, 2012 5:27 am
  • Location: England, UK

Re: Best GPU choice for the money

PostMon Feb 02, 2015 9:09 am

Resolve Lite can work with 2 cards but can use only one GPU for processing. So if you had 2 cards, one card would be used to drive your display/s and the other is then freed up to do image processing.

If you only had one card then some of the cards resources would be used to drive the monitors leaving less available for GPU processing.

Resolve (and most graphics apps) does not work with SLI. SLI is for gaming. Resolve used the GPU in a different way to how a game engine uses it.

Can't comment on the 970 as I don't own one. I'm running 2 x 4GB 670 cards in my system.
Offline

James_Caldwell

  • Posts: 18
  • Joined: Sun Feb 01, 2015 11:42 pm

Re: Best GPU choice for the money

PostMon Feb 02, 2015 2:32 pm

adamroberts wrote:Resolve Lite can work with 2 cards but can use only one GPU for processing. So if you had 2 cards, one card would be used to drive your display/s and the other is then freed up to do image processing.

If you only had one card then some of the cards resources would be used to drive the monitors leaving less available for GPU processing.

Resolve (and most graphics apps) does not work with SLI. SLI is for gaming. Resolve used the GPU in a different way to how a game engine uses it.

Can't comment on the 970 as I don't own one. I'm running 2 x 4GB 670 cards in my system.


Actually SLI isn't really 'for gaming'. There are issues with anything requiring low latency because both of the cards have to synch-up, ie 'micro stutter' or micro seconds of lag.

SLI is most beneficial in high latency where the CPU can feed and manage the GPUs. At best 2 cards are approx equivalent of 1.5x cards--which at $700 is much better than the top GPU out right now and well over a $1000.

Node-based rendering or folding for scientific research or bitcoin are better uses than gaming. This could be applied to rendering video or any complex pixel operations--especially with large data sets, ie 5k, 6k+ with high FPS (120).

The only question is really on the market/engineering on the developer's end. IE, BMD might prefer to come out with a specific hardware card that they deem more profitable, niche, or performance tweaked.

But the future is volumetric pixels or voxels and all cameras will be 3D cameras. Post production will literally be sculpting through shader operations. Soon after we'd probably have node based capturing where you would place a pole in the back to get a full 3D scan. Later that would probably be replaced with a flying drone.

The point is that mathematically pixel operations on video cards are best to embrace long-term because much rendering and 3D/2D content will be blended together in the future (because duh, there is no 2d anymore, only 3D).
Offline
User avatar

David Green

  • Posts: 228
  • Joined: Thu Jul 03, 2014 8:12 pm
  • Location: BC Canada

Re: Best GPU choice for the money

PostMon Feb 02, 2015 5:00 pm

Bob Jackson wrote:Any opinions on if the 500 or so extra CUDA cores on the 980 is worth an extra $200?


The 970 is 1664 cores, the 980 is 2048, which is 384 additional cores.
The 980's Base Clock, Boost Clock, and Fill Rate are also higher.
So the theoretical Compute performance improvement of the 980 over the 970 should be around 20% faster.

There is typically an advantage to having independent GUI and Compute GPUs with Resolve.
David R. Green - Demenzun Media Inc. - Author Composer Filmmaker Programmer
Offline

James_Caldwell

  • Posts: 18
  • Joined: Sun Feb 01, 2015 11:42 pm

Re: Best GPU choice for the money

PostMon Feb 02, 2015 10:27 pm

One of the things you have to consider when dealing with large data sets is that for $700 or less (970 SLI) you can get 8 gb of vram to store all that data.

The equivalent 8gb card is a quadro m6000 for about $2000.

A 980 has almost the same performance of an overclocked 970 (same amount of vram) with water cooling but costs $234+ more.

980 is more for games and real-time performance.
----------------------------------------------------

Think of it this way:

980 = $534 4gb SD Card

970 SLI = $700 8gb SD Card with slightly faster read.

Which would you rather work with in video?
Offline

Adam Simmons

  • Posts: 5510
  • Joined: Thu Apr 04, 2013 4:21 pm
  • Location: UK

Re: Best GPU choice for the money

PostMon Feb 02, 2015 11:16 pm

Resolve ignores SLI so it will only be seen as 2 cards with 4GB of RAM
DVC Built Clevo P775DM3-G Laptop with UHD screen, 7700K CPU@4.9Ghz, Geforce GTX 1060 6GB GPU, G-Sync UHD screen, 500GB M.2 Primary, 1x 480GB SSD, 1x1TB M.2, 1x 2TB Video drives.
Building Bespoke Video Editing systems for over 16 years
Offline

Bob Jackson

  • Posts: 8
  • Joined: Tue Sep 16, 2014 12:10 am

Re: Best GPU choice for the money

PostTue Feb 03, 2015 1:42 am

It sounds like a 970 is the best choice for me right now. Hopefully later this summer I'll be able to afford a second card and run just the monitors from that. Thanks to everyone for the help.
Offline
User avatar

David Green

  • Posts: 228
  • Joined: Thu Jul 03, 2014 8:12 pm
  • Location: BC Canada

Re: Best GPU choice for the money

PostTue Feb 03, 2015 1:47 am

James_Caldwell wrote:One of the things you have to consider when dealing with large data sets is that for $700 or less (970 SLI) you can get 8 gb of vram to store all that data.


+1 What Adam said.

SLI's goal is to divide the workload across multiple GPUs using distributed rendering of frames under DX or GLSL.
In other words, alternate frame rendering where GPU_0 processes frame_0 while GPU_1 processes frame_1.

This is not how Compute (OpenCL/CUDA) typically uses the GPUs.
Compute is more generalized mathematics, the objects being worked on don't have to be frames, so "distributed rendering of frames" is essentially meaningless to Compute.

To leverage multiple GPUs for Compute (OpenCL for example), command queues are created for each GPU device, and the workload is split across the kernels ran on each device.

Therefore, an SLI setup is considered as individual discrete GPUs to Compute.
So two 4GB GPUs in SLI are considered as two 4GB GPUs, they are not considered as a single 8GB GPU with 2x the number of compute cores.

And, for example, since 4K resolution plus certain effects with Resolve can require 6GB+ per GPU...
We can see where that leads us.
David R. Green - Demenzun Media Inc. - Author Composer Filmmaker Programmer
Offline

Leon Chen

  • Posts: 13
  • Joined: Tue Jan 27, 2015 12:37 pm

Re: Best GPU choice for the money

PostTue Feb 03, 2015 4:47 pm

Sorry but just for curiosity: has AMD graphic card has been completely kicked out of the game?

OpenCL should work for AMD card, but is it not good? or I just missed anything? Thanks.
Offline
User avatar

nathandumoulin

  • Posts: 135
  • Joined: Sun Dec 28, 2014 5:41 pm

Re: Best GPU choice for the money

PostWed Feb 04, 2015 4:01 am

The recent controversy over the 970 and it supposed poor performance when using over 3.5gb has been completely blown out of proportion. The regression in speed occurs only within that last .5gb, which when averaged over the other 3.5gb, is a marginal 3% loss. The 970 was rated as a great card prior to this debacle, and those benchmarks have not since changed. It was, and still remains, one of the best cards on the market at that price point (especially when factoring in the TDP).

Perhaps take a look at the FTW version, as it splits the difference between the reference 970 and the 980, and also has improved cooling performance.
Offline

Peter_r

  • Posts: 259
  • Joined: Thu Aug 30, 2012 4:46 am

Re: Best GPU choice for the money

PostWed Feb 04, 2015 12:44 pm

It works fine for AMD/ATi and they have 8GB single GPU cards that scream along with Resolve.
Offline
User avatar

David Green

  • Posts: 228
  • Joined: Thu Jul 03, 2014 8:12 pm
  • Location: BC Canada

Re: Best GPU choice for the money

PostWed Feb 04, 2015 5:21 pm

AMD is fine for GPUs.
My Resolve computer has multiple AMD FirePro W-series video cards.
My 3D/VFX computer has an AMD FirePro W-series video card.
David R. Green - Demenzun Media Inc. - Author Composer Filmmaker Programmer
Offline
User avatar

Emin Khateeb

  • Posts: 10
  • Joined: Sun Oct 18, 2015 8:17 pm
  • Location: Switzerland

Re: Best GPU choice for the money

PostTue Dec 01, 2015 3:53 pm

Why is everybody recommending gaming GPUs when the Resolve documentation says that the GTX 980 is a UI only GPU? I'm going to build a workstation next year for video editing only. Would you recommend me a FirePro/Quadro or should i go for the gaming GPUs? I want to use a single GPU for various reasons (Less power consumption, less heat, etc.).
Offline

Adam Simmons

  • Posts: 5510
  • Joined: Thu Apr 04, 2013 4:21 pm
  • Location: UK

Re: Best GPU choice for the money

PostTue Dec 01, 2015 3:56 pm

The Quadros's don't really have any extra features that Resolve uses over the GTX/Titan range so you can get a much more powerful card for Resolve with a lot of VRAM for a lot less money.

If you want a powerful card then you have to sacrifice lower heat and power as you have to use those to get the computational power. The Quado's manage to use less power by having lower memory bandwidth and lower clock speeds, so even though a Quadro may have the same amount of CUDA cores the rest of the card tends to be slower
DVC Built Clevo P775DM3-G Laptop with UHD screen, 7700K CPU@4.9Ghz, Geforce GTX 1060 6GB GPU, G-Sync UHD screen, 500GB M.2 Primary, 1x 480GB SSD, 1x1TB M.2, 1x 2TB Video drives.
Building Bespoke Video Editing systems for over 16 years
Offline
User avatar

Scott Smith

  • Posts: 959
  • Joined: Sat Apr 13, 2013 12:51 pm

Re: Best GPU choice for the money

PostTue Dec 01, 2015 4:15 pm

I visualize the difference between the GeForce cards and the Quadro cards to be like the difference between a sports car and a tractor trailer truck. Wanna go fast? Sports car. Wanna haul a bunch of stuff? Big truck. The sports car will get you where you need to go, fast, and be much cheaper than a big truck. But if you have lots of serious hauling you need to do . . . nothing beats the truck.

I think the analogy also works for Pentium class processors vs Xeon class.
Scott R Smith
BMD Stuff I use: ATEM 2-M/E, 4 x ATEM PS 4K, Broadcast Videohub, 6 Hyperdeck Pros, 4 Hyperdeck Shuttles, Multidock, Smartscope Duo, Smartview, Intensity Extreme, Decklink Studio, and lots of Miniconverters and Open Gear Converters.
Offline

Lance Braud

  • Posts: 49
  • Joined: Wed Jul 29, 2015 3:07 am

Re: Best GPU choice for the money

PostWed Dec 02, 2015 12:55 am

Emin Khateeb wrote:Would you recommend me a FirePro/Quadro or should i go for the gaming GPUs?


I'm currently dealing with this issue. Gaming GPUs will not output 10-bit color. If you have a 10-bit capable monitor you're going to want a FirePro/Quattro.

Resolve does use the cores of my current GPU for processing though, so I think if you have an 8-bit monitor you could go with any GPU.
Resolve Studio v16.2.3 | Nvidia GTX 1070 | v451.48 | Win10 x64 | AMD TR-1950X 16G RAM
Offline

Adam Simmons

  • Posts: 5510
  • Joined: Thu Apr 04, 2013 4:21 pm
  • Location: UK

Re: Best GPU choice for the money

PostWed Dec 02, 2015 5:21 am

When using Resolve you should really use a dedicated I/O card for monitoring out anyway so the 10 bit output from the GPU is not necessary for Resolve
DVC Built Clevo P775DM3-G Laptop with UHD screen, 7700K CPU@4.9Ghz, Geforce GTX 1060 6GB GPU, G-Sync UHD screen, 500GB M.2 Primary, 1x 480GB SSD, 1x1TB M.2, 1x 2TB Video drives.
Building Bespoke Video Editing systems for over 16 years
Offline
User avatar

Emin Khateeb

  • Posts: 10
  • Joined: Sun Oct 18, 2015 8:17 pm
  • Location: Switzerland

Re: Best GPU choice for the money

PostThu Dec 03, 2015 8:50 am

It's not only about the 10-bit colors. I'm asking why do you all recommend gaming GPUs while Blackmagic Design recommends only workstation GPUs. A FirePro W8100 has 8 GB of video memory, while the most a gaming GPU can deliver at the moment is 6 GB from a 980 Ti. The Titan X has 12 GB for about the same price as a W8100, but what about the rendering performance?
Offline

Adam Simmons

  • Posts: 5510
  • Joined: Thu Apr 04, 2013 4:21 pm
  • Location: UK

Re: Best GPU choice for the money

PostThu Dec 03, 2015 9:52 am

For most people 6GB VRAM is more than sufficient even when working with 4K, the GTX 980TI is only slightly less powerful than the Titan-X when it comes to working with Resolve (2816 CUDA cores vs 3072), apart from that the main difference is the RAM. So if you can afford the extra mainly just for the RAM then fine, but most people recommend the GTX 980Ti as it's the best card for the money, nearly all the power of the Titan-X for a 3rd less money,
Over here the GTX 980 Ti costs around £435 before tax for a cheaper one, the Titan-X costs around £670 before tax.
The Quadro with the equivalent power is the Quadro M6000 which costs just under £3300 before tax.

As to why people tend to go with nVidia over AMD, probably because Resolve in the past has always worked better with CUDA, and that AMD seem to have a poorer reputation with their drivers
DVC Built Clevo P775DM3-G Laptop with UHD screen, 7700K CPU@4.9Ghz, Geforce GTX 1060 6GB GPU, G-Sync UHD screen, 500GB M.2 Primary, 1x 480GB SSD, 1x1TB M.2, 1x 2TB Video drives.
Building Bespoke Video Editing systems for over 16 years
Offline
User avatar

Emin Khateeb

  • Posts: 10
  • Joined: Sun Oct 18, 2015 8:17 pm
  • Location: Switzerland

Re: Best GPU choice for the money

PostThu Dec 03, 2015 10:46 am

This may be true for gaming GPUs. But the workstation GPU segment is a different world. Workstation GPUs are targeting the professional market, where as the gaming GPUs are optimized for consumer. Different hardware, different drivers, different usage.

Gaming drivers are not optimized for CAD-Performance, as you can see in the following benchmark:
Image

In a worst case scenario like this, the monster Titan X will look like a cheap crappy card with unoptimized drivers if its used where it's not meant to be. This also goes for the opposite, a Quadro M6000 will be less performant in games against a Titan X. But to be honest, in most scenarios the Titan X is pretty close to the Quadro M6000.

/edit
Oh and btw, right now i'm using a GTX 970, which is pretty struggling with 4K raw as soon as i put more than 3 color correcting nodes on it. Heavy color correction is almost impossible.
Offline

Adam Simmons

  • Posts: 5510
  • Joined: Thu Apr 04, 2013 4:21 pm
  • Location: UK

Re: Best GPU choice for the money

PostThu Dec 03, 2015 1:24 pm

You are looking at it for the wrong scenario. Resolve is not a CAD program, and therefore doesn't use the same things. Resolve uses the CUDA cores and it's already been proven that for use in Resolve the GTX 980Ti and Titan-X are the better cards to go with.

I have no idea why you would bring up a chart that has no relevance to the usage in Resolve, all that does is confuse people. I quite clearly stated in an earlier post
Adam Simmons wrote:The Quadros's don't really have any extra features that Resolve uses over the GTX/Titan


As to the GTX 970, no real surprise that you get poorer performance. It's 4GB of VRAM is only 3.5GB of fast VRAM and .5 of slower, it's also has a lot less CUDA cores
DVC Built Clevo P775DM3-G Laptop with UHD screen, 7700K CPU@4.9Ghz, Geforce GTX 1060 6GB GPU, G-Sync UHD screen, 500GB M.2 Primary, 1x 480GB SSD, 1x1TB M.2, 1x 2TB Video drives.
Building Bespoke Video Editing systems for over 16 years
Offline
User avatar

Emin Khateeb

  • Posts: 10
  • Joined: Sun Oct 18, 2015 8:17 pm
  • Location: Switzerland

Re: Best GPU choice for the money

PostThu Dec 03, 2015 2:18 pm

At this point i want to refer to Adam Robert's post:
adamroberts wrote:Resolve uses the GPU in a different way to how a game engine uses it.


What you're saying sounds all good and stuff, but if that's true, why does the Resolve Configuration guide recommend workstation GPUs over consumer GPUs? (This is also the main question of mine, which isn't answered yet)

Look it up yourself (page 55):
http://documents.blackmagicdesign.com/D ... _Guide.pdf

They're also writing "For processing UHD or 4K-DCI images we recommend 8GB or more GPU memory."
Or is that "Recommended for UI GPU" only referring to the Quadro K4200? I am confused myself, don't want to offend you or something. I mean it would be even better for me if i could buy a consumer GPU and get all the performance out of it for a lower price than a workstation GPU. But i don't want to make a mistake and then realize i could've get more if i had bought a Quadro/FirePro.

This was the reason i started googling for a good GPU and then i stumbled upon this thread.
I just want my questions answered, which is the idea of a forum ;)
Offline

Adam Simmons

  • Posts: 5510
  • Joined: Thu Apr 04, 2013 4:21 pm
  • Location: UK

Re: Best GPU choice for the money

PostThu Dec 03, 2015 2:49 pm

Where in the config guide does it specifically say that it recommends workstation over Consumer? Only thing I can see is reference to the amount of VRAM on a card. In the guide under the Nvidia section I only see this
Multiple alternative options are shown in no particular order


As to Adams comment, you posted it out of context. He was referring to the fact that games can use graphics cards in SLI mode, whereas Resolve doesn't
DVC Built Clevo P775DM3-G Laptop with UHD screen, 7700K CPU@4.9Ghz, Geforce GTX 1060 6GB GPU, G-Sync UHD screen, 500GB M.2 Primary, 1x 480GB SSD, 1x1TB M.2, 1x 2TB Video drives.
Building Bespoke Video Editing systems for over 16 years
Offline
User avatar

Emin Khateeb

  • Posts: 10
  • Joined: Sun Oct 18, 2015 8:17 pm
  • Location: Switzerland

Re: Best GPU choice for the money

PostThu Dec 03, 2015 3:01 pm

I'm repeating what i already wrote in my previous post: Above the Quadro K4200, the Titan X and the 980, they wrote "Recommended for UI GPU". And now i don't know if they only mean the Quadro K4200 or everything below this line. Because if they mean only the Quadro K4200, why didn't they write it in brackets like they did with the FirePro W7000?
Offline

Adam Simmons

  • Posts: 5510
  • Joined: Thu Apr 04, 2013 4:21 pm
  • Location: UK

Re: Best GPU choice for the money

PostThu Dec 03, 2015 3:10 pm

They are referring to K4200 card only, All the other nVidia cards underneath that are much more powerful than the K4200.

I also don't see it written above the GTX 980, only above the K4200. Between the K4200 and the GTX 980 there's also the Titan-X and there's no way that would just be recommended as the GUI card.

As to why they didn't put in in brackets, no idea.
DVC Built Clevo P775DM3-G Laptop with UHD screen, 7700K CPU@4.9Ghz, Geforce GTX 1060 6GB GPU, G-Sync UHD screen, 500GB M.2 Primary, 1x 480GB SSD, 1x1TB M.2, 1x 2TB Video drives.
Building Bespoke Video Editing systems for over 16 years
Offline

Jens Kapp

  • Posts: 12
  • Joined: Sun Jun 12, 2016 4:08 pm

Re: Best GPU choice for the money

PostWed Jul 06, 2016 11:59 am

Is it better to have more CUDA cores or a higher GPU clock?
Or vice versa:
Should I buy a GTX 1060 with ~1000 CUDA cores and 1700Mhz clock speed vs.
GTX 980ti with ~2000 CUDA cores but only 1200Mhz clock speed?
Thanks!
Offline

Adam Simmons

  • Posts: 5510
  • Joined: Thu Apr 04, 2013 4:21 pm
  • Location: UK

Re: Best GPU choice for the money

PostWed Jul 06, 2016 12:14 pm

As the GTX 1060 hasn't been released yet no one has tested it. I know the GTX 1070 runs the blur test using the candle test project on par with the Titan X, so that would imply that less cores but faster is better
DVC Built Clevo P775DM3-G Laptop with UHD screen, 7700K CPU@4.9Ghz, Geforce GTX 1060 6GB GPU, G-Sync UHD screen, 500GB M.2 Primary, 1x 480GB SSD, 1x1TB M.2, 1x 2TB Video drives.
Building Bespoke Video Editing systems for over 16 years
Offline

Gian Salucci

  • Posts: 25
  • Joined: Wed Mar 11, 2015 1:11 am

Re: Best GPU choice for the money

PostWed Aug 24, 2016 9:45 am

People have been saying Resolve only uses one GPU... Doesn't Resolve studio use ALL the available GPUs + CUDA cores? I added a 2nd 980 and my render speed is much faster...

(Not running in SLI)
Offline

Adam Simmons

  • Posts: 5510
  • Joined: Thu Apr 04, 2013 4:21 pm
  • Location: UK

Re: Best GPU choice for the money

PostWed Aug 24, 2016 3:04 pm

Which 'people' are you referring to?
Users on this forum usually specify if they are referring to Resolve lite or Studio on what card usage you can get out of it
DVC Built Clevo P775DM3-G Laptop with UHD screen, 7700K CPU@4.9Ghz, Geforce GTX 1060 6GB GPU, G-Sync UHD screen, 500GB M.2 Primary, 1x 480GB SSD, 1x1TB M.2, 1x 2TB Video drives.
Building Bespoke Video Editing systems for over 16 years
Offline

Keenan Reed

  • Posts: 1
  • Joined: Tue Oct 25, 2016 12:05 am

Re: Best GPU choice for the money

PostTue Oct 25, 2016 12:29 am

Resolve has the potential to use multiple servers and computational power including multiple GPU's (supposedly 100's of GPUs). This is intended for the professional film and color industry for companies that edit and grade major motion pictures. However the ability to use multiple GPU's is a feature of the STUDIO edition of Resolve. Resolve Lite can only read 1xProcessing GPU and 1xGraphics interface.

Also another reason workstation GPU's cost so much is because of the intention and therefor the specific engineering and quality assurance that goes into each card. nVidia's Quadro cards are engineered to run 24/7 for years with low heat and energy output while boasting maximum VRAM for projects rendering a multitude of complex nodes per card and seamless playback. I believe they typically have the same processing chips that the GeForce line utilizes, but the magic lies in the Quadro drivers and how well a card can be utilized by software via drivers. All nVidia workstation GPU's are quality assured at the highest level passing industry level inspections guaranteeing their performance in that context. The audience that workstation cards speak to aren't the at home budget builder or small team production company. These are large Color Grading agencies and rendering farms that have incredible budgets and will not be ordering small quantities at a time. This is for the color grader who is applying extensive and dense edits to nodes which are being applied to CinemaDNGRaw and AppleProRes file types with immediate playback without hiccups, and not only faster but more consistent renders (not actually entirely certain on the 'fast' portion of that last bit unless using multiple).

This is what I've researched but please correct me if I'm wrong.

My recommendation is that if you're on a budget or are beginner editor/color grader stick with the GeForce line. If you aren't on a budget or are an advanced editor/grader then you're probably looking at Quadro. If you're intermediate probably get a GTX1080 at this point, unless you're willing and can fork out $6,000 for a card that has 24gb of VRAM.
Offline

Adam Simmons

  • Posts: 5510
  • Joined: Thu Apr 04, 2013 4:21 pm
  • Location: UK

Re: Best GPU choice for the money

PostTue Oct 25, 2016 7:35 pm

I'd have to disagree. When it comes to Resolve there's no advantage to using Quadros. Apart from the extra VRAM on the M & P6000, & the P5000 there's absolutely no reason to waste money on them.
Even for high end work I'd be surprised if 12GB VRAM isn't enough, and having built many Avid validated systems over the years using Quadros they are just as prone to failure as any other card. You shouldn't believe all the hype

For the price of 1 Quadro P6000 you can buy 3 Titan-X pascals with change left over
DVC Built Clevo P775DM3-G Laptop with UHD screen, 7700K CPU@4.9Ghz, Geforce GTX 1060 6GB GPU, G-Sync UHD screen, 500GB M.2 Primary, 1x 480GB SSD, 1x1TB M.2, 1x 2TB Video drives.
Building Bespoke Video Editing systems for over 16 years
Offline

Ellory Yu

  • Posts: 4676
  • Joined: Wed Jul 30, 2014 5:25 pm

Re: Best GPU choice for the money

PostThu Oct 27, 2016 1:57 am

Emin Khateeb wrote:I'm repeating what i already wrote in my previous post: Above the Quadro K4200, the Titan X and the 980, they wrote "Recommended for UI GPU". And now i don't know if they only mean the Quadro K4200 or everything below this line. Because if they mean only the Quadro K4200, why didn't they write it in brackets like they did with the FirePro W7000?

Emin,
BMD only writes what they have tested and they are in no position to test every cards. In fact most of the recommended hardware and GPU are outdated. Don't get to much caught between a prosumer versus consumer card. Go with what works best for you and the workflow and workload you need it for. My workhorse GPU is an AMD Radeon R9 390X 8Gb and that puppy have not failed or slow me down with my workflow. I used it for raw workflows for my BMPCC 1080p and my URSA 4K 60 fps and it is just fine. My clips are shorts and features of up to 90 minutes with no hicup. I use DR 12.5.2 Studio and the various effects (transition, Blurr, and NR) and it has not choked. I believe it works just fine because of not just the GPU but how your entire workstation is configured (which includes memory, CPU, ssd storage and caching, OS configuration, software configuration, cache allocations, partitioning, ...) and GPU that again provides what you intend to use it for. The slight advantage I have is that I am an IT pro by profession so I know how best to optimize systems for less. Done it so may times for over 28 years so I am not phase by consumer versus professional product marketing hype. The gaming cards are, IMO, the best when it comes to performance and cost ratio. That's why I chose to go with the AMD R9 instead of a Fire or a 980 Titan instead of a Quadro.

As for SLI, having 2 or more cards are fine if you have the budget but again do get too tight up with it thinking it will also be ideal for a 10 bit reference display. I used a declink card for my reference monitor and suggest that be the route. Circling back to your thinking if BMD does not have it in their manual it must not work or something like that, you just need to read the thousands of posts here and on BMD user and the majority of post that talks about gpu cards talk about the user's used of their 'gaming" or as you put it, 'consumer" gpu and rarely one of those cards in the BMD manual.

Also, Adam know his stuff with respect to BMD and DR. I recommend noting what he says. :)
URSA Mini Pro 4.6K G2, BMPCC 6K. iMac Pro 27” 5K Retina, 64gb, 1Tb SSD, 12Tb M.2 NVMe TB4 DAS, 36Tb HDD DAS, Vega 56 8gb GPU/ BM Vega 56 8gb eGPU, MacOS Sequoia+DVRS 19.1.4, BM Panel & Speed Editor. Mac Mini M2 Pro 10/16 cores, Sequoia+DVRS 20
Offline

Stephen Gyves

  • Posts: 2
  • Joined: Thu Oct 27, 2016 2:30 am

Re: Best GPU choice for the money

PostThu Oct 27, 2016 3:25 am

After choosing the GPU that DaVinci will use, I am hearing stuff about not sharing those GPU's with the graphics needed for the main operating display. How does this work? Do we now have a dedicated card for the main display? Not really using the outputs on the GPU (GTX 980 or what not) with DaVinci? So you'd only have your monitor plugged into the OS GUI card.
Offline

Adam Simmons

  • Posts: 5510
  • Joined: Thu Apr 04, 2013 4:21 pm
  • Location: UK

Re: Best GPU choice for the money

PostThu Oct 27, 2016 6:10 am

The other thing about the 'guide' is that it's so outdated, at least half the GPU's are no longer available
DVC Built Clevo P775DM3-G Laptop with UHD screen, 7700K CPU@4.9Ghz, Geforce GTX 1060 6GB GPU, G-Sync UHD screen, 500GB M.2 Primary, 1x 480GB SSD, 1x1TB M.2, 1x 2TB Video drives.
Building Bespoke Video Editing systems for over 16 years

Return to Getting Started

Who is online

Users browsing this forum: No registered users and 19 guests