Using an additional, dedicated GUI GPU?

Get answers to your questions about color grading, editing and finishing with DaVinci Resolve.
  • Author
  • Message
Offline
User avatar

Cary Knoop

  • Posts: 1183
  • Joined: Sun Mar 12, 2017 6:35 pm
  • Location: Newark, CA USA

Using an additional, dedicated GUI GPU?

PostMon Dec 02, 2019 5:18 pm

Is there any benefit in having an additional GPU dedicated for GUI only? Or is the GUI overhead so minimal that it does not matter?
Offline

Dan Sherman

  • Posts: 980
  • Joined: Fri Jul 01, 2016 11:07 pm

Re: Using an additional, dedicated GUI GPU?

PostMon Dec 02, 2019 5:27 pm

For the last few years BM has recommended one higher end GPU, than multiple lower end ones.
X99-A II, 6850K 4.2 GHz, GTX 1070 FE (431.36), DDR4-2400 CL12-14-14 4x8 GB
Win 10 Pro 1809, Resolve Studio 16.1.0b.017
Offline

Michael Tiemann

  • Posts: 497
  • Joined: Fri Jul 05, 2013 11:22 am

Re: Using an additional, dedicated GUI GPU?

PostMon Dec 02, 2019 5:28 pm

Here's a screenshot showing playback of R3D media for the first several seconds, followed by me opening up a 4-up meter display that includes the CIE scope. As you can see, it increases the load on the GPU by approx 33%. Is that worth a whole GPU itself? Not at the level of a Radeon Pro Vega 64. But with lesser GPUs, it will be a correspondingly heavier load, which could warrant a dedicated GPU, as the immediately previous post suggests.

Screen Shot 2019-12-02 at 12.23.02 PM.png
Screen Shot 2019-12-02 at 12.23.02 PM.png (274.53 KiB) Viewed 1980 times
MacOS Catalina Version 10.15.1
iMac Pro (2017)
3 GHz Intel Xeon W
64GB 2666 MHz DDR4
Radeon Pro Vega 64 16 GB
Decklink Duo card feeding FSI DM 240 Monitor
Offline
User avatar

Cary Knoop

  • Posts: 1183
  • Joined: Sun Mar 12, 2017 6:35 pm
  • Location: Newark, CA USA

Re: Using an additional, dedicated GUI GPU?

PostMon Dec 02, 2019 5:30 pm

Dan Sherman wrote:For the last few years BM has recommended one higher end GPU, than multiple lower end ones.

I am not talking about getting lower end GPU's for computations, I am asking about an additional GPU dedicated to the GUI.
Offline

Michael_Andreas

  • Posts: 1160
  • Joined: Sat Jan 05, 2019 9:40 pm
  • Real Name: Michael Andreas

Re: Using an additional, dedicated GUI GPU?

PostMon Dec 02, 2019 5:53 pm

This YouTuber says that he got an improvement:


This is unknown territory for me. I understand that sometimes people have issues when using motherboard GPUs. But perhaps the YouTuber got by with it by (1) choosing the right magic settings in Resolve and (2) using both GPUs as same manufacturer with compatible driver.
_________________________________________________
DR Studio 16.1.1.005 - Win10 Pro 64b - i7-6700K@4GHz - 32GB RAM
- NVIDIA RTX 2070 8GB, "Studio" driver 431.86 (upgr fr GTX1070 11/22)
- C: Drive 250 GB SSD - Project/Cache Drive 1 TB SSD
Offline

Michael_Andreas

  • Posts: 1160
  • Joined: Sat Jan 05, 2019 9:40 pm
  • Real Name: Michael Andreas

Re: Using an additional, dedicated GUI GPU?

PostMon Dec 02, 2019 6:11 pm

I'm a bit suspicious of his results, wondering if he had OBS running when he made his "before" measurements. With DR open but idle and 4 displays connected, my GPU is hardly working. Playing a video in VLC player adds to that. Playing a clip in DR adds to it. But how much of that load would be handled by the compute GPU and how much by the UI GPU in a dual GPU system?
_________________________________________________
DR Studio 16.1.1.005 - Win10 Pro 64b - i7-6700K@4GHz - 32GB RAM
- NVIDIA RTX 2070 8GB, "Studio" driver 431.86 (upgr fr GTX1070 11/22)
- C: Drive 250 GB SSD - Project/Cache Drive 1 TB SSD
Offline

Mario Kalogjera

  • Posts: 602
  • Joined: Sat Oct 31, 2015 8:44 pm

Re: Using an additional, dedicated GUI GPU?

PostMon Dec 02, 2019 7:06 pm

I once also entertained the idea that a separate gui GPU would free up the "main" compute GPU. However, my experience shows that another, lesser GPU (from a different vendor) slowed down Resolve playback dramatically for me, even it was used just for display.
I can't say if final rendering was affected because I immediately removed the lesser GPU.

Sent from my GM 5 Plus d using Tapatalk
Asus Prime X370-Pro, Ryzen 1600X @4GHz, 32 GB G.Skill AEGIS DDR-4 3000MHz, Asrock Phantom Gaming RX 580 8GB, 120 GB system SSD, Media storage - "Always in motion is it", Windows 10 Pro+Resolve Studio 15.3+Fusion Studio 9.0.2
Offline

mpetech

  • Posts: 33
  • Joined: Wed Sep 04, 2013 9:52 pm

Re: Using an additional, dedicated GUI GPU?

PostMon Dec 02, 2019 8:25 pm

Like any modern media software, not everything is GPU or CPU dependent. Even a simple task like raising brightness will involve varying degrees of CPU and GPU in different stages of the process - decode, transform, FX, encode, etc.

There are tasks in Resolve that is mainly CPU dependent - color correction.
Some tasks are mainly GPU bound - 3D, anything that involves geometrics (resize, shapes, etc.).

Your source codec can also affect the load. DNx and ProRes are currently CPU bound during decode. R3D is affected by GPU., while H.264 depends on the encoding profile.

So the answer is "not so simple." Rule of thumb is faster CPU or GPU will always help but not in a linear, clear cut way you might assume.
Offline

Dan Sherman

  • Posts: 980
  • Joined: Fri Jul 01, 2016 11:07 pm

Re: Using an additional, dedicated GUI GPU?

PostMon Dec 02, 2019 9:03 pm

Cary Knoop wrote:
Dan Sherman wrote:For the last few years BM has recommended one higher end GPU, than multiple lower end ones.

I am not talking about getting lower end GPU's for computations, I am asking about an additional GPU dedicated to the GUI.


BM response is still the same. If you go back through the forum you will see various responses from BM employees that recommend one "higher" card vs 2 "lesser ones".

So for example if you had a 2080 ti currently, an RTX titan would be better than 2 2080 ti's. An RTX quadro would be better than 2 RTX titans etc.

As a developer I can think of several reasons why this is the case, but BM has never given specifics.

Check out this puget benchmark where they did performance scaling analysis with up to 4 Titan Xps.
The gist of it, is that performance scales really badly with multiple gpus.
https://www.pugetsystems.com/labs/artic ... n-Xp-1060/
Last edited by Dan Sherman on Mon Dec 02, 2019 9:18 pm, edited 1 time in total.
X99-A II, 6850K 4.2 GHz, GTX 1070 FE (431.36), DDR4-2400 CL12-14-14 4x8 GB
Win 10 Pro 1809, Resolve Studio 16.1.0b.017
Offline

mpetech

  • Posts: 33
  • Joined: Wed Sep 04, 2013 9:52 pm

Re: Using an additional, dedicated GUI GPU?

PostMon Dec 02, 2019 9:12 pm

Dan Sherman wrote:
Cary Knoop wrote:
Dan Sherman wrote:For the last few years BM has recommended one higher end GPU, than multiple lower end ones.

I am not talking about getting lower end GPU's for computations, I am asking about an additional GPU dedicated to the GUI.


BM response is still the same. If you go back through the forum you will see various responses from BM employees that recommend one "higher" card vs 2 "lesser ones".

So for example if you had a 2080 ti currently, an RTX titan would be better than 2 2080 ti's. An RTX quadro would be better than 2 RTX titans etc.

As a developer I can think of a lot of several reasons why this is the case, but BM has never given specifics.

Check out this puget benchmark where they did performance scaling analysis with up to 4 Titan Xps.
The gist of it, is that performance scales really badly with multiple gpus.
https://www.pugetsystems.com/labs/artic ... n-Xp-1060/


Generally, this is true. Just careful of the Quadro cards. They will perform better than the same generation RTX cards because they have higher VRAM for large resolutions (6K or bigger). But normally 2080ti is the best price to performance ratio. While the Titan RTX card is faster (single digits % better) it is twice as expensive.

The significant advantage of a single faster card is less heat and less problems with PCIe lanes/slots.
Sticking 2 2080 ti in a case requires about 1000 watt PSU, blower-style cooler, and excellent airflow. Don't use regular dual/triple fan style GPU. It will produce too much heat, and the GPU will throttle down, thus negating any benefits.
Offline

Matt Heere

  • Posts: 19
  • Joined: Tue Nov 21, 2017 2:18 pm

Re: Using an additional, dedicated GUI GPU?

PostMon Dec 02, 2019 10:34 pm

I think there's an oft-overlooked context to this question. Everyone responds with "one big GPU > two lesser ones". I think it's a lot more common that someone invests in a new GPU and still has the old one sitting there. Do you leave it in place and have two, or are you better off with just the one?

Hopefully this will be me post-Xmas. If the shiny new RTX 2080 Ti shows up, I'll have a GTX970 going begging. It's not worth selling, and I don't have another system to put it in. I have plenty of power and cooling in the case for both. The 2080 Ti will be such a massive improvement that I doubt it's worth the bother to leave the 970 in there, but I could and theoretically I'd have more GPU available.
Offline
User avatar

Cary Knoop

  • Posts: 1183
  • Joined: Sun Mar 12, 2017 6:35 pm
  • Location: Newark, CA USA

Re: Using an additional, dedicated GUI GPU?

PostMon Dec 02, 2019 10:51 pm

Again, folks, this is not a question about having one faster and larger memory card versus two inferior ones. How come it is so hard to make that clear?

I just want to know if it benefits to add a card, say an RTX 2060 for GUI only to my current configuration?
Offline

Uli Plank

  • Posts: 5203
  • Joined: Fri Feb 08, 2013 2:48 am

Re: Using an additional, dedicated GUI GPU?

PostMon Dec 02, 2019 10:55 pm

I tested a GT120 for GUI and a Titan-X for compute in a 'classic' MacPro and found less than 4% difference, and that was under version 15.
Resolve Studio and Fusion Studio under MacOS Mojave 10.14.6
iMac 2017 Radeon Pro 580 8 GB VRAM and 32 GB RAM
Mac mini 16 GB RAM plus eGFX Breakway Radeon RX 580
Offline

Dan Sherman

  • Posts: 980
  • Joined: Fri Jul 01, 2016 11:07 pm

Re: Using an additional, dedicated GUI GPU?

PostMon Dec 02, 2019 11:08 pm

Cary Knoop wrote:Again, folks, this is not a question about having one faster and larger memory card versus two inferior ones. How come it is so hard to make that clear?

I just want to know if it benefits to add a card, say an RTX 2060 for GUI only to my current configuration?



It's not hard, you just aren't listening.

Here is a direct quote from a BM developer.
viewtopic.php?f=21&t=59799

Rohit Gupta
With Resolve 14, its best you go with a single GPU which is used for both UI and compute. It will give you better performance than a GUI + Compute card generally.

Only exception being if you have multi GPU system (for example 4), it might be better to have a GUI card for display and 4 dedicated cards for compute.
X99-A II, 6850K 4.2 GHz, GTX 1070 FE (431.36), DDR4-2400 CL12-14-14 4x8 GB
Win 10 Pro 1809, Resolve Studio 16.1.0b.017
Offline
User avatar

Cary Knoop

  • Posts: 1183
  • Joined: Sun Mar 12, 2017 6:35 pm
  • Location: Newark, CA USA

Re: Using an additional, dedicated GUI GPU?

PostMon Dec 02, 2019 11:10 pm

Dan Sherman wrote:
Cary Knoop wrote:Again, folks, this is not a question about having one faster and larger memory card versus two inferior ones. How come it is so hard to make that clear?

I just want to know if it benefits to add a card, say an RTX 2060 for GUI only to my current configuration?



It's not hard, you just aren't listening.

Here is a direct quote from a BM developer.
viewtopic.php?f=21&t=59799

Rohit Gupta
With Resolve 14, its best you go with a single GPU which is used for both UI and compute. It will give you better performance than a GUI + Compute card generally.

Only exception being if you have multi GPU system (for example 4), it might be better to have a GUI card for display and 4 dedicated cards for compute.

I have a multi-GPU system.
Offline

Dan Sherman

  • Posts: 980
  • Joined: Fri Jul 01, 2016 11:07 pm

Re: Using an additional, dedicated GUI GPU?

PostMon Dec 02, 2019 11:30 pm

Cary Knoop wrote:
Dan Sherman wrote:
Cary Knoop wrote:Again, folks, this is not a question about having one faster and larger memory card versus two inferior ones. How come it is so hard to make that clear?

I just want to know if it benefits to add a card, say an RTX 2060 for GUI only to my current configuration?



It's not hard, you just aren't listening.

Here is a direct quote from a BM developer.
viewtopic.php?f=21&t=59799

Rohit Gupta
With Resolve 14, its best you go with a single GPU which is used for both UI and compute. It will give you better performance than a GUI + Compute card generally.

Only exception being if you have multi GPU system (for example 4), it might be better to have a GUI card for display and 4 dedicated cards for compute.

I have a multi-GPU system.


Then you should have been more specific. Also note he uses the word might.
X99-A II, 6850K 4.2 GHz, GTX 1070 FE (431.36), DDR4-2400 CL12-14-14 4x8 GB
Win 10 Pro 1809, Resolve Studio 16.1.0b.017
Offline
User avatar

Cary Knoop

  • Posts: 1183
  • Joined: Sun Mar 12, 2017 6:35 pm
  • Location: Newark, CA USA

Re: Using an additional, dedicated GUI GPU?

PostMon Dec 02, 2019 11:45 pm

Dan Sherman wrote:So for example if you had a 2080 ti currently, an RTX titan would be better than 2 2080 ti's. An RTX quadro would be better than 2 RTX titans etc.

Unless you absolutely need the extra memory (or require ECC memory for the Quadro) this argument is completely bogus, two 2080 TI will perform vastly better than a single RTX Quadro or RTX Quadro.
Offline
User avatar

Dwaine Maggart

Blackmagic Design

  • Posts: 5587
  • Joined: Wed Aug 22, 2012 2:53 pm

Re: Using an additional, dedicated GUI GPU?

PostTue Dec 03, 2019 12:03 am

It's not completely bogus if your system motherboard does not have support for 2 full 16 lane PCIe slots in addition to however many more PCIe lanes the system requires.
Dwaine Maggart
Blackmagic Design DaVinci Support
Offline
User avatar

Cary Knoop

  • Posts: 1183
  • Joined: Sun Mar 12, 2017 6:35 pm
  • Location: Newark, CA USA

Re: Using an additional, dedicated GUI GPU?

PostTue Dec 03, 2019 12:42 am

Dwaine, how about helping to give an answer to the question: would an additional card exclusively used for the GUI free up resources for the cards used exclusively for computation?
Offline

mpetech

  • Posts: 33
  • Joined: Wed Sep 04, 2013 9:52 pm

Re: Using an additional, dedicated GUI GPU?

PostTue Dec 03, 2019 1:10 am

Cary Knoop wrote:Dwaine, how about helping to give an answer to the question: would an additional card exclusively used for the GUI free up resources for the cards used exclusively for computation?


If you have a low-end card, adding a big card would certainly benefit functions that are GPU bound. Uncheck the box "use GPU GUI..." in settings. It will force Resolve to use the 2nd card (with no monitor attached) as strictly the compute card. How much benefit? The answer is complicated and certainly not linear. Too many factors to consider like what function, codec, CPU, etc. And each element will get varying degrees of benefit from the new GPU. For example, an FX might get double your performance, but your R3D 6K might be choking the GPU anyway.

There are other complications also. Other apps may not detect your big bad new card unless it is the "primary" GPU. Make sure you have the PCIe lanes, cooling, and CPU power to feed your new GPU.
Offline
User avatar

Dwaine Maggart

Blackmagic Design

  • Posts: 5587
  • Joined: Wed Aug 22, 2012 2:53 pm

Re: Using an additional, dedicated GUI GPU?

PostTue Dec 03, 2019 1:28 am

I would follow Rohit's guidance which was posted above.
Dwaine Maggart
Blackmagic Design DaVinci Support
Offline
User avatar

Cary Knoop

  • Posts: 1183
  • Joined: Sun Mar 12, 2017 6:35 pm
  • Location: Newark, CA USA

Re: Using an additional, dedicated GUI GPU?

PostTue Dec 03, 2019 5:38 am

Dwaine Maggart wrote:I would follow Rohit's guidance which was posted above.

I am sorry but that guidance is ambiguous.
Offline

kinvermark

  • Posts: 189
  • Joined: Tue Apr 16, 2019 5:04 pm
  • Real Name: Mark Wilson

Re: Using an additional, dedicated GUI GPU?

PostTue Dec 03, 2019 5:56 am

@ Cary

I think you are going to have to test this yourself. The only positive evidence I have seen for this approach is the YouTube video posted above. The forum comments all seem to be reading off the same generic "policy" page; which makes me think no one has really done enough testing to say for sure.

I would also like to know (preferably before I chuck out all my old graphics cards :D )
Windows 10 PC. Intel i7 x980, 24GB RAM, AMD Rx580 8GB GPU, SSD & RAID HD array. Dual UHD monitors. Decklink 4k mini. Resolve Studio 16.1
Offline
User avatar

Cary Knoop

  • Posts: 1183
  • Joined: Sun Mar 12, 2017 6:35 pm
  • Location: Newark, CA USA

Re: Using an additional, dedicated GUI GPU?

PostTue Dec 03, 2019 7:20 am

kinvermark wrote:@ Cary

I think you are going to have to test this yourself.

It seems so, I must say I am rather disappointed by the lackadaisical responses. Very much unlike the warm personal interactions, I remember from the last NAB show in Vegas. That "guidance" seems to harp on the "better get one strong card instead of two weak cards" mantra which is totally not relevant to my question. It also bothers me that one RTX Titan is promoted over having two RTX 2080Ti's which I find beyond absurd (with the exception of the case when you really need the extra memory or if regulations require ECC memory for the Quadro). From a CUDA core perspective, the RTX 2080Ti and the RTX Titan perform similarly.

I currently run 2x RTX 2080Ti and consider an additional card like an RTX 2060 or 2070 for the GUI only (and also for other non-Resolve operations).
Offline
User avatar

Dwaine Maggart

Blackmagic Design

  • Posts: 5587
  • Joined: Wed Aug 22, 2012 2:53 pm

Re: Using an additional, dedicated GUI GPU?

PostTue Dec 03, 2019 8:29 am

Don't use a separate lower performance GPU as a GUI only card.

If you use multiple GPU's, they should all be the same type.
Dwaine Maggart
Blackmagic Design DaVinci Support
Offline

Michael_Andreas

  • Posts: 1160
  • Joined: Sat Jan 05, 2019 9:40 pm
  • Real Name: Michael Andreas

Re: Using an additional, dedicated GUI GPU?

PostTue Dec 03, 2019 12:03 pm

On page 121 of the 16.1 user manual, it describes the GPU Configuration section of the Preferences panel. For the switch "Use Display GPU for Compute", it states
..., if two GPUs are installed for image processing, this checkbox enables the shared use of the display GPU instead of dedicating it to just the DaVinci user interface.


Under what circumstances would someone want to uncheck this box? If the OP does Fusion work which I understand uses only one GPU, should he uncheck this box to make sure the Fusion processing is not slowed down by choosing to use the GPU that's driving the display?

To the OP: since it you have multiple GPUs, you can play with the GPU Configuration settings to see what effect they have on performance. (Also, if you'd put your system configuration in your signature, people might realize that you already have multiple GPUs.)
_________________________________________________
DR Studio 16.1.1.005 - Win10 Pro 64b - i7-6700K@4GHz - 32GB RAM
- NVIDIA RTX 2070 8GB, "Studio" driver 431.86 (upgr fr GTX1070 11/22)
- C: Drive 250 GB SSD - Project/Cache Drive 1 TB SSD
Offline
User avatar

Cary Knoop

  • Posts: 1183
  • Joined: Sun Mar 12, 2017 6:35 pm
  • Location: Newark, CA USA

Re: Using an additional, dedicated GUI GPU?

PostTue Dec 03, 2019 3:10 pm

Dwaine Maggart wrote:Don't use a separate lower performance GPU as a GUI only card.

If you use multiple GPU's, they should all be the same type.

I am sure it must be me but this statement defies all logic.

If you check to have a card dedicated to the Resolve interface only why would it matter if it is dissimilar to computational cards?
Offline

Dan Sherman

  • Posts: 980
  • Joined: Fri Jul 01, 2016 11:07 pm

Re: Using an additional, dedicated GUI GPU?

PostTue Dec 03, 2019 3:55 pm

Cary Knoop wrote:
Dwaine Maggart wrote:Don't use a separate lower performance GPU as a GUI only card.

If you use multiple GPU's, they should all be the same type.

I am sure it must be me but this statement defies all logic.

If you check to have a card dedicated to the Resolve interface only why would it matter if it is dissimilar to computational cards?


  1. Mixing gpus can cause driver related issues for multiple reasons.
  2. If memory serves resolve will only consume as much vram as the the gpu with the lowest amount. In other words if you had multiple 8gb cards and then a 4gb cards for the gui gpu, resolve will only ever assume it can use 4gb.
  3. Most likely the gui gpu is feed form the compute gpu(s). If your are running multiple high res monitors off the gui gpu, plus say gpu scopes, you could potentially bottleneck the compute gpus because an under powered gui gpu can't keep up.
X99-A II, 6850K 4.2 GHz, GTX 1070 FE (431.36), DDR4-2400 CL12-14-14 4x8 GB
Win 10 Pro 1809, Resolve Studio 16.1.0b.017
Offline

Michael_Andreas

  • Posts: 1160
  • Joined: Sat Jan 05, 2019 9:40 pm
  • Real Name: Michael Andreas

Re: Using an additional, dedicated GUI GPU?

PostTue Dec 03, 2019 4:09 pm

In the YouTube video I linked, the YouTuber got improved render times adding a 1070 for the UI to a 2080Ti for the render. Take it for whatever it's worth.
_________________________________________________
DR Studio 16.1.1.005 - Win10 Pro 64b - i7-6700K@4GHz - 32GB RAM
- NVIDIA RTX 2070 8GB, "Studio" driver 431.86 (upgr fr GTX1070 11/22)
- C: Drive 250 GB SSD - Project/Cache Drive 1 TB SSD
Offline
User avatar

Cary Knoop

  • Posts: 1183
  • Joined: Sun Mar 12, 2017 6:35 pm
  • Location: Newark, CA USA

Re: Using an additional, dedicated GUI GPU?

PostTue Dec 03, 2019 4:23 pm

Dan Sherman wrote:[*]Mixing gpus can cause driver related issues for multiple reasons.

People mix NVIDIA GPUs all the time, never heard of driver issues.

Dan Sherman wrote:[*]If memory serves resolve will only consume as much vram as the the gpu with the lowest amount. In other words if you had multiple 8gb cards and then a 4gb cards for the gui gpu, resolve will only ever assume it can use 4gb.

You are saying that Resolve takes the size of the GUII only card into account to determine the compute card memory working set? That doesn't make any sense to me.

Dan Sherman wrote:[*]Most likely the gui gpu is feed form the compute gpu(s).

Huh?
Offline

Dan Sherman

  • Posts: 980
  • Joined: Fri Jul 01, 2016 11:07 pm

Re: Using an additional, dedicated GUI GPU?

PostTue Dec 03, 2019 5:35 pm

Cary Knoop wrote:
Dan Sherman wrote:[*]Mixing gpus can cause driver related issues for multiple reasons.

People mix NVIDIA GPUs all the time, never heard of driver issues.


it happens more than you think/heard, I've known people that have had several types of issues. The most common one being that a driver update drops support for an older gpu.

Think about what issues you might get when you mix in match gpus that support different cuda versions.

Cary Knoop wrote:
Dan Sherman wrote:[*]Most likely the gui gpu is feed form the compute gpu(s).

Huh?


If you have separate dedicated compute and gui gpus, then the processing work flow would most likely be something like this.

The compute GPU is going to do all is processing first. When its done, the data/frames it just processed will be passed to the gui gpu to render the gui. if the gui gpu is under powered for the task it becomes a bottleneck.
X99-A II, 6850K 4.2 GHz, GTX 1070 FE (431.36), DDR4-2400 CL12-14-14 4x8 GB
Win 10 Pro 1809, Resolve Studio 16.1.0b.017
Offline

Mario Kalogjera

  • Posts: 602
  • Joined: Sat Oct 31, 2015 8:44 pm

Re: Using an additional, dedicated GUI GPU?

PostTue Dec 03, 2019 7:24 pm

I'd say if you don't have a HEDT platform, the playback bottleneck would come from the fact that the two GPUs communicate at x8 bandwidth (or less). A single GPU only has to communicate within itself i.e render a frame and display it without sending it back into the PCIe network, to the other GPU...however, if a single GPU's VRAM is not enough to avoid swapping to RAM, possibly a second GPU would help free it up (serving as display buffer, GUI renderer) so that no swapping is required and is therefore faster...then again, the GUI display GPU should be an equal or better 3D OpenGL performer than the compute GPU and that's not likely

With a single 8GB GPU I have noticed that, even if VRAM is not nearly full, a lot of system RAM (up to 4GB) is taken by Resolve for GPU shared mempory...
Asus Prime X370-Pro, Ryzen 1600X @4GHz, 32 GB G.Skill AEGIS DDR-4 3000MHz, Asrock Phantom Gaming RX 580 8GB, 120 GB system SSD, Media storage - "Always in motion is it", Windows 10 Pro+Resolve Studio 15.3+Fusion Studio 9.0.2
Offline

mpetech

  • Posts: 33
  • Joined: Wed Sep 04, 2013 9:52 pm

Re: Using an additional, dedicated GUI GPU?

PostTue Dec 03, 2019 8:00 pm

Again, the question is using a GUI GPU strictly for GUI and a second card DR's compute.

While it is true VRAM usage is duplicated across all cards, the GUI GPU's VRAM is not in play for computing for the config stated above.

Second, the driver issue should not be an issue since NVidia, AMD, Blackmagic, and other companies use a packaged all in one driver set. Unless you are using 2 GPU's separated by a decade, NVidia's drivers support GPUs probably even before the 9xx series cards.

Why would the GUI GPU be a bottleneck when all it is doing is rendering the GUI and handling video overlay.
Offline

Dan Sherman

  • Posts: 980
  • Joined: Fri Jul 01, 2016 11:07 pm

Re: Using an additional, dedicated GUI GPU?

PostTue Dec 03, 2019 8:56 pm

mpetech wrote:
Second, the driver issue should not be an issue since NVidia, AMD, Blackmagic, and other companies use a packaged all in one driver set. Unless you are using 2 GPU's separated by a decade, NVidia's drivers support GPUs probably even before the 9xx series cards.


Um that's not correct.

Many of us run the studio drivers, and here is the latest studio driver.
https://www.nvidia.com/Download/driverR ... 4342/en-us

click the "SUPPORTED PRODUCTS" tab, and you will see that this driver supports nothing before the 10 series.

take a look at the latest Quadro drivers, no support for anything other than quadros. https://www.nvidia.com/Download/driverR ... 4337/en-us
X99-A II, 6850K 4.2 GHz, GTX 1070 FE (431.36), DDR4-2400 CL12-14-14 4x8 GB
Win 10 Pro 1809, Resolve Studio 16.1.0b.017
Offline

mpetech

  • Posts: 33
  • Joined: Wed Sep 04, 2013 9:52 pm

Re: Using an additional, dedicated GUI GPU?

PostTue Dec 03, 2019 10:20 pm

Dan Sherman wrote:
mpetech wrote:
Second, the driver issue should not be an issue since NVidia, AMD, Blackmagic, and other companies use a packaged all in one driver set. Unless you are using 2 GPU's separated by a decade, NVidia's drivers support GPUs probably even before the 9xx series cards.


Um that's not correct.

Many of us run the studio drivers, and here is the latest studio driver.
https://www.nvidia.com/Download/driverR ... 4342/en-us

click the "SUPPORTED PRODUCTS" tab, and you will see that this driver supports nothing before the 10 series.

take a look at the latest Quadro drivers, no support for anything other than Quadros. https://www.nvidia.com/Download/driverR ... 4337/en-us


Understood. Studio drivers aside, My point was driver issues between two different cards are not as significant as some would say. You can run the GeForce drivers and support up to 600 series cards. And contrary to popular believes, GF and Quadro drivers can detect each other and install correct drivers.


And if you are running Studio Drivers your GUI GPU aint old. It has to be 2xxx or 1xxx series.

Return to DaVinci Resolve

Who is online

Users browsing this forum: aknittel, Andy Mees, connorlee64, LauraE, LecouteurJ, roger.magnusson and 228 guests