2nd GPU performance improvements

Get answers to your questions about color grading, editing and finishing with DaVinci Resolve.
  • Author
  • Message
Offline
User avatar

Alessandro Caporale

  • Posts: 69
  • Joined: Thu Jan 09, 2014 2:12 pm
  • Location: Italy

2nd GPU performance improvements

PostWed May 04, 2016 12:36 pm

Yet another thread asking about multiple GPUs on the same system.
Right now I'm with one Titan X 12GB GPU, and I'm thinking about buying a 980Ti to speedup Resolve Studio tasks.
AFAIK, Resolve will use RAM memory only from GPU1 while can take advantages of the second GPU card.
Performance improvements are "visible"? Does it worth the money?

Another point: I'm moving to a new X99 system and I could choose (for the very same price) between an i7-5960X and a E5-2699v4

Resolve is faster with cores rather than clock speed? The new E5-2699v4 is a 22 cores (44 threads) monster @2,1GHz with 55MB of cache

Thanks for your time
Offline

Adam Simmons

  • Posts: 5510
  • Joined: Thu Apr 04, 2013 4:21 pm
  • Location: UK

Re: 2nd GPU performance improvements

PostWed May 04, 2016 12:45 pm

You might want to check those prices. Over here the 5960X is around £800 before tax, the E5-2699v4 is close to £3000
DVC Built Clevo P775DM3-G Laptop with UHD screen, 7700K CPU@4.9Ghz, Geforce GTX 1060 6GB GPU, G-Sync UHD screen, 500GB M.2 Primary, 1x 480GB SSD, 1x1TB M.2, 1x 2TB Video drives.
Building Bespoke Video Editing systems for over 16 years
Offline
User avatar

Alessandro Caporale

  • Posts: 69
  • Joined: Thu Jan 09, 2014 2:12 pm
  • Location: Italy

Re: 2nd GPU performance improvements

PostWed May 04, 2016 12:49 pm

I work for a big IT corporation, this means I really don't pay anything and the company pay it for me.
Offline
User avatar

waltervolpatto

  • Posts: 10528
  • Joined: Thu Feb 07, 2013 5:07 pm
  • Location: 1146 North Las Palmas Ave. Hollywood, California 90038 USA

Re: 2nd GPU performance improvements

PostWed May 04, 2016 2:02 pm

GET TWIN GPU CARDS.... (just because you have the money)

if you have space, you can get a easy gpu for gui only and two titan x 12.
W10-19043.1645- Supermicro MB C9X299-PGF - RAM 128GB CPU i9-10980XE 16c 4.3GHz (Oc) Water cooled
Decklink Studio 4K (12.3)
Resolve 18.5.1 / fusion studio 18
GPU 3090ti drivers 512.59 studio
Offline
User avatar

Alessandro Caporale

  • Posts: 69
  • Joined: Thu Jan 09, 2014 2:12 pm
  • Location: Italy

Re: 2nd GPU performance improvements

PostWed May 04, 2016 2:12 pm

Thank you Walter,
Since there's no need of SLI, and since there's no opportunity to use both card's RAM, what benefits could I achieve running two Titan X instead Titan + 980Ti ?
Offline

Sam Steti

  • Posts: 2497
  • Joined: Tue Jun 17, 2014 7:29 am
  • Location: France

Re: 2nd GPU performance improvements

PostWed May 04, 2016 2:30 pm

Hey Alessandro,

Resolve doesn't take SLI into account anyway. Just don't put cards with different VRAM...
*MacMini M1 16 Go - Ext nvme SSDs on TB3 - 14 To HD in 2 x 4 disks USB3 towers
*Legacy MacPro 8core Xeons, 32 Go ram, 2 x gtx 980 ti, 3SSDs including RAID
*Resolve Studio everywhere, Fusion Studio too
*https://www.buymeacoffee.com/videorhin
Offline
User avatar

Simon Dayan

  • Posts: 158
  • Joined: Tue Feb 04, 2014 10:39 am
  • Location: West Hollywood, Los Angeles,CA

Re: 2nd GPU performance improvements

PostWed May 04, 2016 2:32 pm

you can add AMD FirePro(w8100 or w9100) for GUI and you can output 10bit monitor + Open CL for other software like Fusion or premiere pro... or plugins that support open cl ;)
Offline
User avatar

Alessandro Caporale

  • Posts: 69
  • Joined: Thu Jan 09, 2014 2:12 pm
  • Location: Italy

Re: 2nd GPU performance improvements

PostWed May 04, 2016 2:51 pm

Sam: I wrote SLI just because SLI needs twin cards but Resolve has no need for it. May I ask you why same NVRAM if Resolve use just ram from first GPU?

Simon: I'm on a 5K display and not every card work with it, an option that could get me back to AMD is the coming Radeon Pro Duo card, very interesting
Offline
User avatar

waltervolpatto

  • Posts: 10528
  • Joined: Thu Feb 07, 2013 5:07 pm
  • Location: 1146 North Las Palmas Ave. Hollywood, California 90038 USA

Re: 2nd GPU performance improvements

PostWed May 04, 2016 2:57 pm

I would use two titan x, attach one to the gui, resolve will use them elegantly up to 8k if your San is fast (tried with our mac here)
W10-19043.1645- Supermicro MB C9X299-PGF - RAM 128GB CPU i9-10980XE 16c 4.3GHz (Oc) Water cooled
Decklink Studio 4K (12.3)
Resolve 18.5.1 / fusion studio 18
GPU 3090ti drivers 512.59 studio
Offline

Willian Aleman

  • Posts: 356
  • Joined: Thu Aug 23, 2012 10:08 pm
  • Location: NYC, USA

Re: 2nd GPU performance improvements

PostWed May 04, 2016 3:41 pm

waltervolpatto wrote:I would use two titan x, attach one to the gui, resolve will use them elegantly up to 8k if your San is fast (tried with our mac here)


It's my understanding the the GUI doesn't take advantage of the extra VRAM in the second card.
Then, apart of the compatibility between GPU cards, and since Resolve makes use of the two card independently, why to invest in a second, expensive card as the first?
Willian Aleman
New York City
USA

Resolve Studio
iMac (Retina 5K, 27-inch, 2020)
MacOS: 10.15.7 (19H2)
processor: 3.8 GHz 8-Core Intel Core i7
Memory: 40 GB 2667 MHz DDR4
Graphic: AMD Radeon Pro 5700 XT 16 GB
Offline
User avatar

waltervolpatto

  • Posts: 10528
  • Joined: Thu Feb 07, 2013 5:07 pm
  • Location: 1146 North Las Palmas Ave. Hollywood, California 90038 USA

Re: 2nd GPU performance improvements

PostWed May 04, 2016 3:45 pm

if you have two twin cards and you use both for processing, resolve will benefit from both.

if you do not intend use it both for processing then yes, use a lesser card for gui
W10-19043.1645- Supermicro MB C9X299-PGF - RAM 128GB CPU i9-10980XE 16c 4.3GHz (Oc) Water cooled
Decklink Studio 4K (12.3)
Resolve 18.5.1 / fusion studio 18
GPU 3090ti drivers 512.59 studio
Online
User avatar

Uli Plank

  • Posts: 21573
  • Joined: Fri Feb 08, 2013 2:48 am
  • Location: Germany and Indonesia

Re: 2nd GPU performance improvements

PostWed May 04, 2016 4:29 pm

I'm using an old GT120 for GUI and a Titan-X for processing and it works very well.

Wonder if there will be any disadvantage if older CUDA cards are not supported in Resolve 12.5 any more but you use it for GUI only?
Now that the cat #19 is out of the bag, test it as much as you can and use the subforum.

Studio 18.6.5, MacOS 13.6.5
MacBook M1 Pro, 16 GPU cores, 32 GB RAM and iPhone 15 Pro
Speed Editor, UltraStudio Monitor 3G, iMac 2017, eGPU
Offline

Willian Aleman

  • Posts: 356
  • Joined: Thu Aug 23, 2012 10:08 pm
  • Location: NYC, USA

Re: 2nd GPU performance improvements

PostWed May 04, 2016 4:36 pm

waltervolpatto wrote:if you have two twin cards and you use both for processing, resolve will benefit from both.

if you do not intend use it both for processing then yes, use a lesser card for gui


Walter, thanks for clearing this up.
Willian Aleman
New York City
USA

Resolve Studio
iMac (Retina 5K, 27-inch, 2020)
MacOS: 10.15.7 (19H2)
processor: 3.8 GHz 8-Core Intel Core i7
Memory: 40 GB 2667 MHz DDR4
Graphic: AMD Radeon Pro 5700 XT 16 GB
Offline

Justin Jackson

  • Posts: 670
  • Joined: Thu Apr 28, 2016 3:50 am

Re: 2nd GPU performance improvements

PostWed May 04, 2016 7:42 pm

So couple things around this.. first, I have an AMD W5000 with 4GB Ram, and I just built a new system where I put in a nVidia 950GTX with 2GB Ram. I assume the 950 with cuda will be faster than the W5000 with OpenCL? I am so confused about Cuda vs OpenCL. Does Resolve work better with CUDA, OpenCL or on part with either? I also read that Fusion uses OpenCL (I am Windows 10 based). If Resolve works good with OpenCL, then it seems a stronger OpenCL card would be better than using a CUDA based card? I do want to use both Fusion and Resolve, so if Fusion doesnt do much with CUDA, then I assume an AMD card would be the way to go for OpenCL?

I also do a little gaming sometimes, hence why I was leaning towards the nVidia card, but if a gaming capable AMD card with strong OpenCL performance allows me to game decently, as well as provide both Fusion and Resolve boosts, I am fine going that way. I am waiting for the next gen cards, hence the cheap 950GTX for now until I can plunk $500 or more on a decent card.

On that note, I am able to play some games pretty well on my AMD W5000 (or used to when that system worked). Would the W8100 be a better card for both Fusion and Resolve than say their 390x?

I am assuming AMD will refresh the FirePro line soon, its been a few years, and with the new architecture coming out, I hope they are going to update the workstation cards. I still dont know what benefits they bring to Fusion/Resolve vs a consumer level card that also gives great gaming performance?

Thanks.
Custom DIY AMD1950x 16-core/32-thread, liquid cooled, 64GB 3600Mhz RAM, 950Pro-512GB NVMe os/apps, 2x500GB 850 Evo RAID 0 SATA3, Zotac 1070 8GB video, USB 3.1Gen2 RAID0 2x4TB, 2x2TB Crucial MX500 SSD SATA3.
Offline

Rohit Gupta

Blackmagic Design

  • Posts: 1629
  • Joined: Wed Aug 22, 2012 5:00 am

Re: 2nd GPU performance improvements

PostThu May 05, 2016 12:23 am

Uli Plank wrote:I'm using an old GT120 for GUI and a Titan-X for processing and it works very well.

Wonder if there will be any disadvantage if older CUDA cards are not supported in Resolve 12.5 any more but you use it for GUI only?


Please lose your GT120 or use it only if you need it to troubleshoot a boot up issue. It's really slowing down the Resolve UI, especially interactive editing and playback.
Rohit Gupta

DaVinci Resolve Software Development
Blackmagic Design
Offline

Barry Weckesser

  • Posts: 40
  • Joined: Sat Feb 14, 2015 1:16 pm

Re: 2nd GPU performance improvements

PostThu May 05, 2016 9:40 am

I have 2 Titan-Xs - have both set to be used for compute (and one for GUI also). I'm grading Sony XAVC-I 4K video files (SLog3) I bought the second card in hopes of improving rendering speed to an intermediate codec for Edius 8 (Grass Valley HQX - 4K) but there was no change whatsoever (still the same low 3fps). During rendering I look at the GPU utilization and one is using 20% while the other 5% or less. I get the same 3fps performance from another computer with a single NVidia Quadro 4200.

I presume the dual cards only help with multiple nodes - noise reduction etc. but not in rendering.
VIDEOWORKSTATION:DellT7910,DualXeon10core,128GBRAM,2NVIDIATitanX12GB.
DecklinkExtreme4K.
MOBILE VIDEO WORKSTATION: Dell Precision7710,XeonE3-1535,64GB RAM2133;1TB-PCIeNVMe system drive,NVidiaM5000M-8GB,UHD Display;2-1TB PCIeNVMe drives-RAID0
Offline

Justin Jackson

  • Posts: 670
  • Joined: Thu Apr 28, 2016 3:50 am

Re: 2nd GPU performance improvements

PostThu May 05, 2016 3:00 pm

From my understanding, the free version only uses one gpu, but the studio version would use SLI and multiple GPUs?
Custom DIY AMD1950x 16-core/32-thread, liquid cooled, 64GB 3600Mhz RAM, 950Pro-512GB NVMe os/apps, 2x500GB 850 Evo RAID 0 SATA3, Zotac 1070 8GB video, USB 3.1Gen2 RAID0 2x4TB, 2x2TB Crucial MX500 SSD SATA3.
Offline
User avatar

waltervolpatto

  • Posts: 10528
  • Joined: Thu Feb 07, 2013 5:07 pm
  • Location: 1146 North Las Palmas Ave. Hollywood, California 90038 USA

Re: 2nd GPU performance improvements

PostThu May 05, 2016 4:08 pm

Justin Jackson wrote:From my understanding, the free version only uses one gpu, but the studio version would use SLI and multiple GPUs?


the studio will use multiples gpu, but not sli
W10-19043.1645- Supermicro MB C9X299-PGF - RAM 128GB CPU i9-10980XE 16c 4.3GHz (Oc) Water cooled
Decklink Studio 4K (12.3)
Resolve 18.5.1 / fusion studio 18
GPU 3090ti drivers 512.59 studio
Offline

Justin Jackson

  • Posts: 670
  • Joined: Thu Apr 28, 2016 3:50 am

Re: 2nd GPU performance improvements

PostThu May 05, 2016 4:21 pm

How does it use multiple GPUs if not SLI? I assume SLI is the "serialized" combined performance of two (or 3/4) GPUs, where as what you are saying is for each GPU in the system, Resolve (and Fusion?) would use them in parallel similar to multi core cpus?
Custom DIY AMD1950x 16-core/32-thread, liquid cooled, 64GB 3600Mhz RAM, 950Pro-512GB NVMe os/apps, 2x500GB 850 Evo RAID 0 SATA3, Zotac 1070 8GB video, USB 3.1Gen2 RAID0 2x4TB, 2x2TB Crucial MX500 SSD SATA3.
Offline

Willian Aleman

  • Posts: 356
  • Joined: Thu Aug 23, 2012 10:08 pm
  • Location: NYC, USA

Re: 2nd GPU performance improvements

PostThu May 05, 2016 5:02 pm

waltervolpatto wrote:
Justin Jackson wrote:


the studio will use multiples gpu, but not sli


How to set up two gpu for computation only in DR? I haven't found this setting yet. What I have is GPU for computation and the one for the display setting. No options for two comp cards
Willian Aleman
New York City
USA

Resolve Studio
iMac (Retina 5K, 27-inch, 2020)
MacOS: 10.15.7 (19H2)
processor: 3.8 GHz 8-Core Intel Core i7
Memory: 40 GB 2667 MHz DDR4
Graphic: AMD Radeon Pro 5700 XT 16 GB
Offline
User avatar

waltervolpatto

  • Posts: 10528
  • Joined: Thu Feb 07, 2013 5:07 pm
  • Location: 1146 North Las Palmas Ave. Hollywood, California 90038 USA

Re: 2nd GPU performance improvements

PostThu May 05, 2016 5:09 pm

Willian Aleman wrote:
waltervolpatto wrote:
Justin Jackson wrote:


the studio will use multiples gpu, but not sli


How to set up two gpu for computation only in DR? I haven't found this setting yet. What I have is GPU for computation and the one for the display setting. No options for two comp cards


resolve will simply send frane 1 to gou1, frame 2 to gpu 2, frame 3 to gpu 1 and so on
W10-19043.1645- Supermicro MB C9X299-PGF - RAM 128GB CPU i9-10980XE 16c 4.3GHz (Oc) Water cooled
Decklink Studio 4K (12.3)
Resolve 18.5.1 / fusion studio 18
GPU 3090ti drivers 512.59 studio
Offline

Willian Aleman

  • Posts: 356
  • Joined: Thu Aug 23, 2012 10:08 pm
  • Location: NYC, USA

Re: 2nd GPU performance improvements

PostThu May 05, 2016 5:17 pm

Thanks! Then, there is no special check box in DR for this to work?
Willian Aleman
New York City
USA

Resolve Studio
iMac (Retina 5K, 27-inch, 2020)
MacOS: 10.15.7 (19H2)
processor: 3.8 GHz 8-Core Intel Core i7
Memory: 40 GB 2667 MHz DDR4
Graphic: AMD Radeon Pro 5700 XT 16 GB
Offline
User avatar

Simon Dayan

  • Posts: 158
  • Joined: Tue Feb 04, 2014 10:39 am
  • Location: West Hollywood, Los Angeles,CA

Re: 2nd GPU performance improvements

PostThu May 05, 2016 5:32 pm

Go To :
Preferences ---> Video I/O and GPU ----> Use Display GPU For Compute
Offline
User avatar

waltervolpatto

  • Posts: 10528
  • Joined: Thu Feb 07, 2013 5:07 pm
  • Location: 1146 North Las Palmas Ave. Hollywood, California 90038 USA

Re: 2nd GPU performance improvements

PostThu May 05, 2016 6:09 pm

Simon Dayan wrote:Go To :
Preferences ---> Video I/O and GPU ----> Use Display GPU For Compute

this...
W10-19043.1645- Supermicro MB C9X299-PGF - RAM 128GB CPU i9-10980XE 16c 4.3GHz (Oc) Water cooled
Decklink Studio 4K (12.3)
Resolve 18.5.1 / fusion studio 18
GPU 3090ti drivers 512.59 studio
Offline

Willian Aleman

  • Posts: 356
  • Joined: Thu Aug 23, 2012 10:08 pm
  • Location: NYC, USA

Re: 2nd GPU performance improvements

PostThu May 05, 2016 6:21 pm

Simon Dayan wrote:Go To :
Preferences ---> Video I/O and GPU ----> Use Display GPU For Compute


Walter and Simon, as always, thanks for yours helpful info. This is the way I have set it up. What I didn't know was that DR was using both GPU for computation. I thought it was one or the other. I have two GX770. With this new info I'm going to install a third one just for the GUI.
Last edited by Willian Aleman on Fri May 06, 2016 3:23 am, edited 2 times in total.
Willian Aleman
New York City
USA

Resolve Studio
iMac (Retina 5K, 27-inch, 2020)
MacOS: 10.15.7 (19H2)
processor: 3.8 GHz 8-Core Intel Core i7
Memory: 40 GB 2667 MHz DDR4
Graphic: AMD Radeon Pro 5700 XT 16 GB
Offline

Blake LaFarm

  • Posts: 224
  • Joined: Fri Dec 13, 2013 8:48 am

Re: 2nd GPU performance improvements

PostFri May 06, 2016 12:18 am

waltervolpatto wrote:
Simon Dayan wrote:Go To :
Preferences ---> Video I/O and GPU ----> Use Display GPU For Compute

this...


Apologies in advance for misunderstanding, but I would like to ask for a bit of clarification on this:

For optimal performance when running Resolve Studio on an HP Z840 with a total of 2 (two) Titan X cards -- should the "Use Display GPU for Compute" checkbox be selected?

Or, would that somehow cause Resolve to prioritize the display Titan X for compute -- over the Titan X that is dedicated for compute?

Thanks.
HP Z840 | Dual 10-Core Xeon 2.3 GHz | Dual TITAN Xp | 64 GB RAM | Media: PCIe SSD 2.5 GB/s
DeckLink 4K Ext 12G | Pocket UltraScope | Avid Artist Color | CalMAN Studio/C6-HDR
Resolve Studio 15.0.0B.043 | Fusion Studio 9.0.2 | DTV 10.9.12 | Win10 Pro 1803
Offline

Adam Simmons

  • Posts: 5510
  • Joined: Thu Apr 04, 2013 4:21 pm
  • Location: UK

Re: 2nd GPU performance improvements

PostFri May 06, 2016 5:27 am

If it's working properly you should have it ticked that way it gives the systems extra resources to use for it's internal rendering. When ticked you should see a performance boost, especially as you have 2 matching high ends cards.

When it's not ticked your system is just using 1 card for the rendering and the other one is purely used for the GUI, so you are wasting all that extra power by not having it ticked
DVC Built Clevo P775DM3-G Laptop with UHD screen, 7700K CPU@4.9Ghz, Geforce GTX 1060 6GB GPU, G-Sync UHD screen, 500GB M.2 Primary, 1x 480GB SSD, 1x1TB M.2, 1x 2TB Video drives.
Building Bespoke Video Editing systems for over 16 years
Offline

Barry Weckesser

  • Posts: 40
  • Joined: Sat Feb 14, 2015 1:16 pm

Re: 2nd GPU performance improvements

PostFri May 06, 2016 11:48 am

Adam Simmons wrote:If it's working properly you should have it ticked that way it gives the systems extra resources to use for it's internal rendering. When ticked you should see a performance boost, especially as you have 2 matching high ends cards.

When it's not ticked your system is just using 1 card for the rendering and the other one is purely used for the GUI, so you are wasting all that extra power by not having it ticked

Any idea as to why there is absolutely no improvement in rendering performance (various delivery options) when using 2 Titan X cards for Compute vs 1 card? (I do have Studio and I have "use GUI for compute" ticked).
Last edited by Barry Weckesser on Fri May 06, 2016 12:39 pm, edited 2 times in total.
VIDEOWORKSTATION:DellT7910,DualXeon10core,128GBRAM,2NVIDIATitanX12GB.
DecklinkExtreme4K.
MOBILE VIDEO WORKSTATION: Dell Precision7710,XeonE3-1535,64GB RAM2133;1TB-PCIeNVMe system drive,NVidiaM5000M-8GB,UHD Display;2-1TB PCIeNVMe drives-RAID0
Offline

Willian Aleman

  • Posts: 356
  • Joined: Thu Aug 23, 2012 10:08 pm
  • Location: NYC, USA

Re: 2nd GPU performance improvements

PostFri May 06, 2016 11:55 am

Barry Weckesser wrote:
Adam Simmons wrote:If it's working properly you should have it ticked that way it gives the systems extra resources to use for it's internal rendering. When ticked you should see a performance boost, especially as you have 2 matching high ends cards.

When it's not ticked your system is just using 1 card for the rendering and the other one is purely used for the GUI, so you are wasting all that extra power by not having it ticked

Any idea as to why there is absolutely no improvement in rendering performance (various delivery options) when using 2 Titan X cards for Compute vs 1 card?


I'm interested in the answer to this too.
In addition, is there any difference in playback when using the two GPU cards for compute.
Willian Aleman
New York City
USA

Resolve Studio
iMac (Retina 5K, 27-inch, 2020)
MacOS: 10.15.7 (19H2)
processor: 3.8 GHz 8-Core Intel Core i7
Memory: 40 GB 2667 MHz DDR4
Graphic: AMD Radeon Pro 5700 XT 16 GB
Offline

Adam Simmons

  • Posts: 5510
  • Joined: Thu Apr 04, 2013 4:21 pm
  • Location: UK

Re: 2nd GPU performance improvements

PostFri May 06, 2016 12:43 pm

Barry Weckesser wrote:
Adam Simmons wrote:If it's working properly you should have it ticked that way it gives the systems extra resources to use for it's internal rendering. When ticked you should see a performance boost, especially as you have 2 matching high ends cards.

When it's not ticked your system is just using 1 card for the rendering and the other one is purely used for the GUI, so you are wasting all that extra power by not having it ticked

Any idea as to why there is absolutely no improvement in rendering performance (various delivery options) when using 2 Titan X cards for Compute vs 1 card? (I do have Studio and I have "use GUI for compute" ticked).

By rendering I mean when it's performing real-time playback inside Resolve. I'm not sure how much it uses the GPU's when sending out to a final delivery file. I think you may have to wait for Dwayne to answer that one.
DVC Built Clevo P775DM3-G Laptop with UHD screen, 7700K CPU@4.9Ghz, Geforce GTX 1060 6GB GPU, G-Sync UHD screen, 500GB M.2 Primary, 1x 480GB SSD, 1x1TB M.2, 1x 2TB Video drives.
Building Bespoke Video Editing systems for over 16 years
Offline

Blake LaFarm

  • Posts: 224
  • Joined: Fri Dec 13, 2013 8:48 am

Re: 2nd GPU performance improvements

PostFri May 06, 2016 3:38 pm

Adam Simmons wrote:If it's working properly you should have it ticked that way it gives the systems extra resources to use for it's internal rendering. When ticked you should see a performance boost, especially as you have 2 matching high ends cards.

When it's not ticked your system is just using 1 card for the rendering and the other one is purely used for the GUI, so you are wasting all that extra power by not having it ticked


Thanks for that information.
HP Z840 | Dual 10-Core Xeon 2.3 GHz | Dual TITAN Xp | 64 GB RAM | Media: PCIe SSD 2.5 GB/s
DeckLink 4K Ext 12G | Pocket UltraScope | Avid Artist Color | CalMAN Studio/C6-HDR
Resolve Studio 15.0.0B.043 | Fusion Studio 9.0.2 | DTV 10.9.12 | Win10 Pro 1803
Offline

Justin Jackson

  • Posts: 670
  • Joined: Thu Apr 28, 2016 3:50 am

Re: 2nd GPU performance improvements

PostSat May 07, 2016 5:17 pm

Hey all.. this all sounds great, but I am a little lost on one aspect of this. I have a single nVidia card, and my motherboard has a video card. The one on my m/b can output to 4K, although I am only using HD now. Should I be using the output of the m/b to my monitor, and the GPU would just be used for rendering, and thus not connected to my monitor (e.g. m/b video out is for GUI and nVidia GPU is used for rendering)?? If that is the case, then my ability to use my nVidia for gaming (which is less and less these days, but still from time to time) would not be possibly, unless I have perhaps an a/b switch and switch to B (nvidia output) for gaming purposes, but use A (m/b video) for resolve/fusion work?

Or, am I totally misunderstanding this ability?
Custom DIY AMD1950x 16-core/32-thread, liquid cooled, 64GB 3600Mhz RAM, 950Pro-512GB NVMe os/apps, 2x500GB 850 Evo RAID 0 SATA3, Zotac 1070 8GB video, USB 3.1Gen2 RAID0 2x4TB, 2x2TB Crucial MX500 SSD SATA3.
Offline
User avatar

waltervolpatto

  • Posts: 10528
  • Joined: Thu Feb 07, 2013 5:07 pm
  • Location: 1146 North Las Palmas Ave. Hollywood, California 90038 USA

Re: 2nd GPU performance improvements

PostSat May 07, 2016 6:06 pm

Justin Jackson wrote:Hey all.. this all sounds great, but I am a little lost on one aspect of this. I have a single nVidia card, and my motherboard has a video card. The one on my m/b can output to 4K, although I am only using HD now. Should I be using the output of the m/b to my monitor, and the GPU would just be used for rendering, and thus not connected to my monitor (e.g. m/b video out is for GUI and nVidia GPU is used for rendering)?? If that is the case, then my ability to use my nVidia for gaming (which is less and less these days, but still from time to time) would not be possibly, unless I have perhaps an a/b switch and switch to B (nvidia output) for gaming purposes, but use A (m/b video) for resolve/fusion work?

Or, am I totally misunderstanding this ability?


Yes, use th he m/b for gui only and the other for computation.
W10-19043.1645- Supermicro MB C9X299-PGF - RAM 128GB CPU i9-10980XE 16c 4.3GHz (Oc) Water cooled
Decklink Studio 4K (12.3)
Resolve 18.5.1 / fusion studio 18
GPU 3090ti drivers 512.59 studio
Offline

Justin Jackson

  • Posts: 670
  • Joined: Thu Apr 28, 2016 3:50 am

Re: 2nd GPU performance improvements

PostSat May 07, 2016 6:41 pm

Really? Wow..that just seems so odd to me. So in my case, the GPU just doesnt have anything plugged to it? Is the GUI not that fast in terms of needing to update? It is an intel HD graphics 520... will that be fast enough to keep the UI smooth?

If I use the GPU as the only vid out.. does that slow down its use for rendering when it is also used go GUI? Or does it not even use the GPU for rendering if it is the only output?
Custom DIY AMD1950x 16-core/32-thread, liquid cooled, 64GB 3600Mhz RAM, 950Pro-512GB NVMe os/apps, 2x500GB 850 Evo RAID 0 SATA3, Zotac 1070 8GB video, USB 3.1Gen2 RAID0 2x4TB, 2x2TB Crucial MX500 SSD SATA3.
Offline
User avatar

Gosha Gusev

  • Posts: 7
  • Joined: Mon Jan 02, 2017 10:56 pm
  • Location: Russia, Cherepovets

Re: 2nd GPU performance improvements

PostSun Jan 15, 2017 6:59 pm

Hi there, i am not sure I do understand exactly this : If i have two graphics cards with the different VRAM (for example 1080 [8 GB] and 1060 [3 GB])
1080 — used for Computer
1060 — used for GUI only
Is this the only way to use graphics cards?
Can I use 1060 GUI + computer as there is different VRAM in these cards?
Am I correct please?
Offline

Sam Steti

  • Posts: 2497
  • Joined: Tue Jun 17, 2014 7:29 am
  • Location: France

Re: 2nd GPU performance improvements

PostTue Jan 17, 2017 2:21 pm

Gosha Gusev wrote:Hi there, i am not sure I do understand exactly this : If i have two graphics cards with the different VRAM (for example 1080 [8 GB] and 1060 [3 GB])
1080 — used for Computer
1060 — used for GUI only
Is this the only way to use graphics cards?

Yes, this is the correct way; to achieve it you hook up the 1060 to the monitor and choose CUDA in preferences...
Can I use 1060 GUI + computer as there is different VRAM in these cards?
Am I correct please?
Yes, you "can". If you have the Studio version, you can try ticking "use gui gpu to compute" in preferences, but in this case, Resolve will use 3GB max., from both of your GPUs... The max you can reach is the max of the slowest, if you like.
And therefore no, it won't use 8 GB, and super no, it won't use 11 GB, you had understood it ;)
*MacMini M1 16 Go - Ext nvme SSDs on TB3 - 14 To HD in 2 x 4 disks USB3 towers
*Legacy MacPro 8core Xeons, 32 Go ram, 2 x gtx 980 ti, 3SSDs including RAID
*Resolve Studio everywhere, Fusion Studio too
*https://www.buymeacoffee.com/videorhin
Offline
User avatar

Jean Claude

  • Posts: 2973
  • Joined: Sun Jun 28, 2015 4:41 pm
  • Location: France

Re: 2nd GPU performance improvements

PostTue Jan 17, 2017 3:49 pm

Sam is right (Salut Sam). I was like you to ask me questions and I did some tests (see viewtopic.php?f=21&t=53584&p=308650&hilit=candle#p308650).

I still feel like you have too much gap between the gtx 1060 (3GB) and the 1080 (8GB) in memory size and it would be a shame to bridle your 1080 down because 5GB difference.

It is better to harmonize even if the ideal is to have exactly the same cards. The best thing is to do a test to get the real conclusions.

(Studio version only)
"Saying it is good, but doing it is better! "
Win10-1809 | Resolve Studio V16.1 | Fusion Studio V16.1 | Decklink 4K Extreme 6G | RTX 2080Ti 431.86 NSD driver! |
Offline
User avatar

Gosha Gusev

  • Posts: 7
  • Joined: Mon Jan 02, 2017 10:56 pm
  • Location: Russia, Cherepovets

Re: 2nd GPU performance improvements

PostWed Jan 18, 2017 11:42 am

Thank you, Sam.
Thank you, Jean.
At this rate my choice fell on 480RX 2pcs :)
Offline

StephenNathan

  • Posts: 6
  • Joined: Sun Nov 20, 2016 3:13 am

Re: 2nd GPU performance improvements

PostThu Jan 19, 2017 9:30 am

If you want real performance than start with an Nvidia Quadro.

HOWEVER.

I have a feeling (like most of us) your disk speed is lacking performance.

Do the science first. I've seen some older cards do heavy lifting :evil:
Offline
User avatar

Jean Claude

  • Posts: 2973
  • Joined: Sun Jun 28, 2015 4:41 pm
  • Location: France

Re: 2nd GPU performance improvements

PostThu Jan 19, 2017 6:06 pm

StephenNathan wrote:If you want real performance than start with an Nvidia Quadro.

HOWEVER.

I have a feeling (like most of us) your disk speed is lacking performance.

Do the science first. I've seen some older cards do heavy lifting :evil:


Not sure... see
https://www.dcinema.me/nvidia-quadro-vs ... i-resolve/
:?
"Saying it is good, but doing it is better! "
Win10-1809 | Resolve Studio V16.1 | Fusion Studio V16.1 | Decklink 4K Extreme 6G | RTX 2080Ti 431.86 NSD driver! |
Offline
User avatar

Gosha Gusev

  • Posts: 7
  • Joined: Mon Jan 02, 2017 10:56 pm
  • Location: Russia, Cherepovets

Re: 2nd GPU performance improvements

PostFri Jan 20, 2017 2:21 am

StephenNathan wrote:If you want real performance than start with an Nvidia Quadro.

HOWEVER.

I have a feeling (like most of us) your disk speed is lacking performance.

Do the science first. I've seen some older cards do heavy lifting :evil:


I am engaged in this issue:
viewtopic.php?f=18&t=55527
:)
Offline

Kel Philm

  • Posts: 607
  • Joined: Sat Nov 19, 2016 6:21 am

Re: 2nd GPU performance improvements

PostWed Mar 22, 2017 2:17 am

I have a GTX 1080 and I am looking to add a GTX 1060 (for GUI only) to my setup to improve performance in Resolve. Will adding the 1060 for GUI give much processing back to the GTX 1080? Is it worth it? Under this scenario I assume the 1080 will utilise the full 8GB?

I have fast drives/CPU and plenty of Ram and mostly use Resolve and Fusion.
Offline
User avatar

Erik Wittbusch

  • Posts: 482
  • Joined: Sun Aug 16, 2015 8:06 pm
  • Location: Duisburg, Germany

Re: 2nd GPU performance improvements

PostWed Mar 22, 2017 7:03 am

Using GUI only cards doesn't add much to the performance, but it's worth a try.
I'd save for another 1080 and use both for compute.
Offline
User avatar

JPOwens

  • Posts: 1511
  • Joined: Fri Apr 12, 2013 8:04 pm
  • Location: Victoria, British Columbia, Canada

Re: 2nd GPU performance improvements

PostWed Mar 22, 2017 4:32 pm

Erik Wittbusch wrote:I'd save for another 1080 and use both for compute.


That's better advice.

You will have a better chance of performance improvement if both your 'horses" are the same size that are pulling the same cart. Because the big horse will always be limited to what the small horse can do.

Some day I will get a better handle on how the media is mapped out to the process-GPUs VRAM, but the conventional wisdom is that if you have dissimilar cards, tying them together to share UI and compute is a bad idea. Memory-sharing is not cumulative. SLI is not a Resolve memory field option. Multiple compute GPUs do offer improved parallel processing. A separate UI-processor normally functions like the administration wing of a company that is overseeing the factory floor -- not involved in making the product, but overseeing that the process is carried out as efficiently as instructed, so that it is waiting at the shipping dock when the delivery truck backs into the bay.

jPo, CSI
Offline

Kel Philm

  • Posts: 607
  • Joined: Sat Nov 19, 2016 6:21 am

Re: 2nd GPU performance improvements

PostWed Mar 22, 2017 6:54 pm

I had planned to use the 1060 form GUI only and the 1080 for compute only which I assume would not cause the shared memory issue as they shoud be discrete functions, that said though I suspected the overheads of the UI would be pretty much insignificant. I will follow the suggested advice and look to a 2nd 1080. Thanks to you both.
Offline
User avatar

Craig Marshall

  • Posts: 949
  • Joined: Sun Apr 07, 2013 4:49 am
  • Location: Blue Mountains, Australia

Re: 2nd GPU performance improvements

PostWed Mar 22, 2017 9:00 pm

Justin Jackson wrote:Really? Wow..that just seems so odd to me. So in my case, the GPU just doesnt have anything plugged to it? Is the GUI not that fast in terms of needing to update? It is an intel HD graphics 520... will that be fast enough to keep the UI smooth?

If I use the GPU as the only vid out.. does that slow down its use for rendering when it is also used go GUI? Or does it not even use the GPU for rendering if it is the only output?


That's exactly how I have my system set up with Resolve (non Studio) - the i7 CPU's Intel GPU runs all monitors whilst the 16 lane GTX GPU is used exclusively for NLE & Resolve GPU applications.
4K Post Studio, Freelance Filmmaker, Media Writer
Win10/Lightworks/Resolve 15.1/X-Keys 68 Jog-Shuttle/OxygenTec ProPanel
12G SDI Decklink 4K Pro/Calibrated 10bit IPS SDI Monitor
HDvideo4K.com
Offline

Justin Jackson

  • Posts: 670
  • Joined: Thu Apr 28, 2016 3:50 am

Re: 2nd GPU performance improvements

PostTue Aug 22, 2017 3:30 am

So.. months later.. now that I realize this was replied to... I should use my m/b GPU for video display, and nothing connected to the GPU. At least for a pure editing rig with no other use (e.g. not using for gaming or anything). That right?
Custom DIY AMD1950x 16-core/32-thread, liquid cooled, 64GB 3600Mhz RAM, 950Pro-512GB NVMe os/apps, 2x500GB 850 Evo RAID 0 SATA3, Zotac 1070 8GB video, USB 3.1Gen2 RAID0 2x4TB, 2x2TB Crucial MX500 SSD SATA3.
Offline

Andrew Kolakowski

  • Posts: 9210
  • Joined: Tue Sep 11, 2012 10:20 am
  • Location: Poland

Re: 2nd GPU performance improvements

PostTue Aug 22, 2017 9:49 am

Barry Weckesser wrote:I have 2 Titan-Xs - have both set to be used for compute (and one for GUI also). I'm grading Sony XAVC-I 4K video files (SLog3) I bought the second card in hopes of improving rendering speed to an intermediate codec for Edius 8 (Grass Valley HQX - 4K) but there was no change whatsoever (still the same low 3fps). During rendering I look at the GPU utilization and one is using 20% while the other 5% or less. I get the same 3fps performance from another computer with a single NVidia Quadro 4200.

I presume the dual cards only help with multiple nodes - noise reduction etc. but not in rendering.


This is exactly the case- don't expect GPU to help you with actual decoding/encoding of ProRes, etc.
Latest Resolve Studio will use some GPUs to decode h264 files, but it will be only for 4:2:0/8bit files (as far as I can tell). Packing 2x Titans into machine with relatively slow CPU won't help as much as you may think. Machine has to be balanced also taking into account type of your main input assets. It's bit different setup when mainly working with prosumer HD cameras sources and when with 6K RED.
Offline

Justin Jackson

  • Posts: 670
  • Joined: Thu Apr 28, 2016 3:50 am

Re: 2nd GPU performance improvements

PostMon Aug 28, 2017 12:24 am

Ok.. so per your reply.. would a Threadripper with 16 cores, 32 threads, 128GB RAM, few NVMe drives, and two or three 1080tis work well? I was hoping to get near (or better than) real-time encoding of a 4K video. My current 4core I7 with 1070 can render 4K in almost real time to HD. I would assume with 4x the cores and about 6x the threads, I could render 4K footage at near real time speeds? Or is that wishful thinking? Of course I am doing very minimal FX..e.g. fade in/out, transitions between clips.. nothing too fancy.
Custom DIY AMD1950x 16-core/32-thread, liquid cooled, 64GB 3600Mhz RAM, 950Pro-512GB NVMe os/apps, 2x500GB 850 Evo RAID 0 SATA3, Zotac 1070 8GB video, USB 3.1Gen2 RAID0 2x4TB, 2x2TB Crucial MX500 SSD SATA3.
Offline

Peter Chamberlain

Blackmagic Design

  • Posts: 13929
  • Joined: Wed Aug 22, 2012 7:08 am

Re: 2nd GPU performance improvements

PostMon Aug 28, 2017 1:23 am

There is some miss information in this thread, so again for clarity;

The CPU is used for to run the app, disk I/O and compression and decompression of codecs. If you have a red rocket for r3d files thats the exception as those clips are decompressed in the rocket.

The GPU is used for image processing and UI. You can use the same card for both, or have them complete these separate tasks. If your cards are matched, i.e. two Titan Xp, or two GTX 1080 or AMD Firepro W9100 or AMD R9 295X etc etc, we recommend using both for compute with UI monitors connected to just one. If you add more GPUs add the same again.

Ram in one GPU is only used in that GPU. Processors in one GPU are only used in that GPU. If you have different GPU models with different specs, one of the GPUs will be faster and it will complete the processing task faster. Its therefore going to be waiting for the slow GPU. That wont stop it working but its not efficent.

If you have the same number of processors in each GPU but one GPU has more memory, then it can process higher resolution images with more effects before filling the memory. The GPU with the smaller memory will exhaust its RAM quicker and you will see the GPU memory full message.

Resolve does not use SLI or other cross GPU connections. We manage the GPUs directly.

So unbalanced GPUs in the best case waste processing performance of the faster GPUs and limit resolution and effects that can be processed down to the GPU with the smallest memory.

If you have a low power, low memory GPU and would like to add a faster one, assuming the low power GPU is sufficient to run Resolve by itself, make that the UI only GPU and get the fastest GPU with the most RAM you can afford and use that just for processing.

If you are looking for more guidance, this is a good place to start.
viewtopic.php?f=21&t=62582
DaVinci Resolve Product Manager
Next

Return to DaVinci Resolve

Who is online

Users browsing this forum: awppollock, Bing [Bot], Boosuf, DaVinciEditor, flulago, Google [Bot], Håkan Mitts, Michael Kropfberger, Robert Niessner, roger.magnusson, SkierEvans and 200 guests