Optimizing GPU Question

Get answers to your questions about color grading, editing and finishing with DaVinci Resolve.
  • Author
  • Message
Offline

rschlierbeck

  • Posts: 16
  • Joined: Sat Aug 06, 2022 7:54 am
  • Location: Galicia Spain
  • Real Name: Scott Hendershot

Optimizing GPU Question

PostSat Aug 06, 2022 8:40 am

I'm using Davinci Resolve Studio V18.0.1 Build 3. My PC is running Windows 10 has an integrated graphics card (GPU 0) and a NVidia GeForce GTX 1050 Ti graphics card (GPU 1). The BIOS is set up to use the NVidia as the default GPU.

I often have GPU errors if I attempt to do anything other than light grading. I noticed one thing that really improves this is closing any other apps that use the GPU. Chrome is guaranteed to make DR crash for me. But there are many services running in the background that are also using the GPU that can't be shut off.

My question is about how to best optimize the GPU(s) I have in my system. Which should I select for DR and how should I partition them for the best DR performance?

Details
PC running Windows 10 32GB RAM
Davinci Resolve Studio V18.0.1 Build 3
Integrated graphics card: Intel UHD Graphics 630, 1GB RAM
Second graphics card: NVidia GeForce GTX 1050 Ti, 4GB RAM
DR Meadia Storage, Cache and Proxy drive = 1TB SSD

DR_GPU.jpg
My DR GPU settings
DR_GPU.jpg (48.75 KiB) Viewed 3138 times


I am assuming that the NVIDIA card is superior to the integrated graphics card and that I should be choosing it for the GPU in DR. I have manually configured other applications to use the Integrated GPU to reduce the load on the NVIDIA card. All drivers are up to date. Is this a reasonable approach? Is there some other optimization that could improve things?

I understand the NVIDIA card I have is not high end but I was never planning to do video editing when I got this PC. Also the Wikipedia GPU CUDA Support list shows a compatibility of 6.1 and I read that anything above 3.5 should be good.

Thanks

Jones
Offline

Jim Simon

  • Posts: 30311
  • Joined: Fri Dec 23, 2016 1:47 am

Re: Optimizing GPU Question

PostSat Aug 06, 2022 1:31 pm

Your settings are what I would choose and recommend.

But...4GB is kinda skimpy for VRAM.

I think your best option is to get a better card with more VRAM. I recommend nothing less than 8GB. I just picked up an RTX 3060 with 12GB for about $400.
My Biases:

You NEED training.
You NEED a desktop.
You NEED a calibrated (non-computer) display.
Offline

Andy Mees

  • Posts: 3259
  • Joined: Wed Aug 22, 2012 7:48 am

Re: Optimizing GPU Question

PostSat Aug 06, 2022 1:48 pm

rschlierbeck wrote:Is there some other optimization that could improve things
You don't mention anything about the source, edit and target resolutions you are working with, so it may be worth having a look at that and seeing what if anything you might be able to better control, in line with your existing hardware and expectations. For example, I've been working for some years now with a vanilla 6GB GTX 1060. It is very far from state of the art (its super slow compared to the latest and greatest) but I'm just churning out broadcast TV using mostly HD and 4K sources in an HD timeline for HD delivery... for that my current 6GB GPU memory footprint is more than adequate. If I were working on UHD timelines or larger then I would probably want and/or need a card with more VRAM (and faster processing).
Offline

rschlierbeck

  • Posts: 16
  • Joined: Sat Aug 06, 2022 7:54 am
  • Location: Galicia Spain
  • Real Name: Scott Hendershot

Re: Optimizing GPU Question

PostSat Aug 06, 2022 2:32 pm

Jim Simon wrote:Your settings are what I would choose and recommend.

But...4GB is kinda skimpy for VRAM.

I think your best option is to get a better card with more VRAM. I recommend nothing less than 8GB. I just picked up an RTX 3060 with 12GB for about $400.


Yes, I understand 4GB is lacking. A better card will be in the future but for now I need to work with this one.
Offline

rschlierbeck

  • Posts: 16
  • Joined: Sat Aug 06, 2022 7:54 am
  • Location: Galicia Spain
  • Real Name: Scott Hendershot

Re: Optimizing GPU Question

PostSat Aug 06, 2022 2:40 pm

Andy Mees wrote:
rschlierbeck wrote:Is there some other optimization that could improve things
You don't mention anything about the source, edit and target resolutions you are working with, so it may be worth having a look at that and seeing what if anything you might be able to better control, in line with your existing hardware and expectations. For example, I've been working for some years now with a vanilla 6GB GTX 1060. It is very far from state of the art (its super slow compared to the latest and greatest) but I'm just churning out broadcast TV using mostly HD and 4K sources in an HD timeline for HD delivery... for that my current 6GB GPU memory footprint is more than adequate. If I were working on UHD timelines or larger then I would probably want and/or need a card with more VRAM (and faster processing).


Sorry about the lack of information. I am working with 4K footage on HD timelines for editing. I generate proxies for all the clips I'm using in the media pool. Final renders are output at 4K 30fps for youtube. Final renders seem to work fine. It's the timeline editing that can have issues when I use any of the effects like Contrast Pop for example. If I'm just doing simple grading everything is fine.

At some point I'll look at a better graphics card but I need to manage with this one for now.
Offline

Jim Simon

  • Posts: 30311
  • Joined: Fri Dec 23, 2016 1:47 am

Re: Optimizing GPU Question

PostSun Aug 07, 2022 2:19 pm

rschlierbeck wrote:I need to work with this one.
You might not be able to.
My Biases:

You NEED training.
You NEED a desktop.
You NEED a calibrated (non-computer) display.
Offline

Alex Silva

  • Posts: 435
  • Joined: Thu Jul 10, 2014 8:12 am

Re: Optimizing GPU Question

PostSun Aug 07, 2022 5:21 pm

Why do you moved to V18? V17 is still more reliable.
Offline

ZRGARDNE

  • Posts: 697
  • Joined: Sun May 16, 2021 12:32 am
  • Real Name: Zeb Gardner

Re: Optimizing GPU Question

PostMon Aug 08, 2022 2:48 am

With my 4gb GTX1650 I came to realize exporting 4k was impossible. So I only ever worked in 1080p timelines.

What resolution are you trying to export?
Offline

rschlierbeck

  • Posts: 16
  • Joined: Sat Aug 06, 2022 7:54 am
  • Location: Galicia Spain
  • Real Name: Scott Hendershot

Re: Optimizing GPU Question

PostTue Aug 09, 2022 9:07 am

Jim Simon wrote:
rschlierbeck wrote:I need to work with this one.
You might not be able to.

That might very well be the case. Just investigating configurations that might make an improvement.
Offline

rschlierbeck

  • Posts: 16
  • Joined: Sat Aug 06, 2022 7:54 am
  • Location: Galicia Spain
  • Real Name: Scott Hendershot

Re: Optimizing GPU Question

PostTue Aug 09, 2022 9:08 am

Alex Silva wrote:Why do you moved to V18? V17 is still more reliable.

Rumors of performance improvements in V18. So far I there is no difference in the incidence of GPU errors between the versions.
Offline

rschlierbeck

  • Posts: 16
  • Joined: Sat Aug 06, 2022 7:54 am
  • Location: Galicia Spain
  • Real Name: Scott Hendershot

Re: Optimizing GPU Question

PostTue Aug 09, 2022 9:14 am

ZRGARDNE wrote:With my 4gb GTX1650 I came to realize exporting 4k was impossible. So I only ever worked in 1080p timelines.

What resolution are you trying to export?

I'm exporting my final videos at 4K. So far exporting has not been an issue. It's only editing. I edit to 1080p timelines to keep things smooth and manageable. Light grading works fine and most of the time that is all I need. On landscape scenes I like the texture pop or contrast pop effects but I have to be very careful applying them as this is where the GPU issues come into play. Again, light grading and sharpening in the color page are no problem.
Offline

rschlierbeck

  • Posts: 16
  • Joined: Sat Aug 06, 2022 7:54 am
  • Location: Galicia Spain
  • Real Name: Scott Hendershot

Re: Optimizing GPU Question

PostTue Aug 09, 2022 9:17 am

So it seems like VRAM is the most critical component. Is that correct? Am I correct in thinking that VRAM getting full is the real problem?

Thanks for all the help. I accept that I really need to get a better graphics card but trying to improve my understanding of the issues and how to address them. Sometimes throwing money at a problem is the answer and sometimes it's just throwing money. I don't want to spend a grand on an a new graphics card just to have the same problems.

Jones
Offline

Alex Silva

  • Posts: 435
  • Joined: Thu Jul 10, 2014 8:12 am

Re: Optimizing GPU Question

PostTue Aug 09, 2022 9:30 am

You can check if the VRAM is full with Windows Task Manager.
Online
User avatar

Uli Plank

  • Posts: 21767
  • Joined: Fri Feb 08, 2013 2:48 am
  • Location: Germany and Indonesia

Re: Optimizing GPU Question

PostTue Aug 09, 2022 11:32 am

Rather use GPU-Z (free).
Now that the cat #19 is out of the bag, test it as much as you can and use the subforum.

Studio 18.6.6, MacOS 13.6.6, 2017 iMac, 32 GB, Radeon Pro 580
MacBook M1 Pro, 16 GPU cores, 32 GB RAM and iPhone 15 Pro
Speed Editor, UltraStudio Monitor 3G
Offline

Gambitaw

  • Posts: 5
  • Joined: Thu Oct 20, 2022 5:00 pm
  • Real Name: Amir Yaakov Weiss

Re: Optimizing GPU Question

PostTue Nov 01, 2022 9:58 am

Regarding a similar setup. i cant seem to choose the integrated card (i have AMD which is not as bad as the intel one). devinci doesnt let me even mark it and when i change from CUDA to OpenGL it doesnt even open a project or let me choose the other card.

The reason is VRAM ofc, sometimes i prefer to render on bigger VRAM. any clue ?
Online
User avatar

Uli Plank

  • Posts: 21767
  • Joined: Fri Feb 08, 2013 2:48 am
  • Location: Germany and Indonesia

Re: Optimizing GPU Question

PostWed Nov 02, 2022 12:18 am

More details, please, which one is the other card and how much VRAM do you have?
Now that the cat #19 is out of the bag, test it as much as you can and use the subforum.

Studio 18.6.6, MacOS 13.6.6, 2017 iMac, 32 GB, Radeon Pro 580
MacBook M1 Pro, 16 GPU cores, 32 GB RAM and iPhone 15 Pro
Speed Editor, UltraStudio Monitor 3G
Offline

Alex Silva

  • Posts: 435
  • Joined: Thu Jul 10, 2014 8:12 am

Re: Optimizing GPU Question

PostFri Nov 04, 2022 10:47 pm

Intel integrated cards are usually better than AMD ones for DaVinci Resolve due to Quicksync video decoder/encoder. But if you don't have a good separated card from Nvidia or ATI preferably with at least 8 GB VRAM your experience will be limited.

Return to DaVinci Resolve

Who is online

Users browsing this forum: castrorec, Google [Bot], Noerde, panos_mts, PhatLe, stevekn, Tyler Graefe and 284 guests