5.3k GPU decoding support

Get answers to your questions about color grading, editing and finishing with DaVinci Resolve.
  • Author
  • Message
Offline

samfisher5986

  • Posts: 8
  • Joined: Tue May 25, 2021 6:26 am
  • Location: Australia
  • Real Name: Sam Fisher

5.3k GPU decoding support

PostFri Dec 16, 2022 11:33 pm

I've upgraded my GoPro and now found Davinci can't GPU decode the video despite Adobe Premiere being able to do so.

The video is 10bit 5.3k 5312x4648 and I can barely edit the video as CPU usage on my 6 core CPU is 100% and it mostly plays back at 10fps until it stabilises eventually at 30fps but thats with one layer.

I'm running Resolve Studio, and I've seen a few topics on this and it seems either Resolve can't handle above 4k, or its deciding for me based on my system specs.

It has no issue with 10bit 4000x3000, I get 5% CPU usage and 30% GPU usage on that type of video.

I have 16GB of ram and an Nvidia 1080, I would be curious if upgrading either of these would make Resolve decide to GPU decode the video.
Offline

CougerJoe

  • Posts: 330
  • Joined: Wed Sep 18, 2019 5:15 am
  • Real Name: bob brady

Re: 5.3k GPU decoding support

PostSat Dec 17, 2022 12:54 am

I tried with 30 series Nvidia, decodes fine.

Is it possible your 1080 doesn't decode 10bit HEVC, or it doesn't decode 10bit HEVC above 4096x4096?
Offline

samfisher5986

  • Posts: 8
  • Joined: Tue May 25, 2021 6:26 am
  • Location: Australia
  • Real Name: Sam Fisher

Re: 5.3k GPU decoding support

PostSat Dec 17, 2022 1:01 am

CougerJoe wrote:I tried with 30 series Nvidia, decodes fine.

Is it possible your 1080 doesn't decode 10bit HEVC, or it doesn't decode 10bit HEVC above 4K?


As it works perfectly in Adobe Premiere, PotPlayer and MPC-HC, where I can see in Task Manager its all on the GPU, it seems 1080 has no issue with it.
Offline
User avatar

Uli Plank

  • Posts: 21292
  • Joined: Fri Feb 08, 2013 2:48 am
  • Location: Germany and Indonesia

Re: 5.3k GPU decoding support

PostSat Dec 17, 2022 3:16 am

Some cards on PC are limited in their HEVC decoding capability to UHD.
5.3K footage out of my drone decodes smoothly on Apple silicon.
No, an iGPU is not enough, and you can't use HEVC 10 bit 4:2:2 in the free version.

Studio 18.6.5, MacOS 13.6.5
MacBook M1 Pro, 16 GPU cores, 32 GB RAM and iPhone 15 Pro
Speed Editor, UltraStudio Monitor 3G, iMac 2017, eGPU
Offline

samfisher5986

  • Posts: 8
  • Joined: Tue May 25, 2021 6:26 am
  • Location: Australia
  • Real Name: Sam Fisher

Re: 5.3k GPU decoding support

PostSat Dec 17, 2022 7:49 am

Uli Plank wrote:Some cards on PC are limited in their HEVC decoding capability to UHD.
5.3K footage out of my drone decodes smoothly on Apple silicon.


Its definitely not an issue with the GPU specifically.

It seems the issue is with Davinci thinking my GPU isn't capable of it.

This seems to be backed up by a few other comments I've found about the issue.

I'm just not sure what to do about it as there's no clear indication what the software wants.
Offline
User avatar

Uli Plank

  • Posts: 21292
  • Joined: Fri Feb 08, 2013 2:48 am
  • Location: Germany and Indonesia

Re: 5.3k GPU decoding support

PostSat Dec 17, 2022 7:54 am

Well, you could always transcode externally.
No, an iGPU is not enough, and you can't use HEVC 10 bit 4:2:2 in the free version.

Studio 18.6.5, MacOS 13.6.5
MacBook M1 Pro, 16 GPU cores, 32 GB RAM and iPhone 15 Pro
Speed Editor, UltraStudio Monitor 3G, iMac 2017, eGPU
Offline

samfisher5986

  • Posts: 8
  • Joined: Tue May 25, 2021 6:26 am
  • Location: Australia
  • Real Name: Sam Fisher

Re: 5.3k GPU decoding support

PostSat Dec 17, 2022 8:04 am

Uli Plank wrote:Well, you could always transcode externally.


Doesn't seem like a great option for quality, color correction/grading and just dealing with the insane time that would take.

I'm more looking to either get the issue fixed or identify the hardware that davinci seems hardcoded to need.
Offline

CougerJoe

  • Posts: 330
  • Joined: Wed Sep 18, 2019 5:15 am
  • Real Name: bob brady

Re: 5.3k GPU decoding support

PostSat Dec 17, 2022 9:47 am

samfisher5986 wrote:
I'm more looking to either get the issue fixed or identify the hardware that davinci seems hardcoded to need.


So 4K 10bit HEVC uses the decoder but not 5.3K 10bit HEVC. I guess it could be a lack of VRAM. What happens if you start a new 4K project, bring in the video with nothing else, does that play smoothly?
Also check your task manager before opening Resolve, you want to make sure nothing else is using VRAM.

If it is a 8GB VRAM problem I didn't think you'd see that with a single clip, bring on many 5.3K clips and it's more reasonable Resolve will turn off the decoder
Offline

samfisher5986

  • Posts: 8
  • Joined: Tue May 25, 2021 6:26 am
  • Location: Australia
  • Real Name: Sam Fisher

Re: 5.3k GPU decoding support

PostSat Dec 17, 2022 12:05 pm

So I did some tests:

3x 4000x3000 10bit clips. All clips were in separate layers on top of each other and transparent so you could see all clips at once.

VRAM: 7500MB Used (500MB Free)
GPU Usage: 80%
CPU Usage: 7%

This performs very well and well above my normal usage of 1-2 clips at once.


4x 4000x3000 10bit Clips:
GPU Usage: 100%
CPU Usage 100%


This indicates to me that Davinci correctly decides that I'm unable to render more then 3x 4000x3000 clips on my GPU, which is great as its working as intended.


This however correctly supports my theory that it incorrectly assumes I can't run one single 5312x4648 clip (We already know we can as Premiere and video players can do it on the GPU)

5.3k = 24,690,176 pixels
3x 4000x3000 = 36,000,000 pixels

CougerJoe wrote:So 4K 10bit HEVC uses the decoder but not 5.3K 10bit HEVC. I guess it could be a lack of VRAM.


There's definitely enough VRAM and from my testing of reducing vram to give as much of the 8GB as possible to Resolve, it seems Resolve won't even try to use it with 5.3K.



CougerJoe wrote: What happens if you start a new 4K project, bring in the video with nothing else, does that play smoothly?


That was my original test anyway, its a brand new project with nothing but the added clip.


----------

Also just checking my Davinci logs:

NVDEC decodes HEVC, chroma 4:2:0, bitdepth 10, upto 8192 x 8192

So I should be fine as my content is 4:2:0

----------

I've done some new testing and have very strange results:

4:3 - 5312x3984 = GPU Decode
16:9 - 5312x2988 = CPU Decode
8:7 - 5312x4648 = CPU Decode

I used both a 25fps and a 30fps clip and both only GPU decoded in 4:3, 5312x3984

As we know that 4:3 has more pixels then 16:9.
Offline

ZRGARDNE

  • Posts: 684
  • Joined: Sun May 16, 2021 12:32 am
  • Real Name: Zeb Gardner

Re: 5.3k GPU decoding support

PostSun Dec 18, 2022 3:17 pm

This topic has come up before.

Black Magic has coded in limits to where it will refuse to do hardware decoding based on resolution and vram.

They have not documented where these limits are. Or allowed the users to change them.

The basis is no doubt vram management. But it has negative impact to some users as it results in much slower performance.
Offline

samfisher5986

  • Posts: 8
  • Joined: Tue May 25, 2021 6:26 am
  • Location: Australia
  • Real Name: Sam Fisher

Re: 5.3k GPU decoding support

PostSun Dec 18, 2022 10:55 pm

ZRGARDNE wrote:This topic has come up before.

Black Magic has coded in limits to where it will refuse to do hardware decoding based on resolution and vram.

They have not documented where these limits are. Or allowed the users to change them.

The basis is no doubt vram management. But it has negative impact to some users as it results in much slower performance.


I guess that means I just need to buy a new graphics card. Its a shame though as they are so expensive at the moment.
Offline
User avatar

Uli Plank

  • Posts: 21292
  • Joined: Fri Feb 08, 2013 2:48 am
  • Location: Germany and Indonesia

Re: 5.3k GPU decoding support

PostMon Dec 19, 2022 12:22 am

It’s slowly getting better.
No, an iGPU is not enough, and you can't use HEVC 10 bit 4:2:2 in the free version.

Studio 18.6.5, MacOS 13.6.5
MacBook M1 Pro, 16 GPU cores, 32 GB RAM and iPhone 15 Pro
Speed Editor, UltraStudio Monitor 3G, iMac 2017, eGPU
Offline

Alex Silva

  • Posts: 435
  • Joined: Thu Jul 10, 2014 8:12 am

Re: 5.3k GPU decoding support

PostMon Dec 19, 2022 2:38 am

It is possible also that Resolve uses the GPU more than Premiere and that puts it near VRAM limits.
Offline

samfisher5986

  • Posts: 8
  • Joined: Tue May 25, 2021 6:26 am
  • Location: Australia
  • Real Name: Sam Fisher

Re: 5.3k GPU decoding support

PostWed Mar 22, 2023 11:28 am

In case it helps anyone else, I've since purchased an Nvidia 4080 16GB and the problem is 99% resolved with the very rare hiccup mostly on the first playthrough. From the 11GB of Video memory usage, its clear my 8GB 1080 was not enough.

I still think something is wrong with how Davinci is handling it though, for one 5.3k 10bit video to simply playback without any effects or layers:

11GB of Video Memory on my Nvidia GPU.
9GB of System Memory from Davinci alone (You basically need 24GB+ of system memory)
70-95% usage of my Nvidia 4080


When you consider I was easily doing 10bit 4k videos on my old Nvidia 1080... I'm not sure how this makes sense.

If GoPro release an 8k camera, I have a feeling my Nvidia 4080 is not going to be powerful enough.
Offline
User avatar

Uli Plank

  • Posts: 21292
  • Joined: Fri Feb 08, 2013 2:48 am
  • Location: Germany and Indonesia

Re: 5.3k GPU decoding support

PostWed Mar 22, 2023 2:15 pm

Of course not. 8K is four times the number of pixels than 4K.
No, an iGPU is not enough, and you can't use HEVC 10 bit 4:2:2 in the free version.

Studio 18.6.5, MacOS 13.6.5
MacBook M1 Pro, 16 GPU cores, 32 GB RAM and iPhone 15 Pro
Speed Editor, UltraStudio Monitor 3G, iMac 2017, eGPU
Offline

ZRGARDNE

  • Posts: 684
  • Joined: Sun May 16, 2021 12:32 am
  • Real Name: Zeb Gardner

Re: 5.3k GPU decoding support

PostThu Mar 23, 2023 4:33 am

samfisher5986 wrote:
If GoPro release an 8k camera, I have a feeling my Nvidia 4080 is not going to be powerful enough.



You would still edit in a 4k time line. One of the first steps in the Resolve pipeline is chop the footage to match the timeline resolution. The vram usage wouldn't change significantly.
Offline
User avatar

Uli Plank

  • Posts: 21292
  • Joined: Fri Feb 08, 2013 2:48 am
  • Location: Germany and Indonesia

Re: 5.3k GPU decoding support

PostThu Mar 23, 2023 5:05 am

That's correct, I can actually work with 12K BRAW in a UHD timeline on a MacBook Air.
But as soon as there are 8K TVs coming, folks might want to output that resolution. There are already large presentations in exhibits where a client might want 8K, or there is 360 degree video. I understood the question in that way.
No, an iGPU is not enough, and you can't use HEVC 10 bit 4:2:2 in the free version.

Studio 18.6.5, MacOS 13.6.5
MacBook M1 Pro, 16 GPU cores, 32 GB RAM and iPhone 15 Pro
Speed Editor, UltraStudio Monitor 3G, iMac 2017, eGPU

Return to DaVinci Resolve

Who is online

Users browsing this forum: ripchen56 and 181 guests