Reason why DR 20 is embracing nVidia and not AMD?

Get answers to your questions about color grading, editing and finishing with DaVinci Resolve.
  • Author
  • Message
Offline

Rob Ainscough

  • Posts: 123
  • Joined: Sat Nov 01, 2014 11:22 pm

Reason why DR 20 is embracing nVidia and not AMD?

PostThu Apr 17, 2025 2:54 pm

For my primary editing computers, they are all AMD (2X 9950X3D 1X 9950X 1X 7950X) and for the GPU side (9070 XT OC, 7900XTX, 3090, 4070Ti Super). When it comes to AI processing (Topaz Video AI) the 9070 XT OC and 7900XTX GPU are significantly faster than any of my nVidia GPUs.

My concern is BM seem to have more focus on nVidia side of the fence rather than providing for at least equal investment in AMD?

Latest 5000 series from nVidia are frankly horrible, incredibly unstable, and run dangerously HOT (ignoring the scalper $4000+ prices) … and Quadro’s equally as horrible (unstable and inconsistent).

ROCm is a FAR better solution for video editing professionals due to it being open source and not restricted to whatever nVidia decide is good for us.

AMD CPU clean house when it comes to overall performance and AMD latest 9070 XT is a diamond in the rough … the only GPU I’ve used that is faster than the 9070 XT is the nVidia 5090 (when it works). The 5080 (assuming it has all it’s ROPS) is equal to the 9070 XT but nVidia costs 2X more.

I hope BM reconsider the path they’ve chosen.
Offline

qiyuxuan

  • Posts: 7
  • Joined: Thu Apr 17, 2025 3:08 pm
  • Real Name: Yuxuan Qi

Re: Reason why DR 20 is embracing nVidia and not AMD?

PostThu Apr 17, 2025 3:20 pm

Well, this kind of hardware optimization usually require the GPU company to work with the software developer, and Nvidia has more man power to do that. AMD GPU always suffers from software optimization both in gaming and productivity, sometimes it has to rely on open source project to improve the performance. The new video encoder and decoder on RTX5090 and 5080 are hardware feature, that can decode and encode H264 and H265 in 422 sampling. Both Intel and Nvidia GPU can do it now, but AMD's GPU still lacks this hardware feature.
AMD Ryzen 9 5900X
Nvidia RTX 5080
64g Ram
Resolve Studio 20 Beta
Offline

VMFXBV

  • Posts: 801
  • Joined: Wed Aug 24, 2022 8:41 pm
  • Real Name: Andrew I. Veli

Re: Reason why DR 20 is embracing nVidia and not AMD?

PostThu Apr 17, 2025 4:41 pm

Careful now, you might get scolded by some local AMD haters for even mentioning AMD.

As for the real issue, I think its harder to code for OpenCL or ROCm than it is to simply plop CUDA libraries.

It's also AMD's fault to a degree. They should work with BM to provide support for ROCm or HIP. They just released it for Windows and its not in good shape. Maybe in a few versions?

If they truly want to do a universal approach, they should implement Vulkan.

As for the haters, their business in the end. I will enjoy my 7900XTX as it does everything I need it to do.
AMD Ryzen 5800X3D
AMD Radeon 7900XTX
Ursa Mini 4.6K
Pocket 4K
Offline

Rob Ainscough

  • Posts: 123
  • Joined: Sat Nov 01, 2014 11:22 pm

Re: Reason why DR 20 is embracing nVidia and not AMD?

PostThu Apr 17, 2025 5:35 pm

I tend to go with the best tool for the job regardless, currently that’s squarely AMD. My AMD CPUs run circles around Intel’s best.

Sure ROCm has less of a history, but it’s being supported by many more developers and more flexible. CUDA limited to nVidia only (not a reliance I want $$$). AMD reached out to PyTorch, TensorFlow, and MosaicML and others … covers much the same toolset for AI and HPC development as CUDA.

Agree AMD could have worked more getting ROCm into a better position. ROCm is a better fit for DR and will be long term, why trust a company like nVidia who paper launches GPUs (on purpose), claims their GPUs can be “pre-order” yet no one knows how, sets up a VPA program so “normal people” can get their GPUs, but no one has actually been notified, denies missing ROPS issue, then confirms missing ROPS issues, denies power management and burning cable issues as “user error”, and seems to have issues releasing drivers that actually work. This IS NOT a company I want to be reliant on for video/audio processing.

I don’t see nVidia able or willing to keep producing GPUs for “the rest” while they deliver to their AI customers and I highly doubt “video/audio editors” are even a spec of consideration on their future plans. Just seems like a very short sighted proposition to dump all of one’s eggs into nVidia basket.

I hope BM do a rethink on the nVidia path and at least put the same effort into going AMD route.

Rob.
Offline

Rob Ainscough

  • Posts: 123
  • Joined: Sat Nov 01, 2014 11:22 pm

Re: Reason why DR 20 is embracing nVidia and not AMD?

PostFri Apr 18, 2025 12:05 am

qiyuxuan wrote:Well, this kind of hardware optimization usually require the GPU company to work with the software developer, and Nvidia has more man power to do that. AMD GPU always suffers from software optimization both in gaming and productivity, sometimes it has to rely on open source project to improve the performance. The new video encoder and decoder on RTX5090 and 5080 are hardware feature, that can decode and encode H264 and H265 in 422 sampling. Both Intel and Nvidia GPU can do it now, but AMD's GPU still lacks this hardware feature.


That’s false, even RDNA 3 could do decode of H264/265. From AMD website:

https://www.amd.com/en/products/graphic ... 20content1.

Not only can it, it does it A LOT faster with 9070XT
Offline

bclontz

  • Posts: 61
  • Joined: Sat Jul 13, 2019 4:10 pm
  • Real Name: Bill Clontz

Re: Reason why DR 20 is embracing nVidia and not AMD?

PostFri Apr 18, 2025 1:19 am

The fact is, AMD's ecosystem around AI flat out sucks, because they haven't put in the effort that Nvidia has. RocM isn't even officially supported on Windows. Even on Linux, support is limited to specific cards. Their newest rx9000 cards aren't even supported in rocm yet, despite being out for over a month!

Just look at any of the open source tools around AI, such as diffusion image generation and the like... Everyone running those types of tools knows not to buy an AMD card, because its nearly impossible to get working a lot of the time, and when you do jump though the hoops to get it working, it's still slower than an Nvidia card.

I WISH AMD was a viable alternative to Nvidia, but they really dropped the ball.
Offline

qiyuxuan

  • Posts: 7
  • Joined: Thu Apr 17, 2025 3:08 pm
  • Real Name: Yuxuan Qi

Re: Reason why DR 20 is embracing nVidia and not AMD?

PostFri Apr 18, 2025 1:38 am

Rob Ainscough wrote:
qiyuxuan wrote:Well, this kind of hardware optimization usually require the GPU company to work with the software developer, and Nvidia has more man power to do that. AMD GPU always suffers from software optimization both in gaming and productivity, sometimes it has to rely on open source project to improve the performance. The new video encoder and decoder on RTX5090 and 5080 are hardware feature, that can decode and encode H264 and H265 in 422 sampling. Both Intel and Nvidia GPU can do it now, but AMD's GPU still lacks this hardware feature.


That’s false, even RDNA 3 could do decode of H264/265. From AMD website:

Not only can it, it does it A LOT faster with 9070XT


It can decode some H265 or H264 codec, but not in 422 Color Sampling. Mac can do it a long time ago, and intel 11th gen iGPU was the first on PC. Nvidia and AMD both only support 420 and 444 until the latest RTX5000 series. You can do a simple test, export a clip using resolve H265 preset, export twice using the normal H265 and H265 Main 10 422 encoder profile, and import them back to your timeline and test the playback. The normal H265 should be decoded by your GPU while the 422 one is CPU only. Today, most new mirrorless cameras uses H.265 422, so its kinda big deal.
AMD Ryzen 9 5900X
Nvidia RTX 5080
64g Ram
Resolve Studio 20 Beta
Offline

VMFXBV

  • Posts: 801
  • Joined: Wed Aug 24, 2022 8:41 pm
  • Real Name: Andrew I. Veli

Re: Reason why DR 20 is embracing nVidia and not AMD?

PostFri Apr 18, 2025 2:37 am

bclontz wrote:The fact is, AMD's ecosystem around AI flat out sucks, because they haven't put in the effort that Nvidia has. RocM isn't even officially supported on Windows. Even on Linux, support is limited to specific cards. Their newest rx9000 cards aren't even supported in rocm yet, despite being out for over a month!

Just look at any of the open source tools around AI, such as diffusion image generation and the like... Everyone running those types of tools knows not to buy an AMD card, because its nearly impossible to get working a lot of the time, and when you do jump though the hoops to get it working, it's still slower than an Nvidia card.

I WISH AMD was a viable alternative to Nvidia, but they really dropped the ball.


But this is Resolve and Resolve works on FP32 (AKA not AI) for the majority of things. AMD cards work fine in it (including the AI part) but it doesn't mean they couldn't receive some love from BMD.

Also ROCm/HIP works on Windows...And lets not confuse official support with cards not working at all.
AMD Ryzen 5800X3D
AMD Radeon 7900XTX
Ursa Mini 4.6K
Pocket 4K
Offline

Jack Takashi

  • Posts: 34
  • Joined: Sat Feb 22, 2025 7:08 pm
  • Real Name: Jack Takashi

Re: Reason why DR 20 is embracing nVidia and not AMD?

PostFri Apr 18, 2025 4:14 am

My guess, going by how the recent versions of resolve have been closely tied to the driver versions of nvidia, is they develop closely with the nvidia devs. Not a genius answer I know.
Offline

Rob Ainscough

  • Posts: 123
  • Joined: Sat Nov 01, 2014 11:22 pm

Re: Reason why DR 20 is embracing nVidia and not AMD?

PostFri Apr 18, 2025 6:02 am

According to the link I provided above BM, worked closely with AMD. Would just like to see BM leverage more of AMD in its products.

I honestly doubt nVidia will be around much longer on the GPU scene (way too much power consumption). Their “evolution” process seems to be more sledge hammer approach (aka more power) rather than architectural efficiency of design.

The nVidia 50 series cards clearly demonstrate complete lack of interest in consumer products. Sure the GPU has tensor units so it can double as an NPU, but the power efficiency is horrible … like really horrible. Dedicated NPU can do it faster and use a lot less power.
Offline

Nick2021

  • Posts: 903
  • Joined: Thu May 13, 2021 3:19 am
  • Real Name: Nick Zentena

Re: Reason why DR 20 is embracing nVidia and not AMD?

PostFri Apr 18, 2025 7:16 am

Rob Ainscough wrote:
qiyuxuan wrote:Well, this kind of hardware optimization usually require the GPU company to work with the software developer, and Nvidia has more man power to do that. AMD GPU always suffers from software optimization both in gaming and productivity, sometimes it has to rely on open source project to improve the performance. The new video encoder and decoder on RTX5090 and 5080 are hardware feature, that can decode and encode H264 and H265 in 422 sampling. Both Intel and Nvidia GPU can do it now, but AMD's GPU still lacks this hardware feature.


That’s false, even RDNA 3 could do decode of H264/265. From AMD website:

https://www.amd.com/en/products/graphic ... 20content1.

Not only can it, it does it A LOT faster with 9070XT


422 is the issue
Offline

Rob Ainscough

  • Posts: 123
  • Joined: Sat Nov 01, 2014 11:22 pm

Re: Reason why DR 20 is embracing nVidia and not AMD?

PostFri Apr 18, 2025 6:45 pm

qiyuxuan wrote:It can decode some H265 or H264 codec, but not in 422 Color Sampling.


Not sure where you are getting your information from, but my AMD 9070 XT OC can encode/decode H265 422. Not sure what mirrorless has to do with anything? Don't have to use a mirrorless camera to do 422.

Ran your test, all GPU, no CPU ... even monitored loads externally to confirm.
Offline

kinvermark

  • Posts: 758
  • Joined: Tue Apr 16, 2019 5:04 pm
  • Real Name: Mark Wilson

Re: Reason why DR 20 is embracing nVidia and not AMD?

PostFri Apr 18, 2025 7:52 pm

It is h264 422 10 bit that is missing from most GPU's and CPU on-boards.

Not sure it really is the "must have" that people claim, but it is a commonly used "hi-end" camera format.

I use proxies anyway, so no biggie.
Windows 11 laptop. Intel i7-10750H, 32GB RAM, Nvidia 4070 ti Super eGPU, SSD disks. Resolve Studio (latest)
Offline

Nick2021

  • Posts: 903
  • Joined: Thu May 13, 2021 3:19 am
  • Real Name: Nick Zentena

Re: Reason why DR 20 is embracing nVidia and not AMD?

PostFri Apr 18, 2025 8:24 pm

Proxies take time to create. If the decoder handles it you don't need that step.

Decoding on the CPU takes up a lot of CPU. Now Resolve isn't really a CPU hog but still the difference is enough to force people up at least one step on CPU.

HEVC 4.2.2 is for 10 bit colour. Non issue for RAW or 8 bit HEVC but most people don't want to shoot 8 bit
Offline

Shz3al

  • Posts: 9
  • Joined: Sat Feb 15, 2025 1:10 am
  • Real Name: Daniel Rammelt

Re: Reason why DR 20 is embracing nVidia and not AMD?

PostSat May 10, 2025 12:55 am

I just switched from 5080 to 9070XT due to all the stupid issues with the 5080 driver on Linux, I even considered Windows it was that bad but I had similar issues when I installed on there too. So I thought to give ROCM another go (I tried on 7900XTX before the 5080 and gave up as performance was so bad).

The 9070XT is on par performance for gaming on Linux, zero issues with desktop/crashing GPU OOM etc. But I still had issues with ROCM, in windows I could play a basic timeline (30fps, edge sharpen + dehaze) 4k@120fps 4:2:0 footage with no issues. But on linux even with proxies I would get some random sub 30fps (usually 18-24fps) frame rate with offset audio.

I had tried Ubuntu, Debian, Arch, Gentoo, but I always used a newer kernel 6.14 / Mesa 25 to support the 9070, and just installed ROCM without the official amdgpu-dkms drivers. And I always used Gnome desktop as that is what I have been used to.

I fixed it by RTFM and some experimentation
- Installed an Ubuntu 24.04 distro without Gnome (KDE Neon, Cosmic A7 was also ok), as gnome screws with fps in Resolve/other things when using Fractional scaling.
- Use stock kernel 6.11 (or 6.12 mainline kernel is also fine anything newer breaks amdgpu-dkms, also xanmod kernel any version breaks something as fps is terrible with this even if the dkms compiles OK)
- Installed ROCM 6.4 repo using official guide and finally run "apt install rocm amdgpu-dkms"
- Manually select the GPU in Davinci Resolve (probably only a real problem with Pro cards since they expose the BAR device, but it would still give a warning if something changed).
- Ensure that the device in Davinci Resolve is "AMD Radeon RX 9070 XT" and not some generic device name like "AMD Radeon Discrete GPU".

Now I can run the timeline at 30fps, with effect, without proxies the same as if not smoother than on Windows! :D
Games run about the same as far as I can tell, mainly tested in SoTR and it was within 1% of the 6.14/mesa 25 setup.

I don't use much of the AI features in my projects afaik, and since the stupid 5080 drivers would just crash DR neural engine I am not missing out on anything anyway.
Offline

Rob Ainscough

  • Posts: 123
  • Joined: Sat Nov 01, 2014 11:22 pm

Re: Reason why DR 20 is embracing nVidia and not AMD?

PostMon May 12, 2025 6:45 pm

I managed to get a PNY 5090 OC for $1899 US and ordered Alphacool waterblock for it ... but thanks for your details response and listing what worked for you.

I am getting a lot of stutters in DR 20 B3 on my AMD cards (7900XTX or 9070XT OC). Lets see if nVidia is a solution or just more false information.
Offline

Shz3al

  • Posts: 9
  • Joined: Sat Feb 15, 2025 1:10 am
  • Real Name: Daniel Rammelt

Re: Reason why DR 20 is embracing nVidia and not AMD?

PostMon May 12, 2025 7:46 pm

Rob Ainscough wrote:I managed to get a PNY 5090 OC for $1899 US and ordered Alphacool waterblock for it ... but thanks for your details response and listing what worked for you.

I am getting a lot of stutters in DR 20 B3 on my AMD cards (7900XTX or 9070XT OC). Lets see if nVidia is a solution or just more false information.


NVIDIA will fix DR for sure, with AMD you have to use Render Caching if you add any text or AI features on Linux (not so on Windows on windows AMD is perfect). And the 5090 does not have the same level of GPU Ram issues as the 5080 I would guess, I would still restart often just to avoid it if possible, if you check the nvidia linux forums it is a pretty hot topic right now in the driver threads.

Its a trade off either way, the 5090 is defiantly the best consumer card for DR right now. But Nvidia also has the worst driver issues right now (and for the last year or so and it is only getting worse). And the cost $$$$$$ :o

(But I am very jealous all the same! :mrgreen: )

AMD is definitely the best value for money card for DR right now and has next to no major driver issues except for rocm being a PITA to install outside of select distros, but struggles with Fusion in Linux but not in Windows.

So if AMD OpenCL works fine in windows, and nvidia CUDA works fine in both windows/linux... then it must be that DR in linux is not optimised for AMD.

I have already made a complaint to CS about this, they pretty much just said use Rocky Linux 8.6 and official drivers. Rocky 8.6 does not even support my CPU it is so old, and it would not make any difference since it is not the OS that is at fault, it is the software. In the end they said "I will pass on your comments to the development team." so... :?

For whatever reason it takes like a week for mods to approve posts so since then, I have tried CachyOS as arch just got ROCM 6.4.0 in the testing repos and after checking the Archlinux pkg build they do a really vanilla install of ROCM including the frankenstein AMD LLVM. I installed it and had the same poor ~24fps on a 30fps timeline initially (maybe a caching issue as I had 6.3.3 installed and tried that), restarted and now its a solid 30fps even disabling proxy/render caching using the latest kernel/mesa. So it seems it is more down to ROCM 6.4.0 and how it is built than the kernel/amdgpu-dkms.

Anyway... again very jealous of the 5090 + Waterblock. If your mobo has external sensors I would hook up a temp sensor on the power input on the card though. Or maybe just direct some airflow over the PSU/card connections?

EDIT: No it seems as though restarting the OS fixes the low performance issue on Arch/CachyOS Linux, but closing and then starting DR again (20 b3) causes it to drop frames again. A full restart is required to get normal performance again. But after starting then closing DR the frame drops will always come back. It is the same for Ubuntu 24.04 & CachyOS.

EDIT:EDIT: Confirmed this restart performance issue is in 20b3, 19.1.4 does not have this issue at all. With 20b3 the more you restart the more performance you lose, after 1 restart timeline fps goes down from 30 to 25ish after like 5 restarts performance was down to ~17fps max on a 30fps timeline.
Offline

CougerJoe

  • Posts: 578
  • Joined: Wed Sep 18, 2019 5:15 am
  • Real Name: bob brady

Re: Reason why DR 20 is embracing nVidia and not AMD?

PostTue May 13, 2025 12:16 am

Rob Ainscough wrote:
qiyuxuan wrote:It can decode some H265 or H264 codec, but not in 422 Color Sampling.


Not sure where you are getting your information from, but my AMD 9070 XT OC can encode/decode H265 422. Not sure what mirrorless has to do with anything? Don't have to use a mirrorless camera to do 422.

Ran your test, all GPU, no CPU ... even monitored loads externally to confirm.


This GPU doesn't have 422 10bit decode, How you tested a feature that your GPU doesn't have and proved it does is a mystery.
Offline

Rob Ainscough

  • Posts: 123
  • Joined: Sat Nov 01, 2014 11:22 pm

Re: Reason why DR 20 is embracing nVidia and not AMD?

PostTue May 13, 2025 1:11 am

Even the drivers list it … so not sure what to say?
Offline

CougerJoe

  • Posts: 578
  • Joined: Wed Sep 18, 2019 5:15 am
  • Real Name: bob brady

Re: Reason why DR 20 is embracing nVidia and not AMD?

PostTue May 13, 2025 1:27 am

Rob Ainscough wrote:Even the drivers list it … so not sure what to say?


It doesn't have 422 10bit decode or encode, you're the only person in the world saying this. Are you thinking of 420 10bit decode/encode?
Offline

qiyuxuan

  • Posts: 7
  • Joined: Thu Apr 17, 2025 3:08 pm
  • Real Name: Yuxuan Qi

Re: Reason why DR 20 is embracing nVidia and not AMD?

PostTue May 20, 2025 1:03 pm

Rob Ainscough wrote:
qiyuxuan wrote:It can decode some H265 or H264 codec, but not in 422 Color Sampling.


Not sure where you are getting your information from, but my AMD 9070 XT OC can encode/decode H265 422. Not sure what mirrorless has to do with anything? Don't have to use a mirrorless camera to do 422.

Ran your test, all GPU, no CPU ... even monitored loads externally to confirm.


Are you using Intel CPU? Becuase intel IGPU can decode 422 since 11th gen. When 9070XT came out, people had already done tests with its media engine, and 422 decode is not supported, AMD also never mentioned about 422 decoding.
AMD Ryzen 9 5900X
Nvidia RTX 5080
64g Ram
Resolve Studio 20 Beta

Return to DaVinci Resolve

Who is online

Users browsing this forum: Bing [Bot], Driftc, Google [Bot], Jim Simon, MatroxMan, panos_mts and 270 guests