H.264 4:2:2 10-bit support in Resolve 20

Get answers to your questions about color grading, editing and finishing with DaVinci Resolve.
  • Author
  • Message
Offline

MediaGary

  • Posts: 17
  • Joined: Sun Feb 10, 2019 6:17 pm
  • Real Name: Theodore Gary

H.264 4:2:2 10-bit support in Resolve 20

PostTue Jun 10, 2025 3:48 pm

The 'Codecs' section of the DaVinci Resolve 20 announcement says:

+ GPU accelerated H.265 4:2:2 encodes on supported Nvidia systems.
+ GPU accelerated H.265 4:2:2 decodes on supported Nvidia systems.

I am specifically interested in knowing if Resolve 20 supports GPU accelerated H.264 10-bit 4:2:2 on Nvidia RTX 50-series. This codec is native to my Panasonic Lumix cameras and others like Sony.
M4 Max Studio | 3x 4TB M.2, 2TB M.2 | 16TB U.2 RAID-0 | 8TB U.2
AMD 5950X/64GB | Win11 | ARC A770 16GB | 10GbE | 12.8 TB RAID-0 U.2 | 8TB U.2
Resolve 19 Studio | www.tedlandstudio.com/
Offline

CougerJoe

  • Posts: 609
  • Joined: Wed Sep 18, 2019 5:15 am
  • Real Name: bob brady

Re: H.264 4:2:2 10-bit support in Resolve 20

PostTue Jun 10, 2025 11:18 pm

MediaGary wrote:
I am specifically interested in knowing if Resolve 20 supports GPU accelerated H.264 10-bit 4:2:2 on Nvidia RTX 50-series. This codec is native to my Panasonic Lumix cameras and others like Sony.


I've requested 50 series owners to try a number of different files, they never do, only ever complain about bugs affects 50 series owners. So they're here, but not helpful.

As for if they do have compatibility, they might, but if it's the same level of compatibility seen with the V19 beta versions 5090 Reviewers were using, they are compatible with some H.264 422 10bit files but not others. This may be why V20 is not officially compatible or alternatively there may be no H.264 422 10bit GPU decoding ability in V20 until Resolve has full compatibility at which point it will be turned on.
Offline

Mads Johansen

  • Posts: 1425
  • Joined: Mon Dec 19, 2016 10:51 am

Re: H.264 4:2:2 10-bit support in Resolve 20

PostThu Jun 12, 2025 2:38 pm

What specific file do you want to test with? Links please :)
Davinci Resolve Studio 20 build 49, Windows 11, Ultra 7 265k, Nvidia 5070 TI, 576.80 Studio
Offline

Mads Johansen

  • Posts: 1425
  • Joined: Mon Dec 19, 2016 10:51 am

Re: H.264 4:2:2 10-bit support in Resolve 20

PostFri Jun 13, 2025 3:24 am

The answer is yes!

I was sent a h264 high 4:2:2 L5.1, yuv422p10le, 3840x2160 file and it's decoded in hardware. No cpu increase and the hardware decoder load increased (verified both in Task Manager and GPU-Z).
Davinci Resolve Studio 20 build 49, Windows 11, Ultra 7 265k, Nvidia 5070 TI, 576.80 Studio
Offline

MediaGary

  • Posts: 17
  • Joined: Sun Feb 10, 2019 6:17 pm
  • Real Name: Theodore Gary

Re: H.264 4:2:2 10-bit support in Resolve 20

PostFri Jun 13, 2025 3:32 pm

Gratitude and honor to Mads Johansen for verifying the H.264 hardware decode in the RTX 5070 Ti. You have done us all a great favor!
M4 Max Studio | 3x 4TB M.2, 2TB M.2 | 16TB U.2 RAID-0 | 8TB U.2
AMD 5950X/64GB | Win11 | ARC A770 16GB | 10GbE | 12.8 TB RAID-0 U.2 | 8TB U.2
Resolve 19 Studio | www.tedlandstudio.com/
Offline

Annabel_Shanderin

  • Posts: 78
  • Joined: Fri Mar 05, 2021 7:02 pm
  • Real Name: Annabel Shanderin

Re: H.264 4:2:2 10-bit support in Resolve 20

PostMon Jun 16, 2025 6:39 am

That's great news for rtx 50-series owners. Unfortunately, Davinci Resolve 20 doesn't seem to utilize Intel ARC hardware encoders for h264 10 bit 422. At least the 140t igpu on my intel 255h cpu isn't crunching it. H265 10 bit 422 is however decoded on it and running smoothly.

Really hoping blackmagic will enable hardware decoding for h264 10 bit 422 (4k) soon, as this is the format I shoot in the vast majority of the time, and right now it's not editing smoothly.
Offline

MediaGary

  • Posts: 17
  • Joined: Sun Feb 10, 2019 6:17 pm
  • Real Name: Theodore Gary

Re: H.264 4:2:2 10-bit support in Resolve 20

PostWed Jun 18, 2025 1:04 am

Annabel_Shanderin wrote:That's great news for rtx 50-series owners. Unfortunately, Davinci Resolve 20 doesn't seem to utilize Intel ARC hardware encoders for h264 10 bit 422. ...
Really hoping blackmagic will enable hardware decoding for h264 10 bit 422 (4k) soon,....


It seems the encode/decode *hardware* for H.264 10-bit 4:2:2 is not present in the Intel iGPU and ARC series of Intel discrete GPU's. As you can see in my signature, I have an ARC A770 16GB, and it will get replaced with an Nvidia RTX 50-series.
M4 Max Studio | 3x 4TB M.2, 2TB M.2 | 16TB U.2 RAID-0 | 8TB U.2
AMD 5950X/64GB | Win11 | ARC A770 16GB | 10GbE | 12.8 TB RAID-0 U.2 | 8TB U.2
Resolve 19 Studio | www.tedlandstudio.com/
Offline

Annabel_Shanderin

  • Posts: 78
  • Joined: Fri Mar 05, 2021 7:02 pm
  • Real Name: Annabel Shanderin

Re: H.264 4:2:2 10-bit support in Resolve 20

PostWed Jun 18, 2025 5:59 pm

MediaGary wrote:
Annabel_Shanderin wrote:That's great news for rtx 50-series owners. Unfortunately, Davinci Resolve 20 doesn't seem to utilize Intel ARC hardware encoders for h264 10 bit 422. ...
Really hoping blackmagic will enable hardware decoding for h264 10 bit 422 (4k) soon,....


It seems the encode/decode *hardware* for H.264 10-bit 4:2:2 is not present in the Intel iGPU and ARC series of Intel discrete GPU's. As you can see in my signature, I have an ARC A770 16GB, and it will get replaced with an Nvidia RTX 50-series.


Hi Gary, you're right... Oh dear... Why, Intel, why? Was hoping to be able to do some smooth but light editing on my new Lenovo Yoga pro 7i gen 10 with the latest 255h cpu... Seems like I might have to return it and start looking for something with a Rtx 5060 instead. Sigh.
Offline
User avatar

Claire Watson

  • Posts: 157
  • Joined: Sat Aug 26, 2017 2:33 pm

Re: H.264 4:2:2 10-bit support in Resolve 20

PostThu Jun 19, 2025 10:02 am

MediaGary wrote:The 'Codecs' section of the DaVinci Resolve 20 announcement says:

+ GPU accelerated H.265 4:2:2 encodes on supported Nvidia systems.
+ GPU accelerated H.265 4:2:2 decodes on supported Nvidia systems.

I am specifically interested in knowing if Resolve 20 supports GPU accelerated H.264 10-bit 4:2:2 on Nvidia RTX 50-series. This codec is native to my Panasonic Lumix cameras and others like Sony.


My brand new RTX 5070 Ti does NOT accelerate Sony XAVC S-I 4K from A7S3 camera in Resolve 20.

MediaInfo reports the Codec ID "XAVC (XAVC/mp42/iso2)", Format profile "High 4:2:2 Intra@L5.2"

I checked further, here is info from ffprobe

"Stream #0:0[0x1](und): Video: h264 (High 4:2:2 Intra) (avc1 / 0x31637661), yuv422p10le(pc, progressive), 3840x2160 [SAR 1:1 DAR 16:9], 478495 kb/s, 50 fps, 50 tbr, 50k tbn (default)"

I'm still pleased to have updated my old 2080 Ti though as I can now edit Sony XAVC HS format (HEVC) that my older CPU struggled with.
Resolve Studio 20 Win10Pro Gigabyte GA-X99 i7-5960X 32GB DDR4 RAM RTX 5070 Ti (576.80 driver), DeckLink 4K Extreme 12G, Flanders Scientific XMP310 HDR monitor
Offline

Mads Johansen

  • Posts: 1425
  • Joined: Mon Dec 19, 2016 10:51 am

Re: H.264 4:2:2 10-bit support in Resolve 20

PostThu Jun 19, 2025 10:59 am

Claire Watson wrote:
MediaGary wrote:The 'Codecs' section of the DaVinci Resolve 20 announcement says:

+ GPU accelerated H.265 4:2:2 encodes on supported Nvidia systems.
+ GPU accelerated H.265 4:2:2 decodes on supported Nvidia systems.

I am specifically interested in knowing if Resolve 20 supports GPU accelerated H.264 10-bit 4:2:2 on Nvidia RTX 50-series. This codec is native to my Panasonic Lumix cameras and others like Sony.


My brand new RTX 5070 Ti does NOT accelerate Sony XAVC S-I 4K from A7S3 camera in Resolve 20.

MediaInfo reports the Codec ID "XAVC (XAVC/mp42/iso2)", Format profile "High 4:2:2 Intra@L5.2"

I checked further, here is info from ffprobe

"Stream #0:0[0x1](und): Video: h264 (High 4:2:2 Intra) (avc1 / 0x31637661), yuv422p10le(pc, progressive), 3840x2160 [SAR 1:1 DAR 16:9], 478495 kb/s, 50 fps, 50 tbr, 50k tbn (default)"

I'm still pleased to have updated my old 2080 Ti though as I can now edit Sony XAVC HS format (HEVC) that my older CPU struggled with.

Can you be persuaded to upload a 10 second video somewhere for verification?
Davinci Resolve Studio 20 build 49, Windows 11, Ultra 7 265k, Nvidia 5070 TI, 576.80 Studio
Offline
User avatar

Claire Watson

  • Posts: 157
  • Joined: Sat Aug 26, 2017 2:33 pm

Re: H.264 4:2:2 10-bit support in Resolve 20

PostThu Jun 19, 2025 2:09 pm

This was recorded on my A7S3 in S-Log3 S-Gamut3 for use in an HDR project.

Download link
https://www.transfernow.net/dl/20250619dRC7y2oG
Resolve Studio 20 Win10Pro Gigabyte GA-X99 i7-5960X 32GB DDR4 RAM RTX 5070 Ti (576.80 driver), DeckLink 4K Extreme 12G, Flanders Scientific XMP310 HDR monitor
Offline

Mads Johansen

  • Posts: 1425
  • Joined: Mon Dec 19, 2016 10:51 am

Re: H.264 4:2:2 10-bit support in Resolve 20

PostThu Jun 19, 2025 2:54 pm

Claire Watson wrote:This was recorded on my A7S3 in S-Log3 S-Gamut3 for use in an HDR project.

Download link
https://www.transfernow.net/dl/20250619dRC7y2oG

Thanks. You are indeed right, nothing is used on a 5070 TI with this file :(
Davinci Resolve Studio 20 build 49, Windows 11, Ultra 7 265k, Nvidia 5070 TI, 576.80 Studio
Offline

4EvrYng

  • Posts: 783
  • Joined: Sat Feb 19, 2022 12:45 am
  • Real Name: Alexander Dali

Re: H.264 4:2:2 10-bit support in Resolve 20

PostFri Jun 20, 2025 3:52 am

Mads Johansen wrote:
Claire Watson wrote:This was recorded on my A7S3 in S-Log3 S-Gamut3 for use in an HDR project.

Download link
https://www.transfernow.net/dl/20250619dRC7y2oG

Thanks. You are indeed right, nothing is used on a 5070 TI with this file :(

That is odd as I was under impression any Blackwell card should be able to do it. Are you experiencing same lack of 5070 Ti hardware acceleration with any H.264 10-bit 4:2:2 file or just with this particular one? If you try playing this particular one with some other software does it end up using hardware acceleration?
Offline

Nick2021

  • Posts: 910
  • Joined: Thu May 13, 2021 3:19 am
  • Real Name: Nick Zentena

Re: H.264 4:2:2 10-bit support in Resolve 20

PostFri Jun 20, 2025 5:45 am

Annabel_Shanderin wrote:
Hi Gary, you're right... Oh dear... Why, Intel, why?


264 isn't that demanding anymore. CPUs are fast enough for the most part. Partly what was the last new camera to use 264 4.2.2?

264 was for 4k and lower with so many cameras supporting 6k and above it doesn't make sense for the camera makers to use.

If your camera offers other options it might be simpler to switch recording formats.
Offline

Mads Johansen

  • Posts: 1425
  • Joined: Mon Dec 19, 2016 10:51 am

Re: H.264 4:2:2 10-bit support in Resolve 20

PostFri Jun 20, 2025 7:26 am

4EvrYng wrote:
Mads Johansen wrote:
Claire Watson wrote:This was recorded on my A7S3 in S-Log3 S-Gamut3 for use in an HDR project.

Download link
https://www.transfernow.net/dl/20250619dRC7y2oG

Thanks. You are indeed right, nothing is used on a 5070 TI with this file :(

That is odd as I was under impression any Blackwell card should be able to do it. Are you experiencing same lack of 5070 Ti hardware acceleration with any H.264 10-bit 4:2:2 file or just with this particular one? If you try playing this particular one with some other software does it end up using hardware acceleration?

I haven't had the time yet to do my entire test suite of what flavors of H264 and H265 the 5070 TI supports ( https://en.wikipedia.org/wiki/Advanced_ ... g#Profiles ), the pixel formats yuv420, yuv422, yuv444, yuv420p10le, yuv422p10le, yuv444p10le and how the nvdec is accessed (d3d11va, d3d12va, vulkan, cuda).
We know the h264 high 4:2:2 L5.1 yuv422p10le is supported while h264 high 4:2:2 intra L5.2 yuv422p10le is not.

With that background of the way:
I'm sceptical of the need for hardware decoding of h.264 intra, as I can decode the file from Claire at 200 fps.
And no, with the files I normally deal with (eg h264 high), they are all hardware decoded.
Davinci Resolve Studio 20 build 49, Windows 11, Ultra 7 265k, Nvidia 5070 TI, 576.80 Studio
Offline

Annabel_Shanderin

  • Posts: 78
  • Joined: Fri Mar 05, 2021 7:02 pm
  • Real Name: Annabel Shanderin

Re: H.264 4:2:2 10-bit support in Resolve 20

PostFri Jun 20, 2025 8:23 am

Nick2021 wrote:
Annabel_Shanderin wrote:
Hi Gary, you're right... Oh dear... Why, Intel, why?


264 isn't that demanding anymore. CPUs are fast enough for the most part. Partly what was the last new camera to use 264 4.2.2?

264 was for 4k and lower with so many cameras supporting 6k and above it doesn't make sense for the camera makers to use.

If your camera offers other options it might be simpler to switch recording formats.


Hi Nick, I shoot on the Sony A6700, and the reason I am locked into using 264 still, is because Sony cameras for some odd reason doesn't have 25p in h265 when shooting in pal (only 50p and 100p). I would be fine using h265 if it wasn't for that limitation, but I still need 25p.

My new Lenovo yoga pro 7i gen 10 with the Intel 255h Cpu can edit h.264, but the fans ramp up like crazy as the CPU is utilized 100% and it is not as smooth at all as when editing h.264. The noise from the fans and the lack of smooth scrubbing on the time line kind of takes the fun out of editing fast.
Offline

4EvrYng

  • Posts: 783
  • Joined: Sat Feb 19, 2022 12:45 am
  • Real Name: Alexander Dali

Re: H.264 4:2:2 10-bit support in Resolve 20

PostFri Jun 20, 2025 6:44 pm

Mads Johansen wrote:I haven't had the time yet to do my entire test suite of what flavors of H264 and H265 the 5070 TI supports ... We know the h264 high 4:2:2 L5.1 yuv422p10le is supported while h264 high 4:2:2 intra L5.2 yuv422p10le is not.

With that background of the way:
I'm sceptical of the need for hardware decoding of h.264 intra, as I can decode the file from Claire at 200 fps.

Thank you! How exactly do you test decoding speed, please? I would like to test my system.
Offline

Mads Johansen

  • Posts: 1425
  • Joined: Mon Dec 19, 2016 10:51 am

Re: H.264 4:2:2 10-bit support in Resolve 20

PostFri Jun 20, 2025 6:53 pm

4EvrYng wrote:
Mads Johansen wrote:I haven't had the time yet to do my entire test suite of what flavors of H264 and H265 the 5070 TI supports ... We know the h264 high 4:2:2 L5.1 yuv422p10le is supported while h264 high 4:2:2 intra L5.2 yuv422p10le is not.

With that background of the way:
I'm sceptical of the need for hardware decoding of h.264 intra, as I can decode the file from Claire at 200 fps.

Thank you! How exactly do you test decoding speed, please? I would like to test my system.

ffmpeg.exe -i example.h264 -f null NUL

I'm not aware of any "read a file and do nothing with it" functionality in Davinci, sadly.
Davinci Resolve Studio 20 build 49, Windows 11, Ultra 7 265k, Nvidia 5070 TI, 576.80 Studio
Offline

4EvrYng

  • Posts: 783
  • Joined: Sat Feb 19, 2022 12:45 am
  • Real Name: Alexander Dali

Re: H.264 4:2:2 10-bit support in Resolve 20

PostFri Jun 20, 2025 7:39 pm

Mads Johansen wrote:
4EvrYng wrote:
Mads Johansen wrote:I haven't had the time yet to do my entire test suite of what flavors of H264 and H265 the 5070 TI supports ... We know the h264 high 4:2:2 L5.1 yuv422p10le is supported while h264 high 4:2:2 intra L5.2 yuv422p10le is not.

With that background of the way:
I'm sceptical of the need for hardware decoding of h.264 intra, as I can decode the file from Claire at 200 fps.

Thank you! How exactly do you test decoding speed, please? I would like to test my system.

ffmpeg.exe -i example.h264 -f null NUL

I'm not aware of any "read a file and do nothing with it" functionality in Davinci, sadly.

Thank you! My 10920X was able to decode Claire's file with ffmpeg at 260fps which wasn't giving me enough time to judge what was going on with CPU. In Resolve her file played at full speed 50fps without any dropped frames, 30+% overall CPU usage across several cores and at least 2 cores being maxed out when in color page (25+% and one core maxed out when in edit page). So maybe the question isn't can ffmpeg easily decode it without sweating the CPU but how much of CPU usage when in Resolve is due to decoding and how much is due to rest / how much of benefit H.264 GPU acceleration would give.

P.S. Do you know of way to tell ffmpeg to keep doing test decoding in the loop, please, so I can get better idea how much CPU load it causes vs. Resolve?
Offline

Mads Johansen

  • Posts: 1425
  • Joined: Mon Dec 19, 2016 10:51 am

Re: H.264 4:2:2 10-bit support in Resolve 20

PostFri Jun 20, 2025 8:08 pm

4EvrYng wrote:Thank you! My 10920X was able to decode Claire's file with ffmpeg at 260fps which wasn't giving me enough time to judge what was going on with CPU. In Resolve her file played at full speed 50fps without any dropped frames, 30+% overall CPU usage across several cores and at least 2 cores being maxed out when in color page (25+% and one core maxed out when in edit page). So maybe the question isn't can ffmpeg easily decode it without sweating the CPU but how much of CPU usage when in Resolve is due to decoding and how much is due to rest / how much of benefit H.264 GPU acceleration would give.

P.S. Do you know of way to tell ffmpeg to keep doing test decoding in the loop, please, so I can get better idea how much CPU load it causes vs. Resolve?

The only thing I can think of is to create a bat file with
Code: Select all
ffmpeg.exe -i example.h264 -f null NUL
ffmpeg.exe -i example.h264 -f null NUL
ffmpeg.exe -i example.h264 -f null NUL
....
ffmpeg.exe -i example.h264 -f null NUL
ffmpeg.exe -i example.h264 -f null NUL
(the same command multiple times)

You're talking about CPU usage over 1-ish second, right? I haven't verified this but it looks exactly like what we need.
I like https://systeminformer.sourceforge.io/ for it's detailed information.

I haven't tested if we can subtract Davinci from itself: Create a new timeline, add a generator, convert to Compound Clip, then play back. Then take the cpu usage from that as a Davinci basal use and subtract from other measurements.

The other thing I've been thinking of is to indirectly measure how long it takes to go one frame back in a long GOP file: Record via OBS and go to the end of a known GOP file, then go one frame back. Count the frames between button press and resolve responds = GOP length/response time = decode fps.
But it's tangential to the cpu usage :(
Davinci Resolve Studio 20 build 49, Windows 11, Ultra 7 265k, Nvidia 5070 TI, 576.80 Studio
Offline

4EvrYng

  • Posts: 783
  • Joined: Sat Feb 19, 2022 12:45 am
  • Real Name: Alexander Dali

Re: H.264 4:2:2 10-bit support in Resolve 20

PostSat Jun 21, 2025 1:48 am

Mads Johansen wrote:The only thing I can think of is to create a bat file with ...

I ended up creating batch file that kept calling ffmpeg.exe in the loop. Load on CPU was 45-75% very evenly spread across all 24 cores. In other words, ffmpeg efficiently used every single resource at its disposal and can't be used to judge whether Resolve will benefit from GPU H264 acceleration, pattern of CPU usage between the two is obviously very different.

Mads Johansen wrote:I haven't tested if we can subtract Davinci from itself ...

You have 5070 Ti, correct? IIRC one can tell Resolve whether to use GPU or CPU for decoding, which could give an idea how much H264 GPU decoding helps in Resolve, how big is an impact on CPU. If yes what kind of file you would need to test that, please? Would 4K SI file from FX30 work?

Mads Johansen wrote:The other thing I've been thinking of is to indirectly measure how long it takes to go one frame back in a long GOP file: Record via OBS and go to the end of a known GOP file, then go one frame back. Count the frames between button press and resolve responds = GOP length/response time = decode fps. But it's tangential to the cpu usage :(

That's a very good idea. Using 2 machine setup with recording card would eliminate any concerns over vampire system load. Unfortunately I don't have a setup I could test that with.
Offline

Mads Johansen

  • Posts: 1425
  • Joined: Mon Dec 19, 2016 10:51 am

Re: H.264 4:2:2 10-bit support in Resolve 20

PostSat Jun 21, 2025 9:51 am

4EvrYng wrote:
Mads Johansen wrote:The only thing I can think of is to create a bat file with ...

I ended up creating batch file that kept calling ffmpeg.exe in the loop. Load on CPU was 45-75% very evenly spread across all 24 cores. In other words, ffmpeg efficiently used every single resource at its disposal and can't be used to judge whether Resolve will benefit from GPU H264 acceleration, pattern of CPU usage between the two is obviously very different.

Mads Johansen wrote:I haven't tested if we can subtract Davinci from itself ...

You have 5070 Ti, correct? IIRC one can tell Resolve whether to use GPU or CPU for decoding, which could give an idea how much H264 GPU decoding helps in Resolve, how big is an impact on CPU. If yes what kind of file you would need to test that, please? Would 4K SI file from FX30 work?

Mads Johansen wrote:The other thing I've been thinking of is to indirectly measure how long it takes to go one frame back in a long GOP file: Record via OBS and go to the end of a known GOP file, then go one frame back. Count the frames between button press and resolve responds = GOP length/response time = decode fps. But it's tangential to the cpu usage :(

That's a very good idea. Using 2 machine setup with recording card would eliminate any concerns over vampire system load. Unfortunately I don't have a setup I could test that with.

We're still talking about H.264 intra right? My point is that H.264 intra is not worth accelerating because each frame can be decoded almost instantly on the cpu anyway and that other processing will take much longer almost in every case, so I do not think decoding H.264 intra is worth benchmarking.

(Edit later: I was wrong:
ffmpeg -hwaccel cuda -i S_080724_005.MP4 -f null NUL
128 fps
ffmpeg -i S_080724_005.MP4 -f null NUL
453 fps )

In general, for the flavours of H.264 and H.265 and AV1 and and and.. that are computationally expensive to decode, yes those are worth benchmarking.

Since we can't measure anything with Davinci (everything plays at 120 fps which is the maximum frame rate), I used ffmpeg.
I'm going to make a more detailed post with details and reproduction steps, but as a rough guide for decoding with a 5070TI with a H.264 4500 frame Mandelbrot file:
ffmpeg -hwaccel cuda -i mandelbrot-yuv420p-1080p.mov -f null NUL
Code: Select all
CUDA at 1080p:
yuv420p: 645 fps
yuv420p10le: 487 fps
yuv422p: 486 fps
yuv422p10le: 370 fps

CUDA at 2160p:
yuv420p: 191 fps
yuv420p10le: 145 fps
yuv422p: 138 fps
yuv422p10le: 114 fps


Code: Select all
Intel Ultra 7 265k at 1080p:
yuv420p: 726 fps
yuv420p10le: 721 fps
yuv422p: 523 fps
yuv422p10le: 515 fps

Intel Ultra 7 265k at 2160p:
yuv420p: 203 fps
yuv420p10le: 197 fps
yuv422p: 147 fps
yuv422p10le: 141 fps


(Which surprised me a lot. And there's no hardware decoding in the Intel)

I also don't know what a 4K SI file from FX30 is.

Now we have a better supported matrix than before at least :)
Davinci Resolve Studio 20 build 49, Windows 11, Ultra 7 265k, Nvidia 5070 TI, 576.80 Studio
Offline

4EvrYng

  • Posts: 783
  • Joined: Sat Feb 19, 2022 12:45 am
  • Real Name: Alexander Dali

Re: H.264 4:2:2 10-bit support in Resolve 20

PostSun Jun 22, 2025 12:50 am

Mads Johansen wrote:Since we can't measure anything with Davinci (everything plays at 120 fps which is the maximum frame rate), I used ffmpeg.

I believe what ffmpeg can do is irrelevant, what matters is how much benefit there will be to user when in Resolve. There are two things that can happen one somebody tries to play H264 10-bit 4:2:2 S-I (Intra) file from Sony camera (https://helpguide.sony.net/ilc/2220/v1/ ... 68279.html):

1. Resolve will be dropping frames and thus GPU acceleration would very likely eliminate that.

2. Resolve won't be dropping frames but will be consuming CPU. Question is how much of a CPU penalty lack of GPU acceleration will be causing. I can't assume there won't be much of a penalty just because S-I has reputation for being very light on CPU because playing in Resolve some of my 60fps SI files ends up consuming as much as 60% of CPU hammering all of the cores. Thus I believe only way to answer that is by comparing CPU load caused by Resolve when it uses CPU vs. GPU for decoding and doing that for number of 30/60/120 fps files out of the camera.
Offline

CougerJoe

  • Posts: 609
  • Joined: Wed Sep 18, 2019 5:15 am
  • Real Name: bob brady

Re: H.264 4:2:2 10-bit support in Resolve 20

PostSun Jun 22, 2025 5:58 am

4EvrYng wrote:
Mads Johansen wrote:Since we can't measure anything with Davinci (everything plays at 120 fps which is the maximum frame rate), I used ffmpeg.

I believe what ffmpeg can do is irrelevant, what matters is how much benefit there will be to user when in Resolve.

I can't assume there won't be much of a penalty just because S-I has reputation for being very light on CPU because playing in Resolve some of my 60fps SI files ends up consuming as much as 60% of CPU hammering all of the cores.


The only interesting thing about ffmpeg would be it's compatibility with 50 series GPU's, It's has had compatibility since first sales of 5090. If it has GPU decoding compatibility on 4:2:2 formats that Resolve doesn't then that's a Resolve problem but if they have the same compatibility(lack of) that's either a Nvidia driver problem or worst case a hardware or legal patent problem.

My CPU also uses around 60% processing on certain 6K H.264 4:2:2 Intra.
Offline

Mads Johansen

  • Posts: 1425
  • Joined: Mon Dec 19, 2016 10:51 am

Re: H.264 4:2:2 10-bit support in Resolve 20

PostSun Jun 22, 2025 1:50 pm

One of my major stumbling blocks has been the inability to create H.264 10 bit 4:2:2 Intra files long enough, to see CPU/GPU usage that is visible from the background noise.

It turns out that our dear friend ffmpeg and libx264 actually does support 10 bit 4:2:2 Intra
Code: Select all
ffmpeg -f lavfi -i testsrc2=size=3840x2160 -pix_fmt yuv422p10le -profile:v high422 -crf 0 -t 30 -g 1 testscr2.mp4


H.264 10 bit 4:2:2 Intra 2160p from ffmpeg (non-accelerated by Davinci)
At timeline 25 fps: 6% CPU
At timeline 120 fps: 33% CPU.

H.264 10 bit 4:2:2 Intra 2160p from Claire (non-accelerated by Davinci)
At timeline 25 fps: 6% CPU
At timeline 50 fps: 11% CPU
At timeline 120 fps: 23% CPU.

H.264 10 bit 4:2:2 Intra 3456p (non-accelerated by Davinci) (I love that resolution btw)
At timeline 25 fps: 17% CPU
At timeline 120 fps: 45% CPU.


4EvrYng wrote:... There are two things that can happen one somebody tries to play H264 10-bit 4:2:2 S-I (Intra) file from Sony camera (https://helpguide.sony.net/ilc/2220/v1/ ... 68279.html):

1. Resolve will be dropping frames and thus GPU acceleration would very likely eliminate that.

2. Resolve won't be dropping frames but will be consuming CPU. Question is how much of a CPU penalty lack of GPU acceleration will be causing. I can't assume there won't be much of a penalty just because S-I has reputation for being very light on CPU because playing in Resolve some of my 60fps SI files ends up consuming as much as 60% of CPU hammering all of the cores. Thus I believe only way to answer that is by comparing CPU load caused by Resolve when it uses CPU vs. GPU for decoding and doing that for number of 30/60/120 fps files out of the camera.

1) In principle I agree with you. But since Davinci does not accelerate H.264 10 bit 4:2:2 Intra, it's a moot point.

2) As everyone can see, the CPU usage on a new CPU is negligible. I'll report on tuesday how a 12400F fares (don't have access to a slow system with Davinci Studio before then).
If Alexander Dali would kindly post some specs to compare with, that would be lovely.
My old editing system, a 13400F, doesn't have Studio and in the free version, H.264 10 bit 4:2:2 Intra isn't supported.
Davinci Resolve Studio 20 build 49, Windows 11, Ultra 7 265k, Nvidia 5070 TI, 576.80 Studio
Offline
User avatar

joema4

  • Posts: 443
  • Joined: Wed Feb 03, 2021 3:26 pm
  • Real Name: Joe Marler

Re: H.264 4:2:2 10-bit support in Resolve 20

PostSun Jun 22, 2025 3:51 pm

I don't have a Windows machine, but years ago I worked in Windows application development. Below are some suggestions. I cannot test or verify these, so this is just FWIW:

At build-time, the Resolve app must be linked with the NVIDIA ver. 13.x SDK. That cannot be examined, but we can assume that it was done.

Resolve must make the correct NVIDIA SDK calls to enable and use NVDEC decode acceleration for the codecs. We can probably assume that was done in the shipping (non-Beta) Resolve Studio product.

The NVIDIA Blackwell video drivers R570 or later must be present. It should be possible to interrogate the version with this command from CMD.exe: nvidia-smi -q

Use MediaInfo to verify the input clip really is H.264 High 4:2:2 10-bit. Sony XAVC-S 10-bit 4:2:2 H.264 should qualify.

Check Resolve Prefs and GPU selection: Resolve > Preferences > Decode Options (Native / NVIDIA / Quick Sync). Select NVIDIA and Native (leave Quick Sync unchecked to avoid precedence issues).

Watch the GPU in Task Manager during playback to see whether NVDEC engines are engaged. Win-11 Task Manager > GPU > “Video Decode” graph (I'm not sure of the syntax).

See if FFmpeg will use the hardware accelerator for the 10-bit 4:2:2 H.264 file. Run this command while watching the Task Manager video acceleration graph. Instead of test.mp4 use the pathname to the Sony file. Below is an informed guess; I cannot test this.

ffmpeg -vsync 0 -hwaccel cuda -c:v h264_cuvid -i test.mp4 -f null NUL

However, for the above test to have meaning, ffmpeg itself must be a version that works with the latest NVIDIA video hardware. To check this, use this command: ffmpeg -buildconf

The returned data must contain --enable-cuda-nvcc and be dated 2025-04-21 or later. If not, you'd need to update your version of ffmpeg: https://ffmpeg.org/download.html

If ffmpeg is the right version and it shows video acceleration but Resolve does not, generate Resolve diagnostic logs and look for several lines surrounding strings like this: (these are just informed guesses):

NvDecCaps
cuvidCreateDecoder
NVDEC
CUDA_ERROR

This could tell you whether the driver reported the new 4:2:2 10-bit mode as unsupported, or the decoder creation failed for another reason.

If the logs contain messages something like this: CUDA_ERROR_NO_DEVICE / CUDA_ERROR_INVALID_VALUE, that could mean Resolve is not up to date or is not interacting properly with the NVIDIA SDK.
Offline

4EvrYng

  • Posts: 783
  • Joined: Sat Feb 19, 2022 12:45 am
  • Real Name: Alexander Dali

Re: H.264 4:2:2 10-bit support in Resolve 20

PostSun Jun 22, 2025 6:01 pm

CougerJoe wrote:My CPU also uses around 60% processing on certain 6K H.264 4:2:2 Intra.

If me seeing 60% CPU on some of 4K H.264 4:2:2 Intra files is not an exception then I can see how we definitely might benefit from its GPU acceleration. Either that or I have to find a workaround.
Offline

Mads Johansen

  • Posts: 1425
  • Joined: Mon Dec 19, 2016 10:51 am

Re: H.264 4:2:2 10-bit support in Resolve 20

PostSun Jun 22, 2025 6:14 pm

joema4 wrote:I don't have a Windows machine, but years ago I worked in Windows application development. Below are some suggestions. I cannot test or verify these, so this is just FWIW:
....
If ffmpeg is the right version and it shows video acceleration but Resolve does not, generate Resolve diagnostic logs and look for several lines surrounding strings like this: (these are just informed guesses):

NvDecCaps
cuvidCreateDecoder
NVDEC
CUDA_ERROR
...

Code: Select all
[0x00000508] | IO                   | INFO  | 2025-06-22 18:34:57,278 | NVDEC decodes H264, chroma 4:2:0, bitdepth 8, upto 8192 x 8192
[0x00000508] | IO                   | INFO  | 2025-06-22 18:34:57,280 | NVDEC decodes H264, chroma 4:2:0, bitdepth 10, upto 8192 x 8192
[0x00000508] | IO                   | INFO  | 2025-06-22 18:34:57,282 | NVDEC decodes H264, chroma 4:2:2, bitdepth 8, upto 8192 x 8192
[0x00000508] | IO                   | INFO  | 2025-06-22 18:34:57,283 | NVDEC decodes H264, chroma 4:2:2, bitdepth 10, upto 8192 x 8192
[0x00000508] | IO                   | INFO  | 2025-06-22 18:34:57,284 | NVDEC decodes HEVC, chroma 4:2:0, bitdepth 8, upto 8192 x 8192
[0x00000508] | IO                   | INFO  | 2025-06-22 18:34:57,286 | NVDEC decodes HEVC, chroma 4:2:0, bitdepth 10, upto 8192 x 8192
[0x00000508] | IO                   | INFO  | 2025-06-22 18:34:57,287 | NVDEC decodes HEVC, chroma 4:2:0, bitdepth 12, upto 8192 x 8192
[0x00000508] | IO                   | INFO  | 2025-06-22 18:34:57,288 | NVDEC decodes HEVC, chroma 4:2:2, bitdepth 8, upto 8192 x 8192
[0x00000508] | IO                   | INFO  | 2025-06-22 18:34:57,298 | NVDEC decodes HEVC, chroma 4:2:2, bitdepth 10, upto 8192 x 8192
[0x00000508] | IO                   | INFO  | 2025-06-22 18:34:57,300 | NVDEC decodes HEVC, chroma 4:2:2, bitdepth 12, upto 8192 x 8192
[0x00000508] | IO                   | INFO  | 2025-06-22 18:34:57,301 | NVDEC decodes HEVC, chroma 4:4:4, bitdepth 8, upto 8192 x 8192
[0x00000508] | IO                   | INFO  | 2025-06-22 18:34:57,303 | NVDEC decodes HEVC, chroma 4:4:4, bitdepth 10, upto 8192 x 8192
[0x00000508] | IO                   | INFO  | 2025-06-22 18:34:57,307 | NVDEC decodes HEVC, chroma 4:4:4, bitdepth 12, upto 8192 x 8192
...
[0x00000508] | IO                   | INFO  | 2025-06-22 18:34:57,318 | NVDEC decodes VP9, chroma 4:2:0, bitdepth 8, upto 8192 x 8192
[0x00000508] | IO                   | INFO  | 2025-06-22 18:34:57,320 | NVDEC decodes VP9, chroma 4:2:0, bitdepth 10, upto 8192 x 8192
[0x00000508] | IO                   | INFO  | 2025-06-22 18:34:57,321 | NVDEC decodes AV1, chroma 4:2:0, bitdepth 8, upto 8192 x 8192
[0x00000508] | IO                   | INFO  | 2025-06-22 18:34:57,324 | NVDEC decodes AV1, chroma 4:2:0, bitdepth 10, upto 8192 x 8192
[0x00000508] | IO                   | INFO  | 2025-06-22 18:34:57,324 | Nvidia : driverVer(208), SDK(13:0)

From davinci_resolve.log and ResolveDebug.txt.
resolve_graphics_log.txt just says NVIDIA Driver Version 576.80
So yes, they are using SDK 13 :)

And there's no note of not decoding H.264 10 bit 4:2:2 Intra in any of the log files.
Davinci Resolve Studio 20 build 49, Windows 11, Ultra 7 265k, Nvidia 5070 TI, 576.80 Studio
Offline

Mads Johansen

  • Posts: 1425
  • Joined: Mon Dec 19, 2016 10:51 am

Re: H.264 4:2:2 10-bit support in Resolve 20

PostSun Jun 22, 2025 6:15 pm

4EvrYng wrote:
CougerJoe wrote:My CPU also uses around 60% processing on certain 6K H.264 4:2:2 Intra.

If me seeing 60% CPU on some of 4K H.264 4:2:2 Intra files is not an exception then I can see how we definitely might benefit from its GPU acceleration. Either that or I have to find a workaround.

*gentle cough*
Mads Johansen wrote:If Alexander Dali would kindly post some specs to compare with, that would be lovely.
Davinci Resolve Studio 20 build 49, Windows 11, Ultra 7 265k, Nvidia 5070 TI, 576.80 Studio
Offline

4EvrYng

  • Posts: 783
  • Joined: Sat Feb 19, 2022 12:45 am
  • Real Name: Alexander Dali

Re: H.264 4:2:2 10-bit support in Resolve 20

PostSun Jun 22, 2025 6:27 pm

Mads Johansen wrote:One of my major stumbling blocks has been the inability to create H.264 10 bit 4:2:2 Intra files long enough, to see CPU/GPU usage that is visible from the background noise.

If you tell me what kind of file you would need I would be happy to take one with FX30 as soon as I have little time.

Mads Johansen wrote:It turns out that our dear friend ffmpeg and libx264 actually does support 10 bit 4:2:2 Intra
...
H.264 10 bit 4:2:2 Intra 2160p from Claire (non-accelerated by Davinci)
At timeline 25 fps: 6% CPU
At timeline 50 fps: 11% CPU
At timeline 120 fps: 23% CPU.
...
As everyone can see, the CPU usage on a new CPU is negligible.
...
If Alexander Dali would kindly post some specs to compare with, that would be lovely.

Am I wrong or it is odd that there is such a big difference in CPU utilization experiences between you and me & CougerJoe? Yes, Claire's file plays on my system at 33% on 120fps timeline _BUT_ as both Joe and I have experienced we get as much as 60% with some files. My CPU is not latest and greatest (it is 10920X @ 4.3GHz) but it is still not a complete slouch. If you tell me which specs you would need from comparison I would be happy to oblige.

Mads Johansen wrote:1) In principle I agree with you. But since Davinci does not accelerate H.264 10 bit 4:2:2 Intra, it's a moot point.
It does not? That is my bad, for some reason I thought it does. Probably I have problems interpreting googlefoo results what exactly is accelereated by each DR version. Is there somewhere a non-ambigous list that can't be misinterpreted by someone like me that is not savy about DR 18/19/20's codec support?


If only Sony in it's "infinite wisdom" didn't decide not to provide 30fps recording when using H265 codes I wouldn't care.
Offline

4EvrYng

  • Posts: 783
  • Joined: Sat Feb 19, 2022 12:45 am
  • Real Name: Alexander Dali

Re: H.264 4:2:2 10-bit support in Resolve 20

PostSun Jun 22, 2025 6:32 pm

Mads Johansen wrote:
4EvrYng wrote:
CougerJoe wrote:My CPU also uses around 60% processing on certain 6K H.264 4:2:2 Intra.

If me seeing 60% CPU on some of 4K H.264 4:2:2 Intra files is not an exception then I can see how we definitely might benefit from its GPU acceleration. Either that or I have to find a workaround.

*gentle cough*
Mads Johansen wrote:If Alexander Dali would kindly post some specs to compare with, that would be lovely.

4EvrYng wrote: My 10920X was able to decode Claire's file ...

Ummm ... it seems like you missed it ... :) I will be happy to provide all details needed if you tell me what you are looking for.
Offline

4EvrYng

  • Posts: 783
  • Joined: Sat Feb 19, 2022 12:45 am
  • Real Name: Alexander Dali

Re: H.264 4:2:2 10-bit support in Resolve 20

PostSun Jun 22, 2025 6:39 pm

joema4 wrote:Use MediaInfo to verify the input clip really is H.264 High 4:2:2 10-bit. Sony XAVC-S 10-bit 4:2:2 H.264 should qualify.

Should Sony S-I qualify too? According to MediaInfo it is High 4:2:2 Intra @ L5.2.
Offline
User avatar

joema4

  • Posts: 443
  • Joined: Wed Feb 03, 2021 3:26 pm
  • Real Name: Joe Marler

Re: H.264 4:2:2 10-bit support in Resolve 20

PostSun Jun 22, 2025 6:48 pm

4EvrYng wrote:
joema4 wrote:Use MediaInfo to verify the input clip really is H.264 High 4:2:2 10-bit. Sony XAVC-S 10-bit 4:2:2 H.264 should qualify.

Should Sony S-I qualify too? According to MediaInfo it is High 4:2:2 Intra @ L5.2.


Blackwell NVDEC should accelerate 10-bit 4:2:2 XAVC S-I, which is All-Intra, but that's not a very good test. You could turn off decode acceleration, and S-I should still be fast and smooth, albeit consume a bit more CPU to decode.

The Long GOP formats are the most revealing test, because those are difficult to handle smoothly using CPU (ie software-only) methods. When you turn on/off decode acceleration when using Long GOP, the difference is so great that there's no doubt whether decode acceleration is being used.
Offline

Mads Johansen

  • Posts: 1425
  • Joined: Mon Dec 19, 2016 10:51 am

Re: H.264 4:2:2 10-bit support in Resolve 20

PostSun Jun 22, 2025 7:02 pm

joema4 wrote:
4EvrYng wrote:
joema4 wrote:Use MediaInfo to verify the input clip really is H.264 High 4:2:2 10-bit. Sony XAVC-S 10-bit 4:2:2 H.264 should qualify.

Should Sony S-I qualify too? According to MediaInfo it is High 4:2:2 Intra @ L5.2.


Blackwell NVDEC should accelerate 10-bit 4:2:2 XAVC S-I, which is All-Intra, but that's not a very good test. You could turn off decode acceleration, and S-I should still be fast and smooth, albeit consume a bit more CPU to decode.

The Long GOP formats are the most revealing test, because those are difficult to handle smoothly using CPU (ie software-only) methods. When you turn on/off decode acceleration when using Long GOP, the difference is so great that there's no doubt whether decode acceleration is being used.

I don't know how to make it simpler:
Blackwell does decode H.264 High 4:2:2 Intra. *
Currently Davinci does not use the Blackwell decoder for H.264 High 4:2:2 Intra.

* being currently, on my machine, ffmpeg hwaccel vulkan, d3d11va and d3d12va does not decode H.264 High 4:2:2 Intra, only CUDA.
Davinci Resolve Studio 20 build 49, Windows 11, Ultra 7 265k, Nvidia 5070 TI, 576.80 Studio
Offline

4EvrYng

  • Posts: 783
  • Joined: Sat Feb 19, 2022 12:45 am
  • Real Name: Alexander Dali

Re: H.264 4:2:2 10-bit support in Resolve 20

PostSun Jun 22, 2025 8:09 pm

joema4 wrote:Blackwell NVDEC should accelerate 10-bit 4:2:2 XAVC S-I, which is All-Intra, but that's not a very good test. You could turn off decode acceleration, and S-I should still be fast and smooth, albeit consume a bit more CPU to decode.

That is where I (maybe incorrectly) differ in belief. I feel if my system was consuming only "bit more CPU" than it would if SI Resolve decode was GPU accelerated then it wouldn't be consuming 60-ish % of the 24 cores. 60% just for playback seems a lot and doesn't leave much for rest.
Offline

4EvrYng

  • Posts: 783
  • Joined: Sat Feb 19, 2022 12:45 am
  • Real Name: Alexander Dali

Re: H.264 4:2:2 10-bit support in Resolve 20

PostSun Jun 22, 2025 8:13 pm

Mads Johansen wrote:I don't know how to make it simpler:
Blackwell does decode H.264 High 4:2:2 Intra. *
Currently Davinci does not use the Blackwell decoder for H.264 High 4:2:2 Intra.

I believe I understood what you said and feel I am responsible for confusion because I assumed that Resolve does. I think before I go down the rabbit's hole of DR's H.266 GPU acceleration any further I should take a number of test files so one can compare apples to apples.
Offline
User avatar

Claire Watson

  • Posts: 157
  • Joined: Sat Aug 26, 2017 2:33 pm

Re: H.264 4:2:2 10-bit support in Resolve 20

PostSun Jun 22, 2025 8:57 pm

4EvrYng wrote:[quote="Thus I believe only way to answer that is by comparing CPU load caused by Resolve when it uses CPU vs. GPU for decoding and doing that for number of 30/60/120 fps files out of the camera.


Just wanted to confirm...played that clip that I posted in it's project. With and without Hardware Acceleration, in each case CPU usage remains the same at 28%. GPU usage says 25% but that's for 3D while zero for Video Decode. So for me I see no improvement using this Sony H.264 High 4:2:2 L5.2 codec in Resolve 20 and latest RTX 5070 Ti GPU.
Resolve Studio 20 Win10Pro Gigabyte GA-X99 i7-5960X 32GB DDR4 RAM RTX 5070 Ti (576.80 driver), DeckLink 4K Extreme 12G, Flanders Scientific XMP310 HDR monitor
Offline

4EvrYng

  • Posts: 783
  • Joined: Sat Feb 19, 2022 12:45 am
  • Real Name: Alexander Dali

Re: H.264 4:2:2 10-bit support in Resolve 20

PostSun Jun 22, 2025 9:37 pm

Claire Watson wrote:
4EvrYng wrote:[quote="Thus I believe only way to answer that is by comparing CPU load caused by Resolve when it uses CPU vs. GPU for decoding and doing that for number of 30/60/120 fps files out of the camera.


Just wanted to confirm...played that clip that I posted in it's project. With and without Hardware Acceleration, in each case CPU usage remains the same at 28%. GPU usage says 25% but that's for 3D while zero for Video Decode. So for me I see no improvement using this Sony H.264 High 4:2:2 L5.2 codec in Resolve 20 and latest RTX 5070 Ti GPU.

Thank you for confirming, that is exactly what Mads was saying (that as of now DR doesn't utilize H264 10-bit 4:2:2 GPU acceleration available in 5070 Ti). Out of curiosity and to give me an idea what I could expect, if you take 4K 4:2:2 10-bit file with HS codec at 30 and 60 fps with highest bit rate available for them what are the CPU figures you see during playback with / without GPU acceleration, please?
Offline
User avatar

joema4

  • Posts: 443
  • Joined: Wed Feb 03, 2021 3:26 pm
  • Real Name: Joe Marler

Re: H.264 4:2:2 10-bit support in Resolve 20

PostSun Jun 22, 2025 10:32 pm

4EvrYng wrote:...that is exactly what Mads was saying (that as of now DR doesn't utilize H264 10-bit 4:2:2 GPU acceleration available in 5070 Ti). Out of curiosity and to give me an idea what I could expect, if you take 4K 4:2:2 10-bit file with HS codec at 30 and 60 fps with highest bit rate available for them what are the CPU figures you see during playback with / without GPU acceleration, please?


There are two different cases being discussed. Up until Claire's first post it was 10-bit 4:2:2 H.264/H.265 Long GOP. Her case was 10-bit 4:2:2 H.264 All-Intra.

It appears the NVIDIA RTX 50-series on Resolve Studio 20 on Intel/Windows can do hardware-accelerated decoding of 10-bit 4:2:2 H.264/H.265 Long GOP. It cannot currently do hardware-accelerated decoding of 10-bit 4:2:2 H.264 All-Intra.

Further research indicates the Blackwell hardware supports 4k 10-bit 4:2:2 H.264 All-Intra (profile ID 118), but this is not exposed in the current driver. So there's nothing the application layer can do about it. It's conceivable that might be supported in a future driver update. That is out of Blackmagic's hands.

However, for All-Intra it doesn't make a huge difference in performance or smoothness on the timeline. My M1 Ultra Mac supports All-Intra, and with Resolve hardware decode acceleration on vs off, there is little difference in responsiveness. The CPU graphs are a bit different but the editing feel is the same. I suspect it would be similar on Windows/Intel, even if the NVIDIA driver supported 10-bit 4:2:2 H.264 All-Intra on Blackwell (RTX 50 series).

For Long GOP (whether H.264 or H.265/HEVC), hardware vs software decoding makes a much bigger difference. Resolve Studio 20 on Intel/Windows and RTX 50-series with the correct driver should support hardware decode acceleration of 10-bit 4:2:2 H.264 and H.265(HEVC) Long GOP.
Offline
User avatar

Claire Watson

  • Posts: 157
  • Joined: Sat Aug 26, 2017 2:33 pm

Re: H.264 4:2:2 10-bit support in Resolve 20

PostSun Jun 22, 2025 10:34 pm

With the HS codec at 50fps and Hardware Acceleration on I see CPU usage at 7% in the Media page and GPU usage 25%. In the Color page CPU is 14% and GPU usage 21%. Without Hardware Acceleration CPU is 64% in the Media page and 70% in the Color page while Video Decode is zero.
Resolve Studio 20 Win10Pro Gigabyte GA-X99 i7-5960X 32GB DDR4 RAM RTX 5070 Ti (576.80 driver), DeckLink 4K Extreme 12G, Flanders Scientific XMP310 HDR monitor
Offline

4EvrYng

  • Posts: 783
  • Joined: Sat Feb 19, 2022 12:45 am
  • Real Name: Alexander Dali

Re: H.264 4:2:2 10-bit support in Resolve 20

PostSun Jun 22, 2025 10:49 pm

Claire Watson wrote:With the HS codec at 50fps and Hardware Acceleration on I see CPU usage at 7% in the Media page and GPU usage 25%. In the Color page CPU is 14% and GPU usage 21%. Without Hardware Acceleration CPU is 64% in the Media page and 70% in the Color page while Video Decode is zero.

Thank you! I suspect if only DR supported it :( we would be seeing similar benefits from H264 GPU accelerated decoding too. Truth is I wouldn't care about it at all if only Sony had option to record 30fps in H265.
Offline

CougerJoe

  • Posts: 609
  • Joined: Wed Sep 18, 2019 5:15 am
  • Real Name: bob brady

Re: H.264 4:2:2 10-bit support in Resolve 20

PostMon Jun 23, 2025 12:36 am

4EvrYng wrote:
CougerJoe wrote:My CPU also uses around 60% processing on certain 6K H.264 4:2:2 Intra.

If me seeing 60% CPU on some of 4K H.264 4:2:2 Intra files is not an exception then I can see how we definitely might benefit from its GPU acceleration. Either that or I have to find a workaround.


It depends on the H.264 4:2:2 10 intra codec, the version DJI uses as a replacement for prores does not decode well on CPU at 6K and needs GPU decoding. DJI are using it as the highest quality codec, smaller file size than prores but is not easy to playback on CPU, but obviously plays back easier than a H.265 version.

Whereas previously maybe intra was about easy playback for editors, the DJI version is more about quality.
Offline

4EvrYng

  • Posts: 783
  • Joined: Sat Feb 19, 2022 12:45 am
  • Real Name: Alexander Dali

Re: H.264 4:2:2 10-bit support in Resolve 20

PostMon Jun 23, 2025 1:05 am

CougerJoe wrote:
4EvrYng wrote:
CougerJoe wrote:My CPU also uses around 60% processing on certain 6K H.264 4:2:2 Intra.

If me seeing 60% CPU on some of 4K H.264 4:2:2 Intra files is not an exception then I can see how we definitely might benefit from its GPU acceleration. Either that or I have to find a workaround.


It depends on the H.264 4:2:2 10 intra codec ...

I suspected there is a possibility of that so I used words "_might_ benefit".
Offline

Mads Johansen

  • Posts: 1425
  • Joined: Mon Dec 19, 2016 10:51 am

Re: H.264 4:2:2 10-bit support in Resolve 20

PostTue Jun 24, 2025 10:43 am

Mads Johansen wrote:It turns out that our dear friend ffmpeg and libx264 actually does support 10 bit 4:2:2 Intra
Code: Select all
ffmpeg -f lavfi -i testsrc2=size=3840x2160 -pix_fmt yuv422p10le -profile:v high422 -crf 0 -t 30 -g 1 testscr2.mp4


As everyone can see, the CPU usage on a new CPU is negligible. I'll report on tuesday how a 12400F fares (don't have access to a slow system with Davinci Studio before then).

Just tested with the 12400F:
H.264 10 bit 4:2:2 Intra 2160p from ffmpeg (non-accelerated by Davinci)
At timeline 25 fps: 30% CPU.
Interestingly enough on a 1080p timeline with 2160p media, the CPU usage was the same, but with more GPU usage.

So, it's a hell of a lot more demanding on a cpu from early 2022 than late 2024.
Davinci Resolve Studio 20 build 49, Windows 11, Ultra 7 265k, Nvidia 5070 TI, 576.80 Studio

Return to DaVinci Resolve

Who is online

Users browsing this forum: Google [Bot], panos_mts and 297 guests