10-bit HEVC Render GPU utilization

Get answers to your questions about color grading, editing and finishing with DaVinci Resolve.
  • Author
  • Message
Offline
User avatar

Christopher Dobey

  • Posts: 234
  • Joined: Fri Jul 18, 2014 5:58 pm
  • Location: San Francisco Bay Area, California

10-bit HEVC Render GPU utilization

PostFri Aug 17, 2018 5:35 pm

Rendering 10-bit HEVC causes a continuous utilization handoff between the GPU and CPU as seen in Activity Monitor.app
Rendering 8-bit HEVC and ProRes use the dedicated AMD GPU exclusively.
Same results running latest version of High Sierra and Mojave Beta.

Is this a feature or is 10-bit HEVC encoding not fully hardware accelerated?


Setup
Resolve 15 Public Release
macOS 10.13.6 & 10.14 Beta (18A365a)
MacBook Pro 15-inch 2018
Intel Core-i9 8950HK 6-Core
AMD Radeon Pro 560X 4GB / Intel UHD 630
32GB DDR4 2400MHz
Attachments
8 bit render.jpg
8 bit render.jpg (622.6 KiB) Viewed 2024 times
10 bit render.jpg
10 bit render.jpg (989.99 KiB) Viewed 2024 times
Nikon Z6 | Ninja V ProRes RAW
Resolve Studio 18 Beta | macOS 12.3.1
Apple M1 Max MacBook Pro, 10 core CPU, 32-core GPU, 64GB memory
Pro Display XDR | Micro Panel
Offline

Andrew Kolakowski

  • Posts: 9211
  • Joined: Tue Sep 11, 2012 10:20 am
  • Location: Poland

Re: 10-bit HEVC Render GPU utilization

PostFri Aug 17, 2018 6:02 pm

ProRes is not PGU rendered for sure. Probably CPU is waiting for GPU to do the work, so CPU load is low for ProRes.
Other should be GPU encoded I think, so very high CPU load is strange somehow.
Offline
User avatar

rick.lang

  • Posts: 17262
  • Joined: Wed Aug 22, 2012 5:41 pm
  • Location: Victoria BC Canada

10-bit HEVC Render GPU utilization

PostFri Aug 17, 2018 9:44 pm

I am currently rendering HEVC 10bit from ProRes 444 XQ. My CPU usage is quite steady around 45-50% and GPU steady and not used heavily. So it doesn’t look like I’m going back and forth between the CPU and GPU.

For my computer, iMac 27” late 2015, I know that h.264 8bit is done in the Quick Sync hardware feature of the CPU and it would be much faster, but I have switched to HEVC 10bit for everything.

Sent from my iPhone using Tapatalk
Last edited by rick.lang on Sun Aug 19, 2018 6:00 pm, edited 1 time in total.
Rick Lang
Offline
User avatar

Christopher Dobey

  • Posts: 234
  • Joined: Fri Jul 18, 2014 5:58 pm
  • Location: San Francisco Bay Area, California

Re: 10-bit HEVC Render GPU utilization

PostSun Aug 19, 2018 5:21 pm

Hey Rick which GPU did you equip your iMac with? Radeon R9 M380, M390, M395, M395X?

Also, the timeline rendered is all 4000x2160 URSA Mini 4K CinemaDNG RAW 3:1 footage.
Nikon Z6 | Ninja V ProRes RAW
Resolve Studio 18 Beta | macOS 12.3.1
Apple M1 Max MacBook Pro, 10 core CPU, 32-core GPU, 64GB memory
Pro Display XDR | Micro Panel
Offline
User avatar

Cary Knoop

  • Posts: 1452
  • Joined: Sun Mar 12, 2017 6:35 pm
  • Location: Newark, CA USA

Re: 10-bit HEVC Render GPU utilization

PostSun Aug 19, 2018 5:29 pm

Encoding a quality HEVC video takes much more CPU time than ProRes, you are almost comparing two extremes in that respect. So in the case of ProRes the bottleneck shifts from CPU to GPU.

In my opinion GPU based HEVC is simply not good enough.
Offline
User avatar

Jean Claude

  • Posts: 2973
  • Joined: Sun Jun 28, 2015 4:41 pm
  • Location: France

Re: 10-bit HEVC Render GPU utilization

PostSun Aug 19, 2018 5:52 pm

I'll do a CPU / GPU comparison tomorrow: I have a HEVC @ 59.94 fps UHD HDR10+ and my CUDA Codes feast on reading it in RT ...

I think it's with some losless codec a HDD flow problem with this clip. :?
"Saying it is good, but doing it is better! "
Win10-1809 | Resolve Studio V16.1 | Fusion Studio V16.1 | Decklink 4K Extreme 6G | RTX 2080Ti 431.86 NSD driver! |
Offline
User avatar

rick.lang

  • Posts: 17262
  • Joined: Wed Aug 22, 2012 5:41 pm
  • Location: Victoria BC Canada

Re: 10-bit HEVC Render GPU utilization

PostSun Aug 19, 2018 6:04 pm

Christopher Dobey wrote:Hey Rick which GPU did you equip your iMac with? Radeon R9 M380, M390, M395, M395X?


I’m using the 4GB AMD R9 M395X.

Also, the timeline rendered is all 4000x2160 URSA Mini 4K CinemaDNG RAW 3:1 footage.


Ah, that’s quite a different workload than mine using ProRes to HEVC.


Sent from my iPhone using Tapatalk
Rick Lang
Offline
User avatar

rick.lang

  • Posts: 17262
  • Joined: Wed Aug 22, 2012 5:41 pm
  • Location: Victoria BC Canada

10-bit HEVC Render GPU utilization

PostSun Aug 19, 2018 6:13 pm

Cary Knoop wrote:... In my opinion GPU based HEVC is simply not good enough.


Cary, you may be right. My initial tests comparing h.264 8bit and h.265 8bit clearly showed the h.264 gave a better image and I stayed with h.264.

But for my recent deliverable, over 63 minutes about 8.5GB, storage size was a consideration for the client and I went with h.265 10bit. I’m thinking that is looking better than the 8bit would have been and with significantly less data. Especially since I did both a colour and monochrome version.


Sent from my iPhone using Tapatalk
Rick Lang
Offline
User avatar

Christopher Dobey

  • Posts: 234
  • Joined: Fri Jul 18, 2014 5:58 pm
  • Location: San Francisco Bay Area, California

Re: 10-bit HEVC Render GPU utilization

PostSun Aug 19, 2018 8:27 pm

GPU vs CPU encoding perceptual visual quality is a great topic but I'd like to answer the original question of why HEVC 10-bit is the only codec that shares it's processing between CPU and GPU.

Tested AVC/H.264: Full GPU utilization leaving the CPU nearly untouched as well.
Attachments
AVC.jpg
AVC.jpg (346.98 KiB) Viewed 1762 times
Nikon Z6 | Ninja V ProRes RAW
Resolve Studio 18 Beta | macOS 12.3.1
Apple M1 Max MacBook Pro, 10 core CPU, 32-core GPU, 64GB memory
Pro Display XDR | Micro Panel

Return to DaVinci Resolve

Who is online

Users browsing this forum: asdffghj, panos_mts and 166 guests