Page 1 of 1

10-bit HEVC Render GPU utilization

PostPosted: Fri Aug 17, 2018 5:35 pm
by Christopher Dobey
Rendering 10-bit HEVC causes a continuous utilization handoff between the GPU and CPU as seen in Activity Monitor.app
Rendering 8-bit HEVC and ProRes use the dedicated AMD GPU exclusively.
Same results running latest version of High Sierra and Mojave Beta.

Is this a feature or is 10-bit HEVC encoding not fully hardware accelerated?


Setup
Resolve 15 Public Release
macOS 10.13.6 & 10.14 Beta (18A365a)
MacBook Pro 15-inch 2018
Intel Core-i9 8950HK 6-Core
AMD Radeon Pro 560X 4GB / Intel UHD 630
32GB DDR4 2400MHz

Re: 10-bit HEVC Render GPU utilization

PostPosted: Fri Aug 17, 2018 6:02 pm
by Andrew Kolakowski
ProRes is not PGU rendered for sure. Probably CPU is waiting for GPU to do the work, so CPU load is low for ProRes.
Other should be GPU encoded I think, so very high CPU load is strange somehow.

10-bit HEVC Render GPU utilization

PostPosted: Fri Aug 17, 2018 9:44 pm
by rick.lang
I am currently rendering HEVC 10bit from ProRes 444 XQ. My CPU usage is quite steady around 45-50% and GPU steady and not used heavily. So it doesn’t look like I’m going back and forth between the CPU and GPU.

For my computer, iMac 27” late 2015, I know that h.264 8bit is done in the Quick Sync hardware feature of the CPU and it would be much faster, but I have switched to HEVC 10bit for everything.

Sent from my iPhone using Tapatalk

Re: 10-bit HEVC Render GPU utilization

PostPosted: Sun Aug 19, 2018 5:21 pm
by Christopher Dobey
Hey Rick which GPU did you equip your iMac with? Radeon R9 M380, M390, M395, M395X?

Also, the timeline rendered is all 4000x2160 URSA Mini 4K CinemaDNG RAW 3:1 footage.

Re: 10-bit HEVC Render GPU utilization

PostPosted: Sun Aug 19, 2018 5:29 pm
by Cary Knoop
Encoding a quality HEVC video takes much more CPU time than ProRes, you are almost comparing two extremes in that respect. So in the case of ProRes the bottleneck shifts from CPU to GPU.

In my opinion GPU based HEVC is simply not good enough.

Re: 10-bit HEVC Render GPU utilization

PostPosted: Sun Aug 19, 2018 5:52 pm
by Jean Claude
I'll do a CPU / GPU comparison tomorrow: I have a HEVC @ 59.94 fps UHD HDR10+ and my CUDA Codes feast on reading it in RT ...

I think it's with some losless codec a HDD flow problem with this clip. :?

Re: 10-bit HEVC Render GPU utilization

PostPosted: Sun Aug 19, 2018 6:04 pm
by rick.lang
Christopher Dobey wrote:Hey Rick which GPU did you equip your iMac with? Radeon R9 M380, M390, M395, M395X?


I’m using the 4GB AMD R9 M395X.

Also, the timeline rendered is all 4000x2160 URSA Mini 4K CinemaDNG RAW 3:1 footage.


Ah, that’s quite a different workload than mine using ProRes to HEVC.


Sent from my iPhone using Tapatalk

10-bit HEVC Render GPU utilization

PostPosted: Sun Aug 19, 2018 6:13 pm
by rick.lang
Cary Knoop wrote:... In my opinion GPU based HEVC is simply not good enough.


Cary, you may be right. My initial tests comparing h.264 8bit and h.265 8bit clearly showed the h.264 gave a better image and I stayed with h.264.

But for my recent deliverable, over 63 minutes about 8.5GB, storage size was a consideration for the client and I went with h.265 10bit. I’m thinking that is looking better than the 8bit would have been and with significantly less data. Especially since I did both a colour and monochrome version.


Sent from my iPhone using Tapatalk

Re: 10-bit HEVC Render GPU utilization

PostPosted: Sun Aug 19, 2018 8:27 pm
by Christopher Dobey
GPU vs CPU encoding perceptual visual quality is a great topic but I'd like to answer the original question of why HEVC 10-bit is the only codec that shares it's processing between CPU and GPU.

Tested AVC/H.264: Full GPU utilization leaving the CPU nearly untouched as well.