So I did some tests:
3x 4000x3000 10bit clips. All clips were in separate layers on top of each other and transparent so you could see all clips at once.
VRAM: 7500MB Used (500MB Free)
GPU Usage: 80%
CPU Usage: 7%
This performs very well and well above my normal usage of 1-2 clips at once.
4x 4000x3000 10bit Clips:
GPU Usage: 100%
CPU Usage 100%
This indicates to me that Davinci correctly decides that I'm unable to render more then 3x 4000x3000 clips on my GPU, which is great as its working as intended.
This however correctly supports my theory that it incorrectly assumes I can't run one single 5312x4648 clip (We already know we can as Premiere and video players can do it on the GPU)
5.3k = 24,690,176 pixels
3x 4000x3000 = 36,000,000 pixels
CougerJoe wrote:So 4K 10bit HEVC uses the decoder but not 5.3K 10bit HEVC. I guess it could be a lack of VRAM.
There's definitely enough VRAM and from my testing of reducing vram to give as much of the 8GB as possible to Resolve, it seems Resolve won't even try to use it with 5.3K.
CougerJoe wrote: What happens if you start a new 4K project, bring in the video with nothing else, does that play smoothly?
That was my original test anyway, its a brand new project with nothing but the added clip.
----------
Also just checking my Davinci logs:
NVDEC decodes HEVC, chroma 4:2:0, bitdepth 10, upto 8192 x 8192
So I should be fine as my content is 4:2:0
----------
I've done some new testing and have very strange results:
4:3 - 5312x3984 = GPU Decode
16:9 - 5312x2988 = CPU Decode
8:7 - 5312x4648 = CPU Decode
I used both a 25fps and a 30fps clip and both only GPU decoded in 4:3, 5312x3984
As we know that 4:3 has more pixels then 16:9.