Wed Feb 05, 2014 1:17 am
Bearing in mind the amount of thorough replies on this post, explaining to me in plain english what to do and how to do it, i feel obliged to post about the results. Ended up getting evga 680 mac version (so 2gb only). Thought this would be more appropriate for a first timer like me. Right now gt 120 is used for the GUI and 680 for image processing. Of course, really happy with the results, even though the only thing i can compare to is radeon 5770 for both gui and image processing.
However i've noticed something weird and can't miss the opportunity to pick your brains about it.
I currently have a project with both 2k and 4k sony f55 raw in it. When rendering prores 4444 with identical settings to desktop,
2k is:
rendering at 30-32 fps, is using the CPU up to 99% (according to activity monitor), gpu is used at around 30% (according to atMonitor), vram is 50%, and the network usage (the raws are on a server so this is the average read speed during render) 40mb/s
4k is:
rendering at 15-15.5 fps, is using the CPU up to 60%, gpu at around 30%, vram is 50%, and the network usage is around 80mb/s
Last time i've tested, the network was capable of 300 mb/s read speed.
QUESTION: If the GPU, CPU, vram and network are not maxed out then why is the render speed slower on 4k?
Thanks for your input
Stepan