Michael_Andreas wrote:Responding to some specific objections:
--No cameras encode to h.264 or h.265. Maybe yours don't, but lots of DSLRs, action cameras, and drones do.
Don't be a politician
"
8K H.265 100mbps This is a delivery codec, nobody shoots this with a camera or edits it."
"
4K H.264 150mbps 8-bit C200 AVC soap opera interview codec,
possible"
Michael_Andreas wrote:--Most filming is done at less than 59.94/60 Hz. Well yes, probably. Why does that make the benchmark worthless? If Codex X recorded at 60 FPS is processed at framerate Y, do we expect Codex X recorded at 30 FPS to be processed at a framerate significantly different than Y?
59.94/60 fps 8k.R3D 22:1 runs on a laptop, 23.976/24 fps 8k.R3D 5:1 cripples a $10k workstation with an i9-9980XE and a RTX Quadro 8000.
Michael_Andreas wrote:--The benchmarks don't tell you which part of your system is being stressed. Absolutely agree, which is why I ran the Open Hardware Monitor and published the results. Can you point to a better Benchmark software that gives the kind of insight you think it should without using another tool?
When they only way to go from the US to Europe is swimming, doesn't make at a good solution.
Michael_Andreas wrote:For an example of how to use the benchmark, I could compare my usage case to one of the codecs and a similar amount of correction processing and evaluate whether it would be worthwhile upgrading to a 2080Ti given that their benchmark runs give about a 2x increase in performance. I could put the benchmark files on a spinning hard drive to see how that effected it. YMMV.
For a GPU 3 things are important: fp32 performance, amount of memory and the memory speed.
A RTX2080ti is fast enough to de-bayer 8k.R3D 5:1 24 fps in realtime, it lacks the amount of memory to do (heavy) TNR and the memory speed is not enough to make full use of the fp32 perfomance.