10 bit color and selecting a GPU
Earlier this year I built a base line (windows) computer system designed with the idea that I would be acquiring a BMCC and Davinci Resolve at a later time. I made sure to get a monitor capable of displaying 10 bit color depth, but I held off on purchasing a gpu since I wasn't going to need it until I actually started working with footage that required some muscle. Well the time is fast approaching that I will be getting my hands on my camera, and I am looking at GPU's up for the task.
The last several generations of nvidia GPUs have enabled 10bit color on windows systems, but I recently found out that professional programs like Adobe After Effects aren't compatible with the way it was implemented on the cards; instead, they require the Quadro series to output 10 bit color. Is this true of resolve as well?
I understand that there is a significant difference between gaming and workstation cards, but the consumer 3d accelerators have about 10 times the amount of CUDA cores than the similarly priced workstation cards.
I am a budding independent filmmaker working on my own time and my own budgets, so I do not need the best and fastest card possible. I can't even dream of owning something like a K6000, so that isn't the type of answer I am looking for. What I am wondering is if 10 bit color output is possible using a card in the GTX family, or if I am better off spending a little extra money on something like a K2000. I don't intend on ever doing 3D rendering for vfx or anything along those lines, I am just looking for something that will enable me to get the most out of resolve within a reasonable price range. Thanks in advance for any help related to this.
The last several generations of nvidia GPUs have enabled 10bit color on windows systems, but I recently found out that professional programs like Adobe After Effects aren't compatible with the way it was implemented on the cards; instead, they require the Quadro series to output 10 bit color. Is this true of resolve as well?
I understand that there is a significant difference between gaming and workstation cards, but the consumer 3d accelerators have about 10 times the amount of CUDA cores than the similarly priced workstation cards.
I am a budding independent filmmaker working on my own time and my own budgets, so I do not need the best and fastest card possible. I can't even dream of owning something like a K6000, so that isn't the type of answer I am looking for. What I am wondering is if 10 bit color output is possible using a card in the GTX family, or if I am better off spending a little extra money on something like a K2000. I don't intend on ever doing 3D rendering for vfx or anything along those lines, I am just looking for something that will enable me to get the most out of resolve within a reasonable price range. Thanks in advance for any help related to this.