Only need CUDA for Noise Reduction in Resolve i believe.
Also, where are the results showing it's blowing NVIDIA out of the water? They're comparing to the 680MX (mobile version found in new imacs etc) and the Quadro 4000 (poor compared to a 580/680).
My friend did the resolve candle test with his 680MX in his new imac:
V01 04 blur nodes 24 fps
V02 08 blur nodes 20 fps
V03 16 blur nodes 10 fps
V04 32 blur nodes 5 fps
V05 64 blur nodes 3.5 fps
V06 01 NR 2 nodes 7.5 fps
V07 03 NR 2 nodes 3 fps
V08 06 NR 2 nodes 1.5 fps
This is from my GTX580:
V01 04 blur nodes 24 fps
V02 08 blur nodes 24 fps
V03 16 blur nodes 13 fps
V04 32 blur nodes 7 fps
V05 64 blur nodes 3.5 fps
V06 01 NR 2 nodes 13 fps
V07 03 NR 2 nodes 5 fps
V08 06 NR 2 nodes 2.5 fps
and this is from our GTX670:
V01 04 blur nodes 24 fps
V02 08 blur nodes 24 fps
V03 16 blur nodes 13 fps
V04 32 blur nodes 7 fps
V05 64 blur nodes 3.5 fps
V06 01 NR 2 nodes 10 fps
V07 03 NR 2 nodes 3.5 fps
V08 06 NR 2 nodes 2 fps
The imac 680MX's are good, but still not like desktop GFX cards (as to be expected).
Have you seen the new GTX Titan against the 7970 etc (which should be better than the 7950 in the above link) in compute performance?
http://www.anandtech.com/show/6774/nvid ... unveiled/3
I don't think Radeon is blowing NVIDIA out of the water.