OpenGL guy said:So when/where was it stated that the GeForce FX could do 1 32-bit FLOP per cycle and 2 16-bit FLOPs per cycle?
I can't guarantee that's the case. It could be the opposite, 1 16-bit FLOPs per cycle or 1 32-bit FLOP every two cycle. But that seems unlikely.
But what I can guarantee is that the GFFX is two times faster ( beside memory bandwidth, of course ) when using FP16 than when using FP32.
And I can also guarantee the GFFX benchmarks we've recieved so far are run in 32-bit.
Here's a quote from Brian Burke, of nVidia PR:
The GeForce FX is 128-bit throughout the pipeline. The GeForce FX runs 128-bit, 64-bit and 32-bit natively. Going from 128 to 64, for example, will result in a performance doubling. This will not be the case for a GPU that does not run these modes natively. To run 64-bit on one of those GPUs, they will still incur the performance hit associated with 128-bit.
We selected the 32-bit benchmarks because that would give people the best frame of reference, as that has been the standard for some time.
Uttar