Just read that IBM added a benchmark for some new PowerPC processors which work like "benchmark result per Watt" or something like that. I guess that's a reaction to Prescott... What do you think? Is that a good idea for graphics benchmarks, too? I mean, NVidia and ATI can't continue to increase the power consumption in the same way they did during the last 2 generations. So in the end (a few graphics card generations further in the future) the winner might be who has the highest computational power with a given power consumption / heat dissipation.
In light of this I think a good graphics card review these days should at least contain a quick check on how high the power consumption of a given graphics card in both idle state and under full load is.
What do you guys think?
In light of this I think a good graphics card review these days should at least contain a quick check on how high the power consumption of a given graphics card in both idle state and under full load is.
What do you guys think?