NVIDIA GF100 & Friends speculation

strange, only the first page opens for me. Have the other pages been taken down?
How much power does it consume?

The pages opens fine for me: http://en.expreview.com/2010/08/09/world-exclusive-review-512sp-geforce-gtx-480/9070.html/6

Anyway, the system based on the 512-SP GTX 480 draws 204W over what the system based on the standard GTX 480 draws.

So the card itself probably draws close to ~ 300 + 200 = 500W, which is so huge that I wonder whether it's even possible.
 
Their text on that page is below.

As we had expected, the stand-by power consumption of 512SP GTX 480 was 17W higher.

Under full load the GPU voltage of reference GTX 480 was 1.0V, while 512SP edition was 1.056V. Surprisingly, the full spec’ed GTX 480 sucked 644W power, which was 204W higher than 480SP GTX 480!
 
Their text on that page is below.

As we had expected, the stand-by power consumption of 512SP GTX 480 was 17W higher.

Under full load the GPU voltage of reference GTX 480 was 1.0V, while 512SP edition was 1.056V. Surprisingly, the full spec’ed GTX 480 sucked 644W power, which was 204W higher than 480SP GTX 480!
That's quite a lot. But it seems a bit fishy to me, almost as if they'd just taken a 480SP part and found a way to unlock the remaining SP's. Because otherwise the additional SP's, which make up only a 6.7% increase in SP's, and thus less than a 6.7% increase in die area used, should not lead to more than a 7% increase in total power (at least, not without a corresponding significant increase in clock speed). Unless the additional SP's had major process problems that dramatically increased their power draw (and thus should have been disabled in the first place).

Although bear in mind that a couple of things could exacerbate this. First, if their load situation made use of a GPU-limited benchmark, higher performance on the GPU side would mean that the rest of the system would have to work harder, and thus may lead to higher power draw from other components. Second, this is total system power draw, which is increased further by the inefficiencies of the power supply. I don't think these things can fully explain the large discrepancy, though.
 
That's quite a lot. But it seems a bit fishy to me, almost as if they'd just taken a 480SP part and found a way to unlock the remaining SP's. Because otherwise the additional SP's, which make up only a 6.7% increase in SP's, and thus less than a 6.7% increase in die area used, should not lead to more than a 7% increase in total power (at least, not without a corresponding significant increase in clock speed).
Don't forget the higher voltage - now their conclusion doesn't make much sense, since almost certainly the higher voltage was needed for the higher clocks the card had (before it was downclocked for the review), not for the additional SM. That could easily cause another 10-20% increase in power draw. Still, all together (voltage, additional SM, PSU efficiency) that could only explain about half the difference of 200W. And that the cpu has to work harder should only make a tiny difference.
 
It is total system power draw they measured. Despite their system being quite beefy, I can't imagine their load-test scenario putting much strain on the CPU portion at all since it's Furmark. Though despite all the issues etc, a delta 204 watts is insane. The power difference was only 5.6% higher (1.056 vs 1.00 volt). I figure there's definite leakage and power issues in the core. It's definitely not a 'fixed' product.

The test bed includes Intel Core i5-750 (oc’ed to 4.0GHz), 4GB DDR3 dual-channel memory (oc’ed to 1600MHz), Windows 7 Ultimate 64-bit OS, and ForceWare 258.96 WHQL drivers.

Record the maximum value as stand-by power consumption 5 minutes after the PC entered Windows 7 system interface; Load the GPU with FurMark V1.8.2 Multiple-GPU mode, and then record the maximum value as loaded power consumption.
 
Don't forget the higher voltage - now their conclusion doesn't make much sense, since almost certainly the higher voltage was needed for the higher clocks the card had (before it was downclocked for the review), not for the additional SM. That could easily cause another 10-20% increase in power draw. Still, all together (voltage, additional SM, PSU efficiency) that could only explain about half the difference of 200W. And that the cpu has to work harder should only make a tiny difference.

I was wondering if the 512 version was intended for a much lower clock rate.
 
That's quite a lot. But it seems a bit fishy to me, almost as if they'd just taken a 480SP part and found a way to unlock the remaining SP's. Because otherwise the additional SP's, which make up only a 6.7% increase in SP's, and thus less than a 6.7% increase in die area used, should not lead to more than a 7% increase in total power (at least, not without a corresponding significant increase in clock speed). Unless the additional SP's had major process problems that dramatically increased their power draw (and thus should have been disabled in the first place).

Although bear in mind that a couple of things could exacerbate this. First, if their load situation made use of a GPU-limited benchmark, higher performance on the GPU side would mean that the rest of the system would have to work harder, and thus may lead to higher power draw from other components. Second, this is total system power draw, which is increased further by the inefficiencies of the power supply. I don't think these things can fully explain the large discrepancy, though.

There could be many reasons but most of the time the most obvious one (the results are not correct) is the right one. Obviously the reviewer wasn't puzzled enough by these strange results to dig a bit deeper there.

With one of my test systems, at some point while wondering why some results were strange, I noticed that the power supply I used (high-end CoolerMaster) was basically jumping from 135 to 155 watts (total system power draw).
 
There could be many reasons but most of the time the most obvious one (the results are not correct) is the right one. Obviously the reviewer wasn't puzzled enough by these strange results to dig a bit deeper there.
True, that's a good point. It would be fantastically difficult for a video card to actually draw that much power, and especially to do it without failing almost immediately. More likely it's something wrong with the test.
 
21e5mrs.jpg
 
Back
Top