NVIDIA GF100 & Friends speculation

I'd like to see that same test repeated, using a game; Metro 2033, HAWX 2 would be fine. Just to see if changes anything.
Anandtech tested with an alternative to Furmark (they don't specify the program, but claim it has similar power draw features), and Crysis:
http://www.anandtech.com/show/4008/nvidias-geforce-gtx-580/17

Tomshardware has a graph comparing three of the cards on Metro 2033:
http://www.tomshardware.com/reviews/geforce-gtx-580-gf110-geforce-gtx-480,2781-15.html
 
Tomshardware has a graph comparing three of the cards on Metro 2033:
http://www.tomshardware.com/reviews/geforce-gtx-580-gf110-geforce-gtx-480,2781-15.html

Did they not add the 580's results on purpose or did they remove it?

Power%20Draw.png

GPU%20Temp.png
 
Did they not add the 580's results on purpose or did they remove it?
They purposefully removed the results from their Furmark tests. There is another graph on that page that represents their Metro 2033 results. You can see that the 580 is very slightly but consistently lower in power consumption than the 480.
 
They purposefully removed the results from their Furmark tests. There is another graph on that page that represents their Metro 2033 results. You can see that the 580 is very slightly but consistently lower in power consumption than the 480.

It would be interesting to look at the regulation...

The way I see it, nVidia improved the board but not really the GPU, that would explain the 350-watt unlocked power draw (10% more frequency and one more SM leading to a little more than 10% more power).

Combine lower temp, better regulation (including thicker power traces) and it's quite possible GF110 is actually a non-improved GF100.

RAM doesn't overclock that well, so there probably have not been improvement, same goes for power draw and perhaps even the full speed FP16 texture filtering ability.

All in all, all praises at GTX580 launch would be 0% worth, in fact it's just what nVidia should have released 8 month ago.
 
You're coming to all those conclusions because of Furmark? What about actual game power consumption? There is no 10% increase in power draw there.
 
You're coming to all those conclusions because of Furmark? What about actual game power consumption? There is no 10% increase in power draw there.
That could be entirely due to the tweaked regulation/power delivery considering peak climbs by some 40 watt (10+%).

Static leakage went down too, but that could be due to process improvement or even binning.

I really thought they re-engineered the GPU "just a little", but this GPU-Z bit seems to confirm it's not the case at all, considering there's nothing really new (full-speed FP16 isn't new and that's the only "functional" difference between GF100 and GF110).
 
That is an interesting link Chanloth. Thanks. It seems furmark was pretty useless before (on the 5870 as well), I guess the whole testing games idea isn't too bad afterall.
 
You're coming to all those conclusions because of Furmark? What about actual game power consumption? There is no 10% increase in power draw there.
Other sites measuring power draw didn't see a 10% increase there neither. Some very slightly more, some very slightly less (for instance here: http://ht4u.net/reviews/2010/nvidia_geforce_gtx_580_test/index13.php - slightly more at furmark, very slightly less in some game). This imho is all luck of the draw, if you get a more or less leaky part, power consumption of GTX580 and GTX480 seems to be pretty much the same (except at idle). Or it could depend on test environment - note the GTX580 cooler is much much better not only less noisy but also keeps it more cool helping power draw.
All that I think is reflected by the TDP comparison between GTX 480 and GTX 580 - granted it's lower for the GTX 580 but the difference is really tiny way below sample variance (it still is not a really honest number, at least not compared to other cards which don't exceed their TDP, but there has been no change there for the GTX 580).
 
That could be entirely due to the tweaked regulation/power delivery considering peak climbs by some 40 watt (10+%).

Not following your logic. So the tweaked power regulation works great in regular apps but keels over in Furmark? That makes no sense. What would make sense is if their fiddling with transistor types reduced average power consumption under typical use but did nothing for the highly focused and high utilization workload of Furmark. Another point is that if it was so easy they would have used better power regulation circuitry back at 480 release. They had a looong time to figure that out.
 
What about performance/consumption when the 580 cooler is placed on the 480 and vice versa?
That would account for 20~40 watt... (assuming GTX580's cooler is 10-20°C better)

I don't remember who tested this on the GTX480, but I remember numbers quite similar to what Dave gave us for Cypress, which in itself is already quite an achievement.
 
What about performance/consumption when the 580 cooler is placed on the 480 and vice versa?

This. I actually blogged about it yesterday, and had the same idea. Here's the relevant excerpt:

Then again, in actual games (or 3DMark) the GTX 580 tends to draw a bit less power than the 480. Why? One possible reason is that games aren't quite that demanding, and under such circumstances, the GTX 580's cooler is able to cope very well with the card's heat output.
And as we know, power increases with temperature. We even know (thanks to Dave Baumann) that around its standard operating temperature, Cypress (HD 5800) draws about one additional watt per additional degree Celsius.
If we assume the additional heat-related power draw to be proportional to TDP, then GF100/110 draws about 1.6W more per additional °C around its standard operating temperature. Since the GTX 580 typically operates 15 to 20°C lower than the 480, we can expect it to draw approximately 24 to 32W less, all other things being equal. Of course, all other things are not equal, the 580 has more enabled units, higher clocks, and is based on a newer revision.

Nevertheless, the 580's lower temperatures are probably an important factor in games, but one that doesn't really help in Furmark, because with such a heavy load, its cooler has trouble keeping things… well, cool.
 
http://www.techpowerup.com/134460/Disable-GeForce-GTX-580-Power-Throttling-using-GPU-Z.html

Wow, thats pretty heavy power consumption, even beating GTX480 at Furmark :oops:
Sorry, My english is not good.

I have read that article and have followed that article, and here is test results of my ASUS GTX 580.

ASUS GTX 580 @ 810/1013 vCORE=1.075v, FAN SET 85%, ROOM TEMP 30oC

Maximum Temp with Furmark - 70oC
24475d1289719381-review-amtech-asus-geforce-gtx-580-da-co-mat-o-amtech-temp-70.jpg


Maximum Temp with Crysis Warhead (I plays map Train and Airfield of Crysis Warhead in 30'), FAN SET 75% - 81oC
24472d1289719381-review-amtech-asus-geforce-gtx-580-da-co-mat-o-amtech-temp-2.jpg


Maximum Power Consumption of Core i7 965 @ 3.6GHz + ASUS GTX 480 when running Furmark
20213d1274254287-amtech-review-asus-geforce-gtx-480-bai-binh-phuc-han-gtx480-full-load.jpg


Maximum Power Consumption of Core i7 965 @ 3.6GHz + ASUS GTX 580 when running Furmark
24364d1289471892-review-amtech-asus-geforce-gtx-580-da-co-mat-o-amtech-cs-peak-chay-furmark.jpg


Maximum Power Consumption of Core i7 965 @ 3.6GHz + ASUS GTX 580 when playing Crysis Warhead
24365d1289471892-review-amtech-asus-geforce-gtx-580-da-co-mat-o-amtech-cs-peak-choi-crysis.jpg
 
Thanks bakalu for your trouble. :)

===============

Now, what's the word/speculation on the 570? Will it basically be a GTX 580 minus one processing cluster? What about the bus? Will it be 384bit/1.5GB or 320bit/1280MBs? What about the clocks? Will they be close(r) to the GTX 580 or the 480?

Would Nvidia want to position the 570 above the 480 or below?
 
Last edited by a moderator:
Now, what's the word/speculation on the 570? Will it basically be a GTX 580 minus on processing cluster? What about the bus? Will it be 384bit/1.5GB or 320bit/1280MBs? What about the clocks? Will they be close(r) to the GTX 580 or the 480?
The memory rumors that are going around says 1.5 GB, which would mean 384-bit bus.
 
The memory rumors that are going around says 1.5 GB, which would mean 384-bit bus.

Hmm...384/1.5Gb at 480 clocks minus one cluster would bring it closer to 480 levels, but that's ok. Nothing some OC will not be able to fix! :devilish:

At the same clocks with the same bus, the 580 could still be 6-7% faster than the 570, but at 100$ less, the 570 would be a nice offer. Not bad, not bad at all. Especially if you consider dual gpu setups, where the difference would be even less due to cpu limits coming into play with such beasts anyway.
 
Don't forget 14 clusters @675MHz with full-width bus.

That would perform between GTX470 (14@607MHz) and GTX480 (15@700MHz), allow for decent thermals and, more importantly, good overall yield, all this at the expense of two GDDR5 chips per board, with no need to redesign one.

Alternatively, there could be no GTX570 at all since GTX470 is barely ok and GF110 could simply replace GF100, but I think it'll show in some weeks because of ridiculous margin on this specific reference.
 
Back
Top