This guy did some power figure recently:
http://ht4u.net/reviews/2009/leistungsaufnahme_graka/index14.php
Is in german, some of his results seem er interesting. Anyway its a convenient list of TDPs if nothing else.
Meanwhile we have translated the article into english, too. Think this could be interessting for you guys here:
Power Consumption of current graphics cards (English Version)
To see how we testet, go here:
http://ht4u.net/reviews/2009/power_consumption_graphics/index5.php
I don't know what they exactly did, but they did screw up something, since if GTX295 & HD4870X2 for example WOULD pull those figures they're claiming, they would have never gotten validated for PCI Express, nor could be sold as PCI Express cards, not to mention they would go way over specified powers you're allowed to put through the 6pin & 8pin plugs
XBit Labs is the way to go for card-only power draw.
AnarchX and Carsten allready declared that FurMark causes that our results are that much higher than those from Xbit-labs.
Here is a brief comparison of some tools, we tested. For determining maximum power 3DMark06 isn´t the best choice in my humble opinion, because there are enough games out there, that cause a higher power consumption. Nevertheless FurMark has to be treated as a worst-case scenario.
AMD tells me their TDP describes only the TDP of the graphic chip not the complete graphic card.
Perhaps it is a excuse. I do not know it.
Hehe, i think the answer to that question differs in depency of who you ask from AMD
. We didn´t get a really reliable feedback on this topic, yet. And i assume we won´t ever get one
. We recently found out that FireGl products are rated with a higher TDP, have a look at the Radeon FireGl 9270. That´s a HD 4870 with 2 GB of VRAM and its rated at 160 Watt typical and 220 Watt peak. We meassured about ~ 190 Watts for the HD 4870 1GB. Everybody can make his own conclusion on that
.
http://ati.amd.com/technology/streamcomputing/product_firestream_9270.html
I have to say there really should be a bigger stink about Furmark as it gives every indication that HD48xx cards are not exactly fit for purpose if there's a graphics load that would melt them.
Just saying "no real graphics app currently prsents such a workload" is no excuse. How do we know that some games aren't being throttled for the same reason? And why shouldn't a game come along and make the card melt?
Jawed
I see it the same way. Who says, that a developer theoretically can´t just use the same algorithm in a game? What about the Fur-Ring of middle-earth in close-up-view?
I guess people aren't familiar with the term "
Power Virus".
Hmm. That´s your opinion, but just have look at your signature. What about all those GPGPU-Things? Power-Viruses, too?
Greetings,
Leander - HT4U