AMD RV770 refresh -> RV790

Single-slot HD4870, so at least 20-30W less power? Not sure which ATI single-slot card had the highest power consumption, or what that was, though. 130W for single slot, is that do-able?
This guy did some power figure recently:
http://ht4u.net/reviews/2009/leistungsaufnahme_graka/index14.php
Is in german, some of his results seem er interesting. Anyway its a convenient list of TDPs if nothing else.

Problem is if they get much more than the 4850 does now likely to need a dual cooler. Market seems to want more performance for same price or same performance for less so a dual cooler and/or additional power plug(and associated circuitry) would likely be off limits for that SKU.

"HD4750" at 600MHz launched soon, with 750MHz "HD4770" 3 months later would seem like a reasonable plan.

Yeah maybe, depends on how much of a gap the RV790 creates. Also the remaining stock of the RV770. 40nm is a bit annoying how you get quite reasonable power savings at really low clocks but you cant just add additional units to compensate for performance as variability quickly comes into the picture on bigger dies.
 
Problem is if they get much more than the 4850 does now likely to need a dual cooler. Market seems to want more performance for same price or same performance for less so a dual cooler and/or additional power plug(and associated circuitry) would likely be off limits for that SKU.
9800GTX+ says hi. ;)
 
This guy did some power figure recently:
http://ht4u.net/reviews/2009/leistungsaufnahme_graka/index14.php
Is in german, some of his results seem er interesting. Anyway its a convenient list of TDPs if nothing else.

Problem is if they get much more than the 4850 does now likely to need a dual cooler. Market seems to want more performance for same price or same performance for less so a dual cooler and/or additional power plug(and associated circuitry) would likely be off limits for that SKU.
I don't know what they exactly did, but they did screw up something, since if GTX295 & HD4870X2 for example WOULD pull those figures they're claiming, they would have never gotten validated for PCI Express, nor could be sold as PCI Express cards, not to mention they would go way over specified powers you're allowed to put through the 6pin & 8pin plugs
XBit Labs is the way to go for card-only power draw.
 
They used FurMark for their load values. ATi does not throttle this app just for fun by default through a driver profile. ;)

Of course the load is much higher than on any other app around:
http://ht4u.net/reviews/2009/leistungsaufnahme_graka/index7.php
... also it holds this high utilization over the whole time -> higher temperature -> higher consumption.

I would think the PCIe-specs or the IHVs TDP values are more based on usual utilization.
 
XBit Labs is the way to go for card-only power draw.
As much as i value xbitlabs' measurement, it's only useful for evaluating typical power draw (3DMark). We've* been pointing this out for a long time now, but were lacking the tools to actually measure it. HT4U now did it properly - if you took the time to read the article, they're showing the difference between furmark and 3DMark in powerdraw also.


*We = PC Games Hardware, not HT4U. Just to avoid confusion.

I don't know what they exactly did, but they did screw up something, […]
I'd rather say, somebody else screwed up something. And that somebody else is sitting overseas (from a euopean pov).
 
I have to say there really should be a bigger stink about Furmark as it gives every indication that HD48xx cards are not exactly fit for purpose if there's a graphics load that would melt them.

Just saying "no real graphics app currently prsents such a workload" is no excuse. How do we know that some games aren't being throttled for the same reason? And why shouldn't a game come along and make the card melt?

Jawed
 
Workloads that have utilizations sufficient to exceed TDP or nominal power draw exist for CPUs.

CPUs typically are very effective when it comes to throttling when exceeding TDP, but I'm not sure about load on the power supply.

Intel's Montecito had measures to keep track of actual power draw and amps, but that was switched off, unfortunately. Nehalem has taken x86 to a level of complexity that nearly matches that.

That there are workloads that logic-heavy GPUs might spike above limits doesn't surprise me. That they seem to consistently miss the fact that they are over the limit might be a problem, though I don't know the methodology or German.

It would be another data point that indicates that utilization may not be the best metric for evaluating future architectures.
It's been a long time since any performance chip has been able to sustain good utilization without slamming into a power barrier.

The question becomes what actual work gets done for a given amount of utilization.
Idling hardware on a chip that is chronically TDP or wattage limited is not a sin when transistor budgets are so bloated.
 
I have to say there really should be a bigger stink about Furmark as it gives every indication that HD48xx cards are not exactly fit for purpose if there's a graphics load that would melt them.

Weren't there reports of Nvidia's stuff getting quite hot as well? I know I was scared to let the new version run unattended. It started racing towards 100 C within seconds of starting on my GTX 285.
 
This guy did some power figure recently:
http://ht4u.net/reviews/2009/leistungsaufnahme_graka/index14.php
Is in german, some of his results seem er interesting. Anyway its a convenient list of TDPs if nothing else.

Meanwhile we have translated the article into english, too. Think this could be interessting for you guys here:

Power Consumption of current graphics cards (English Version)

To see how we testet, go here:
http://ht4u.net/reviews/2009/power_consumption_graphics/index5.php

I don't know what they exactly did, but they did screw up something, since if GTX295 & HD4870X2 for example WOULD pull those figures they're claiming, they would have never gotten validated for PCI Express, nor could be sold as PCI Express cards, not to mention they would go way over specified powers you're allowed to put through the 6pin & 8pin plugs
XBit Labs is the way to go for card-only power draw.

AnarchX and Carsten allready declared that FurMark causes that our results are that much higher than those from Xbit-labs. Here is a brief comparison of some tools, we tested. For determining maximum power 3DMark06 isn´t the best choice in my humble opinion, because there are enough games out there, that cause a higher power consumption. Nevertheless FurMark has to be treated as a worst-case scenario.

AMD tells me their TDP describes only the TDP of the graphic chip not the complete graphic card.
Perhaps it is a excuse. I do not know it. ;)

Hehe, i think the answer to that question differs in depency of who you ask from AMD ;). We didn´t get a really reliable feedback on this topic, yet. And i assume we won´t ever get one ;). We recently found out that FireGl products are rated with a higher TDP, have a look at the Radeon FireGl 9270. That´s a HD 4870 with 2 GB of VRAM and its rated at 160 Watt typical and 220 Watt peak. We meassured about ~ 190 Watts for the HD 4870 1GB. Everybody can make his own conclusion on that ;).
http://ati.amd.com/technology/streamcomputing/product_firestream_9270.html

I have to say there really should be a bigger stink about Furmark as it gives every indication that HD48xx cards are not exactly fit for purpose if there's a graphics load that would melt them.

Just saying "no real graphics app currently prsents such a workload" is no excuse. How do we know that some games aren't being throttled for the same reason? And why shouldn't a game come along and make the card melt?

Jawed

I see it the same way. Who says, that a developer theoretically can´t just use the same algorithm in a game? What about the Fur-Ring of middle-earth in close-up-view? ;)

I guess people aren't familiar with the term "Power Virus".
Hmm. That´s your opinion, but just have look at your signature. What about all those GPGPU-Things? Power-Viruses, too?

Greetings,
Leander - HT4U
 
This guy did some power figure recently:
http://ht4u.net/reviews/2009/leistungsaufnahme_graka/index14.php
Is in german, some of his results seem er interesting. Anyway its a convenient list of TDPs if nothing else.

Meanwhile we have translated the article into english, too. Think this could be interessting for you guys here:

Power Consumption of current graphics cards (English Version)

To see how we testet, go here:
http://ht4u.net/reviews/2009/power_consumption_graphics/index5.php

I don't know what they exactly did, but they did screw up something, since if GTX295 & HD4870X2 for example WOULD pull those figures they're claiming, they would have never gotten validated for PCI Express, nor could be sold as PCI Express cards, not to mention they would go way over specified powers you're allowed to put through the 6pin & 8pin plugs
XBit Labs is the way to go for card-only power draw.

AnarchX and Carsten allready declared that FurMark causes that our results are that much higher than those from Xbit-labs. Here is a brief comparison of some tools, we tested. For determining maximum power 3DMark06 isn´t the best choice in my humble opinion, because there are enough games out there, that cause a higher power consumption. Nevertheless FurMark has to be treated as a worst-case scenario.

AMD tells me their TDP describes only the TDP of the graphic chip not the complete graphic card.
Perhaps it is a excuse. I do not know it. ;)

Hehe, i think the answer to that question differs in depency of who you ask from AMD ;). We didn´t get a really reliable feedback on this topic, yet. And i assume we won´t ever get one ;). We recently found out that FireGl products are rated with a higher TDP, have a look at the Radeon FireGl 9270. That´s a HD 4870 with 2 GB of VRAM and its rated at 160 Watt typical and 220 Watt peak. We meassured about ~ 190 Watts for the HD 4870 1GB. Everybody can make his own conclusion on that ;).
http://ati.amd.com/technology/streamcomputing/product_firestream_9270.html

I have to say there really should be a bigger stink about Furmark as it gives every indication that HD48xx cards are not exactly fit for purpose if there's a graphics load that would melt them.

Just saying "no real graphics app currently prsents such a workload" is no excuse. How do we know that some games aren't being throttled for the same reason? And why shouldn't a game come along and make the card melt?

Jawed

I see it the same way. Who says, that a developer theoretically can´t just use the same algorithm in a game? What about the Fur-Ring of middle-earth in close-up-view? ;)

I guess people aren't familiar with the term "Power Virus".

Hmm. That´s your opinion, but have a look at your signature. Are those GPGPU-things Power-Viruses, too?

Greetings,
Leander - HT4U
 
I think there are other examples of some non-power virus applications occassionally forcing processors to exceed TDP.

The Pentium 4 was a freakish example where it happened all too often, but no processor since is really immune, particularly as the top speed grades approach the limits of a particular TDP bracket.

I know more recent CPUs have at least some capability to fall back to safe clocks if they read too high a temp.
 
I see it the same way. Who says, that a developer theoretically can´t just use the same algorithm in a game? What about the Fur-Ring of middle-earth in close-up-view?
Its unlikely to remain a constant, with no other activity for very long.
 
Back
Top