Power Consumption with latest Gfx card.

iwod

Newcomer
I post this elsewhere and no one seems to be concern about it. The fact that these latest powerful Gfx card draw much more power even when they are idle. What happens if i am a casual gamer who only plays games at weekend but with a Geforce 8800 GTX card? So 90% of the times when i using windows i would have to pay extra electricity bill just to own a powerful Gfx card? Although It is not so much about the money that i am concern about but environment too.

I know first gen nvidia product tends to be power hungry and less optimise hardware wise. I hope with G81 they will take this into account.
 
I post this elsewhere and no one seems to be concern about it. The fact that these latest powerful Gfx card draw much more power even when they are idle.

It's almost unavoidable: high-end hardware use high-speed processes with high static leakage. So even when the transistors are not switching, they consume a hefty amount of power that's directly correlated with the area of the die.

It's possible to put parts of the die on different power sections and shut them off completely, but that technique is usually only done for exterme low power handheld devices. It also makes the power supply more costly to design.

Don't expect this to improve for future GPU's: they will migrate to even smaller processes with (initially) less area, but the leakage will be even higher.

The best solution would be to have a computer with an integrate GPU on the motherboard for every day use and an external GPU for heavy duty gaming. In that case, you would shut down the power of external GPU completely. Probably not so easy to do for a desktop (how would you switch cables from one mode to the next), but may be do this already for laptops? If not, I'm pretty sure it won't be long before somebody installs it on a high-end laptop.

Although It is not so much about the money that i am concern about but environment too.

Cancel 1 long car camping trip per year and you'll pollute probably far less than the combined polution of 1 year of a high end GPU. You could spend that long weekend behind your computer, for example. ;)
 
laptops with 2 GPUs already exist (with the integrated crap like intel or S3 savage, and good nvidia or ATI GPU for gaming).
 
I think a breakdown of the power consumption for a graphics board would be interesting.

The idle power draw of a card is pretty high, but how much is due to the GPU?
The RAM on board would consume a good amount of power. The lower-clocked main memory DIMMs consume something like 18 Watts per DIMM. This might be lower for DDR2, but none are clocked as high as those on a board.

How much is lost by the power regulation circuitry on the board?

How much for the IO, DAC, and other components?

How much can a GPU reduce its draw in idle? Can the hardware reliably determine when it can idle a few quads?
Can the memory controller find a way to stop refreshing unused DRAM banks?
 
If the IO chip on the 8800 is complex enough, in theory, you could shut down the GPU and memory and stream video out the plugin card's output. No need to swap cables.
 
Dunno, but, I thought the SLI connection(s) went through the IO chip....

ed: all I/O *except* for pcie, apparently: http://www.beyond3d.com/reviews/nvidia/g80-arch/index.php?p=03
NVIO marshalls and is responsible for all data that enters the GPU over an interface that isn't PCI Express, and everything that outputs the chip that isn't going back to the host. In short, it's not only responsible for SLI, but the dual-link DVI outputs (HDCP-protected), all analogue output (component HDTV, VGA, etc) and input from external video sources.
 
Last edited by a moderator:
yes, this topic does come up often and here's my thoughts

a large percentage of people buying the high end cards are kids, who's mommy and daddy likely bought the card and also own the house and pay the electric bill, so it doesnt really matter to them does it

the other large percentage are the older crowd of enthusiasts with disposable income who could probably care less that their $300/mo power bill just went to $305/mo......i mean really, how much extra power are we actually talking here?

no matter what is in your system, you can't draw any more current than your power supply is capable of drawing from the outlet, i am thinking it's extremely small, as in you probably spend more on much more trivial meaningless things that get tossed out......
 
I wouldn't mind lower idle power consumption, irrespective of cost.

That would allow the cooler to spin down even further in idle mode, so the card would be even quieter when I'm not in a game full of explosions.
The reduced need for air-flow would also cut down on how fast dust builds up in the cooler, since less gets pulled in from the outside.
 
Back
Top