The LAST R600 Rumours & Speculation Thread

Status
Not open for further replies.
Any of those power consumption numbers you guys are mentioning simply cannot be for the GPU alone; those sound rather like total system consumption numbers. Or to be more precise I cannot digest that a GPU alone would consume 200 or even 300W.

http://www.techreport.com/reviews/2006q4/geforce-8800/index.x?pg=16

Wild speculation on my behalf, but if a high end system with a R600 installed should consume 200W@idle and 300W@load, it might not be low per se in comparison to the straight competitor, yet not the end of the world either.
 
Last edited by a moderator:
Any of those power consumption numbers you guys are mentioning simply cannot be for the GPU alone; those sound rather like total system consumption numbers. Or to be more precise I cannot digest that a GPU alone would consume 200 or even 300W.

http://www.techreport.com/reviews/2006q4/geforce-8800/index.x?pg=16

Wild speculation on my behalf, but if a high end system with a R600 installed should consume 200W@idle and 300W@load, it might not be low per se in comparison to the straight competitor, yet not the end of the world either.

We know from both Nvidia's specs and that Tech Report article that 8800gtx draws ~170w load by itsself. A 7600gt draws 65-70w load, so the numbers come out correctly. It's not absurd to think R600 may draw 50w or more power if the voltage is greater to the core, even with GDDR4 installed, as apparently it will have more (1gb vs 768). Not saying it will end up using that much power by itsself, but it's possible and perhaps even likely. I was just trying to figure out what the CES model drew from the pci-e slot, as well as 6-pin and 8-pin connector. Apparently the answer is a maximum of 300w. I've got to believe that's coming down, but I doubt substantially. Odds are though, it will fit into one of the power envelopes developed for pci-e 2.0 (ie the 8-pin connector), which I thought were 225w (pci-e slot + 8-pin) and 300w (pci-e slot, 8-pin, and 6-pin as seen on CES mock-up), as Razor mentioned per this slide:

300wgn4.jpg


It seems most rumors i've read (Anandtech, The Inq, Hexus) seem to point at the 225w spec (215w-225w...top-end of the spec), which is different than what the mockup shows (and may be being shown as CES), hence my reasoning. If you were AMD/ATi, would you want to show off a graphics card to the world that needs 300w of connectors if your plan was to reduce it to 225w?
 
Last edited by a moderator:
  • Like
Reactions: Geo
Wouldn't the fact that in the past ,certain parts of the gpu have been idle,so there would be less power used,but with unification,and keeping as much of the gpu working as possible,would naturally lead to more power usage?would that not have any bearing on it?
 
Presumably, AMD is anticipating the PCI Express 2 standard, which requires the 8-pin connector on the board. The power supply companies will start to roll out 8-pin PEG2 equipped PSs this spring (Cebit?) so it would be nice if R600 could use either 2x PEG1 connectors or a PEG2 connector.

I'm going to guess that a PEG1 plug on the power supply is capable of plugging into a PEG2 socket on the GPU, with an adaptor. So both sockets combined deliver the required 150W.

Alternatively if your power supply has a PEG2 plug, then it supplies 150W all alone, when plugged into the PEG2 socket. Therefore there's no need to use both the PEG1 and PEG2 sockets.

Jawed
 
Last edited by a moderator:
Wouldn't the fact that in the past ,certain parts of the gpu have been idle,so there would be less power used,but with unification,and keeping as much of the gpu working as possible,would naturally lead to more power usage?would that not have any bearing on it?
Yes.

In return, we expect utterly astounding performance :D

Jawed
 
The converse would be true as well, though, right? Think it would be possible for the driver to detect high temperatures on the card and dynamically switch off some of the pipelines to lower the temp?
 
We know from both Nvidia's specs and that Tech Report article that 8800gtx draws ~170w load by itsself.

Then I urge you to read carefully the headline of the Techreport chart:

power-idle.gif


....and a little bit of article text which might pose a tad more helpful:

Now for the moment of truth. We measured total system power consumption at the wall socket using an Extech power analyzer model 380803. The monitor was plugged into a separate outlet, so its power draw was not part of our measurement. Remember, out of necessity, we're using different motherboards for the CrossFire systems. Otherwise, the system components other than the video cards were kept the same.

http://www.techreport.com/reviews/2006q4/geforce-8800/index.x?pg=16


A 7600gt draws 65-70w load, so the numbers come out correctly. It's not absurd to think R600 may draw 50w or more power if the voltage is greater to the core, even with GDDR4 installed, as apparently it will have more (1gb vs 768). Not saying it will end up using that much power by itsself, but it's possible and perhaps even likely. I was just trying to figure out what the CES model drew from the pci-e slot, as well as 6-pin and 8-pin connector. Apparently the answer is a maximum of 300w. I've got to believe that's coming down, but I doubt substantially. Odds are though, it will fit into one of the power envelopes developed for pci-e 2.0 (ie the 8-pin connector), which I thought were 225w (pci-e slot + 8-pin) and 300w (pci-e slot, 8-pin, and 6-pin as seen on CES mock-up), as Razor mentioned per this slide:

300wgn4.jpg


It seems most rumors i've read (Anandtech, The Inq, Hexus) seem to point at the 225w spec (215w-225w...top-end of the spec), which is different than what the mockup shows (and may be being shown as CES), hence my reasoning. If you were AMD/ATi, would you want to show off a graphics card to the world that needs 300w of connectors if your plan was to reduce it to 225w?

See above.
 
Then I urge you to read carefully the headline of the Techreport chart:

power-idle.gif


....and a little bit of article text which might pose a tad more helpful:



http://www.techreport.com/reviews/2006q4/geforce-8800/index.x?pg=16

You are right, but that is the idle power chart, anybody should take a look at the "load" chart, with total power consumption under load.
Anyway, 8800 seems to have the same X1950 XTX power requirements (more or less). That should mean something in the 100-120W range. IMHO R600 could come in the 150W range, but anyway 200+Watts is very unlikely: the card should require a liquid cooling system that way.
And if the 700 M transistors and 80nm figures are right, there is nothing hinting at having 100% more power dissipation than the competition, having about the same die size, even with frequency and voltage being different (frequency is rumored to be higher, but IMHO it's difficult to get it in the GHz range, and voltage on a smaller process is likely on par or lower than at 90 nm)
 
Last edited by a moderator:
Comparisons with 8800GTX power consumption are odious because NVidia has only activated 1/2 the core as yet :p The other half gets turned on when R600 is released :devilish: ...

Jawed
 
You are right, but that is the idle power chart, anybody should take a look at the "load" chart, with total power consumption under load.

187W idle vs. 287W load for the entire system; I'm still wondering how any of you manage to calculate the pure GPU power consumption out of that.

Anyway, 8800 seems to have the same X1950 XTX power requirements (more or less). That should mean something in the 100-120W range. IMHO R600 could come in the 150W range, but anyway 200+Watts is very unlikely: the card should require a liquid cooling system that way.

Exactly.

And if the 700 M transistors and 80nm figures are right, there is nothing hinting at having 100% more power dissipation than the competition, having about the same die size, even with frequency and voltage being different (frequency is rumored to be higher, but IMHO it's difficult to get it in the GHz range, and voltage on a smaller process is likely on par or lower than at 90 nm)

As I said if the hypothetical 200 or 300W figures are for system power consumption with a R600 installed, it isn't really a tragedy IMO.
 
Comparisons with 8800GTX power consumption are odious because NVidia has only activated 1/2 the core as yet :p The other half gets turned on when R600 is released :devilish: ...

Jawed

Obvious jokes aside, compilers don't consume more power last time I checked.
 
That's NV's claimed power consumption and not measured by xbit labs. Ironically NV also claims 143W for the GX2, yet according to xbit labs' measurement it's only (supposed) 110W. What is what exactly?

Even if, 145.5W is quite a difference to the 200,225 or even 300W figures I'm reading here. If the GPU alone will consume 200W, then a high end system won't get away with less than ~ 350W under load (total system power consumption). Crossfire ending up at an estimated wooping ~ 450W.
 
187W idle vs. 287W load for the entire system; I'm still wondering how any of manage to calculate the pure GPU power consumption out of that.

Simply I don't :smile:
I only said that we must compare the "load" figures for the whole system and not the "idle" ones
 
300 watts for just a gpu is insane...it would have to be 3 times faster than g80 for me to be able to swallow those power requirements.
 
Status
Not open for further replies.
Back
Top