The LAST R600 Rumours & Speculation Thread

Status
Not open for further replies.
You should also remember that power supplies do not have 100% efficiency. The figures in Techreport charts represent the power taken from the wall outlet, and the system uses about 85% of that power. Rest turns into heat in the power supply. Moreover, power supplies are rated according to their output power.
 
You should also remember that power supplies do not have 100% efficiency. The figures in Techreport charts represent the power taken from the wall outlet, and the system uses about 85% of that power. Rest turns into heat in the power supply. Moreover, power supplies are rated according to their output power.

It's still an accurate reflection of the amount of heat you have to deal with when cooling your PC, which a minority of us actually care about.
 
It's still an accurate reflection of the amount of heat you have to deal with when cooling your PC, which a minority of us actually care about.

That's true. On the other hand it's a very inaccurate reflection of what wattage you need from your power supply. It would be useful to have both numbers available to not mislead readers. If testing the power supply efficiency at each power level proves too much, even a typical number would be good to have.

As for R600, it'll probably be within the 225W spec. G80 already requires this spec to cover the worst case scenario, or best utilization. I wouldn't expect typical consumption of R600 to far exceed that of G80 though (<150W).
 
If testing the power supply efficiency at each power level proves too much, even a typical number would be good to have.

At these power draws (~150-300W DC), ~80% is a good rule of thumb for decent quality power supply. Even 5% either way isn't going to make a large difference. SPCR do measurements of efficiency as a function of power in their PSU tests.
 
I really can't believe all the fuss.

The 8800GTX has two power connectors, yet only draws a little more power at full load than X1900XTX which only has one power connector.

8800GTX would be marginal if powered from one connector (if NVidia's specified power draw of ~145W is to be believed), but the two connectors it has provide far more than it'll ever draw.

The heatsink on the VR-Zone provided render of R600's cooler actually looks to be the ~same volume as on 8800GTX, just comparing the area of the fins.

Jawed
 
I really can't believe all the fuss.

The 8800GTX has two power connectors, yet only draws a little more power at full load than X1900XTX which only has one power connector.

8800GTX would be marginal if powered from one connector (if NVidia's specified power draw of ~145W is to be believed), but the two connectors it has provide far more than it'll ever draw.

The heatsink on the VR-Zone provided render of R600's cooler actually looks to be the ~same volume as on 8800GTX, just comparing the area of the fins.

Jawed

I remember reading from some site that according to nVidia the peak max load of GF8800 GTX is ~180W, under what circumstances it would draw that much, I don't know, but maybe we'll some day find those circumstances.
 
I remember reading from some site that according to nVidia the peak max load of GF8800 GTX is ~180W, under what circumstances it would draw that much, I don't know, but maybe we'll some day find those circumstances.
The XBit Labs test relates that NVidia's spec is ~145W, as compared with the 150W that a single PEG connector + mobo can supply, which is why I describe it as marginal.

Jawed
 
The XBit Labs test relates that NVidia's spec is ~145W, as compared with the 150W that a single PEG connector + mobo can supply, which is why I describe it as marginal.

Jawed

In this case, I would offer to use 2 sockets of PEG than one, if the spec is ~145w. The reason is about safety factor that is needed to applied to coverage all types of PSU from any source and in any situation!! I would not wish to suggest using one PEG outlet for this extreme maximum spec case if I was the card design engineering.

Edit: Typo...
 
The peak for 8800 GTX is higher than one connector + slot can provide, it's as simple as that (i.e 145W is wrong). As for R600, you can apply the same logic, just that it's likely they're further along the curve (in terms of needing more than slot + 6-pin).
 
Why are the fins so small? Isn't that supposed to be very ineffective?

Assuming that you are talking about the cooling system fins of the vr-zone "image" of R600, if you have smaller (thinner) fins you can put more in the same size and total cooling surface increases, as it is not sheer mass that provides a better cooling (except effects on thermal transient impedance), but the total surface of the cooler (thermal resistance is inversely proportional to surface).
 
We could possibly see more power consumption once we get Directx10 Games. At the moment there's got to be parts of the chip not being fully utilized e.g. geometry shader.
 
We could possibly see more power consumption once we get Directx10 Games. At the moment there's got to be parts of the chip not being fully utilized e.g. geometry shader.

If its a unified architecture those geometry units should still not be idling. Plenty of Pixel/Vertex data to keep them busy..
 
We could possibly see more power consumption once we get Directx10 Games. At the moment there's got to be parts of the chip not being fully utilized e.g. geometry shader.

Since it's a no brainer that R600 will also be a unified shader core, I'd love to know where that one or G80 has dedicated geometry shader units to sit around idle. On both each ALU can process pixel/vertex or geometry shaders.
 
Simply I don't :smile:
I only said that we must compare the "load" figures for the whole system and not the "idle" ones

There are two graphs in the link, but that's besides the point.

A very simple example: my UPS (1500VA) has a small front panel LCD screen measuring apart from battery capacity also the power load that goes to it from the system. Imagine five fields to fill; with the former 7800GTX@490/685MHz it would fill one field in 2D at full load two fields. Now with the 8800GTX@default it's constantly stuck at two fields w/o any fluctuations up or down.

It might sound a tad naive and simplistic overall, but it's an alternative to the 10W up 20W down hairsplitting. As long as the thing is as silent as the G70 (in an already as silent as possible system) I hardly have any reason to bother much about it. If someone now interested in either R600 or G80 sits all day and measures how many kilowatts what exactly burns it can go in the anal direction.
 
Since it's a no brainer that R600 will also be a unified shader core, I'd love to know where that one or G80 has dedicated geometry shader units to sit around idle. On both each ALU can process pixel/vertex or geometry shaders.
Power is not quite as simplistic as that. I/O is actually one of the biggest causes for power consumption (both at the board level and chip level) even though you may have full utilisation of the shader core power consumption can vary wildly dependant on the amount of I/O going on to and from the chip. Massive shaders that keep the core nice and busy could cut down I/O could actually reduce power draw relative to other scenarios.
 
Power is not quite as simplistic as that. I/O is actually one of the biggest causes for power consumption (both at the board level and chip level) even though you may have full utilisation of the shader core power consumption can vary wildly dependant on the amount of I/O going on to and from the chip. Massive shaders that keep the core nice and busy could cut down I/O could actually reduce power draw relative to other scenarios.

Is that (the red bold) a hint to the thing to come? :cool:

I knew it sounds general but I cannot help thinking about the "Massive shaders" word :p .
 
Is that (the red bold) a hint to the thing to come? :cool:

I knew it sounds general but I cannot help thinking about the "Massive shaders" word :p .

My interpretation is something like the Folding@Home GPU client, which is an example of a massive shader application that demonstrates significantly lower power draw versus the majority of game applications.
 
Status
Not open for further replies.
Back
Top