NV41 - Geforce 6700

Fodder said:
http://shell.world-net.co.nz/~ntan/a001_resize.jpg

Apparently 12pipe, SLI capable, 256bit bus, 2.8ns 128MB/256MB ram. Does the lack of molex power reveal anything about the process used, or is it conceivable a 12-pipe 130nm DDR1 part could fit within PCIE's limits?

I didnt think the 6800NU would consume more than 75watts of power, I dont think any changes to the process would have been made. Does anyone remember the site who did the power consumption comparison of the high end ATI/Nvidia lines?
 
ChrisRay said:
Fodder said:
http://shell.world-net.co.nz/~ntan/a001_resize.jpg

Apparently 12pipe, SLI capable, 256bit bus, 2.8ns 128MB/256MB ram. Does the lack of molex power reveal anything about the process used, or is it conceivable a 12-pipe 130nm DDR1 part could fit within PCIE's limits?

I didnt think the 6800NU would consume more than 75watts of power, I dont think any changes to the process would have been made. Does anyone remember the site who did the power consumption comparison of the high end ATI/Nvidia lines?

Xbit did an article Part II: NVIDIA vs. ATI
And here's their results of the 6600GT

The guy who does the power consumption tests hasn't done anything with the X700's yet.
 
xbit, too.

Edit: Too slow. :) But I think xbit did mention X700XT power consumption in in their review (text, not a graph), and it used slightly less than the 6600GT (like 1W less idle, and maybe 5-10W load), IIRC.
 
Lezmaka said:
ChrisRay said:
Fodder said:
http://shell.world-net.co.nz/~ntan/a001_resize.jpg

Apparently 12pipe, SLI capable, 256bit bus, 2.8ns 128MB/256MB ram. Does the lack of molex power reveal anything about the process used, or is it conceivable a 12-pipe 130nm DDR1 part could fit within PCIE's limits?

I didnt think the 6800NU would consume more than 75watts of power, I dont think any changes to the process would have been made. Does anyone remember the site who did the power consumption comparison of the high end ATI/Nvidia lines?

Xbit did an article Part II: NVIDIA vs. ATI
And here's their results of the 6600GT

The guy who does the power consumption tests hasn't done anything with the X700's yet.

Thanks, Well according to them the 6800 was only consuming like 45W.
 
Pete said:
Edit: Too slow. :) But I think xbit did mention X700XT power consumption in in their review (text, not a graph), and it used slightly less than the 6600GT (like 1W less idle, and maybe 5-10W load), IIRC.
Well, I think they did. Saw this in their summary:

* The graphics card and the RADEON X700 XT core of the current revisions are rather hot at work, and this results in a noisy cooling system;
 
Pete said:
Edit: Too slow. :) But I think xbit did mention X700XT power consumption in in their review (text, not a graph), and it used slightly less than the 6600GT (like 1W less idle, and maybe 5-10W load), IIRC.

I think it was the opposite, the X700XT used a bit more.
 
At the very least, the cooling solution was louder, so one may surmise that if it required louder cooling, then it may have also used more power. But even after looking around the web some more through Google, I found nothing direct on the X700's power consumption.
 
ChrisRay said:
Lezmaka said:
ChrisRay said:
Fodder said:
http://shell.world-net.co.nz/~ntan/a001_resize.jpg

Apparently 12pipe, SLI capable, 256bit bus, 2.8ns 128MB/256MB ram. Does the lack of molex power reveal anything about the process used, or is it conceivable a 12-pipe 130nm DDR1 part could fit within PCIE's limits?

I didnt think the 6800NU would consume more than 75watts of power, I dont think any changes to the process would have been made. Does anyone remember the site who did the power consumption comparison of the high end ATI/Nvidia lines?

Xbit did an article Part II: NVIDIA vs. ATI
And here's their results of the 6600GT

The guy who does the power consumption tests hasn't done anything with the X700's yet.

Thanks, Well according to them the 6800 was only consuming like 45W.

~35-37W, unless it's a typo and you really meant 35.

The nominal clock rates of the Galaxy Glacier GeForce 6800 are 350/700MHz – as you see, the core frequency is originally 25MHz higher on this card than the recommended 325MHz. The graphics memory overclocked well, while the GPU – less successfully. The maximum stable frequencies were 375/900MHz.

350/700= ~39W
375/900= ~40+W

http://www.xbitlabs.com/articles/video/display/ati-vs-nv-power_4.html
 
Bjorn said:
Pete said:
Edit: Too slow. :) But I think xbit did mention X700XT power consumption in in their review (text, not a graph), and it used slightly less than the 6600GT (like 1W less idle, and maybe 5-10W load), IIRC.

I think it was the opposite, the X700XT used a bit more.
You're right, most places say more. I was thinking of the wrong site, though. It was Hardware.fr that said their X700XT used less power than their 6600GT[url]:


327 vs. 344 under load, in the XT's favor.
 
Which would have come from outside the power supply, which is typically not a very accurate way of measuring power draw of a specific component, because other components may end up being stressed differently. In other words, I really want to see Xbit expand their power consumption tests to include the X700.
 
Back
Top