Power Consumption: ATi vs NVidia

This can´t be correct/the test as it show´s that many with lower PSU´s could hook up one NV40 replacing there NV3x/R3x as they draw equal with power :oops:
 
overclocked said:
This can´t be correct/the test as it show´s that many with lower PSU´s could hook up one NV40 replacing there NV3x/R3x as they draw equal with power :oops:

It also shows that 6800 Ultra and GT draws a highter current from the 12V rail.
 
Come to think of it - these power numbers show the GT drawing 20W below the PCI Express power supply levels - the PCIe GT doesn't operate from the PCI Express bus alone, it needs the new auxiluary power connector.

:?:
 
Would it be possible to ask Nvidia and ATI for a basic wiring diagram for their cards? Nothing really in detail just how it draws power and where from exactly. If the AGP voltages are just tied to the molex voltages on a common bus you could possibly just disable the traces in the mobo that power it and force it to use only the molex which you could easily get a meter on.

Of course this is all theory and who knows if it would work but it could be possible.
 
Anarchist4000 said:
Would it be possible to ask Nvidia and ATI for a basic wiring diagram for their cards? Nothing really in detail just how it draws power and where from exactly. If the AGP voltages are just tied to the molex voltages on a common bus you could possibly just disable the traces in the mobo that power it and force it to use only the molex which you could easily get a meter on.

Of course this is all theory and who knows if it would work but it could be possible.
I doubt it would work because you can only draw so much power through the molex connector. Also, the board may have been designed to balance power draw from the AGP and molex connectors. Disabling one source would cause a large draw on the other (assuming it could work at all), which could damage components not designed to handle the full power drain.

-FUDie
 
DaveBaumann said:
Come to think of it - these power numbers show the GT drawing 20W below the PCI Express power supply levels - the PCIe GT doesn't operate from the PCI Express bus alone, it needs the new auxiluary power connector.

:?:

I'm sure someone will be along to feed you the "it's just for overclocking purposes" line, but I don't see them putting on the molex if it wasn't at least close to being necessary.

I am skeptical of those xbit numbers, didn't nvidia in their own documentation rate the 6800U at 90 or 100 watts or something? I know they often give a margin, but 25%+?
 
AlphaWolf said:
I'm sure someone will be along to feed you the "it's just for overclocking purposes" line, but I don't see them putting on the molex if it wasn't at least close to being necessary.

Well, the PCIe X800 XT review I did a while back I did without the connector, as I didn't have the cable. I was sent the 6800 GT after and that didn't have a cable - I asked NVIDIA if it was OK to run without, as it was with the XT, and they said the GT needs it, so they posted me along a cable.
 
Im getting a 6800U OC and don't know whether I should stick with my antec truepower 380W. Did a test and my system draws 211W. I added the 6800U and it came up to be 293W
 
Remember that power supplies are inefficient. The board itself doesn't draw that much more power from the power supply, despite the fact that the power supply uses that much more. This is why power supplies heat up (and why you want air from the power supply to blow out of the case).

Also remember that you purchased an overclocked version. That will definitely draw more power than even a normal GeForce 6800 Ultra, as you can read earlier in the thread.
 
madshi said:
Thanks! That's very interesting for me. Especially the low power consumption of the plain 6800. I planned to buy a X800 Pro, but the 6800 looks mighty tempting after seeing this chart. If only it had gamma corrected 6xAA... :cry:
Are you thinking in terms of heat in your case, or just power draw? If the latter, consider that if you spend 5x as much time in 2D mode as in 3D mode, the X800P turns out to be just as power hungry as the 6800. Plus, the X800P packs an extra 128MB, and you can find it for ~$330 fairly often--not too bad compared to a $285 6800 ($250 if you account for Far Cry).
 
I doubt it would work because you can only draw so much power through the molex connector. Also, the board may have been designed to balance power draw from the AGP and molex connectors. Disabling one source would cause a large draw on the other (assuming it could work at all), which could damage components not designed to handle the full power drain.

-FUDie

I'm working off the assumption that the power problem is related to how much the traces on the mobo heat up when you get so much current going through them. The whole point would be to provide x volts to the card, the problem is just how to get enough current flowing without seriously heating up the board. If the trace with the power and the molex are tied together just to increase the amount of current the card can draw cutting that trace could work fairly easily. This of course assums that the trace and molex are directly tied to eachother on the card somewhere.
 
Pete said:
Are you thinking in terms of heat in your case, or just power draw? If the latter, consider that if you spend 5x as much time in 2D mode as in 3D mode, the X800P turns out to be just as power hungry as the 6800. Plus, the X800P packs an extra 128MB, and you can find it for ~$330 fairly often--not too bad compared to a $285 6800 ($250 if you account for Far Cry).
I'm thinking about heat. I know, the X800 Pro is great at idle/2D. I just need to make sure that my PC doesn't get too hot while playing demanding 3D games. My PC is built to be silent from the ground up, and the way I built it only works if the graphics card doesn't produce too much heat.

About the prices: The lowest price here in Germany for a X800 Pro is 400 Euro (if you choose a shop which actually has the card in store), while the 6800 "only" costs 300 Euro. But anyway, yesterday I've made up my mind and ordered a X800 Pro because of 2 reasons: (1) I'm a bit skeptical of those xbitlabs numbers. I suspect the whole 6800 numbers to be too low. Somehow I trust the techreport numbers more, which show the R420 is a better light compared to NV40. (2) My monitor doesn't handle 1600x1200 very well, so I'll use lower resolutions and want to have gamma corrected 6xAA (plus TAA).
 
Anarchist4000 said:
Would it be possible to ask Nvidia and ATI for a basic wiring diagram for their cards? Nothing really in detail just how it draws power and where from exactly. If the AGP voltages are just tied to the molex voltages on a common bus you could possibly just disable the traces in the mobo that power it and force it to use only the molex which you could easily get a meter on.

Of course this is all theory and who knows if it would work but it could be possible.
The AGP spec was made by Intel and is freely-available from their website. Additionally, if you look at the first page of the thread, you can see that xbitlabs took the above idea one step further and actually supplied power via the appropriate pins on the AGP card with their own circuits, so that they could measure the all of the power going to the card.
 
AlphaWolf said:
DaveBaumann said:
Come to think of it - these power numbers show the GT drawing 20W below the PCI Express power supply levels - the PCIe GT doesn't operate from the PCI Express bus alone, it needs the new auxiluary power connector.

:?:
I am skeptical of those xbit numbers, didn't nvidia in their own documentation rate the 6800U at 90 or 100 watts or something? I know they often give a margin, but 25%+?
And the efficiency of usual power supplies is ? (hint: you'll get exactly 90-100W)

I don't know how much power PCIe slot can transfer and on what voltages - could it be that it can't transfer so much power on 12v ?
 
Xbit still hasn't done Nvidia cards have they? I have been waiting for quite awhile for that set to come out, but it has not been apparent.

Perhaps the 6800 gt for some reason has a osscilation that is quite large and so the average would be low but it can sometimes swing high enough to need the extra current.
 
madshi said:
I'm thinking about heat.
Thought so. I just wanted to get my point about the X800P probably consuming as much energy overall. :)

I just need to make sure that my PC doesn't get too hot while playing demanding 3D games. My PC is built to be silent from the ground up, and the way I built it only works if the graphics card doesn't produce too much heat.
Yeah, I rank silence right up there. Are you working out of a SFF, or will you be slapping a VGA Silencer on there at some point?

It's amazing how expensive the new Silencers are, BTW. NewEgg has 'em for $30+ without shipping, compared to the $15 I remember the original going for. I think the original still fits the X800P, though, which is nice.
 
Pete said:
Are you working out of a SFF, or will you be slapping a VGA Silencer on there at some point?
I've already bought a passive Aerocool VM-101 heatpipe, which according to some tests I've read gives you better temps than the active VGA Silencer on low speed! :p The heatpipe works only that well, if you have good airflow in your case, though (which I should have).
 
Martillo1 said:
What about using Peltier's effect?
Peltiers increase overall power draw, since they don't operate at anywhere near 100% efficiency. They can be useful to alleviate problems with heat locality, but they invariably increase the total amount of heat that need to be dissipated.
 
Back
Top