Nvidia lowering power supply requirements?

bdmosky

Newcomer
This interview indicates that perhaps the power supply requirements are officially being lowered a bit.

Perhaps Jen-Hsun Huang comments weren't as off base as people were thinking?
 
Or perhaps it's just an extension of the same spin control JHH was performing. A good 350 watt power supply may actually be better than a cheap 480 watt psu in any event. It will be interesting to see how oem adoption of the 6800 cards shapes up.
 
AlphaWolf said:
Or perhaps it's just an extension of the same spin control JHH was performing. A good 350 watt power supply may actually be better than a cheap 480 watt psu in any event. It will be interesting to see how oem adoption of the 6800 cards shapes up.

So the question is what will it say on the box as required, because if it says 480Watts and you burn the bastard out on a 350 because you read gamespot, guess what your shit out of luck. The question is what is required for warranty issues baby.
 
The interview is stating that you will need a "good" or "robust" 350 W PSU. Sounds a bit dodgy to me.

Either nVidia is just going to be a bit more risky with an official spec they allow on retail boxes....or what I suspect they're doing, is telling OEM system integrators that specific 12V rail specs that are needed....letting OEMs choose "any 480W power supply" or "350W power supplies that meet the following minimum 12V rail spec."

I was hoping to find out that a new chip revision actully lowered power output, but that's not the case.
 
Well, it's always the case. A shitty power case, won't ever be able to do what it's supposed to do, unfortunately :(
 
nVidia's simply had time to feel out the reactions to both their new parts and those of ATI's and they know this is the single greatest negative being associated with their lineup, both with OEMs and the online gaming community. What I find strange is why their engineers would tell the company's PR/marketing to include 480 watt PSUs in their press releases if the boards would be fine with 'quality' 350 PSUs? Things that make you go hmmm. . . .
 
John Reynolds said:
What I find strange is why their engineers would tell the company's PR/marketing to include 480 watt PSUs in their press releases if the boards would be fine with 'quality' 350 PSUs?
Because 9/10 PSUs sold are most definitely not 'quality'? I have severe doubts my Omni or Codegen 400 watters could handle a 6800 Ultra, I had a "300W" Omni that fell over when moving from a TNT2 to a GF2MX.
 
Evildeus said:
Well, it's always the case. A shitty power case, won't ever be able to do what it's supposed to do, unfortunately :(

Thing is, ED, a "shitty power case" = cheap power case, and will have a higher liklihood of working with the X800 than the 6800. That's why ATI never had to "play it safe" and recommend 480W power supplies in the first place. ATI can "play it safe" by recommending 325W (IIRC?) power supplies.

480W power supplies cuts out a LOT of options for OEMs. nVidia is going to make it a bit easier for them by going through some process of defining quality power supplies of lower wattages. Likely they will give OEMS some spec for the power in the 12V line and/or voltage tolerances that should be adhered to.
 
Well, i think that for R360 it's 350W, and i didn't see the requirement for R420. Anyone got the information?

You are right, but when i look to Fodder exemple, i'm a bit sceptical if a R420 should not require a 480W PSU also. As we don't know the other specificities when a firm says 350W (voltage tolerace?), we can't say what is the real requirement from both.
 
Fodder said:
I had a "300W" Omni that fell over when moving from a TNT2 to a GF2MX.

Wow, that must have been one really crappy PSU... a GF2MX eats up something like 4-5 watts...! :oops:
 
Look at this chart, at least the requirement for the XT should be a little more than a 9800XT, the 6800U draws almost the same as a 5950U :? :
IMG0007737.gif

http://www.hardware.fr/articles/494/page7.html
 
I would hazard a guess that it is due to, if July availability rumors are correct, the TSMC version of the chip looking good. On the other hand it may suggest that Nv had planned to clock the Nv40 a lot higher resulting high projected power requirements. If that is the case. Maybe yield issues had them scale back the clock making the extra molex connector redundant.
 
The problem with running prime too, is that you are not constantly loading the GPU.

This does not show relative power consumptions of the GPUs when put in GPU bound situations, which is what will show a truer comparison of relative video card power consumption.

Now that RUBY can be run on both cards...perhaps a loop of that demo running at high rez in a loop would be a decent test...
 
I guess we'll have to wait for actual reviews to see if the new chips/cards actually need less power or if this is just ANTI-FUD.

I'm also wondering if this will affect the clock speeds and the benchmark results we've already seen.
 
Evildeus said:
You are right, but when i look to Fodder exemple, i'm a bit sceptical if a R420 should not require a 480W PSU also.

Um....Why? R420 consumes about the same or less than the 9800XT. Did the 9800XT require a 480W power supply?
 
trinibwoy said:
What's wrong with running prime too?

What Joe said + it's just bad practice to measure large values and subtract from them... you're supposed to minimize the number of factors rather than make assumptions of them being constant, even IF your assumptions are justified.
 
anaqer said:
trinibwoy said:
What's wrong with running prime too?

What Joe said + it's just bad practice to measure large values and subtract from them... you're supposed to minimize the number of factors rather than make assumptions of them being constant, even IF your assumptions are justified.

Agreed :)

On another note I'm getting tired of the repetitive arguments. I'll probably not buy any of the current products anyway and the driver/optimization/performance landscape may be quite different come fall when I upgrade. A couple of things that might stimulate my interest before the refresh if they ever come to fruition are:

1) Shadermark 2.1 is released along with DX9.0c
2) The Far Cry path/driver issues are resolved
3) Nvidia exposes more AA modes than currently available as suggested by 3DCenter's article
4) Nvidia exposes their old AF implementation (sure to cause some IQ flame wars :LOL: )
5) HL2/DOOM3 :?: :?: :?:
6) Getting the Cats on my 9800PRO so I can finally use some game profiles.....doing it manually is getting to be a pain in the ass.
 
Back
Top