NVIDIA GF100 & Friends speculation

Let me just pose this question then: if one were to take a GTX 480 and run the fan @ 100%, perhaps bringing the loaded temperature down from 95 degrees C to say 75 degrees C, is it possible this would cause a measurable reduction in power usage, aside from the obvious increase in power draw by the fan itself?
 
Let me just pose this question then: if one were to take a GTX 480 and run the fan @ 100%, perhaps bringing the loaded temperature down from 95 degrees C to say 75 degrees C, is it possible this would cause a measurable reduction in power usage, aside from the obvious increase in power draw by the fan itself?

yes. Most likely you would have a measurable difference in power from a ~20c operating differential.
 
To put that into context - at around about operating temps, IIRC, we saw roughly a correlation of ~1W to 1 degree on Cypress.
 
Er, well, while it is true that increases in temperature increase the mobile electrons/holes in semiconductors, processors make use of doping for the majority of their mobile electrons/holes, so that effect is small, and is generally offset by other deleterious effects, as far as I am aware.
I was just teasing the guy for his statement.

Joking aside, higher temperatures should affect leakage. Undoped Si between transistors, p-n junctions, and even the gate insulator increases in conductivity, and higher temperature increases tunnelling current, too. I don't think the resistance of the doped areas make much difference to power consumption.

Thanks, Dave, for the quantitative data.
 
But that's power draw measured at the AC source, so losses/efficiency/thermal variance of the rest of the system has to be factored into that reading.
 
Joking aside, higher temperatures should affect leakage. Undoped Si between transistors, p-n junctions, and even the gate insulator increases in conductivity, and higher temperature increases tunnelling current, too. I don't think the resistance of the doped areas make much difference to power consumption.
Right, I was thinking that might be the case, but I wasn't sure.
 
But that's power draw measured at the AC source, so losses/efficiency/thermal variance of the rest of the system has to be factored into that reading.

Plus increase in fan speed which adds to that as well.

But I did several tests on my Cypress when First got my card and I can confirm that what Dave said is very correct. +1C=+1W at full load for stock HD5870 between 70C and 85C at least.
 
Also apparently ATI didn't get access to the final game code either.

http://hardforum.com/showpost.php?p=1035518165&postcount=27

I just can't fathom why they would purposely not give ATI access to the final code to tweak their drivers, if TWIMTBP is just supposed to be Nvidia offering support like they claim.
As far as I know, ATI doesn't have a comparable developer relations program, so they would hardly have reason to show ATI their code. ATI might just say, "That's nice. Why are you showing this to us?"
 
As far as I know, ATI doesn't have a comparable developer relations program, so they would hardly have reason to show ATI their code. ATI might just say, "That's nice. Why are you showing this to us?"
Agreed on the former but I think its improving with great pains, atleast that is my assumption based on no concrete evidence.

But on the latter, 4A had nothing* to lose by offering to show the code. Getting emotionally hurt is a far less trade off than making your game more efficient across the platform, which in turn might get them at the very least fractionally more sales.

*Unless the trade off had some thing else in play.
 
The products may be bounded differently though. At 225W the lower leakage range may be reserved for the 470, while with a higher TDP on the 480 the higher leakage parts may be reserved for that.
Good point! That'd also correspond with the OC results we were getting.
 
But on the latter, 4A had nothing* to lose by offering to show the code. Getting emotionally hurt is a far less trade off than making your game more efficient across the platform, which in turn might get them at the very least fractionally more sales.
The difference is that nVidia is actively reaching out to developers, in effect saying, "Let us help you with that. We can make your game run faster!" And also bear in mind that nVidia's developer relations program is highly unlikely to exclude optimizations that would also help ATI hardware, thus the TWIMTBP program likely improves game performance for both IVH's, but obviously the program's main focus is going to be nVidia hardware, not least because nVidia's support surely knows much more about nVidia hardware than ATI hardware.
 
To put that into context - at around about operating temps, IIRC, we saw roughly a correlation of ~1W to 1 degree on Cypress.

Wow, I was wondering why my entire PC draws south of 250W under load given TDPs of both Cypress and Yorkfield (both OC'ed)! So water cooling is power efficient as well ;)!
 
Back
Top