Xmas said:
In mobile devices clock-throttling obviously saves a significant amount of power. How do you know the cost of adding all this? Especially in the light of NVidia having to design such circuitry and software for their mobile parts anyway?
This is not a mobile part, is the first answer. (Was nv30 in the 5800U a mobile part? People are grossly confused about this.) This product is *not* designed for the mobile market--what's unclear about that? It consumes far more power @ 400MHz when doing 3D operations than it would at 400MHz doing 2D operations, or at 235MHz running 2D operations.
As far as cost goes, if you can't figure out why it would cost more to include clock shifting and throttling in a 3D card than it would to exclude it, I can't help you.
Again, if the object here was power-saving then the card would clock at 235MHz for 2D and 3D operations. That would provide a maximum power savings over any modes discussed.
Additionally, the trigger for the clock-throttle and downclock is gpu
heat. Not power consumption or battery life! There's a tab in the driver config that plainly allows the end user to set the limits for gpu heat which when reached will throttle back the MHz speed of the chip
while it is doing 3D processing. (It's never going to overheat in 2D whether running at 235MHz or 400MHz.)
The only way this clock-throttling could be a bad thing is if it would be triggered already at rather lowish temperatures. But if it only happens in extreme cases like fan failure, then it's entirely a good thing.
I think you need to add in another condition as well--if it throttles back while running a 3D game, that would be a bad thing. I trust you'll agree with that idea, as that's point I've made from the beginning.
If you think this has anything at all to do with "power saving" you are deluded. I've not seen a single instance of nVidia PR where they have advertised this as a "power saving" function--rather this is an imaginative spin put on the issue by people who want to apologize for it.
Look, what happens in a mobile power saving scenario? First of all it *does not* depend on cpu temperature as a trigger--because the goal is not clock throttling but power saving (the two are quite separate categories not related to each other.) Whereas in a mobile power-saving scenario the goal is to reduce the MHz speed when computational demand is light, without regard to heat, for the express purpose of conserving battery life, in the 5600 the goal is to reduce MHz to
manage heat. Any power-saving spin you want to put on that is purely coincidental to the goal of heat management.
So why is the heat management necessary to begin with? Simple, because the chip was not designed to run indefinitely at 400MHz +, and won't, and so it is essential to the viability of the chip that it *not* be continuously run at 400MHz. Therefore nVidia has designed the reference card to *always* throttle back to 235MHz in 2D, and to *always* throttle back to a lower MHz when temperature threshholds are exceeded *during* 3D operation. This leads to an inevitable implication that at 400MHz + the chip is being *overvolted* in order to be overclocked so as to process 3D at those MHz speeds. This requires a somewhat sophisticated approach to manage the heat which is the result of the increased power loads consumed at 400MHz + while doing 3D processing.
I hope you'll think about it and see there's a big difference between deliberately managing a clock to conserve power, and clock-throttling which is done specifically to manage heat.
I would really like to see more dynamically-clocked GPUs and CPUs (clocked high when performance is required, clocked low when doing simple tasks) in the desktop market. I'd like to have a PC that is totally silent when browsing the net or doing office work, but fast enough for highest quality 3D graphics.
Look, if you want "dynamically clocked" gpus you'll have to wait because the 5600 simply doesn't fit your definition. If it was "dynamically clocked" the purpose of which was to conserve power and reduce noise, then why doesn't the card allow you the option of running all of your 3D at 235MHz? *That* would be something I could agree with would be in the "power-saving, noise-management (well, only if the fan changes speeds or shuts down)" category.
But a *heat trigger* designed to throttle back 3D processing at 400MHz to a lower MHz speed?--has nothing to do with either power or noise management, and everything to do with controlling destructive temperatures. Again, the two are unrelated. Very simply, if the 5600 could run indefinitely at 400MHz + with nominal heat and voltage signatures--the card would have been designed to run that way from the start.
That's what's been wrong with the entire idea of using mobile power-saving tech as an analogy here. BTW, my 9800P makes exactly as much "noise" at 445MHz as it does at 380MHz...
(Which is not much...
)
Edit: I guess, still, nobody knows whether the 5900U clockshifts and throttles...?