So, do we know anything about RV670 yet?

I read it as a means to downclock when not in "3D". For example under Windows Classic thene you'd get "2D" clocks, under Aero "2,5D clocks" and in gaming "3D" clocks. Heavy scenes vs cutscenes is just marketing IMHO.

Nothing new if thats the case.

Chris
 
Since when is multiple clock settings such a new thing? Rivatuner has supported various power themes and multiple clock settings for a long long time on Nvidia hardware.

I really dont see how core throttling and power adjustments from a different workload is such a new thing to some people. Most power sensitive people have been doing this for a while. It looks like typical mobile technology at work to me.
 
But surely RivaTuner had support for existing ATI power themes etc? This should be "something new" by the looks of it, since it needed added support from RT's part
 
So what is it going to do?

If I set my core using over-drive to let's say "800" --- if the scene is not intense or a cut-scene --it will reduce to lesser levels to save on power or heat, etc.?

Then when the scene get's heavy it is raised to 800 if needed? Raises it even higher if it can?

Does the drivers to this automatically without enabling Over-Drive?

Is it another setting in OverDrive to be more flexible?

How much power savings can one save?

Sounds like power-play mobile technology added to 3d gaming or something like this.
 
But surely RivaTuner had support for existing ATI power themes etc? This should be "something new" by the looks of it, since it needed added support from RT's part


Possibly. I dont have any idea. All I was doing was replying to the possibility illustrated above about various clock settings being available in drivers. This has been supported by Nvidia drivers since the Geforce 6 go series. To me thats nothing paticularly new. Of course this could be something entirely different.

These settings can also be controlled at the bios level for Nvidia hardware.




But like I said. No way to know if this is actually what is being discussed here. Just commenting on the post speculating on the issue.
 
Well, I don't know for the mobile parts (except that they've had several powersaving methods for ages), but at least for desktop parts ATI has had separate 2D & 3D clocks (&volts) for a while already
 
Well, I don't know for the mobile parts (except that they've had several powersaving methods for ages), but at least for desktop parts ATI has had separate 2D & 3D clocks (&volts) for a while already

Ya basically the description just didnt seem that new to me. In Vista you can set low power 3D/2D/3D settings and actually adjust them with the power modes.

But most enthusiasts who "really" care about this sort of thing have been doing this for a while now.
 
I don't know if I "really" care because I don't know what the positives may or may not be.

Could these power saving technologies help with multi-GPU's as well? We complain about the high cost of power supply's and GPU's going crazy utilizing high power -- anything that is new and may help is welcomed from nVidia or ATI.
 
In 2D yes. If you want to lower your heat/power consumption while doing basic desktop functions such as web browsing or notepad work theres zero reason to have your 2D clocks running @ full capacity. I dont.
 
Uhrm... ok.. then please explain to me.... why did Rivatuner add support for something "new" if it wasn't something "new"? ;)
Don't get me wrong, but I've got the impression that truly dynamic clock adjustment incurs latency. For example in games: if you are slowly crawling, the load is low and clocks are lowered. Suddenly 10 barrels blow up due to an airstrike. The GPU doesn't know in advance that big load is coming to rev up the clocks, so FPS dips until board comes to full speed (half a second? 3 seconds? I don't know).
If AMD has solved the problem - great! :cool:
 
I would expect about the same as for CPUs which is only a few 10s or 100s of clocks. (even on a GPU there are hundreds of millions of clocks per second to play with).

If we're talking Cool'n'Quiet/SpeedStep for GPUs, I think its new & a damn good idea.
 
Last edited by a moderator:
Could these power saving technologies help with multi-GPU's as well? We complain about the high cost of power supply's and GPU's going crazy utilizing high power -- anything that is new and may help is welcomed from nVidia or ATI.
Given that AMD now has complete platform control, perhaps it revolves around a more integrated solution than Nvidia's ESA. I wonder if it ties in with Marvell's DSP based PSU microcontroller.

2007110517470626.png

2007110518004848.jpg


http://www.nvidia.com/object/nvidia_esa.html
 
Last edited by a moderator:
Nothing new if thats the case.
Chris

That's not the case, Chris. It is not just a software 2D/3D clock switching technology, which was available for years in NVIDIA products since GeForce FX series. But that is really not that new as Inq showed it, DPM is not a new feature of RV670 and it was already used in the previous RV6xx products. Whole RV6xx family supports hardware clock throttling techniques, which reduce clocks and voltages at hardware level depending on GPU usage. These things are way more effective than currently existing NVIDIA software power management technologies.
 
Frequency throttling for a device where power consumption is critical (such as a laptop in battery mode) is a must have capability for all sorts of processors for years now. However I don't think anyone is as naive to fire up a demanding 3d game on a laptop when it's in battery mode heh.

Anyway I haven't the slightest clue what AMD's power saving features exactly include, but I'm highly sceptical if any throttling is going on while under full load. Probably not directly connected but I have AMD's Cooln'Quiet stuff on my PC disabled. That's what proper device and case cooling sollutions are for, and while those slightly add to the cost any processor won't throttle when it feels like it (relatively speaking) and yes there are massive heatsinks available for both GPUs as CPUs with 0db noise.

Before someone says that such sollutions aren't ideal for overclocking, any software driven frequency throttling isn't exactly either.

In summary since power consumption is way more critical in the notebook/laptop market, we'll have to see which of the two IHVs came up with more effective power saving and lower power consumption for either RV670 or G92 based mobile GPUs.
 
I wonder if it's more along the lines of just turning off quads/parts of the chip that aren't needed and/or in use?

I'd imagine running even Vista's desktop doesn't require all the ALU's running all the time.

And maybe if say shader workload in X game wasn't being pushed, IE - sitting idle much of the time, that it could "turn off" shader blocks.

From a power and heat standpoint that would be interesting as I still run some games that don't need all the power of an 9700pro much less a G80 or R600.

Too bad there wasn't more info into what they are actually doing, hopefully we'll find out more when the card launches.

Regards,
SB
 
How come? Isn't this pretty much "de-facto" on at least mobile CPU's?

It tells me that the part will be overclocked outside the termal spec. Like, being pushed to the max where possible but having to roll back when it gets too hot.

I may be wrong though :)
 
I have AMD's Cooln'Quiet stuff on my PC disabled. That's what proper device and case cooling sollutions are for
I see Cooln'Quiet as not wasting electricity having your system blasting away full tit all the time rather than for cooling/fan noise & so keep such things on.
Intel had the one which throttled down to keep the CPU from overheating.
The Core2 version (& I think on the later P4s) works more like Cooln'Quiet in that it throttles down only when the load is light & comes back to full speed in only a small number of CPU cycles when needed.
 
Back
Top