Query: Does the 5900U shift its 2D-3D clock speed?

But design could even be cheaper. Why? NVidia advertises NV3x as a "scaleable architecture". They use similar building blocks for the whole family. Since they have to put in clock throttling for their mobile parts, why not design all the modules with it in mind, instead of designing two versions of some blocks, one with and one without clock throttling? That would save design and verification work.
:rolleyes:

If you are saying tha any of Nivida's current cards could be put in a mobile GPU with any where near the same level of performance of the desktop then you seriously need help.
Any of these "new" GPU's maybe aside from the 5200 would probably have a thermal meltdown inside of one of those small notebook cases. Not to mention what the power draw would do to your battery life.

I really wish Nivida would right the ship, however, they seem to be doing everything they can to shoot themselves in the foot right now.

But to answer Walt C's question, I don't think anyone knows whether the 5900 clock throttles or not. :LOL:
 
Socos said:
[If you are saying tha any of Nivida's current cards could be put in a mobile GPU with any where near the same level of performance of the desktop then you seriously need help.
I'm not saying this. I said that NVidia probably reuses parts of the design of their mobile chips in desktop chips, and that this might save some work.
 
well it it does use clock throttling then that would be a definate indication that the design is not suited for continueous operation at 400Mhz.
 
YeuEmMaiMai said:
well it it does use clock throttling then that would be a definate indication that the design is not suited for continueous operation at 400Mhz.
Try to apply the same logic to Pentium 4 overheat protection.
 
kyleb said:
but the p4 never uses that in normal operation, you have to take the heatsink off or something first.
Any proof that this is different with NV3x? Except when running 3D screensavers, that is...
 
Walt C wrote
I guess nobody knows whether the 5900U does clock shifting and throttling like the 5800U

It does. Also, all the FX GPUs support temperature controlled fan-throttling, even the 5200 Ultra. The 5200 non-ultra is usually passively cooled anyways, though. I don't know if all 5200 Ultra boards implement it, but Gainward does, see GamePC

People consider silent 2D operation to be a desirable feature, which is something you don't seem to understand. I wish my 9700 had come with such a feature. Please don't tell me that it is because the 5200 Ultra can't run its puny 45 million transistors at 325 Mhz without melting.

If you think this has anything at all to do with "power saving" you are deluded. I've not seen a single instance of nVidia PR where they have advertised this as a "power saving" function

It is about power saving, as the only way to run silently is to reduce heat, and the only way to reduce heat is to reduce power consumption. Power savings is a means though, and not the end goal, which is silence. Maybe that is what is confusing you.

BTW, my 9800P makes exactly as much "noise" at 445MHz as it does at 380MHz... (Which is not much...)
See this is subjective. As a silent computing enthusiast, I do not consider it acceptable that my video card, of all things, be making noise when web browsing or using Word. So I had to buy Zalman's $40 heatsink for my 9700. If ATI had included fan throttling tech, it might have saved me that trouble, since the noise probably wouldn't bother me when playng UT2k3.

But a *heat trigger* designed to throttle back 3D processing at 400MHz to a lower MHz speed?--has nothing to do with either power or noise management, and everything to do with controlling destructive temperatures.

This is correct, but the fan throttling in 2D operation is clearly aimed at reducing noise, which is why even the 5200 Ultra includes it.

The point here is that the power savings between 2D operation at 400MHz and 2D operation at 235MHz is *scant and minimal... Hence the downclock to 235MHz in 2D is completely unnecessary from the standpoint of "power savings" and "heat", since both are largely reduced when the chip goes into 2D mode.

No, it is about a 40% power savings, in fact. The point is to get the heat down to the point where you can turn the fan off and enjoy silence. What proof do you have that this is possible if the clock remains at 400 Mhz?

Xmas wrote:
Uhm, no, I don't want it to run silent and power-saving in 3D operation.
I want it to run totally silent (power saving as a side effect) when I want it to be silent, and to run fast when I want it to be fast. Exactly like mobile parts do (in fact this is more about CPU than GPU)
This is exactly correct, as is the rest of your post.
 
xmas said:
The goal of temperature-triggered clock-throttling is to ensure stable operation in extreme cases. It's a security feature.

It is entirely possible to head for both goals at the same time.

OK, then point me at the nVidia literature which discusses this "power saving" aspect and relates how much "power saving" actually occurs between 2D operation at 400MHz and 2D operation at 235MHz. I haven't been able to find any info from nVidia on this subject--so if you can't either, let's agree to stop calling it "power saving." I just don't see any evidence that this is what it is.

And actually, I rather think the goal of clock throttling is to clock down the chip when it overheats. The definition of "extreme" will obviously vary from system to system. And chip to chip. nVidia obviously thinks it has reason to clock down the gpu, doesn't it, since it *always* clocks it down in 2D and *always* clocks it down when it overheats (why should it overheat with adequate cooling?)

I mean if you're implying that there's something lesser in quality about a 3D chip reference design which excludes clock throttling, I'd have to tell you that's never been my experience. The great majority of 3D reference designs sold since the V1 have been completely successful with no need for clock throttling.


Regarding the cost:
The amount of additional software needed is close to zero, because they need it for their mobile parts anyway. Manufacturing is a bit more expensive because of the increased transistor count.

But design could even be cheaper. Why? NVidia advertises NV3x as a "scaleable architecture". They use similar building blocks for the whole family. Since they have to put in clock throttling for their mobile parts, why not design all the modules with it in mind, instead of designing two versions of some blocks, one with and one without clock throttling? That would save design and verification work.

This is really reaching...;) Of course it costs more. Please quit confusing this chip with something relating to the "mobile market" and your idea of "power saving." This is not a mobile chip, nor is it designed for the mobile market. First of all I want to see some numbers from nVidia on "power saving"--like how much power I'm saving when I use a 5600. A hallmark of "power-saving" technology is that companies which employ it can express the power being saved in concrete numbers. Where "power-saving" techniques are employed in the mobile market, the goal is to preserve battery power, and the various companies employing such power-saving schemes have numbers to back up the premise they push. I've seen nVidia pushing no such premise about these products. Could it be that's because these products aren't meant for the "mobile market" and as such aren't designed with power-saving in mind?


Of course they knew they had to clock their high-end part as high as possible, and they knew they were going to use an extreme cooling solution. So clock throttling provides the benefits that they can safely clock it higher and still have enough "security margin" to guarantee operation in hot environments. And that the loud fan only has to run when absolutely neccessary.

Right--they knew that overclocking it was pushing the heat envelop and so they designed in a clock-throttle to knock down the MHz to cool the chip as they *expected* it to overheat from time to time.

Yes, of course. Isn't it a fine thing that it prevents your GPU to turn into smoke?
IIRC the default limit is 140°C. Kinda high, isn't it? Do you really think it will reach that limit in any but extreme circumstances like covering the exhaust hole or fan failure?

Right--as opposed to "power saving" and other assorted nonsense...;) Good description of a heat-triggered clock throttle mechanism.

If it would clock down in "normal" operating circumstances, I'd consider the card over-clocked, which is a bad thing.
(Well, there might be reasons for such behavior, not specifically to GPUs, but then it would have to be advertised as such)

Yes, the 5800U, before being abandoned for mass production, was reported on more than one site to arbitrarily clock itself down in the middle of a 3D game (in some cases, screen savers.) This would indicate to me that the clock throttle was doing its job. However, that's far from saying that is a desirable outcome...;) (When contrasted with chips that run all day long at their advertised MHz speeds without a need for clock throttling "protection," chips that do so for years without complaint.)

Like I wrote above, it's possible to target both issues with one design. And you can't really say which one is the intended effect and which one is only a side effect. I'm pretty sure NVidia had both in mind.

However, there is ample proof that the 5600 design employs a heat-triggered clock throttle, while there is no proof that any kind of mobile-market power saving is going on at all. How much power am I saving? What's the power savings nVidia advertises?

Do you have any proof that it can't run indefinitely at 400MHz?

No. Do you have any proof that it can?...;)

I know these different goals. I surely never said it was the same thing. But it can be accomplished with almost the same means.

I disagree. Power-saving technology as found in the mobile market clocks down *only* to save power. In every mobile version I've seen there are multiple steps of power-saving employed--multiple levels of power saving. Heat is not a consideration--doesn't enter the picture.

What we see in the 5600 is decidedly not that. We see a chip clocked to 400MHz to run 3D, with cooling adequate enough to presumably allow indefinite operation of the chip running 3D at 400MHz. Switching from 3D to 2D operation at 400MHz automatically cools the chip and consumes less power than when the chip is running 3D at 400MHz. Further, although I could be mistaken about this, I have not read that the fan in the 5600 shuts down in 2D operation. What I've read is that the fan runs all the time but its noise level is not obtrusive (like it was with the 5800U.)

Go back to the 5800U. Why did nVidia end up shutting the fan off when it clocked down to 2D operation? It had *nothing to do with saving power*, it was noise reduction, plain and simple. The noise of the fan was *so bad* in 3D operation that nVidia clocked down the gpu and turned off the fan to give people's ears a breather...;) No power-saving there...

So, if the fan noise in the 5600 is not objectionable when running 3D at 400MHz, how could it possibly be objectionable when running 2D at 400MHz?

The simple truth is that if we eliminate power saving from the equation (which I think is entirely justified), then there's no reason other than heat for nVidia to clock down the 5600 to 235MHz in *2D* operation. I would stipulate that noise pollution was a worthwhile reason if nVidia turned off the fan at 235MHz--but certainly not power saving. But noise pollution seems questionable in itself if it is true that at 400MHz 3D the fan noise is no greater than the fan on a GF4 or a Radeon.

If there was any kind of "mobile-market" power saving going on I'd expect to see many user selectable, or automatic, levels of MHz function--that's what you see in that kind of power-saving scenario. And, as I said before, I'd expect to see some numbers from nVidia to back it up any claim of power saving.

Uhm, no, I don't want it to run silent and power-saving in 3D operation. I want it to run totally silent (power saving as a side effect) when I want it to be silent, and to run fast when I want it to be fast. Exactly like mobile parts do (in fact this is more about CPU than GPU)

Like I said then you'll have some wating to do on a product like that...;)

I think we've been talking about three distinct issues here that have become confused:

1) Power saving as we see it in the mobile market
2) Clock throttling for thermal reasons in nVidia nv3x line of gpus
3) noise pollution as in the 5800U

*chuckle* From what I've read number 1 isn't applicable to the 5600, #2 definitely is, and #3 I simply can't answer...;) It's fine if you want all three--but it doesn't appear to me they are evident in the 5600.

Totally agree. Intel does that too, and I think it is a good idea to add such kind of security.

That's fine if you like it or see it a desirable feature. But that doesn't change the fact that there's nothing wrong with a chip that doesn't need thermal clock throttling for protection while in a normal operating environment.

WaltC said:
Again, the two are unrelated. Very simply, if the 5600 could run indefinitely at 400MHz + with nominal heat and voltage signatures--the card would have been designed to run that way from the start.
Here I don't agree.

Well, understanding that nVidia first incorporated this type of thermal clock-throttling with its nv3x chips--and understanding what kind of heat the nv30 puts out--and understanding nVidia didn't see a need for this with the GF4--and understanding that none of ATi's current .15 micron chips appears to need it--I think it's a fair bet that nVidia's slapped it into its reference designs because it thinks thermal clock-throttling is needed there specifically to alleviate overheating problems.

As such, I don't have a quarrel with the thermal clock throttling because it seems to be needed on an active basis in these products. But I do have a quarrel with "power saving" definitions--as I can't see any justification for them. Possible noise pollution, thermal profile considerations make sense--"power-saving" of the type found in the mobile market simply does not appear to be present.
 
ghandi said:
Walt C wrote

It does. Also, all the FX GPUs support temperature controlled fan-throttling, even the 5200 Ultra. The 5200 non-ultra is usually passively cooled anyways, though. I don't know if all 5200 Ultra boards implement it, but Gainward does, see...

At last! An answer to my original question...Thank you very much! Much appreciated.

Obviously, the fan can throttle down *because* the chip is down clocked in 2D to 235MHz.

People consider silent 2D operation to be a desirable feature, which is something you don't seem to understand. I wish my 9700 had come with such a feature. Please don't tell me that it is because the 5200 Ultra can't run its puny 45 million transistors at 325 Mhz without melting.

So the fan doesn't actually throttle down, then? Are you saying it shuts off in 2D at 235MHz? If so, then it is an issue of noise pollution versus power saving--thanks for the info (I wish you had posted much earlier...;))

It is about power saving, as the only way to run silently is to reduce heat, and the only way to reduce heat is to reduce power consumption. Power savings is a means though, and not the end goal, which is silence. Maybe that is what is confusing you.

No, I think I hit that right on the button. Power saving as it is done in the mobile market is a direct goal--as you say, here it is not. Not surprsing as the power differential between 2D at 400MHz and 2D at 235MHz would be slim. The real power-glutton mode of the chip would be its 3D 400MHz operation. I have no trouble with the idea that the chip is downclocked to 235MHz in 2D in order to shut the fan down.

However, I have read that the fan noise of the 5600 is not comparable to the rabid whine of the 5800U when the chip is running at 400MHz, so I wonder whether this is needed. Be that as it may, however, throttling the chip down to 235MHz in 2D in order to turn off the fan is certainly a valid reason for throttling it back (even if it is not a bona-fide power-saving reason.)

See this is subjective. As a silent computing enthusiast, I do not consider it acceptable that my video card, of all things, be making noise when web browsing or using Word. So I had to buy Zalman's $40 heatsink for my 9700. If ATI had included fan throttling tech, it might have saved me that trouble, since the noise probably wouldn't bother me when playng UT2k3.

Heh--I can't even hear my 9800P fan over my case fans...;) But if you can then you must stagger when you run a 3D game and the fan kicks up. For me, there's no difference. You're right, though--it's very much subjective.

This is correct, but the fan throttling in 2D operation is clearly aimed at reducing noise, which is why even the 5200 Ultra includes it.

Really, I wasn't aware the FX fans were that loud after the 5800U....Hmmmm....

No, it is about a 40% power savings, in fact. The point is to get the heat down to the point where you can turn the fan off and enjoy silence. What proof do you have that this is possible if the clock remains at 400 Mhz?

40% of what? Running the chip in 3D mode with the fan on? Or 40% of the power the chip would be consuming in 2D mode at 400MHz? Some numbers from nVidia would be interesting to look at...if you have a link...

Heh--again, if my 3D-card fan was silent I'd never know it...;) My case/cpu fans sure aren't. But I could understand the attraction if you're running a water pump and no case fans, certainly.

Edit: One flaw in your reasoning, ghandi, that I can see is that the only reason nVidia clocked down the 5800U in the beginning was so they could run that gosh-awful fan at a slower rpm while people were working in 2D. (Of course the fan noise wasn't the only reason the nVidia CEO declared nv30 "a failure", but that's another topic.) Later, when they became aware of how much people really disliked the awful racket, they decided to turn off the fan in 2D completely.

This was not done simply to offer "silent" operation as you allude, it was only done to ameliorate to some degree the truly awful racket the fan made when running 3D games--it was a counter of sorts. Had the 5800U sported a fan no louder than a GF4 or Radeon in the beginning, the "silent treatment" would never have materialized. I just think it's a bit misleading to imply that nVidia was after "silent" computing all along when the fact is the 5800U fan is one of the loudest, most obnoxious fans ever put on a 3D card. Slowing the fan down, and then turning it off, came out of a desire to muzzle that fan noise to some degree--not as a bonafide silent-computing initiative. I thought, from the accounts I've read, that the newer FX fans were much quieter when running.
 
So the fan doesn't actually throttle down, then? Are you saying it shuts off in 2D at 235MHz?

Whether they slow down or shut off is left to the board maker. Gainward shuts theirs off, others might just slow them down, I'm not sure.

Really, I wasn't aware the FX fans were that loud after the 5800U....Hmmmm....

They are much quieter than the 5800, but silence is always better.

the only reason nVidia clocked down the 5800U in the beginning...was not done simply to offer "silent" operation as you allude, it was only done to ameliorate to some degree the truly awful racket the fan made when running 3D games--it was a counter of sorts. Had the 5800U sported a fan no louder than a GF4 or Radeon in the beginning, the "silent treatment" would never have materialized.

I agree they must have come up with the technology to quiet the godawful 5800 Ultra. But including it in other cards just makes sense, since people, including OEMs, consider less noise to be desirable.

As far as the 5200 Ultra is concerned, I don't think there are any stability related issues. I doubt it draws more than 35 watts. ATI's 9000 pro draws about 30 watts, with about 20% fewer transistors (45m vs 36m)

Heh--I can't even hear my 9800P fan over my case fans... But if you can then you must stagger when you run a 3D game and the fan kicks up. For me, there's no difference. You're right, though--it's very much subjective.

The loudest noise from my computer is the buzz from my CRT. :D
 
WaltC said:
OK, then point me at the nVidia literature which discusses this "power saving" aspect and relates how much "power saving" actually occurs between 2D operation at 400MHz and 2D operation at 235MHz. I haven't been able to find any info from nVidia on this subject--so if you can't either, let's agree to stop calling it "power saving." I just don't see any evidence that this is what it is.

This is really reaching...;) Of course it costs more. Please quit confusing this chip with something relating to the "mobile market" and your idea of "power saving." This is not a mobile chip, nor is it designed for the mobile market. First of all I want to see some numbers from nVidia on "power saving"--like how much power I'm saving when I use a 5600. A hallmark of "power-saving" technology is that companies which employ it can express the power being saved in concrete numbers. Where "power-saving" techniques are employed in the mobile market, the goal is to preserve battery power, and the various companies employing such power-saving schemes have numbers to back up the premise they push. I've seen nVidia pushing no such premise about these products. Could it be that's because these products aren't meant for the "mobile market" and as such aren't designed with power-saving in mind?

Right--as opposed to "power saving" and other assorted nonsense...;)

The simple truth is that if we eliminate power saving from the equation (which I think is entirely justified), then there's no reason other than heat for nVidia to clock down the 5600 to 235MHz in *2D* operation. I would stipulate that noise pollution was a worthwhile reason if nVidia turned off the fan at 235MHz--but certainly not power saving. But noise pollution seems questionable in itself if it is true that at 400MHz 3D the fan noise is no greater than the fan on a GF4 or a Radeon.

If there was any kind of "mobile-market" power saving going on I'd expect to see many user selectable, or automatic, levels of MHz function--that's what you see in that kind of power-saving scenario. And, as I said before, I'd expect to see some numbers from nVidia to back it up any claim of power saving.

(Sorry for reordering parts of your post)

Walt, please read my post again. You may notice that I mentioned "power saving" less often than you did here - far less.

I didn't say clock throttling is used in NV31 to save power. like it is in NV31Go. I said it might have made it into the chip because it was already there, design finished and tested, and because it might add one or two possibly marketable features, being silence and security. I really think that justifies the cost of adding it (which is IMO not very high - it might even have saved some cost).


And actually, I rather think the goal of clock throttling is to clock down the chip when it overheats. The definition of "extreme" will obviously vary from system to system. And chip to chip. nVidia obviously thinks it has reason to clock down the gpu, doesn't it, since it *always* clocks it down in 2D and *always* clocks it down when it overheats (why should it overheat with adequate cooling?)
Why should it *not* clock down in 2D? My Gf3Ti200 at 175MHz isn't exactly slow in 2D, so why would I need the chip to run at 400+MHz when displaying Web Pages?
And why should it *not* clock down when it overheats?? Heck, what else?Make fire alarm sounds?

Fan failure does happen. Inexperienced overclockers are going to bring the chip beyond its limits. Isn't it good to know that your graphics card won't get damaged if the fan fails?


I mean if you're implying that there's something lesser in quality about a 3D chip reference design which excludes clock throttling, I'd have to tell you that's never been my experience. The great majority of 3D reference designs sold since the V1 have been completely successful with no need for clock throttling.
There is no particular need for clock throttling in desktop chips, but it's nonetheless a good feature to have.
I think overheat protection and workload-dependent downclocking will be standard features for GPUs in less than three years.

Right--they knew that overclocking it was pushing the heat envelop and so they designed in a clock-throttle to knock down the MHz to cool the chip as they *expected* it to overheat from time to time.
That is purely speculation.

However, there is ample proof that the 5600 design employs a heat-triggered clock throttle, while there is no proof that any kind of mobile-market power saving is going on at all. How much power am I saving? What's the power savings nVidia advertises?
NVidia doesn't advertise power savings, because no one is interested in it for a desktop product. But you can use the same mechanism (dynamic clocking) for different goals.



Uhm, no, I don't want it to run silent and power-saving in 3D operation. I want it to run totally silent (power saving as a side effect) when I want it to be silent, and to run fast when I want it to be fast. Exactly like mobile parts do (in fact this is more about CPU than GPU)
Like I said then you'll have some wating to do on a product like that...;)
NVidia's current offerings do almost what I want, but there are of course other, more important factors like performance and image quality...

Totally agree. Intel does that too, and I think it is a good idea to add such kind of security.

That's fine if you like it or see it a desirable feature. But that doesn't change the fact that there's nothing wrong with a chip that doesn't need thermal clock throttling for protection while in a normal operating environment.
Of course there is nothing wrong with such a chip. It certainly shouldn't need clock throttling in a normal operating environment. But having such a protection is a good thing.

WaltC said:
Well, understanding that nVidia first incorporated this type of thermal clock-throttling with its nv3x chips--and understanding what kind of heat the nv30 puts out--and understanding nVidia didn't see a need for this with the GF4--and understanding that none of ATi's current .15 micron chips appears to need it--I think it's a fair bet that nVidia's slapped it into its reference designs because it thinks thermal clock-throttling is needed there specifically to alleviate overheating problems.

As such, I don't have a quarrel with the thermal clock throttling because it seems to be needed on an active basis in these products.
I honestly don't think NV3x chips running at their nominal clock speed need thermal clock throttling in a normal operating environment.
 
Xmas said:
kyleb said:
but the p4 never uses that in normal operation, you have to take the heatsink off or something first.
Any proof that this is different with NV3x? Except when running 3D screensavers, that is...

well i suppose that depends on what you what to call proof. but if you are willing to take nvidia's word on it then yes it is different with the nv3x. :LOL:
 
kyleb said:
well i suppose that depends on what you what to call proof. but if you are willing to take nvidia's word on it then yes it is different with the nv3x. :LOL:
So where does NVidia state that thermal clock throttling is used during normal operation?
 
My opinion

AFAIK from current information, the only NV3X board besides the NV30 that might likely use clock throttling due to a real negative is the NV35. That negative being that it might be necessary to reach MTBF goals.

Notice the "might"s...the existence of clock throttling doesn't prove it, but that with other characteristics indicate that it might be a factor. I think your fallacy, WaltC, is simply taking that for proof. But clock throttling based on thermal characteristics and/or 2D/3D separate clock speed operation has completely valid positive aspects.

IMO, it is fan throttling that is based on countering a negative such that its existence indicates something undesirable about a design and introduces an opportunity for "danger" in its triggering mechanism. I think laptops are important again to mention here, because I think there are examples of fan throttling there that work without undue "danger" being manifested and illustrate that it is possible that it might be only the negative (noise) that might matter in this regard (if they fixed the 3d window app/screensaver/driver triggering problems, or completely divorced that from fan throttling).

Now, as for the NV35 in particular, to my knowledge: we don't have the clear indications, as we did for the NV30, of significant problems, though we do have some initial indication of (significantly lesser, but possibly still significant) fan noise; we don't have clear indication of the fan throttling situation being a negative in the final product, or even reason (like a massive assembly) to preclude some significant leeway in quieter cooling; we have indication that many products will be 2 slot, which ranges from insignificant to important for different people; and also we have indication that DDR 1 RAM usage and the core itself leave some overclock headroom for both.

To me, this indicates that: NV35 clock throttling mechanism is a pure plus for overclockers (MTBF issues don't really when you are already ignoring them); that the fan situation might be a negative, or might not at all, and I'm not even sure that was a focus of discussion so far; there might be some negatives with triggering for throttling features, but the ones we know of shouldn't be insurmountable; that if the 2 slot issue matters to you, we have positive indication that the NV35 has a negative in this regard; and that OEMs might not like having to jump through additional hoops to use the products due to possible MTBF target challenges..

All the italic words have an as of yet indeterminate chance of being true (i.e., there is no reason to assume they are significantly more supported than their negative, if at all), and cannot, to my knowledge, be used to validly support a stance against the NV31+ products cooling/clocking features. The OEM ones are in italics because for the NV35, the overclocking plus and high-end nature make the negatives under discussion quite possibly irrelevant.

I don't get how this paints a clearly negative picture of the NV35 from the standpoint of its thermal characteristics at all, WaltC, and since it seems the potential negatives I consider associated with the fan seems to be likely to be surmountable or maybe not even evident, I don't think it is valid to presume them now...the weight of evidence seem to me to be on the NV35's side, in the postive, so far.

I also am not aware of how there is anything clearly negative at all about NV31/NV34 having similar mechanisms for cooling/clock speed, if they do, since I don't see any need for the negatives of fan throttling to manifest (i.e., no chance of a too loud fan AFAIK). It is possible that there are issues with these products in this regard, but I'm simply not aware of them and don't have reason to think any are necessitated.
 
I'd personally love it if my whole system would shut its fans off when I was only websurfing.

If DELL or Compaq/HP added something like to their boxes and got it right, I might actually consider paying them something and not building my own box--since they would offer some value add for me that I presume I could not necessarily get from simply putting the components together myself.
 
Xmas said:
kyleb said:
well i suppose that depends on what you what to call proof. but if you are willing to take nvidia's word on it then yes it is different with the nv3x. :LOL:
So where does NVidia state that thermal clock throttling is used during normal operation?

all over the place, at least last i checked. that is what this thread is about, ya know? :?
 
kyleb said:
Xmas said:
kyleb said:
well i suppose that depends on what you what to call proof. but if you are willing to take nvidia's word on it then yes it is different with the nv3x. :LOL:
So where does NVidia state that thermal clock throttling is used during normal operation?

all over the place, at least last i checked. that is what this thread is about, ya know? :?
Huh? Did you follow this thread?
I don't think this thread showed evidence that the GFFX5600U new revision (or another NV3x card) uses thermal clock throttling while running fully featured 3D action at 400MHz while the fan is running properly and ambient temperature isn't too extreme. No proof that it overheats in normal operation and has to clock down because of it.
 
Re: My opinion

demalion said:
....I don't get how this paints a clearly negative picture of the NV35 from the standpoint of its thermal characteristics at all, WaltC, and since it seems the potential negatives I consider associated with the fan seems to be likely to be surmountable or maybe not even evident, I don't think it is valid to presume them now...the weight of evidence seem to me to be on the NV35's side, in the postive, so far.

I also am not aware of how there is anything clearly negative at all about NV31/NV34 having similar mechanisms for cooling/clock speed, if they do, since I don't see any need for the negatives of fan throttling to manifest (i.e., no chance of a too loud fan AFAIK). It is possible that there are issues with these products in this regard, but I'm simply not aware of them and don't have reason to think any are necessitated.

As long as we can dispense with the notion that the 2D-3D clock shifting is being done for bonafied "power-saving" reasons, I'm satisfied. Looking at it as a legacy feature relative to nv30 for the purposes of noise pollution control is probably the best way to look at it. I had not read that current 5600 reference design fan noise in 3D operation was objectionable, and was surprised to learn from ghandi that the clock is throttled back to 235MHz ( edit: actually ghandi answered my questions about the 5900U reference design) so that the fan can be turned off (and seems to be the sole reason for the 2D-3D clock shifting.)

However, my own personal view of the thermally triggered clock throttle is simply that I mistrust its presence. Is it present simply as an added "security" feature that will likely never be invoked during normal operation, or is it present because nVidia expects that the chip will overheat from time to time during normal 400MHz 3D processing (or 450MHz operation for nv35 reference designs)? We know for a fact that at times the clock throttle is invoked during normal 3D operation with some tested nv30 reference designs, hence the reason for its presence within nv30 reference designs is obvious. My own *opinion* is that yield issues relative to nv3x have still not been completely solved, and so nVidia deems this nv30 carryover technology is still required. (Especially since 3D processing speeds have been bumped up from 350 to 400MHz in 5600, but bumped down to 450MHz from 500MHz in the 5900U.) This is related as a supposition on my part, and I recognize that one might reach different conclusions from the same body of evidence...;) What makes me skeptical of it, and causes me to dislike it, is its 5800U nv30 reference-design heritage.
 
Xmas said:
kyleb said:
Xmas said:
kyleb said:
well i suppose that depends on what you what to call proof. but if you are willing to take nvidia's word on it then yes it is different with the nv3x. :LOL:
So where does NVidia state that thermal clock throttling is used during normal operation?

all over the place, at least last i checked. that is what this thread is about, ya know? :?
Huh? Did you follow this thread?
I don't think this thread showed evidence that the GFFX5600U new revision (or another NV3x card) uses thermal clock throttling while running fully featured 3D action at 400MHz while the fan is running properly and ambient temperature isn't too extreme. No proof that it overheats in normal operation and has to clock down because of it.

well sure if we want to go off into fantasy world and exclude 2d mode from the definition of "normal operation"; then you are correct. do you want a cookie Xmas?
 
Back
Top