Query: Does the 5900U shift its 2D-3D clock speed?

Discussion in 'Architecture and Products' started by WaltC, May 28, 2003.

  1. Socos

    Newcomer

    Joined:
    Feb 23, 2003
    Messages:
    48
    Likes Received:
    0
     
  2. Xmas

    Xmas Porous
    Veteran Subscriber

    Joined:
    Feb 6, 2002
    Messages:
    3,344
    Likes Received:
    176
    Location:
    On the path to wisdom
    I'm not saying this. I said that NVidia probably reuses parts of the design of their mobile chips in desktop chips, and that this might save some work.
     
  3. YeuEmMaiMai

    Regular

    Joined:
    Sep 11, 2002
    Messages:
    579
    Likes Received:
    4
    well it it does use clock throttling then that would be a definate indication that the design is not suited for continueous operation at 400Mhz.
     
  4. Xmas

    Xmas Porous
    Veteran Subscriber

    Joined:
    Feb 6, 2002
    Messages:
    3,344
    Likes Received:
    176
    Location:
    On the path to wisdom
    Try to apply the same logic to Pentium 4 overheat protection.
     
  5. kyleb

    Veteran

    Joined:
    Nov 21, 2002
    Messages:
    4,165
    Likes Received:
    52
    but the p4 never uses that in normal operation, you have to take the heatsink off or something first.
     
  6. Xmas

    Xmas Porous
    Veteran Subscriber

    Joined:
    Feb 6, 2002
    Messages:
    3,344
    Likes Received:
    176
    Location:
    On the path to wisdom
    Any proof that this is different with NV3x? Except when running 3D screensavers, that is...
     
  7. ghandi

    Newcomer

    Joined:
    Dec 1, 2002
    Messages:
    3
    Likes Received:
    0
    Walt C wrote
    It does. Also, all the FX GPUs support temperature controlled fan-throttling, even the 5200 Ultra. The 5200 non-ultra is usually passively cooled anyways, though. I don't know if all 5200 Ultra boards implement it, but Gainward does, see GamePC

    People consider silent 2D operation to be a desirable feature, which is something you don't seem to understand. I wish my 9700 had come with such a feature. Please don't tell me that it is because the 5200 Ultra can't run its puny 45 million transistors at 325 Mhz without melting.

    It is about power saving, as the only way to run silently is to reduce heat, and the only way to reduce heat is to reduce power consumption. Power savings is a means though, and not the end goal, which is silence. Maybe that is what is confusing you.

    See this is subjective. As a silent computing enthusiast, I do not consider it acceptable that my video card, of all things, be making noise when web browsing or using Word. So I had to buy Zalman's $40 heatsink for my 9700. If ATI had included fan throttling tech, it might have saved me that trouble, since the noise probably wouldn't bother me when playng UT2k3.

    This is correct, but the fan throttling in 2D operation is clearly aimed at reducing noise, which is why even the 5200 Ultra includes it.

    No, it is about a 40% power savings, in fact. The point is to get the heat down to the point where you can turn the fan off and enjoy silence. What proof do you have that this is possible if the clock remains at 400 Mhz?

    Xmas wrote:
    This is exactly correct, as is the rest of your post.
     
  8. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    OK, then point me at the nVidia literature which discusses this "power saving" aspect and relates how much "power saving" actually occurs between 2D operation at 400MHz and 2D operation at 235MHz. I haven't been able to find any info from nVidia on this subject--so if you can't either, let's agree to stop calling it "power saving." I just don't see any evidence that this is what it is.

    And actually, I rather think the goal of clock throttling is to clock down the chip when it overheats. The definition of "extreme" will obviously vary from system to system. And chip to chip. nVidia obviously thinks it has reason to clock down the gpu, doesn't it, since it *always* clocks it down in 2D and *always* clocks it down when it overheats (why should it overheat with adequate cooling?)

    I mean if you're implying that there's something lesser in quality about a 3D chip reference design which excludes clock throttling, I'd have to tell you that's never been my experience. The great majority of 3D reference designs sold since the V1 have been completely successful with no need for clock throttling.


    This is really reaching...;) Of course it costs more. Please quit confusing this chip with something relating to the "mobile market" and your idea of "power saving." This is not a mobile chip, nor is it designed for the mobile market. First of all I want to see some numbers from nVidia on "power saving"--like how much power I'm saving when I use a 5600. A hallmark of "power-saving" technology is that companies which employ it can express the power being saved in concrete numbers. Where "power-saving" techniques are employed in the mobile market, the goal is to preserve battery power, and the various companies employing such power-saving schemes have numbers to back up the premise they push. I've seen nVidia pushing no such premise about these products. Could it be that's because these products aren't meant for the "mobile market" and as such aren't designed with power-saving in mind?


    Right--they knew that overclocking it was pushing the heat envelop and so they designed in a clock-throttle to knock down the MHz to cool the chip as they *expected* it to overheat from time to time.

    Right--as opposed to "power saving" and other assorted nonsense...;) Good description of a heat-triggered clock throttle mechanism.

    Yes, the 5800U, before being abandoned for mass production, was reported on more than one site to arbitrarily clock itself down in the middle of a 3D game (in some cases, screen savers.) This would indicate to me that the clock throttle was doing its job. However, that's far from saying that is a desirable outcome...;) (When contrasted with chips that run all day long at their advertised MHz speeds without a need for clock throttling "protection," chips that do so for years without complaint.)

    However, there is ample proof that the 5600 design employs a heat-triggered clock throttle, while there is no proof that any kind of mobile-market power saving is going on at all. How much power am I saving? What's the power savings nVidia advertises?

    No. Do you have any proof that it can?...;)

    I disagree. Power-saving technology as found in the mobile market clocks down *only* to save power. In every mobile version I've seen there are multiple steps of power-saving employed--multiple levels of power saving. Heat is not a consideration--doesn't enter the picture.

    What we see in the 5600 is decidedly not that. We see a chip clocked to 400MHz to run 3D, with cooling adequate enough to presumably allow indefinite operation of the chip running 3D at 400MHz. Switching from 3D to 2D operation at 400MHz automatically cools the chip and consumes less power than when the chip is running 3D at 400MHz. Further, although I could be mistaken about this, I have not read that the fan in the 5600 shuts down in 2D operation. What I've read is that the fan runs all the time but its noise level is not obtrusive (like it was with the 5800U.)

    Go back to the 5800U. Why did nVidia end up shutting the fan off when it clocked down to 2D operation? It had *nothing to do with saving power*, it was noise reduction, plain and simple. The noise of the fan was *so bad* in 3D operation that nVidia clocked down the gpu and turned off the fan to give people's ears a breather...;) No power-saving there...

    So, if the fan noise in the 5600 is not objectionable when running 3D at 400MHz, how could it possibly be objectionable when running 2D at 400MHz?

    The simple truth is that if we eliminate power saving from the equation (which I think is entirely justified), then there's no reason other than heat for nVidia to clock down the 5600 to 235MHz in *2D* operation. I would stipulate that noise pollution was a worthwhile reason if nVidia turned off the fan at 235MHz--but certainly not power saving. But noise pollution seems questionable in itself if it is true that at 400MHz 3D the fan noise is no greater than the fan on a GF4 or a Radeon.

    If there was any kind of "mobile-market" power saving going on I'd expect to see many user selectable, or automatic, levels of MHz function--that's what you see in that kind of power-saving scenario. And, as I said before, I'd expect to see some numbers from nVidia to back it up any claim of power saving.

    Like I said then you'll have some wating to do on a product like that...;)

    I think we've been talking about three distinct issues here that have become confused:

    1) Power saving as we see it in the mobile market
    2) Clock throttling for thermal reasons in nVidia nv3x line of gpus
    3) noise pollution as in the 5800U

    *chuckle* From what I've read number 1 isn't applicable to the 5600, #2 definitely is, and #3 I simply can't answer...;) It's fine if you want all three--but it doesn't appear to me they are evident in the 5600.

    That's fine if you like it or see it a desirable feature. But that doesn't change the fact that there's nothing wrong with a chip that doesn't need thermal clock throttling for protection while in a normal operating environment.

    Well, understanding that nVidia first incorporated this type of thermal clock-throttling with its nv3x chips--and understanding what kind of heat the nv30 puts out--and understanding nVidia didn't see a need for this with the GF4--and understanding that none of ATi's current .15 micron chips appears to need it--I think it's a fair bet that nVidia's slapped it into its reference designs because it thinks thermal clock-throttling is needed there specifically to alleviate overheating problems.

    As such, I don't have a quarrel with the thermal clock throttling because it seems to be needed on an active basis in these products. But I do have a quarrel with "power saving" definitions--as I can't see any justification for them. Possible noise pollution, thermal profile considerations make sense--"power-saving" of the type found in the mobile market simply does not appear to be present.
     
  9. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    At last! An answer to my original question...Thank you very much! Much appreciated.

    Obviously, the fan can throttle down *because* the chip is down clocked in 2D to 235MHz.

    So the fan doesn't actually throttle down, then? Are you saying it shuts off in 2D at 235MHz? If so, then it is an issue of noise pollution versus power saving--thanks for the info (I wish you had posted much earlier...;))

    No, I think I hit that right on the button. Power saving as it is done in the mobile market is a direct goal--as you say, here it is not. Not surprsing as the power differential between 2D at 400MHz and 2D at 235MHz would be slim. The real power-glutton mode of the chip would be its 3D 400MHz operation. I have no trouble with the idea that the chip is downclocked to 235MHz in 2D in order to shut the fan down.

    However, I have read that the fan noise of the 5600 is not comparable to the rabid whine of the 5800U when the chip is running at 400MHz, so I wonder whether this is needed. Be that as it may, however, throttling the chip down to 235MHz in 2D in order to turn off the fan is certainly a valid reason for throttling it back (even if it is not a bona-fide power-saving reason.)

    Heh--I can't even hear my 9800P fan over my case fans...;) But if you can then you must stagger when you run a 3D game and the fan kicks up. For me, there's no difference. You're right, though--it's very much subjective.

    Really, I wasn't aware the FX fans were that loud after the 5800U....Hmmmm....

    40% of what? Running the chip in 3D mode with the fan on? Or 40% of the power the chip would be consuming in 2D mode at 400MHz? Some numbers from nVidia would be interesting to look at...if you have a link...

    Heh--again, if my 3D-card fan was silent I'd never know it...;) My case/cpu fans sure aren't. But I could understand the attraction if you're running a water pump and no case fans, certainly.

    Edit: One flaw in your reasoning, ghandi, that I can see is that the only reason nVidia clocked down the 5800U in the beginning was so they could run that gosh-awful fan at a slower rpm while people were working in 2D. (Of course the fan noise wasn't the only reason the nVidia CEO declared nv30 "a failure", but that's another topic.) Later, when they became aware of how much people really disliked the awful racket, they decided to turn off the fan in 2D completely.

    This was not done simply to offer "silent" operation as you allude, it was only done to ameliorate to some degree the truly awful racket the fan made when running 3D games--it was a counter of sorts. Had the 5800U sported a fan no louder than a GF4 or Radeon in the beginning, the "silent treatment" would never have materialized. I just think it's a bit misleading to imply that nVidia was after "silent" computing all along when the fact is the 5800U fan is one of the loudest, most obnoxious fans ever put on a 3D card. Slowing the fan down, and then turning it off, came out of a desire to muzzle that fan noise to some degree--not as a bonafide silent-computing initiative. I thought, from the accounts I've read, that the newer FX fans were much quieter when running.
     
  10. ghandi

    Newcomer

    Joined:
    Dec 1, 2002
    Messages:
    3
    Likes Received:
    0
    Whether they slow down or shut off is left to the board maker. Gainward shuts theirs off, others might just slow them down, I'm not sure.

    They are much quieter than the 5800, but silence is always better.

    I agree they must have come up with the technology to quiet the godawful 5800 Ultra. But including it in other cards just makes sense, since people, including OEMs, consider less noise to be desirable.

    As far as the 5200 Ultra is concerned, I don't think there are any stability related issues. I doubt it draws more than 35 watts. ATI's 9000 pro draws about 30 watts, with about 20% fewer transistors (45m vs 36m)

    The loudest noise from my computer is the buzz from my CRT. :D
     
  11. Xmas

    Xmas Porous
    Veteran Subscriber

    Joined:
    Feb 6, 2002
    Messages:
    3,344
    Likes Received:
    176
    Location:
    On the path to wisdom
    (Sorry for reordering parts of your post)

    Walt, please read my post again. You may notice that I mentioned "power saving" less often than you did here - far less.

    I didn't say clock throttling is used in NV31 to save power. like it is in NV31Go. I said it might have made it into the chip because it was already there, design finished and tested, and because it might add one or two possibly marketable features, being silence and security. I really think that justifies the cost of adding it (which is IMO not very high - it might even have saved some cost).


    Why should it *not* clock down in 2D? My Gf3Ti200 at 175MHz isn't exactly slow in 2D, so why would I need the chip to run at 400+MHz when displaying Web Pages?
    And why should it *not* clock down when it overheats?? Heck, what else?Make fire alarm sounds?

    Fan failure does happen. Inexperienced overclockers are going to bring the chip beyond its limits. Isn't it good to know that your graphics card won't get damaged if the fan fails?


    There is no particular need for clock throttling in desktop chips, but it's nonetheless a good feature to have.
    I think overheat protection and workload-dependent downclocking will be standard features for GPUs in less than three years.

    That is purely speculation.

    NVidia doesn't advertise power savings, because no one is interested in it for a desktop product. But you can use the same mechanism (dynamic clocking) for different goals.



    NVidia's current offerings do almost what I want, but there are of course other, more important factors like performance and image quality...

    Of course there is nothing wrong with such a chip. It certainly shouldn't need clock throttling in a normal operating environment. But having such a protection is a good thing.

    I honestly don't think NV3x chips running at their nominal clock speed need thermal clock throttling in a normal operating environment.
     
  12. kyleb

    Veteran

    Joined:
    Nov 21, 2002
    Messages:
    4,165
    Likes Received:
    52
    well i suppose that depends on what you what to call proof. but if you are willing to take nvidia's word on it then yes it is different with the nv3x. :lol:
     
  13. Xmas

    Xmas Porous
    Veteran Subscriber

    Joined:
    Feb 6, 2002
    Messages:
    3,344
    Likes Received:
    176
    Location:
    On the path to wisdom
    So where does NVidia state that thermal clock throttling is used during normal operation?
     
  14. demalion

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    2,024
    Likes Received:
    1
    Location:
    CT
    My opinion

    AFAIK from current information, the only NV3X board besides the NV30 that might likely use clock throttling due to a real negative is the NV35. That negative being that it might be necessary to reach MTBF goals.

    Notice the "might"s...the existence of clock throttling doesn't prove it, but that with other characteristics indicate that it might be a factor. I think your fallacy, WaltC, is simply taking that for proof. But clock throttling based on thermal characteristics and/or 2D/3D separate clock speed operation has completely valid positive aspects.

    IMO, it is fan throttling that is based on countering a negative such that its existence indicates something undesirable about a design and introduces an opportunity for "danger" in its triggering mechanism. I think laptops are important again to mention here, because I think there are examples of fan throttling there that work without undue "danger" being manifested and illustrate that it is possible that it might be only the negative (noise) that might matter in this regard (if they fixed the 3d window app/screensaver/driver triggering problems, or completely divorced that from fan throttling).

    Now, as for the NV35 in particular, to my knowledge: we don't have the clear indications, as we did for the NV30, of significant problems, though we do have some initial indication of (significantly lesser, but possibly still significant) fan noise; we don't have clear indication of the fan throttling situation being a negative in the final product, or even reason (like a massive assembly) to preclude some significant leeway in quieter cooling; we have indication that many products will be 2 slot, which ranges from insignificant to important for different people; and also we have indication that DDR 1 RAM usage and the core itself leave some overclock headroom for both.

    To me, this indicates that: NV35 clock throttling mechanism is a pure plus for overclockers (MTBF issues don't really when you are already ignoring them); that the fan situation might be a negative, or might not at all, and I'm not even sure that was a focus of discussion so far; there might be some negatives with triggering for throttling features, but the ones we know of shouldn't be insurmountable; that if the 2 slot issue matters to you, we have positive indication that the NV35 has a negative in this regard; and that OEMs might not like having to jump through additional hoops to use the products due to possible MTBF target challenges..

    All the italic words have an as of yet indeterminate chance of being true (i.e., there is no reason to assume they are significantly more supported than their negative, if at all), and cannot, to my knowledge, be used to validly support a stance against the NV31+ products cooling/clocking features. The OEM ones are in italics because for the NV35, the overclocking plus and high-end nature make the negatives under discussion quite possibly irrelevant.

    I don't get how this paints a clearly negative picture of the NV35 from the standpoint of its thermal characteristics at all, WaltC, and since it seems the potential negatives I consider associated with the fan seems to be likely to be surmountable or maybe not even evident, I don't think it is valid to presume them now...the weight of evidence seem to me to be on the NV35's side, in the postive, so far.

    I also am not aware of how there is anything clearly negative at all about NV31/NV34 having similar mechanisms for cooling/clock speed, if they do, since I don't see any need for the negatives of fan throttling to manifest (i.e., no chance of a too loud fan AFAIK). It is possible that there are issues with these products in this regard, but I'm simply not aware of them and don't have reason to think any are necessitated.
     
  15. RussSchultz

    RussSchultz Professional Malcontent
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,855
    Likes Received:
    55
    Location:
    HTTP 404
    I'd personally love it if my whole system would shut its fans off when I was only websurfing.

    If DELL or Compaq/HP added something like to their boxes and got it right, I might actually consider paying them something and not building my own box--since they would offer some value add for me that I presume I could not necessarily get from simply putting the components together myself.
     
  16. kyleb

    Veteran

    Joined:
    Nov 21, 2002
    Messages:
    4,165
    Likes Received:
    52
    all over the place, at least last i checked. that is what this thread is about, ya know? :?
     
  17. Xmas

    Xmas Porous
    Veteran Subscriber

    Joined:
    Feb 6, 2002
    Messages:
    3,344
    Likes Received:
    176
    Location:
    On the path to wisdom
    Huh? Did you follow this thread?
    I don't think this thread showed evidence that the GFFX5600U new revision (or another NV3x card) uses thermal clock throttling while running fully featured 3D action at 400MHz while the fan is running properly and ambient temperature isn't too extreme. No proof that it overheats in normal operation and has to clock down because of it.
     
  18. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    Re: My opinion

    As long as we can dispense with the notion that the 2D-3D clock shifting is being done for bonafied "power-saving" reasons, I'm satisfied. Looking at it as a legacy feature relative to nv30 for the purposes of noise pollution control is probably the best way to look at it. I had not read that current 5600 reference design fan noise in 3D operation was objectionable, and was surprised to learn from ghandi that the clock is throttled back to 235MHz ( edit: actually ghandi answered my questions about the 5900U reference design) so that the fan can be turned off (and seems to be the sole reason for the 2D-3D clock shifting.)

    However, my own personal view of the thermally triggered clock throttle is simply that I mistrust its presence. Is it present simply as an added "security" feature that will likely never be invoked during normal operation, or is it present because nVidia expects that the chip will overheat from time to time during normal 400MHz 3D processing (or 450MHz operation for nv35 reference designs)? We know for a fact that at times the clock throttle is invoked during normal 3D operation with some tested nv30 reference designs, hence the reason for its presence within nv30 reference designs is obvious. My own *opinion* is that yield issues relative to nv3x have still not been completely solved, and so nVidia deems this nv30 carryover technology is still required. (Especially since 3D processing speeds have been bumped up from 350 to 400MHz in 5600, but bumped down to 450MHz from 500MHz in the 5900U.) This is related as a supposition on my part, and I recognize that one might reach different conclusions from the same body of evidence...;) What makes me skeptical of it, and causes me to dislike it, is its 5800U nv30 reference-design heritage.
     
  19. kyleb

    Veteran

    Joined:
    Nov 21, 2002
    Messages:
    4,165
    Likes Received:
    52
    well sure if we want to go off into fantasy world and exclude 2d mode from the definition of "normal operation"; then you are correct. do you want a cookie Xmas?
     
  20. RussSchultz

    RussSchultz Professional Malcontent
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,855
    Likes Received:
    55
    Location:
    HTTP 404
    Kyleb? Are you suggesting that the 5600U overheats in 2d mode and therefore needs to slow down?
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...