NV30- the fan will last how long, we must dust it how often?

Status
Not open for further replies.
Maybe Tom's hardware will do another video, ATI heat sink falls off compared to Nvidia GF fx with the smoldering N30 chip afterwards.

What if a piece of paper gets sucked into the GF fx. Maybe unlikely but out of a few million owners, it gota occure a few times.
 
I'd assume no matter how hot the air is from the exhaust, that the little that's getting recycled would only cause a small increase in the average - compared to intaking and reusing the already above ambient air from inside the case like fans normally do..

wonder how it'd perform differently if they kept the intake outside, and had the exhaust vent inside... or would that wind up melting a hole through the nearest PCI card?

should be interesting to see what the board manufacturers do, seems to be that they very rarely incorporate the same reference design heatsink, wanting to use their own flashy design/look.. wonder what they're gonna have to do to top the reference now and make themselves stand out from each other?

"Did NV mention anywhere that they've integrated any kind of thermal protection circuitry into this? That is, should the fan go, will the board lower it's clock speed or shut down the PC to protect your expensive board from doing an impression of an Athlon without a heatsink?"

well they did speak of how they can monitor the chip's activity, and turn the fan on accordingly, I'm sure there must be some sort of reverse of that, which provides safety to the card if the fan dies, or gets too hot, etc..
 
noko said:
Maybe Tom's hardware will do another video, ATI heat sink falls off compared to Nvidia GF fx with the smoldering N30 chip afterwards.
I never heard about a heat sink falling off an R300. Maybe you are referring to the heat spreader on the back of the board? It wasn't a part of the original design, so losing it wouldn't be a loss. One of my original boards lacked the heat spreader and I never noticed any issues after months of use.
 
nv30-hoover.jpg

I'm sorry, i HAD to..... :eek:
 
also don't forget that behind your pc is already going to be above room temp, cause thats where all the exhaust from your psu and rest of your case is going, so the nv30 will be drawing in warmer air to begin with heheh
 
I don't think recirculating hot air is this thing's biggest concern. Like others have said, the tiny fan on standard video cards recycle way more hot air (that's pre-heated inside the case to boot) than the NV30 hoover-thingy will. I would worry more about dust contamination in the fan or air channels of that copper cooler. Putting an intake filter is not feasible as it would most likely either cut off most of the airflow (small fan can't offer much air pressure), or simply not do much good (filter too coarse). You'd have to clean that too of course...

*G*
 
Grall said:
I don't think recirculating hot air is this thing's biggest concern. Like others have said, the tiny fan on standard video cards recycle way more hot air (that's pre-heated inside the case to boot) than the NV30 hoover-thingy will. I would worry more about dust contamination in the fan or air channels of that copper cooler. Putting an intake filter is not feasible as it would most likely either cut off most of the airflow (small fan can't offer much air pressure), or simply not do much good (filter too coarse). You'd have to clean that too of course...

*G*

Regardless of the theoretical "closed" air ducting of the fan system, and its theoretical benefits, there is a significant portion of the heatsink which covers the ram and the back of the card such that a significant amount of heat will radiate directly into the case--even if the closed-loop fan does its job.

Apparently, there is an intake filter (some of these posts have said.) I agree with you that a small filter like that will become clogged up quickly and soon become an impediment to airflow through the fan--theoretically cutting the heatsink's efficiency.

nVidia's done a lot of touting of it's "silent running" circuitry (Heh-Heh--clever of them to face it head-on like that and push it as a "feature"), but obviously the chance of chip failure through burn-out must be pretty good, and so they've added in the throttling feature which will throttle it all down in the event the fan becomes too inefficient. This is the other side of the "silent running" feature of the fan.

Interesting to note, but on-the-scene observers have said the fan isn't "that" loud. Don't know how reliable that is--none of them listened to it in a quiet room. But assuming it is reliable information and the fan is not excessively loud when running at full bore, then the actual purpose of the throttling mechanism might primarily be to prevent chip burn out with the "silent running" feature only a secondary after effect. Nonetheless, the "silent running" idea is a great marketing stroke designed to nullify the type of sentiment that might occur if it was called a "clock-throttling feature," instead...;) But I believe a clock throttle is exactly what it is. *chuckle* Wouldn't it be great to buy a 500MHz nv30 and get 500MHz out of it only 50% of the time?...;)

In any event, IMO, the 500MHz model requiring this fan indicates the nv30 is being severely overclocked at 500Mhz--I'm still wondering if their slower-clocked version will be offered sans the huge fan--bet it will be--and bet it will be clocked @ 350-400Mhz. It would also be nice to know how much heat the ram will be radiating into the case.
 
Boy, you people get really desparate when you have to attack the fan on a card. The same f*nb*i's who worship water and nitrogen cooled hacks and have no problem with buying tremendously huge heatsinks and fans for their CPUs, or who in the past have complemented OEMs who deviated from the reference design to add massive heatsinks on the RAM, are now criticizing NVidia for adding a rather futuristic cooling solution into the reference design.

Even more absurd is people who are not thermal engineers or who haven't ran simulations or done the neccessary math talking about how the design won't work, or will break down. Or speculate that that the card will melt down.

We have come full circle folks. This is Voodoo Volts all over again. In the total absense of any facts and with no ability to criticize performance, the criticism falls to non-relevant aspects like the size of the card, power connector, or the fan used!! Next time round, we'll be criticizing the color of the PCB used.

If ATI shipped a card that had a Peltier cooler on the second PCI slot, we'd be seeing high praise from the same folks. We already have two separate inventions of the "vent exhaust out the back of the card" We might see more vendors in the future switch to this, so I wonder what the people who are criticizing NVidia now will say if an R300 or R350 ships from someone with a similar setup.
 
I think you're off base DC, at least for a large number of the folks posting in this thread. I think the majority of these posts about the GFFX's cooling fall into two categories (aside from pure fbism):

1. Curiosity as to how and how well it works
2. Disappointment that the card won't have much OC potential as it is probably pushed pretty hard already

I fall into both categories.

Mize
 
Boy, you people get really desparate when you have to attack the fan on a card.

Almost...but not quite as desparate as defending the fan on a card. ;)

We have come full circle folks. This is Voodoo Volts all over again.

Yes...complete role reversal. Those that were critical of the Volts solution, are now singing the praises of "advanced cooling" techniques of GFFX.

In the total absense of any facts and with no ability to criticize performance, the criticism falls to non-relevant aspects like the size of the card, power connector, or the fan used!!

And who is to blame for this situation? That would be nVidia....by having their "paper launch." There wouldn't be an "absense of facts" had nVidia released pricing, spec, and performance information verifiable by 3rd parties.

In short....all this talk about "non-relevant" aspects may have been prevented, if nVidia was closer to production than they are.

If ATI shipped a card that had a Peltier cooler on the second PCI slot, we'd be seeing high praise from the same folks.

That would all depend on the performance of this Peltier cooler card. On par with the competition? Or clear and away performance leader, and at what price?

so I wonder what the people who are criticizing NVidia now will say if an R300 or R350 ships from someone with a similar setup.

For the record, I'll repeat what I said earlier in this thread about these kinds of things:

Having said that, I'm all for anyone providing us more choice by whatever means. If nVidia or ATI puts on a cooling system that requires 3 slots and a flux capacitor for power...but that enables them to gets the clock rate up to 1 Ghz....I'm all for it.

Of course, we would all prefer less obtrusive arrangements, but at least we have the choice to decide if the trade-offs are worth it or not.

Point being: these "non-relevant" things as you call them ARE relevant. They have there own set of negatives...be it cost, noise, or limiting of viable systems. To call them "non-relevant" is just as f*nboish as calling them engineering atrocities.
 
Username said:
:LOL:

Now I have to admit I didnt see this one coming!

I guess if you are going to find things to complain about, try something a little less banal.

Has anyone seen a fanless powersupply on mainstream PC's lately? I wonder how often I'll have to clean that, or if it will jam making my motherboard a lightbulb.

It will be really interesting to read the spin next month when the reviews come out, I doubt I'll read any "The NV30 is 2.5 times as fast as the R300 @1600x1200 with AA!" spiel then.

Here's the thing, there's a few small differences between a PSU's fan and the NV30's.

First off, the PSU probably doesn't absolutely 100% require the fan or it'll die a horrible death. My PSU, 300W ATX2.03, never gets hot at all, including its fan.

Second, a PSU's fan only spins at a moderate rate... whereas NV30's will more than likely be VERY VERY VERY fast, drawing in a lot more air. And I do mean a LOT. Also consider that a fan on a PSU has the whole fan's diameter squared for intake and output, whereas NV30's fan has one small slot in the back - right next to the scorching hot exhaust no less.

There is NO way I'm getting an NV30 if it has this cooler... partly because my PC sits in a wooden alcove... :LOL:
 
[Nvidiot]
That fan is awesome, Nvidia is finally giving us hardcore gamers what we want... that baby should overclock like crazy!
[/Nvidiot]

[FanATIc]
Oh god, did you see the size of that thing? That fan is huge.... I thought .13 was supposed to be smaller and cooler :rolleyes: That board is bigger than a Ti4600... and you can forget about overclocking, if that thing needs that sort of cooling just to run at normal speeds.
[/FanATIc]

Umm... not sure what my point is... :-?
 
DemoCoder said:
Boy, you people get really desparate when you have to attack the fan on a card. The same f*nb*i's who worship water and nitrogen cooled hacks and have no problem with buying tremendously huge heatsinks and fans for their CPUs, or who in the past have complemented OEMs who deviated from the reference design to add massive heatsinks on the RAM, are now criticizing NVidia for adding a rather futuristic cooling solution into the reference design.

Can't speak for anyone else, of course, but I don't do a lot of overclocking. And in the past I have criticized OEMs for placing ram heatsinks because I believed them to be largely cosmetic (and at that time I think this was true--however, I doubt these ram heatsinks for nv30 are "cosmetic" at all--probably a necessity.) So I guess I don't quite fit the above grouping...;)

Even more absurd is people who are not thermal engineers or who haven't ran simulations or done the neccessary math talking about how the design won't work, or will break down. Or speculate that that the card will melt down.

While I might ask precisely what a "thermal engineer" is...(hot under the collar, maybe?...;)), I don't think you have to be an engineer to understand the basic principles behind heatsinks, and certainly you don't have to be such to know whether you happen to like a particular arrangement or not, right?

We have come full circle folks. This is Voodoo Volts all over again. In the total absense of any facts and with no ability to criticize performance, the criticism falls to non-relevant aspects like the size of the card, power connector, or the fan used!! Next time round, we'll be criticizing the color of the PCB used.

It seems to me that we do have some facts nVidia has asked us to look at:

(1) The fan and heatsink as nVidia has shown them for the 500MHz nv30 *reference design* (not some hopped up overclocker's wet dream...;))are huge, easily the largest *I* 've ever seen for a stock 3D card's base reference design.

(2)nVidia has already revealed that the "silent running" feature is part of a clock throttle, which is based on the monitoring of a number of factors taking place on the board. This was done either at the presentation or in subsequent interviews in the last couple of days. If the fan is not overtly loud when running at full tilt, then the "silent running" would seem just a better word for "Intelligent Clock-Throttling Technology", at least from a marketing point of view. (If you want to split hairs you might say that slowing down the GPU is separate from slowing down the fan, and you'd be correct. However, nVidia has already described it's clock throttle at least in one piece I've read in the last few days, and so my idea that "silent running" is a part of the greater clock-throttling mechanism seems pretty straightforward. Some people like clock throttling--which means my description might be positive as much as negative.)

(3)It's not yet been established whether any of the samples demoed at the presentation are actually running at 500Mhz....yet. At least to my satisifaction (which further erodes any confidence whatsoever in the canned benchmark numbers nVidia has released--which of course no one else is able to duplicate due to lack of functioning nv30-based products in the review circuit.)


Pretty much it looks to me as if there are plenty of facts in evidence to analyze. If you aren't suggesting that one need be a "thermal engineer" to contemplate buying the card, then I can't see how not being a "thermal engineer" invalidates logical criticism of the facts in evidence (which admittedly are scarce.) However, since nVidia is throwing around benchmark numbers prior to shipping cards to anybody, I think any criticisms of the card prior to its shipping are equally as fair. Especially when you consider that they are based on facts nVidia itself has put into evidence.

If ATI shipped a card that had a Peltier cooler on the second PCI slot, we'd be seeing high praise from the same folks. We already have two separate inventions of the "vent exhaust out the back of the card" We might see more vendors in the future switch to this, so I wonder what the people who are criticizing NVidia now will say if an R300 or R350 ships from someone with a similar setup.


If the 9700 Pro had a Peltier cooler from which it derived it's horsepower I absolutely would not have bought it--nor would I have entertained buying it. I can't speak for anyone else, but I can assure you I have have been totally consistent. It would have fallen in exactly the same category I now place nv30 in--a cheap shot--a huckster's way to performance. Actually, I would have had far more respect--and virtually no criticism--of nv30 had it shipped at 400MHz without the gargantuan fan and a bit less in the heatsink department--although the fan is my biggest objection.

It's kind of ironic that you mention this, really...;) It's actually after thinking a bit about the 9700 Pro, and looking at its rather sloppy, slipshod, standard cooling arrangement, which has been completely adequate for me so far, and thinking about its 110,000,000 transistors on a .15 micron die--that the nature of what nVidia was doing with nv30 became a bit clearer. With it's 125,000,000 transistors on a .13 micron die (a much smaller die), it should require less power, and run cooler than the same chip at .15 microns, and so at .13 microns it should clock appreciably higher than the same chip at .15 microns--while having about the same cooling characteristics. (All of this is generalized, of course.)

The copious copper heatsinks practically covering the card front and back, along with the VLF (very large fan) indicate to me that in order to get this chip to 500MHz nVidia is having to pour on the juice and raise the clock substantially. Further, the fact that nVidia has officially included this HSF arrangement in the reference design proves to me that nVidia thinks it is an absolute necessity--and not 75% cosmetic *chuckle* (as if...)

So, if such a HSF arrangment is indeed absolutely necessary, then it's needed because of copious amounts of heat produced by the card, and that heat is produced by voltage & clockrate (for GPU and ram). Hence the only "circle" I see here is one that keeps coming back to this HSF combination as being not only rare among OEM designs--but absolutely unique...;) What OEM would voluntarily choose to raise the base cost of his reference design so dramatically with these kinds of heatsinks and this kind of fan, especially, if they were not necessary to the proper function of the reference design? I can't think of a single one, including nVidia (possibly saying "especially nVidia" would be apt.)

No, an OEM would put on the reference design only that which was needed to ensure proper operation of the reference design and the reference clock rate. Period. The OEM would then allow whatever card OEMs wanted to purchase his product to slap on whatever fan and heatsinks they wanted, just so long as the basic thermal needs of the reference design were observed.

So if it's necessary--it's because at 500MHz the nv30 is overclocked. I suppose we will find out at what clock rate the chip will run with much less cooling by virtue of the nv30 reference designs for slower-clocked cards without so much heatsink and a smaller, "standard" fan.

I don't see much here that seems to me the product of faulty reasoning, but dissenting opinions are always welcome...;) *chuckle* (Like this is the nv30 GPU Heatsink & Fan Think Tank).... :LOL:
 
Maybe this will be the wave of the future--

The way the Rolling Stones now scalp their own tickets (prime seats available at much higher prices), perhaps we'll see card makers overclock their own card. Maybe if ATI had done the same thing with an Ultra version of their 9700 and offered it for $100 more it would have appealed to some (not me, I'm fairly certain my PC's power supply couldn't handle it).
 
WaltC said:
While I might ask precisely what a "thermal engineer" is...(hot under the collar, maybe?...;)), I don't think you have to be an engineer to understand the basic principles behind heatsinks, and certainly you don't have to be such to know whether you happen to like a particular arrangement or not, right?


It is precisely because you don't know what a thermal engineer is that disqualifies you from making pronouncements about the operating lifetime and temperature properties of NVidia's design. From an aesthetic point of view, you are certainly capable of having an opinion as to how such a setup "looks" or how much it might hit your pocket book.

However, once you start talking about dust problems, or recycling hot air, or the temperature of the exhaust, you enter into the realm of nonsense.


Do you really think NVidia, ATI, Intel, AMD, et al, just "eyeball" the type of heatsink needed from "basic principles behind heatsinks"? The reality is far from that. Thermal engineers look at a design and use thermal and radiation software to do design and analysis of the heat properties of systems. This means computing heat transfer coefficients, heat/air flow, and temperature in both steady state and transient scenarios. Everything is analyzed from chip packaging to case air flow. Just look at software like SINDA for a taste of what this involves. Dust and particulate matter *IS* taken into account in the design.


When you're designing a card it isn't as simple as just looking through a catalog of large heatsinks and bolting on the one you need. I'm sorry if that annoys the armchair thermal engineers here making comments on NVidia's thermal design in the absense of any facts or knowledge.

That's why this idea that NVidia added this external venting system and fan at the "last minute to clock up the NV30 to beat the R300" is nonsense. This thermal system was designed months ago and had to be validated on thermal simulation software just like other aspects of the design. No way did they bolt this thing on at the last minute to beat ATI. Mostly likely, Nvidia's thermal engineers had this design in mind over a year ago and wanted to work it into whatever their next generation board was.
 
Re: NV30- the fan will last how long, we must dust it how of

g__day said:
Looking at the piccys of NV30 again I really wonder how long that fan will last, before it gets jammed, a bearing goes, it sucks in some dust and then poof your expensive toy is a flashbulb in about 30 seconds max.

Most computer fans have a MTBF (mean time before failure) of something like 60,000 hours. So you could leave your NV30 running nonstop for years (8760 hours per year) before the fan would likely fail on you. That is a mean time, so you could get unlucky, but the odds are against it. My computers have a lot of fans in them and not one of them has stopped working yet after having them for 3+ years. Granted, I don't run 24/7, but I run enough... Don't worry, it won't be a problem and cleaning won't be necessary unless your comp is wide open and sucking in dust (even so it shouldn't be necessary unless conditions are really unsanitary).

Sharkfood said:
Like I stated, on my case, I've replaced the front case INTAKE fans a number of times as they do indeed get packed up and need cleaning fairly often. Floor mounted tower cases on carpeted surfaces are going to inhale lint and debris, no matter how much vacuuming you do around the area. As case fans are only a couple bucks, I dont bother and just replace them.

You can buy fan filters for something like $1.50, so if you're replacing your intake fans all the time you're wasting a lot of money. The filters can be easily vacuumed out and they catch most dust/hair/lint before it goes into the fan. I assume that a filter could also be butchered to work on the GF FX intake as well. Actually if this turns out to be a serious problem (which I rather doubt), you'll be able to buy filters made for the GF FX fan, or it will come with one installed.
 
DemoCoder said:
However, once you start talking about dust problems, or recycling hot air, or the temperature of the exhaust, you enter into the realm of nonsense.

I hate to disagree, but I don't have to be a 'thermal engineer' to know that dust and other particulate matter (our amazing 'shedding dog' for instance) plays havoc with the fans in both mine and my wifes computer, to the point of constant maintainance of both. Almost lost a video card due to dust bunnies killing the fan. Fortunately it was only a GF2GTS and the only adverse effect of the fan packing in was a few crashes. I'd be amazed if the same would be true of the GFFX without some kind of thermal protection to aid what appears to be a very very hot chip.
 
DemoCoder said:
If ATI shipped a card that had a Peltier cooler on the second PCI slot, we'd be seeing high praise from the same folks. We already have two separate inventions of the "vent exhaust out the back of the card" We might see more vendors in the future switch to this, so I wonder what the people who are criticizing NVidia now will say if an R300 or R350 ships from someone with a similar setup.
Oh good lord, WHATEVER.
That is total BS.
IF ATI shipped a card with a peltier cooler, you can bet i'd be laughing about it.
Would you get off of your f@nboi highhorse for a momeent and stop the witch-hunt? Stop looking for f@nboi's and you might stop seeing them. If you cant see how this HSF is "above and beyond" standard cooling, then you are blind. I guess next thing is, you'll be telling us that YOU are a thermal engineer. Oh wait... You arent. So iguess you wouldnt know whether or not one would be needed to discuss this thing.
And i'll tell you this - i dont need a "pizza engineer" to tell me if a pizza tastes like shit or not. You figure out the analogy.
 
Er, don't make this a witch hunt - I asked a few questions given the design looks fragile to dust build up. I have 6 PCs for years now in a room which is cleaned often and has wooden floors. You still get a lot of dust build up.

As someone who knows statistics don't talk to me about mean time to failure - talk standard deviations in a distribution pattern given an enviroment with dust in operating conditions not a clean air lab. How many parts in a million fail within a year I wonder? Are they at six sigma? I have had 3 fans fail in 3 years (of about 36 fans in the 6 cases; 1 CPU fan, 1 GPU fan - a GF2 GTS, 1 front case fan). Two were dust related failures, the other one (the CPU) have no idea - it was a quality high speed fan about 3 months old, failed when I was taking the kids to school, the CPU probably just survived cause I had the air conditioner on full at the time.

It would be sensible for the card to have an emergency shutdown procedure if a fan fails or it dramatically overheats. It is probably crucial infact with this design.

I wonder what software simulations they did on fan cooling effectivenes given dust build up scenarioes? That would be extremely challenging to model accurately. Even simple turbulence in an airflow stream requires supercomputers to simulate. How would you possibly accurately model the adhesion of dust or fibres of random sizes, length, flexibility, adhesion, weight moving into your high speed parts and heat sink fins? Then you'd have to change turbulence models, for feedback effects - it'd be a nightmare. Such a model would be ill-conditioned at best - trying to evaluate chaos patterns affected by multiple random events.

But a common sense gut feeling just makes me go Oh-Oh when I look at this design. It looks impressive, well marketed, tuned and most of all fragile. You can't filter air in as was said before without lowering air pressure - and you really need high pressure air to cool effectively. The heat fins are an excellent target for trapping dust and any build up will seriously impede airflow.

Why speculate on whether this was planned or if it was a "we need to compete with ATi and we didn't see their broadside comming" over-reaction? You can place a bet either way here - and its just as speculative.

I'd love NVidia and/or the benchmarkers to cover this little potential Achilles heel well please. A simple comment - "No you're safe - we considered this and..." would suffice. If instead they hedge their warranties or sell cleaning kits with this model I'd look worried :)
 
Status
Not open for further replies.
Back
Top