NV30- the fan will last how long, we must dust it how often?

Status
Not open for further replies.
K.I.L.E.R said:
GFFX would scare away my GF, will it not?
No. Didnt you know women are attracted to loud fans and super hot video cards?
Just make sure she doesnt end up leaving you for a R350.

(oh yeah! BLING BLING - Its page 14!)
 
How this post has merrily wandered and grown since conception - like a wayward child. I am surprised that such a simple observation has grown to 250 or so posts.

To add some more relevance I read a few days ago that all the GF FX line will use the same heat management solution. I read (digit life or hardware extreme I believe - I will try and find the link) an arcticle that explains that if the card gets too hot it throttles down - regardless of the manufacturer's cooling solution applied. So it doesn't matter if its got a small fan or liquid nitrogen cooled - if it gets too hot the handbrake comes on.

This will make benchmarking fun!!! Here are the 3d Marks at 45, 50, 55, 60 and 70 degrees celcius - look at the drop in performance. The longer we play a game the hotter the video card becomes hence if the HSF isn't super 'adequate' the slower our card will become.

Very, very interesting...
 
That is interesting, so if Quake III goes down to 10FPS, you know that the dust buster filter is clogged :p.
 
People where bitching and moaning about the clock-throttling on the P4 when it was announced. Nowadays you don't hear anything about it. I don't see why this will be any different, unless the hardware that does the throttling is majorly faulty that is.
 
Or the cooling system gets really clogged up easily. It just looks like a design that was slapped together at the last moment. Now if Nvidia did a chip with cooling passages and use a pressurize viotile fluid like freon with a passive or active heat exchanger then they would be pushing the technology and probably be able to really get the clock speed up.
 
That is interesting, so if Quake III goes down to 10FPS, you know that the dust buster filter is clogged .

incorrect - it is when the flames start licking your legs that you realise your heatsink needs cleaning.
 
I can see it now. Message boards filled with newbies complaining that thier 500 dollar video card is performing worse than thier geforce 4 mx ;)
 
But how cool is that?
If you want to take off the beasty HSF,. you can. It will just run at 10Hz in response.
 
I don't know. Going from my r8500 to my r9700 my case temps increased by 2 degree's . My brother in law's (yea i sell him all my old parts for cheap) geforce 4 4600 raised my temps by 3 degrees I wonder what will happen if i put that geforce fx into my case . My chips might become unstable.
 
People where bitching and moaning about the clock-throttling on the P4 when it was announced.

That depends if the GF FX clock throttling is a smoother curve, or a single threshold reduction like the P4 to prevent breakdown from overheating.

The P4's clock throttling only comes into effect when the CPU is about to fry or at a temperature that would yield permanent damages to the CPU. As this condition is never hit under normal operation (including long benchmarking cycles), this is why you dont hear people complaining about it- you can "burn in" benchmark a P4 for 48-hours+ and the clockspeed remains constant with performance unhindered.

If the NV30's "throttling" is similar, then its all good. It's better to have the GPU downclock rather than fry, but the main difference will be how many degrees of throttling it has, and how often it throttles for the average user. If the GF FX has a tendency to throttle one or two notches during your average UT2003 game, it will be a very bad thing and would suggest the card runs too hot at peak speeds to even consider peak speeds effective output for the card.

A product touted and marketed with benchmarks at 500mhz and combined with monster cooling had better function at 500mhz at least 98% of the time. If such a card falls back and runs at 400mhz and substantially lesser performance "on average" gaming, then it's false advertising. What good is buying a card that can only run a single benchmark at 98 fps, but in game falls back to 58 fps after the first 8 minutes?

I doubt this will be the case, but I'd hope if there is any question in the matter that websites will run benchmarks in a loop and mark any degradation in performance over time. I know I dont play games for 4-8 minutes, but instead can enjoy online game "rallies" for hours at a time regularly.
 
I agree. If i'm playing unreal and I'm in the middle of a match and my frames go from a 100fps down to 20fps I'd be pissed. Hopefully we will see some great site cough beyond3d cough test something like mabye 3dmark in a constent loop at above normal temps. I'm not talking like a 120f room , I'm talking about a hot summer day temps.
 
jvd:
Since FxFlow blows the hot air outside the case, why do you think it would make the temperature in the case higher than other cards?

And until we see some tests on the GeFX, I take most "concerns" in this thread for what it is, FUD.
 
Sharkfood:
That depends if the GF FX clock throttling is a smoother curve, or a single threshold reduction like the P4 to prevent breakdown from overheating
If I had to make a guess I'd think it would be a multi-step solution, probably borrowing technology from their mobile parts (isn't it only user selectable there? Or can it change automatically?). I'd imagine steps like "total shutdown", "minimal 2D", "max 2D/minimal 3D", "full blast" or simmilar. Having just one step wouldn't fit with their "Less noise when running outlook/word" statement, so there has to be some logic controlling it depending on use. I wonder if it will be totally hardware based (how would that work?), or hardware with hints from the driver like "power up 3d pipeline to 50%". If it's driven by software, it should be possible to hack and do some testing on what it's like in different modes etc.

If such a [500mhz] card falls back and runs at 400mhz and substantially lesser performance "on average" gaming, then it's false advertising.
And they would (hopefully) never get away with it and lose all their die-hard supporters in a flash. From impression I've got some reviewers would probably test this issue first and do the "real" benchmarking afterwards. :)

I doubt this will be the case, but I'd hope if there is any question in the matter that websites will run benchmarks in a loop and mark any degradation in performance over time.
Yeah, like there was in the beginning with the P4. Some were claiming the CPU was throttling down, but further inquiry revealed that not to be the case (IIRC). Then again, I've still got to kill that "fact" once every few months at work. (I'm the "tech guy" at work)
 
Basic said:
jvd:
Since FxFlow blows the hot air outside the case, why do you think it would make the temperature in the case higher than other cards?

And until we see some tests on the GeFX, I take most "concerns" in this thread for what it is, FUD.

Why ? well i'm sure the fx flow is not a 100% effective. Below all my graphic cards I have a pci slot sucker so it takes most of the heat blown off by the fan right out of the case. Thus I expect the geforce fx to be better at removing the heat than my current set up but will it be enough to counter the amount of heat produced by the chip?
 
antlers4 said:
If it needs a filter and if you need to clean the filter, you're not going to need to open up or shut down your machine; the filter will be write there in the air intake in the extra slot. Seems like kind of a silly thing to worry about...

Actually, that depends how the filter is attached, and how easy it would be to reach. Case in point, the mounting bracket is inserted so that it fits inside the case (in the bracked). If the filter is as mounted on the bracket itself, and part of the mounting extends beyond the width of the opening at back of slot, then it wouldn't be removable without taking out the card. If however, the air inlet sticks out beyond the back of the case and slides on and off, then it would be possible to remove without taking the card out.

Time will tell, but I wouldn't gaurentee matter of factly that something would be inserted in the most convenient method, from a stand point of mechanics. All one would have to do to see this doesn't always occur is look at the design of the slot 1 cartridge that Intel used to use. Sliding the processor in the mother board wasn't the tough part, but removing it, unless one had extra long fingers and a very large distance between the thumb and the first finger.... I could never do that with 1 hand...and that said, so much for keeping one hand in a grounded position while only working with the other... The problem is that the cartridge was 2 wide to keep both tabs pressed in with both the thumb and first finger and my fingers just couldn't reach that far.

But even if they could, outstretched like that, there wouldn't be a good grip to begin lifting the Pentium II or Pentium III out of the slot. AMD had a better idea, by having the lockign position pulled out and then lock into the unlock position. Then one could grab the CPU one handed, much more easily. This is but one example of a product that isn't the most functional, from a functional or accessibility stand point.

I'd be worried about whether my power supply let me run the card stably at full power...

I don't know if you were replying to me or not, but it isn't so much worrying, as a fore-thought and a direct response to a what if question posed by another. What if this turned out to be the case? Well I'd mind. If it isn't the case, then my answer was only to the hypothetical situation mentioned.

As to the PSU, some of us do have 400 watt PSUs and if it isn't enough, a 533 PSU isn't that expensive, even with some known name brands. But replacing a PSU is a one time thing...
 
Son Goku said:
All one would have to do to see this doesn't always occur is look at the design of the slot 1 cartridge that Intel used to use. Sliding the processor in the mother board wasn't the tough part, but removing it, unless one had extra long fingers and a very large distance between the thumb and the first finger.... I could never do that with 1 hand...and that said, so much for keeping one hand in a grounded position while only working with the other... The problem is that the cartridge was 2 wide to keep both tabs pressed in with both the thumb and first finger and my fingers just couldn't reach that far.

All the slot 1 procs i have worked with, those "tabs" can LOCK in place IN...as well as out.
so to pull em out, you just lock em in, then pull up.
 
Althornin said:
Son Goku said:
All one would have to do to see this doesn't always occur is look at the design of the slot 1 cartridge that Intel used to use. Sliding the processor in the mother board wasn't the tough part, but removing it, unless one had extra long fingers and a very large distance between the thumb and the first finger.... I could never do that with 1 hand...and that said, so much for keeping one hand in a grounded position while only working with the other... The problem is that the cartridge was 2 wide to keep both tabs pressed in with both the thumb and first finger and my fingers just couldn't reach that far.

All the slot 1 procs i have worked with, those "tabs" can LOCK in place IN...as well as out.
so to pull em out, you just lock em in, then pull up.

I've worked with more then a couple, both box and OEM, and they most definitely did not lock in. I am not the only person who found this a pain, as I've had more then a few OEMs recount how much of a pain this was for them too. I've got an old PII 400 sitting here in my apartment too, I could check...yup confirmed. On more then a few it didn't work that way. The pull out to remove was a much better design IMO...

But in addition to this, one can mention all kinds of other things such as the mounting screws, placement of IDE sockets (aka at the bottom of the board where some might have full ATX or server cases and have drives up top) and the power connector on various mobos, the fun that is connecting a cable to the back of a CD-Rom drive when there isn't much room between the CD drive and the PSU and yet the cable isn't long enough to attach with drive completely out of case, etc...

Arguably the 25-pin SCSI cable wasn't the best idea either...as it looks like and can be plugged into a parallel port. In fact, a few people have done that reaching back there to plug in their SCSI scanner and then gone to seek help as the SCSI device left their mobos in sorry shapre. Of course a parallel port wasn't exactly made to receive a termination voltage used in SCSI... I of course didn't want a oops, so when I got the SCSI scanner I used to own...I had to be extra careful with that thing, until I moved to a complete SCSI disk system (in a newer comp) and so used a 25-pin to 50-pin standard SCSI cable to plug the thing into the back of the Adaptec card. Ironically it had a 50-pin terminator on it, with a 25-pin SCSI cable...
 
On the subject of GFFX fan noise:

http://forums.tweaktown.com/showthread.php?s=&threadid=6811

While we couldn’t see it, the fan cooling the heat pipes was very loud – we are talking almost Delta-like volume levels. Possibly, as we get closer to seeing these cards in retail, nVidia may tweak the cooling systems to a more noise tolerable level – at least I hope so.

When quizzed by a gamer at the sound levels coming from the back of the card, an nVidia rep was quick to suggest that it wouldn’t matter much because gamers would be using headphones during their gaming. Unless the cooling technology has thermal throttling (which it very well may, mind you) I would have to disagree with this notion.

Say you are listening to music or fragging away with your desktop speakers, the hum of the cooling fan will still be audible since we do not all use headphones.
 
Status
Not open for further replies.
Back
Top