NVIDIA GF100 & Friends speculation

hmm this discussion is going a bit off topic me thinks, but I can't resist :).
I think there's both pros and cons for graphic card coolers which "mostly" or "mostly not" exhaust the heat directly. Now, mostly exhausting the heat seems obviously better at first glance. But OTOH it seems it's difficult to do that silently - it's hard to beat a big case fan with a comparatively small fan and a tiny exhaust area as far as total heat transfer goes. But of course if you want that everything stays as cool as possible (for instance for overclocking) or there's a lot of heat in a relatively crowded space already (as is the case with SLI) it seems coolers exhausting more hot air are definitely a better choice. So as far as the reference GTS 450 heatsink is concerned, I don't see much of a problem, unless you use SLI (why with this card???), overclock the heck out of all your components or you don't have reasonable airflow inside your case.
 
I would think so, seeing as the reference cooler designs have a shroud and an exhaust.



Well obviously such a cooler would still serve a useful purpose in keeping your videocard working. But seeing as these reference cards do have exhausts, the question does not apply to them in the first place.

No, 2 stock 460s would not exhaust anywhere near 320W inside your case.

The shroud has holes all around and the fan blows a vertical stream of air to the PCB. Do you really need to do a thermodynamics experiment to measure how much hot air remains inside the case?

I am giving you the benefit of the doubt, into arbitrarily accepting that the four sides are four 1/4 equals, which they are not, since the side of the card with the exhaust out of the case, is one of the smaller sides, hence less air go out of it.

But still, there are 3/4 remaining with exhaust inside the case. Is that so hard to understand? It may not be 320W in 460SLI, but it sure as hell is 240W (75%). Again, not a good thing. If you start overclocking, you'd better have a Katrina blowing inside the case.

Anyway, I never said that the 460s or 450s coolers, don't do their jobs. I only said that they do their jobs by putting quite some heat burden on the rest of the system. All such coolers do whether they are ATI's or Nvidia's.
 
Its been done. Thats why blower solutions have been invented.
Well, I'm sure it's been done internally. But really, it's worth actually having an idea of the specifics before asserting that one sort of design is substantially better or worse than another.
 
Not only are you removing the vast amount of GPU heat produced, blowers usually have metal trays that accomodate the RAM, regulators and other components; significant proprotions of the heat accounted for with these components is also exited via the airflow of the blower. Usually, unless it is a specifically design solution, Axial fans to not acccount for any of the heat from the other components, limiting their performance (and of course the heatoutput from these is conveyed in the case, irrespective of whether the fn sink actually accounts for them or not).

Of course, in the blower solutions the heat output of these components is also reflected back, somewhat, on the reported GPU temps, which is why it is important to look at the TDP rather than the GPU temp sensors to understand heat output.
 
Look, I'm not doubting for an instant that blowers help. This is purely a question of how much, which is critical when attempting to make strong statements supporting one blower design vs. another, or supporting a blower design as opposed to a non-blower design for lower-power products.

As for me, I'd much rather look at the bottom line: how do temperatures and noise compare between products? What about case temperature? I'd much rather have these numbers than people talking about how one design is better than another because it blows more or less air back into the case.
 
The shroud has holes all around and the fan blows a vertical stream of air to the PCB. Do you really need to do a thermodynamics experiment to measure how much hot air remains inside the case?

I am giving you the benefit of the doubt, into arbitrarily accepting that the four sides are four 1/4 equals, which they are not, since the side of the card with the exhaust out of the case, is one of the smaller sides, hence less air go out of it.
You probably need to run a thermodynamic simulation in order to get accurate results. True, the exhaust side is one of the smaller sides, but also one of two sides, where the a large part of the air is not blocked directly by the shroud and deflected, so that a kind of stream can build up.

But still, there are 3/4 remaining with exhaust inside the case. Is that so hard to understand? It may not be 320W in 460SLI, but it sure as hell is 240W (75%). Again, not a good thing. If you start overclocking, you'd better have a Katrina blowing inside the case.
The 450s that have been measured so far show a gaming load (i.e. sans „power-virus”) of around 90 watts. I won't cite my own measurements, but two colleagues, whose numbers I trust as much:
http://www.heise.de/newsticker/meldung/GeForce-GTS-450-Fermi-fuer-die-Mittelklasse-1077428.html
http://ht4u.net/reviews/2010/nvidia_geforce_gts_450_test_msi_cyclone_gigabyte_oc/index18.php
Thus, the power consumption is on the general level of HD 5770 - comparing stock non-OC cards to their likes.

So, apart from generally questioning multi-GPU configurations and especially in this performance segment, the situation is not as grave as it may seem. I haven't seen any serious heat catastrophes resulting from AMDs move to redesign the reference cooler for HD 5770 to an axial design either:
http://www.pcgameshardware.de/aid,7...armodell-leiser-und-kuehler/Grafikkarte/News/ (sorry for citing myself this time).
 
I am giving you the benefit of the doubt, into arbitrarily accepting that the four sides are four 1/4 equals, which they are not, since the side of the card with the exhaust out of the case, is one of the smaller sides, hence less air go out of it.

Well you're right about one thing, they are not equals. The point of having a shroud in the first place is clearly to direct air in a certain direction.

Power.jpg


This shroud is not completely closed, but it's not wide open either. The slits along the sides are thin, and clearly the plastic at the back is designed with a downwards curve which blocks off more than half of that side. It seems obvious to me that the point of this design is to blow a significant portion out the back of the PC case. And yes, another portion is certainly meant to blow into the case, and that air is not specifically directed across the VRM section just by accident.

I'm not sure how you can look at that design and claim that all of the hot air will go into your case. That's just silly.
 
Cooling solution, which warms up one of the most heat-sensitive component, can hardly be "so much better", than a solution, which exhaust all the heat outside (or for some models - majority of the heat goes outside, while the rest is directed on side-plate).

A solution which exhausts all or as much as possible of the heat outside is a great solution (the noise concern mentioned earlier is relevant however) in many instances. No contest there.

But there are still a lot of non-shroud coolers around for various reasons. Take the 5750 reference cooler for instance. As well as many, many non-reference OEM coolers for a lot of different GPUs.

The GTS 450 cooler is more of a compromise between the 2 types, which in no way merits your original assement that:
What's so good on a cooler, which blows all the hot air on your hard-drive and other components?
 
Just measured the first GTS 450 to pass 130 watts in Furmark (110is in Games). 18,7 percent OC and raised core voltage.
 
You probably need to run a thermodynamic simulation in order to get accurate results. True, the exhaust side is one of the smaller sides, but also one of two sides, where the a large part of the air is not blocked directly by the shroud and deflected, so that a kind of stream can build up.


The 450s that have been measured so far show a gaming load (i.e. sans „power-virus”) of around 90 watts. I won't cite my own measurements, but two colleagues, whose numbers I trust as much:
http://www.heise.de/newsticker/meldung/GeForce-GTS-450-Fermi-fuer-die-Mittelklasse-1077428.html
http://ht4u.net/reviews/2010/nvidia_geforce_gts_450_test_msi_cyclone_gigabyte_oc/index18.php
Thus, the power consumption is on the general level of HD 5770 - comparing stock non-OC cards to their likes.

So, apart from generally questioning multi-GPU configurations and especially in this performance segment, the situation is not as grave as it may seem. I haven't seen any serious heat catastrophes resulting from AMDs move to redesign the reference cooler for HD 5770 to an axial design either:
http://www.pcgameshardware.de/aid,7...armodell-leiser-und-kuehler/Grafikkarte/News/ (sorry for citing myself this time).

You can cite yourself all you want sir. PCGH has turned out to be one of my primary source of education and benchmark reviews, although I've been quite harsh towards it before. ;)

I think that a multi gpu GTS 450 solution, is quite interesting actually, since it performs quite close to the much more expensive 5850/5870. For some people, it could still be a viable solution.

Indeed the 5770's rev 2 cooler, is generally considered to be the better one. Still, the problem with most review sites, is that since they are exactly that, review site, they have their hardware out in the open most of the times, so they don't measure the heat build up inside an average user's PC. I am pretty sure, that a PC case with the Phoenix cooler would be cooler than the rev 2.0. Again, especially in multi gpu.
 
Well you're right about one thing, they are not equals. The point of having a shroud in the first place is clearly to direct air in a certain direction.

Power.jpg


This shroud is not completely closed, but it's not wide open either. The slits along the sides are thin, and clearly the plastic at the back is designed with a downwards curve which blocks off more than half of that side. It seems obvious to me that the point of this design is to blow a significant portion out the back of the PC case. And yes, another portion is certainly meant to blow into the case, and that air is not specifically directed across the VRM section just by accident.

I'm not sure how you can look at that design and claim that all of the hot air will go into your case. That's just silly.

I changed my initial statement of "all air" to an arbitrary evaluation of about 75%. I explained my thoughts earlier.

As you correctly say, some portion of the hot air goes out and some of it stays in. In that much we agree. Now in order to do exact measurement, it seems that a thermodynamics experiment is needed after all.

The bottom line is, that this type of cooling, although efficient, needs additional care in the case.

We could keep arguing for ages about this. Cheers.
 
Florin: I think I already explained it more comprehensibly, but it seems I failed. So for the last time. My point was, that all the hot air is directed on HDD. Not that all the heat, which the board generates, goes inside - I've never told that.
 
I think that a multi gpu GTS 450 solution, is quite interesting actually, since it performs quite close to the much more expensive 5850/5870. For some people, it could still be a viable solution.
A HD5850 costs pretty much the same as 2 GTS 450. So, if you look at the raw fps numbers, it appears GTS 450 SLI is indeed faster on average (by quite a bit somtimes). But still, there's no question what I'd prefer. It's not faster enough to be worth the trouble, imho. And, of course, power consumption in such a scenario is no contest (for that alone I wouldn't get GTS450 SLI).
 
A HD5850 costs pretty much the same as 2 GTS 450. So, if you look at the raw fps numbers, it appears GTS 450 SLI is indeed faster on average (by quite a bit somtimes). But still, there's no question what I'd prefer. It's not faster enough to be worth the trouble, imho. And, of course, power consumption in such a scenario is no contest (for that alone I wouldn't get GTS450 SLI).

Indeed.

Still, that's why I said 5850/5870 and not just 5850. I am under the impression that overall, the 450 SLI scheme is a worthy opponent for these cards and falls maybe somewhere in the middle of them, performance wise.

As for the power draw, as our friend CarstenS said, he measured 90W on each, so two of them should be just shy off a 5870's power draw with pretty close performance. It is also interesting noting, that SLIing these cards, should be an indication of how a full GF104 should perform.

The good part with multi gpu, is that you get to disable one card, once it's extra power is not really needed. For example, a single 450 should be more than enough to cope with Singlularity or Prince of Persia. In that case, if you had a 5870 instead, you would burn power you wouldn't need to, while in the 450 SLI example, you could just turn off one of them and be burdened only by its idle power draw, which is minimal anyway.

Not looking at one side of things, the opposite would happen in case a brand new game came out, that had no support for SLI. In that case, the second GTS 450 would be useless and the 5870/5850 would be much better off. Still, in games that really require that kind of power, Nvidia does not sit idle there and immediately launches a beta driver. Unlike *the you know who* that has still to provide anything working in multigpu for Lost Planet 2.

Moreover, if I am not mistaken, Nvidia's driver allows for custom SLI profiles, although I've never used it and don't know what success rate it has.
 
The good part with multi gpu, is that you get to disable one card, once it's extra power is not really needed. For example, a single 450 should be more than enough to cope with Singlularity or Prince of Persia. In that case, if you had a 5870 instead, you would burn power you wouldn't need to, while in the 450 SLI example, you could just turn off one of them and be burdened only by its idle power draw, which is minimal anyway.
Actually, it looks like in the particular implementation, nVidia has some problems here in that the second card cannot be turned off (or placed in any sort of ultra low-power state), so the idle power consumption of nVidia's SLI solutions pale in comparison to ATI's Crossfire solutions. The difference may not be so noticeable once a game has been loaded that doesn't use SLI/Crossfire, but I'm pretty sure it will still be there.

Moreover, if I am not mistaken, Nvidia's driver allows for custom SLI profiles, although I've never used it and don't what success rate it has.
Well, it's been a while since I've used it, but nVidia's SLI was, as of a few years ago, quite comprehensive. Setting up a custom profile sometimes helped, sometimes didn't. It all depends upon how the game renders the scene. But in any case, most of the games that don't have profiles will be older games that don't need them anyway.
 
A HD5850 costs pretty much the same as 2 GTS 450. So, if you look at the raw fps numbers, it appears GTS 450 SLI is indeed faster on average (by quite a bit somtimes). But still, there's no question what I'd prefer. It's not faster enough to be worth the trouble, imho. And, of course, power consumption in such a scenario is no contest (for that alone I wouldn't get GTS450 SLI).

You'd probably be better off with a pair of HD 5770s than GTS 450s, anyway.
 
As for the power draw, as our friend CarstenS said, he measured 90W on each, so two of them should be just shy off a 5870's power draw with pretty close performance. It is also interesting noting, that SLIing these cards, should be an indication of how a full GF104 should perform.

you're comparing Power Draw versus TDP here!

the 450 draws 88W in Grid, the 5870 Vapor-X draws 116W.
If I combine that with the average performance of TPU's SLI benchmark (where the power usage is even worse for the 450) you're talking about a 135% power draw for 90% performance.
 
Actually, it looks like in the particular implementation, nVidia has some problems here in that the second card cannot be turned off (or placed in any sort of ultra low-power state), so the idle power consumption of nVidia's SLI solutions pale in comparison to ATI's Crossfire solutions. The difference may not be so noticeable once a game has been loaded that doesn't use SLI/Crossfire, but I'm pretty sure it will still be there.


Well, it's been a while since I've used it, but nVidia's SLI was, as of a few years ago, quite comprehensive. Setting up a custom profile sometimes helped, sometimes didn't. It all depends upon how the game renders the scene. But in any case, most of the games that don't have profiles will be older games that don't need them anyway.

HHHmmmm, that's interesting. Wasn't aware of this SLI Issue. Still, more clarification is need, if you please. Does the second card stay in non ultra low power when SLI is enabled but in idle 2D or when SLI is disabled and in 2D?

If it is the first, then ATI has the same problem as well. My second 5850 stays at 400/900 if I enable Crossfire and just stay on the desktop. If I disable it, the both cards operate at 157/300. In all honesty, I don't know if this one of the "features" of Cat 10.8. Didn't notice it before, because I always keep crossfire disabled anyway and I only enable it before running some bitchy game that needs dual gpu power.
 
Fire you say?

Ah yes, I've got some fuel here:

http://www.hardocp.com/article/2010/09/14/nvidia_geforce_gts_450_sli_video_card_review/9

If you look at our power testing of total system Wattage, GeForce GTS 450 drew 456 Watts at full load. ATI Radeon HD 5770 CrossFireX drew 443 Watts at full load, and ATI Radeon HD 5750 CrossFireX drew a much lower 370 Watts at full load.

It doesn’t seem like performance is scaling with power very well with GeForce GTS 450 SLI. It seems it is a power hog compared to the competition.

Our GeForce GTS 450 SLI versus Radeon HD 5770 CrossFireX and Radeon HD 5750 CrossFireX comparisons today have revealed that GeForce GTS 450 SLI is on the bottom end of performance in this price range. Two standard GeForce GTS 450 video cards will cost you $260. On the other hand, two Radeon HD 5770 video cards will cost you $10 less at $250 and provide up to 30% better performance in games. Radeon HD 5750 CrossFireX will cost you as little as $210 saving you $50 and providing the same or better performance than GTS 450 SLI.

NVIDIA should be worried, they’ve given us a GPU that is either underpowered in this price range or overpriced, and it seems to take a lot of power to produce substandard performance. The fact is that Radeon HD 5770 cards used to be priced higher, but prices have fallen, a lot. Even the Radeon HD 5750, when it was launched, cost $139.99, right at the price of the GTS 450, and in our tests it is giving us better performance. It took NVIDIA eleven months to produce a video card that doesn’t exactly offer anything new over the Radeon HD 5750 which has been out for almost a year now. NVIDIA is behind this generation folks, there is no denying that. In the 20"-22" display size market, the market that NVIDIA explains the GTS 450 is targeting, there has never been a clearer "winner," and that is the ATI Radeon HD 5770.
 
Back
Top