nVidia - expect 120Watt GPUs soon!

Shit!

120 watts! That's really much. Watercooling is needed, unless you want louder and most of all much much bigger copper heatsink. Will cost much and who is going to buy something that will be more worse than GFFX (unless watercooling is used)?
 
I would not be surprised if the next-gen PC form-factor includes specs for venting GPU heat. That is really the only sensical way I can see things progressing, if it is determined that such high-power GPUs are going to become more of a standard gong forward.

The current form factors simply are not designed to handle heat removal from a slot-mounted board at such high wattages.

I just don't see water-cooling a viable option at all for anything but a niche market.
 
I bet the 256MB NV35 will draw over 100W in total.

As for 120W for the ASIC alone... :oops:

MuFu.
 
Hmm.
I'm really into quiet computers, particularly at home. I read this too, so I started to wonder if I just misread current trends due to personal bias. So I asked my local dealer about what customers ask for.

Outside vanilla, very few people ask for high performance "overclocking" fans. Noone, basically. The few people who seem to be into overclocking go for water cooling instead. What is requested is quiet or small computers, in that order. The people who want small want them to be quiet as well.

Of course, this is just one dealer, but it jives with what I've seen in the marketplace and on boards. Even hardware reviewers object to noise, and you'd expect them to be tolerant - the GeforceFX backlash was huge. Another major trend in computing is away from desktop systems entirely, towards mobile computing.

I have had this feeling for some time that nVidia has lost sync a bit with the gfx-market reality, and 120 Watt GPUs fit into that pattern. Bulky and noisy systems aren't "badges of honour", sorry.

Entropy
 
He doesnt mean that

He means the 256MB RAM will probably draw an additional 100W power.

Correct me if I am wrong Mufu. Personally I think this is going to be the greatest limitation to the Power of 3D now.. not bandwidth limits but damn power limits <looks for sources of mass power>
 
Nebuchadnezzar said:
MuFu said:
I bet the 256MB NV35 will draw over 100W in total.

As for 120W for the ASIC alone... :oops:

MuFu.

That doesn't make sense... :? :LOL:

The ASIC draws more power than the total? :p

No, he means that the NV35 is not the 120W ASIC--although the NV35 board with 256 MB could be drawing 100W.

NVidia must be referring to NV40 or beyond.
 
Would it be possible to put gfx cards into a small external case?
I.e. you have a small AGP/PCI-X card that plugs into the system as usual, and serves as bridge between the gfx card and system (signals could for example be transmitted optically). Then you can pretty much use whatever cooling solution/form factor/power supply you want for the graphics.

For the renderfarm market, you could then rack-mount gfx "systems", using a single CPU to control a whole GPU cluster...

is this a completely hair-brained idea?
Serge
 
Seriously, how do you people react to this?
Readers of the B3D forums are probably among the most interested group of people you are likely to find as far as high-performance PC 3D-hardware goes.

Personally, me and those I associate with who care about computers at all are moving in the other direction entirely - we want technology that fits in, rather than being intrusive. But maybe that's just my age bracket.

Dropping the endless ATI vs nVidia vs AMD vs Intel vs Iraq for a moment, is this the direction you want your computers to develop?

Entropy
 
Entropy said:
is this the direction you want your computers to develop?

I'm less interested in a particular direction, than I am interested in there being choice, so I can choose my "own" direction. :)

In other words, I really don't care about the specifics of power consumption or heat. I don't care if some cards are offered at 500W, if that's what it takes to show an marked increase in 3D performance.

Introduce such products if the IHVs feel they must, and let the market decide if they are acceptable and viable products.

Personally, the particular aspect of "power" that I am concerned with is noise. I won't buy a computer product with FXFlow type noise levels. I also "prefer" smaller, and "prefer" a part that is less obtrusive or one that doesn't force me to buy a new case. However, if the other features of the gaphics solution are great enough, I will choose it.

Again, it's all about having a choice.
 
120 Watts! We're going to need separate power supplies for that! (Can anyone say VooDoo 5 6000, with the power brick? :D )
 
if its too loud, i'll watercool it.
It was the only way to make my OCed athlon system reasonably quiet.
All i need is a GPU block, and i'm set.
 
Intel has proclaimed that 100W of heat is the maximum for processors. Maybe they're conservative, but honestly you can't go that much higher than that. I guess graphics cards have a large die, so there's more surface to cool, but even so a 120W VPU is just insane... Nvidia chips have always run a little on the hot side, but now it seems like they're getting carried away. I really don't see this working out so well...
 
Back
Top