NVIDIA GF100 & Friends speculation

Not sure what the current rumours are, but I just heard one for GF114 releasing in late Jan, 384 SP, 775MHz core clock, 256bit bus, 200W load power consumption (gaming not Furmark) 150W TDP.

Does that match up with other rumours or is someone pulling my leg. January seems awfully soon for GF114, and the core clock looks very high for a reference board, surely that would be bad for yields at Nvidia, better to bin for a lower clocked chip.

What kind of performance would we get from something like that? Looks pretty decent, and makes it seem very much like a GF104b rather than a whole new design.
 
That seems wrong while Nvidia's TDP's having been great it has been pretty close to gaming performance though furmark blows it away.
 
Yes, but does the surface matter anyway?
Of course, it does.

I might be wrong here but the surface area Si/cooler is not the real bottleneck?
Even if it is, then it's a bottleneck per surface area.

You just want a good cold thermal conductor being as close as possible to the heat source. Less silicon - less heat buffer.
That doesn't make any sense.

Not the part about less silicon. And neither the part of your thermal conductor being cold. In the case of optimal cooling, your cooling element will have the same temperature as the heat source and as the ambient temperature. Any delta in temperature is indicative of an unwanted thermal resistance

You calculate thermal resistances the way you calculate electrical resistance: if you put a lot of them in parallel (= larger surface), you lower the overall resistance. Put them in series, you increase it.
 
I'm guessing Nvidia are going for the optimistic route? It's not unheard of, I mean GTX480 had a TDP of 250W but load consumption was much higher.

I can ask the guy for clarification, maybe I misunderstood him and he meant 200W was Furmark as opposed to game performance, not the other way around.
 
I'm guessing Nvidia are going for the optimistic route? It's not unheard of, I mean GTX480 had a TDP of 250W but load consumption was much higher.

I can ask the guy for clarification, maybe I misunderstood him and he meant 200W was Furmark as opposed to game performance, not the other way around.

That would make a lot more sense, cause 200W in games would mean something like ~240W under Furmark, so that 150W TDP would be downright retarded.
 
150 nominal TDP was already in place for GTX 460 with only 768 MB. But then, Furmark didn't overly top this.
 
That doesn't make any sense.

Not the part about less silicon. And neither the part of your thermal conductor being cold. In the case of optimal cooling, your cooling element will have the same temperature as the heat source and as the ambient temperature. Any delta in temperature is indicative of an unwanted thermal resistance

You calculate thermal resistances the way you calculate electrical resistance: if you put a lot of them in parallel (= larger surface), you lower the overall resistance. Put them in series, you increase it.

Maybe I did not formulate it clearly (lousy english). I worry about the silicon to possibly be in series in the "thermal circuit". Any good thermal conductor with potential to lead out heat to a "infinite" reservoir as close to the heat source point is possibly better than adding Si around it for increased surface area even if the decrease the surface thermal resistance.

This is of course only hypothetical since added chip size means higher ratio of heat/resistance (electrical) from internal leads and I realize now with the question stated (300W no questions asked) I'm wrong in the sense that I can't count in the added volume of Si as being in series since I can't count in the gain of having shorter circuit leads.

I know I didn't express this very well either. Well well.
 
Not sure what the current rumours are, but I just heard one for GF114 releasing in late Jan, 384 SP, 775MHz core clock, 256bit bus, 200W load power consumption (gaming not Furmark) 150W TDP.

In Asia rumors are spreading of a 875MHz 384SPs 256-Bit 185W TDP GTX 560:
http://translate.google.de/translat...ad-38537-1-1.html&sl=zh-CN&tl=en&hl=&ie=UTF-8
http://we.pcinlife.com/thread-1572757-1-1.html

Clock seems a bit high, but if they can achieve this, this should be a HD 6950 competitor.
 
Yeah, that makes sense. I'll give the guy a call tomorrow...

So I called the guy and he said 200W was for Furmark, 150W was for gaming performance. I misunderstood what he said.

He gave me the specs again to make sure I had it right:

384SP
775-820MHz
150W TDP, 200W Furmark
350-370mm^2 die
256bit, 1GB.

30 - 40% better than the GTX460 1GB according to Nvidia, but he says 20-30% is more realistic.
 
The problem is that before TDP used to mean (at least close to) those furmark numbers, and gaming is notably lower than TDP. That was until Fermi, anyway
 
Yeah maybe that's because they shipped one batch and then ran out of cards? but then.. what do I know ..
Or maybe gamers got tired of waiting for AMD's "now known lower performance" 6970 and finally bought the GTX580 thus depleting stocks.

Looks like nVidia has been minting money on the GTX580 since early November.
 
Back
Top