Unknown Soldier
Veteran
what do you mean, "to start"? It will launch after Antilles.
The way Nvidia have been going, I expect them to launch before Antilles.
what do you mean, "to start"? It will launch after Antilles.
It will make the GTX470 obsolete, like the GTX465 was made obsolete by the GTX460 boards.What kind of performance would we get from something like that? Looks pretty decent, and makes it seem very much like a GF104b rather than a whole new design.
Of course, it does.Yes, but does the surface matter anyway?
Even if it is, then it's a bottleneck per surface area.I might be wrong here but the surface area Si/cooler is not the real bottleneck?
That doesn't make any sense.You just want a good cold thermal conductor being as close as possible to the heat source. Less silicon - less heat buffer.
200W load power consumption (gaming not Furmark) 150W TDP.
I'm guessing Nvidia are going for the optimistic route? It's not unheard of, I mean GTX480 had a TDP of 250W but load consumption was much higher.
I can ask the guy for clarification, maybe I misunderstood him and he meant 200W was Furmark as opposed to game performance, not the other way around.
That doesn't make any sense.
Not the part about less silicon. And neither the part of your thermal conductor being cold. In the case of optimal cooling, your cooling element will have the same temperature as the heat source and as the ambient temperature. Any delta in temperature is indicative of an unwanted thermal resistance
You calculate thermal resistances the way you calculate electrical resistance: if you put a lot of them in parallel (= larger surface), you lower the overall resistance. Put them in series, you increase it.
Or ( to clutter up this thread more) the furmark.exe triggers a power hold.
Not sure what the current rumours are, but I just heard one for GF114 releasing in late Jan, 384 SP, 775MHz core clock, 256bit bus, 200W load power consumption (gaming not Furmark) 150W TDP.
Yeah, that makes sense. I'll give the guy a call tomorrow...
All GTX 580s @ newegg are OUT OF STOCK... Hmmmm.
Or maybe gamers got tired of waiting for AMD's "now known lower performance" 6970 and finally bought the GTX580 thus depleting stocks.Yeah maybe that's because they shipped one batch and then ran out of cards? but then.. what do I know ..
Or maybe gamers got tired of waiting for AMD's "now known lower performance" 6970 and finally bought the GTX580 thus depleting stocks.
Looks like nVidia has been minting money on the GTX580 since early November.