NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.
I just wonder, how long NV will stand on the GDDR3 node for their SKUs?
If this G200 line is likely to repeat the G80 path, I would see the next die shrinkage (55/40 nm ?) replacement of the bulky GTX-280 to scale down the memory bus in half, wiring some uber-megahertz GDDR5 chippery
They will use it as long as it works for them thats the way i see it the only reason they would use
gddr5 is if ati took the performance crown from them which im hoping happens after all competition inspires innovation as for the die shrink i have no idea why they stuck with 65nm

Now, if GT 200 turns out to be long life for Tesla architecture - 2 years or better - then the margins will be very good. If it turns out to be a short-lived turkey and we see GT300 in 6 months, then it is probably a very bad margin. These are extreme scenarios.
Unified shaders with g80 what new innovation will they pull out of their hats for this one thats going to separate it from the rest make it a base for future revisions
i am guessing at $650 for Nvidia's GT200 top performer. AMD will counter with price [and x2/x3] imo.
-What did that link i quoted say, about $110 for a GT200 chip? That IS a decent margin!
I really don't think ati will pull out an x3 the power requirements would be insane unless they did some major die shrinkage even then the x2 variant sounds like a true contender and we haven't even seen 4xxx series benchmarks im not saying a x3 variant would be rejected id be first in line for it:smile:
 
Last edited by a moderator:
You would not say it is not on purpose?

That Nvidia is perhaps telegraphing a message of their new baby monster they are about to unleash. AMD delayed the r700 a bit. Perhaps to bump their clocks in response?

From my view point, AMD is uncharacteristically quiet and Nvidia is unusually giddy with their own apparent success; surprising as Nvidia attempts to take on a much bigger company and AMD at the same time. You'd think they have several secret weapons they are still holding back. i do.

i am guessing at $650 for Nvidia's GT200 top performer. AMD will counter with price [and x2/x3] imo.
-What did that link i quoted say, about $110 for a GT200 chip? That IS a decent margin!

EDIT, Yes

http://www.tgdaily.com/content/view/37554/135/

Margins will be decent if they sell the card at $600, lets say the total cost of the card, $110 for the chip and $140, for the board, ram, cooler, which is a bit on the high side, still leaves 45% margins for retail and 60% margins for nV, as long as yield is good, $600 is a good starting place for nV. These aren't midrange cards or low end cards where they have lot less to play with when it comes to pricing.
 
If TDP is over or near 225W 6+8 is required and atm it looks like it will be the case.
I thought about that when the mechanical drawings were released, and I guess it's going to depend on whether the user has a PCIE2 motherboard. If that's the case then just the 8 connector, or 2x6 connectors would be sufficient.

I guess they're assuming anyone who's willing to plunge whatever money they're going to charge customers for it will have motherboard and/or PSU to match.
 
I thought about that when the mechanical drawings were released, and I guess it's going to depend on whether the user has a PCIE2 motherboard. If that's the case then just the 8 connector, or 2x6 connectors would be sufficient.
No, PCIe 2.0 is still 75W per x16 slot.
 
http://www.incrysis.com/forums/viewtopic.php?id=20383

It seems it's likely that 6+6 power supplies will be fine - people who've been running 9800GX2s have already been crossing this bridge.

This thread muddies the waters a bit:

http://www.evga.com/forums/tm.asp?m=297186

but it appears that 8-pin connectors on power supplies that provide 6+8 plugs configure the extra 2 pins to connect to a single ground wire.

So it appears that the 8-pin configuration is solely to reduce the amount of current passing through any one ground pin within the PCI Express connector. It doesn't alter the amount of current in the ground wire. So an adaptor that converts 6-pin to 8-pin is functionally equivalent - EDIT: although not strictly electrically equivalent - and this is arguably why an adaptor is considered "unsafe".

So the only remaining question is whether the power supply itself is beefy enough. I dare say any SLI-certified PS is up to the job, since 2x8800GTX will draw more power than a GTX 280.

Jawed
 
How can it be 100mm^2 bigger, and on 65nm and only be 1.1billion transistors. It surely needs to be in the 1.5 1.6 billion ball park?

Sorry if thats been covered in this thread and I missed it.
 
http://www.incrysis.com/forums/viewtopic.php?id=20383

It seems it's likely that 6+6 power supplies will be fine - people who've been running 9800GX2s have already been crossing this bridge.

This thread muddies the waters a bit:

http://www.evga.com/forums/tm.asp?m=297186

but it appears that 8-pin connectors on power supplies that provide 6+8 plugs configure the extra 2 pins to connect to a single ground wire.

So it appears that the 8-pin configuration is solely to reduce the amount of current passing through any one ground pin within the PCI Express connector. It doesn't alter the amount of current in the ground wire. So an adaptor that converts 6-pin to 8-pin is functionally equivalent - EDIT: although not strictly electrically equivalent - and this is arguably why an adaptor is considered "unsafe".

So the only remaining question is whether the power supply itself is beefy enough. I dare say any SLI-certified PS is up to the job, since 2x8800GTX will draw more power than a GTX 280.

Jawed
Thanks for the link. I was worried about being able to use my HX620, but according to JonnyGuru's post in that second link it seems buying a new PSU would accomplish nothing.
 
Do you think guys could we expect some kind of Performance-Mainstream or Mainstream GPUs based on GT200 architecture and smaller process -55nm or 40nm-this year?
 
Too bad because i doubt that G92B could be a decent competitor to Rv770 especially when Rv770 is said to be around 20-30% faster than G9800GTX (so G92 and we know that G92B=G92).

Maybe. Maybe not. A 55nm G92 should be able to (in theory) clock both its master and shader cores quite a bit higher.
 
Status
Not open for further replies.
Back
Top