NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.
I think G80 is clearly the most successful GPU architecture for many years, even more so than R300.

What makes me smile is all the flack some people handed out to early adopters of the GF8 citing claims of first gen cards being a mugs game.

Lol, I wonder how many of those people are now purchasing cards 18 months later that are barely faster (or even slower) than the same cards they were trash talking when they launched.

Still glad you waited guys? ;)


Well, yeah...........they're a lot cheaper. Especially price/performance products like the GT - -especially in Sli.
 
The current G92 is made on TSMC and UMC 65nm nodes, while this new "B" version will be a "simple" die shrink to the 55nm half node, with likely substantially higher clocks at the same TDP level.

I thought the rule of thumb was half nodes bring more a financial improvement through smaller dies than a performance improvement for a given design?
 
I thought the rule of thumb was half nodes bring more a financial improvement through smaller dies than a performance improvement for a given design?

G70@110nm = 334mm2
7800GTX 512 (call me hard to find at launch) = 550MHz
7800GTX 256 = 430MHz

G71@90nm = 196mm2
7900GTX = 650MHz
 
Well, yeah...........they're a lot cheaper. Especially price/performance products like the GT - -especially in Sli.

The GTX isn't really that much cheaper. But yes, overall, there is a cost saving. However putting up with a much slower GPU for 18 months all for the sake of a $100 or so? Not my idea of a great plan.
 
Last edited by a moderator:
G70@110nm = 334mm2
7800GTX 512 (call me hard to find at launch) = 550MHz
7800GTX 256 = 430MHz

G71@90nm = 196mm2
7900GTX = 650MHz
That's not a half-node shrink. 130nm, 90nm, 65nm are the full nodes, 110nm, 80nm, 55nm are the half-nodes.
 
That's not a half-node shrink. 130nm, 90nm, 65nm are the full nodes, 110nm, 80nm, 55nm are the half-nodes.

It's not that simple.
If a given half node high performance version is required for a GPU, then it may end up being more expensive than the full node from which it derives directly.

For instance, the TSMC 55nm half node has a bunch of options, which will determine the final price regardless of the die size of any given chip:



The 65nm full node has an even greater amount of options, and it's likely also cheaper than the 55nm half node with equivalent options, so the balance between the price that Nvidia or AMD must pay to TSMC, yield rate and/or time to market, volume of orders and overall die size is always extremely volatile:




So, i'm pretty sure that the 110nm half node version used on G70 was the high performance one, while the NV43 had a more economical version of the same half node.
 
Last edited by a moderator:
Sure, sure...that girl(?) over at AT also knows the password for area 51 and where Jimmy Hoffa's earthly remains are located, among other things:D
 
Hmm such a high CPU score with C2Q clocked only @2,67Ghz? Does anybody think this score could be true? ;)

usrwp2.jpg


My Q6600 @ 333x9=3000MHz scores 4700 points in 3DMark06 using WinXP 32bit for CPU-TEST.

I would say it's fake score.
 
Hmm such a high CPU score with C2Q clocked only @2,67Ghz? Does anybody think this score could be true? ;)

Did not know, that GT200 is x86 capable and can support your CPU in every app? ;):LOL:

The claimed 2560x1600 can also easy exposed, by the visible 4 -> 1280x1024.


But since some GT200 are supposed to be around, but not public, I would think that GPU-Z would be possible to show GPU and Device-ID.
 
Hmm such a high CPU score with C2Q clocked only @2,67Ghz? Does anybody think this score could be true? ;)

No. The CPU score is fucked up(too high)-a 4GHz Penryn gives about 5900, more or less, and a 3GHz one gives 4597-so the odds of an old C2Q scoring what it scores in that pic are non-existant. Oh, wait, she's probably testing it on an 8-core Nehalem...gah, that doesn't work either, as the score would be too flimsy. It's a load of crock spilled by someone crying for attention:).
 
Interesting.
Now if only this G92B would allow to use GDDR5 memory... -)

this is why I say the g100 its goin be awhile before its out ;)
From what i heard G100 should be quite a bit faster than anything G92-based so a G92 55nm shrink won't be a problem for G100 to handle.
If anything this shrink will help them to maintain a more solid line-up with less of a hole between G100 and G92...
 
Interesting.

From what i heard G100 should be quite a bit faster than anything G92-based so a G92 55nm shrink won't be a problem for G100 to handle.
If anything this shrink will help them to maintain a more solid line-up with less of a hole between G100 and G92...

Hmm quite a bit faster than G92? ;) But how much - you can tell us :D I hope more than 50% at least.

The second thing is when G100 will be released. If there is G92B in plans so G100 release date could be 2009 :(
 
Hmm quite a bit faster than G92? ;) But how much - you can tell us :D I hope more than 50% at least.
G92(A) vs GT200 ~ 7900GTX vs 8800 GTX... ;)

But the competition does not sleep and R700XT is also supposed to be a more than significant upgrade over RV670XT.
 
Status
Not open for further replies.
Back
Top