Nvidia GT300 core: Speculation

Status
Not open for further replies.
Simple reasoning would tell you that if the ballpark in terms of performance between GT212 and GT300 is large enough, there's no particular reason to skip anything.
Forgive me for losing track here, but which GT212 are we talking about? The straight shrink of the GT200 design, or one of the enhanced versions? I was assuming the GT212 was pretty much a straight shrink. Performance would hopefully increase more than it did for the 55nm shrink but nothing spectacular.

212 appears hypothetically at timeframe X as single chip high end and slips down to the "performance" segment as soon as 300 hits the shelves. IMHLO the die size of the GT212 should be ~300mm2 despite its rumoured increase in clusters/SPs, meaning that they can easily sell such a chip as a performance and later on as a mainstream part without a problem.

Obviously they did the above last year with the GT200 above and the G92 below. But i was thinking designing a new performance part off the GT300 would be a more sensible strategy ala G92 and G94. 300mm2 is prohibitive at 40nm for the moment, the leakage and variability are too great. Need to wait 6 months perhaps to see that size chip.
 
Forgive me for losing track here, but which GT212 are we talking about? The straight shrink of the GT200 design, or one of the enhanced versions? I was assuming the GT212 was pretty much a straight shrink. Performance would hopefully increase more than it did for the 55nm shrink but nothing spectacular.

GT212 has 384 ALU's on 40 nm. read the earlier posts in this thread
 
GT212 has 384 ALU's on 40 nm. read the earlier posts in this thread
Which doesn't seem likely.
It might happen but it would be a strange move from Nvidia.

There are quite a few alternate explainations that would be more realistic than this first 40nm tweaked/enlarged G200.
 
Which doesn't seem likely.
It might happen but it would be a strange move from Nvidia.

Why?

There are quite a few alternate explainations that would be more realistic than this first 40nm tweaked/enlarged G200.

This "GT212" or whatever it's actually called shouldn't be their first 40nm chip this year anyway.
 
With that rumored sp count GT212 will be a huge chip even at 40nm, which incidentally is a new process. Suppose that's true, it leaves little room for GT300 to improve unless one is to expect a RV770 type miracle in less than two quarters. This also departs form the very sensible tick-tock(G80-G92) style Nvidia has employed.

I concur with the post above that G2xx family is generally late by six month. GT200 was supposed to be out at early 2008 and it makes the most sense that a shrink GT212 or whatever it's called should have been out about now.
 
Charlie says Nvidia is delaying their 40nm parts from Q2 to Q3 and the GT212 is dead. Speculation.
 
Kind of makes sense, doesn't it ?
The market is shifting downwards in both margins and gross volume, and Nvidia is in need of a G92/G92b replacement first and foremost.
Not only is it almost 18 months old now, but it still is very far from even the GTX 260, performance-wise, and even at 55nm, the G92 and G94 cores will never be as cheap to make as a RV740 at 40nm, for instance.
If reports of industry-wide losses are to be officially confirmed, cost-cutting should settle in as soon as possible even here.

GT214 should be the top priority part for the mainstream market. With the GTX 285 and GTX 295 out now and filling that role, it's not like the GT212 was a true necessity just 3 or 4 months later...
GTX 260 (the original, 192sp part) performance sold at 9600 GT price-points -or lower- is what the market wants, i believe.


Then again, Charlie also talked about a supposed Nvidia/Microsoft phone to be shown at CES in Las Vegas, and we all know how that turned out.
Maybe he meant "CES 2010"... ;)
 
Last edited by a moderator:
Charlie says Nvidia is delaying their 40nm parts from Q2 to Q3 and the GT212 is dead. Speculation.
Are the delays to 40nm technical or commercial?

If technical, that could be more of the same trouble that 65nm and 55nm have been.

Alternatively, maybe the channel is saying "don't rush into 40nm, we've got piles of 55nm chips that will last a long time".

Jawed
 
Feels like FUD to me, GT216 must surely have taped-out by now and they'd still be in time for Q2 if they respinned it next month or possibly even later. So if this is true, it'd imply either that no 40nm chip taped-out yet (contradicted by TSMC's claims, IMO) or that the initial tape-out is a G98-like disaster where the damn thing is back and won't even boot and can't be tested for an hilariously dumb reason.

I could perfectly believe that GT212 was pushed to Q3 while GT214 was canned and replaced by GT215 (also in Q3). What Charlie claims there, however, seems extremely fishy to me. Alternatively, maybe they're confident they'll get GT300 out by 2H09 and they did cancel GT212 therefore - honestly not sure how that'd even be a negative, heh. Remember those claims that was a certain G7x derivative which was also canned because performance wasn't impressive enough compared to a GX2 and G80 was coming anyway... Those were likely at least partially right, FWIW.

WRT the ZunePhone/Tegra thing, he claimed MWC in February, not CES... And he could very well be right, although also horribly wrong at the same time! ;) Too bad Charlie's jaw will probably drop when he realizes it doesn't harm their Windows Mobile partners and battery life is way better than an iPhone. Oh well, whatever, he'll probably claim he was right anyway. Good ole Charlie...
 
Alternatively, maybe they're confident they'll get GT300 out by 2H09 and they did cancel GT212 therefore - honestly not sure how that'd even be a negative, heh.
I'm pretty sure they canceled GT212 because Windows 7 and DirectX 11 are early. Win7 looks like Q3 release right now. Therefore it makes sense to fast-track a the DX11 (GT300) chip, right?
Maybe ATI's DX11 part is coming early and NV is afraid that no-one will buy their DX10 chip when ATi offers a DX11 chip with similar performance?
 
So if this is true, it'd imply either that no 40nm chip taped-out yet (contradicted by TSMC's claims, IMO) or that the initial tape-out is a G98-like disaster where the damn thing is back and won't even boot and can't be tested for an hilariously dumb reason.

Pardon my ignorance, but what happened to G98? If you feel it is way too OT, please PM me.
 
I'm pretty sure they canceled GT212 because Windows 7 and DirectX 11 are early. Win7 looks like Q3 release right now. Therefore it makes sense to fast-track a the DX11 (GT300) chip, right?
Since when can you fast-track a product by several months just out of sheer willpower? :p
Maybe ATI's DX11 part is coming early and NV is afraid that no-one will buy their DX10 chip when ATi offers a DX11 chip with similar performance?
Now here you might have put the finger on something... If NVIDIA had heard AMD might be early to DX11 several months ago, then it is possible that they might have started allocating more resources to that some time ago, and not just now that Win7 is looking like it's on the fast-track itself. Whether that can make a big difference on such a big project is another question entirely...

WRT G98, let's just say I heard from several people A11 didn't even boot-up and leave it at that, okay? :) So they had to do a new tape-out and then they needed a respin (as you'd expect really) and so forth...
 
Feels like FUD to me, GT216 must surely have taped-out by now and they'd still be in time for Q2 if they respinned it next month or possibly even later.
Yeah, it's a good point, with the end of Q2 over 5 months away.

So if this is true, it'd imply either that no 40nm chip taped-out yet (contradicted by TSMC's claims, IMO) or that the initial tape-out is a G98-like disaster where the damn thing is back and won't even boot and can't be tested for an hilariously dumb reason.
What TSMC claims?

It still doesn't rule out that NVidia's been told to slow down by the channel, does it?

Jawed
 
What TSMC claims?
That their 3 lead customers are AMD, NVIDIA and Altera. And I've seen musings Altera will be the last, and they've already got samples... Of course, I could be reading too much into that or it might simply be wrong.

It still doesn't rule out that NVidia's been told to slow down by the channel, does it?
Absolutely, although I wouldn't say they told NV to "slow them"... I think it's more likely they told them "**** you" after stufifng the channel with 65nm inventory in a blind hope to be able to transition fully to 55nm.

The problem now is that NVIDIA has lots of inventory left and they probably don't see how to get rid of it. However given that GT216 basically makes G92/G94 obsolete (and G96/GT200 to a lesser extend; they won't be in the same perf ballpark but I don't see how perf/$ will be attractive for these two with GT216 in the market), I don't really see what they can do. If they delay it, they won't be able to get rid of that inventory at reasonable prices anyway due to RV740...

So this doesn't make any sense to me. I can see non-GT216 chips having been delayed/canned for both financial and engineering reasons (GT214 being canned and switching to a GDDR5 GT215 would make sense for example), but I don't see how you get away from GT216 being released in Q2 and it meeting the OEM back-to-school cycle. I could see how they might start shipping to to OEMs earlier than to the channel, though... (ala RV610/RV630, although probably earlier)

Of course there is always the possibility that GT216 will get delayed sometime between now and Q2; my only point is that the roadmap for it likely hasn't changed much if at all, because if they needed a second respin they wouldn't know it by now anyway.

EDIT: Oh, one last thought: maybe GT212 is canned, but not because of GT300 or anything like that; maybe enthusiast-level sales are just so bad because of the economic crisis that they feel it's just not worth the money to keep working on it and even tape it out.
 
Since when can you fast-track a product by several months just out of sheer willpower? :p
I heard they booked a Uri Geller course for the design team. ;)
Now here you might have put the finger on something... If NVIDIA had heard AMD might be early to DX11 several months ago, then it is possible that they might have started allocating more resources to that some time ago, and not just now that Win7 is looking like it's on the fast-track itself. Whether that can make a big difference on such a big project is another question entirely...
The rumors about Win7's early arrival poped up pretty early, around october last year. I presume NVIDIA is well connected and head or was informed even earlier.
 
The rumors about Win7's early arrival poped up pretty early, around october last year. I presume NVIDIA is well connected and head or was informed even earlier.

Even so, I strongly doubt that Win7 will go RTM before late summer, at best.
That's smack in the middle of Q3, in line with the hypothetical GT300 release.
How much do you think sales would be affected by releasing it simultaneously with 7 ?

AMD sure couldn't rely on DX11 support alone if visible speed increase per dollar wasn't there too.
I mean, G80 sold months before Vista RTM (retail, at least), not because it supported DX10, but because it was a beast running DX9 games against anything else on the market at the time.
 
Which, of course, begs the question, whose architecture is currently closer to what'll be delivered by the D3D11 GPUs?

Jawed
 
Even so, I strongly doubt that Win7 will go RTM before late summer, at best.
That's smack in the middle of Q3, in line with the hypothetical GT300 release.
How much do you think sales would be affected by releasing it simultaneously with 7 ?
I thought it was anticipated that GT300 is to be released in Q4? Anyway, the real question is how much an early win7 release would have hurt GT212. I think especially in economically dense times people want "future-proof" and GT212 only would have offered better performance. A card in that price range is an investement even to the enthusiast crowd.

AMD sure couldn't rely on DX11 support alone if visible speed increase per dollar wasn't there too.
I mean, G80 sold months before Vista RTM (retail, at least), not because it supported DX10, but because it was a beast running DX9 games against anything else on the market at the time.
IMO one reason why G80 sold as good as it did was because it was "future-proof". People knew Vista and DX10 was around the corner and there was a moderate amount of hype around it. DX10-support and good performance made G80 a no-brainer. I think that NVIDIA is keen on repeating this, especially in the light that Win7 will probably gain much better marks than Vista.
 
Status
Not open for further replies.
Back
Top