Nvidia GT300 core: Speculation

Status
Not open for further replies.
Economic slump and suspected competitor parts. G300 started out as "the next $650 or $700 product."

I would suspect that sometime last year the plans were drastically changed when they started on 40nm and saw what happened to price spectrum. postponing it's launch a couple of months to re-engineer (cost-cut) seems a logical step. That's my reasoning behind the 2010 launch.
I suspect a re-engineering that would provide any significant amount of cost cutting would take far, far longer than a couple of months.
 
I suspect a re-engineering that would provide any significant amount of cost cutting would take far, far longer than a couple of months.

what if this started over 2 years ago? After it became clear that whatever road they took with G200 didn't lead them where they wanted to. then they got word of RV770 and they were sure that postponing G300 was the way to go. G200 was supposed to launch at 55nm last year, 65nm was a backup plan. I could see the original G300 being framed at Q2 of 2009. they would've lost just 8 months by deciding to change early.
 
what if this started over 2 years ago? After it became clear that whatever road they took with G200 didn't lead them where they wanted to. then they got word of RV770 and they were sure that postponing G300 was the way to go. G200 was supposed to launch at 55nm last year, 65nm was a backup plan. I could see the original G300 being framed at Q2 of 2009. they would've lost just 8 months by deciding to change early.
Except the G8x line did very, very well for nVidia. Aside from some engineering mistakes with some laptop parts, they really have made out like bandits with these chips. They may or may not have decided to make a change based upon the G200's performance, but as that was basically a refresh part, they may not have had tremendous expectations for it either.
 
Chalnoth: Each graphics card, which has no competition, is successful. G80 was successful, because R600 was late, power hungry, had broken ROPs and its manufacturing process was too leaky to allow production of higher-clocked model.

If one-eyed competes with blind, than he must win in every aspect...

From technological standpoint, GT200 is similar kind of product as G80. But GT200 (unfortunately for nVidia) has a competitor.
 
Chalnoth: Each graphics card, which has no competition, is successful. G80 was successful, because R600 was late, power hungry, had broken ROPs and its manufacturing process was too leaky to allow production of higher-clocked model.

If one-eyed competes with blind, than he must win in every aspect...

From technological standpoint, GT200 is similar kind of product as G80. But GT200 (unfortunately for nVidia) has a competitor.

Well said. It's hard to judge the engineering of something w/o competition, although you can look at generation on generation improvement in various metrics.

GT200 definitely had some, especially in the area of power efficiency.

DK
 
I remember how GT200 fared in the early reviews versus two G92 chips in SLI.

G80 and friends' overall awesomeness seemed to really deaden the enthusiasm for GT200, particularly at the price point initially targeted.
 
Economic slump and suspected competitor parts. G300 started out as "the next $650 or $700 product."

I would suspect that sometime last year the plans were drastically changed when they started on 40nm and saw what happened to price spectrum. postponing it's launch a couple of months to re-engineer (cost-cut) seems a logical step. That's my reasoning behind the 2010 launch.

And you reasoning tells you that NV was so insane to plan a say 2.4B transistor chip at 55nm before? No one can be sure but I haven't heard even once during 2008 that their X11 chip won't be on 40nm rather the contrary.

what if this started over 2 years ago? After it became clear that whatever road they took with G200 didn't lead them where they wanted to. then they got word of RV770 and they were sure that postponing G300 was the way to go. G200 was supposed to launch at 55nm last year, 65nm was a backup plan. I could see the original G300 being framed at Q2 of 2009. they would've lost just 8 months by deciding to change early.

65nm a back up plan for GT200? Where do they tell those fairy tales these days? The easiest way to find out why any D3D11 GPU isn't on shelves yet is to look in TSMC's 40nm direction and nothing else IMHO.

Apart from all that does anyone know for sure yet when they will be able to release it or not?
 
And you reasoning tells you that NV was so insane to plan a say 2.4B transistor chip at 55nm before? No one can be sure but I haven't heard even once during 2008 that their X11 chip won't be on 40nm rather the contrary.

ehm, no .. that's not what I said or thought.


65nm a back up plan for GT200? Where do they tell those fairy tales these days?

That was pretty obious already, right? the 200b's used for the FX5800's were produced in July, they were revision 2. If the first revision of G200b was useable, it would've been ready before the GT200 launched. Seen how the dates on the g200-A2 package would be about the same as a speculative g200b1.
 
ehm, no .. that's not what I said or thought.

Care to elaborate then? Cause I can't figure any other scenario that would make their X11 chip more cost effective since it sounds so far a quite complex and sophisticated architecture overall.

That was pretty obious already, right? the 200b's used for the FX5800's were produced in July, they were revision 2. If the first revision of G200b was useable, it would've been ready before the GT200 launched. Seen how the dates on the g200-A2 package would be about the same as a speculative g200b1.

No it wasn't or isn't obvious at all. The story about GT200@65nm has been mentioned in these forums more than once.

Your rather weird scenarios suggest that any IHV can change to process X or Y on short notice. I haven't the vaguest clue what you mean exactly with revisions, yet if you should mean A1/B1 silicon figure it out yourself why it couldn't have been useable under usual conditions.
 
Care to elaborate then? Cause I can't figure any other scenario that would make their X11 chip more cost effective since it sounds so far a quite complex and sophisticated architecture overall.

Indeed, but the market for a complex and sophisticated chip at $700 is all but gone, people want that product at $300 now, probably less.

No it wasn't or isn't obvious at all. The story about GT200@65nm has been mentioned in these forums more than once.

Your rather weird scenarios suggest that any IHV can change to process X or Y on short notice. I haven't the vaguest clue what you mean exactly with revisions, yet if you should mean A1/B1 silicon figure it out yourself why it couldn't have been useable under usual conditions.

Eh? change on short notice? I'm talking about two years of development here, not two months.
65 and 55nm G200 were produced so short after one another that B1 packaged at the same time as A2. IT seems that 55nm was going quit well for G9x that production of the g200 at 55nm was given preference, g200 wasn't delayed for more than half a year.

This is something that they saw coming for quite a while, just like they didn't get 40nm chips out as soon as they could.
 
Eh? change on short notice? I'm talking about two years of development here, not two months. 65 and 55nm G200 were produced so short after one another that B1 packaged at the same time as A2. IT seems that 55nm was going quit well for G9x that production of the g200 at 55nm was given preference, g200 wasn't delayed for more than half a year.

Hmmmm so you're saying that 65nm was a backup plan but they decided to switch to the backup plan 2 years before launch? That's a bit of fantasy right there.

Why doesn't the much more obvious and realistic scenario of 65nm GT200 being delayed make sense to you? This scenario makes a lot more sense in the context of the mess they made of G92's debut.

And the market for expensive graphics cards hasn't gone anywhere. The current competitive landscape (which has only existed for about a year) isn't yet an established trend. This time around cheap was king because the expensive stuff wasn't any faster. You can make assumptions about how that's going to hold up in the future but I don't see any indication of it become law just yet.
 
Indeed, but the market for a complex and sophisticated chip at $700 is all but gone, people want that product at $300 now, probably less.

1. Despite the couple of weeks GTX280 stayed after launch at roughly $600 (which is quite a notch below your exaggerated 700 bucks), it dropped quickly to far more reasonable prices. In other words they can sell those at way less and oh well ask me if I care as a consumer if the IHV makes X% less profit on that one or none at all in the end.

2. Workstation and Server markets; need I really to say more than that or isn't it a fact that NV dominates those markets for years now? In retrospect it can also mean that a high amount of expenses can be amortized here, even more so since we're talking way higher than $700 per high end board, let alone Tesla clusters.

Where again is what all but gone and for which narrow and shortsighted perspective exactly?


Eh? change on short notice? I'm talking about two years of development here, not two months.
65 and 55nm G200 were produced so short after one another that B1 packaged at the same time as A2. IT seems that 55nm was going quit well for G9x that production of the g200 at 55nm was given preference, g200 wasn't delayed for more than half a year.

This is something that they saw coming for quite a while, just like they didn't get 40nm chips out as soon as they could.

You do realize though that your theory doesn't make much sense do you? I won't doubt that 65nm GT200 faced a delay for whatever reason, but that still doesn't mean that 55nm GT200b development started later than its 65nm predecessor. How and why the 65nm part can fit into that picture as a hypothetical "back up plan" is beyond my imagination.

Better let's oversimplify things I fail to see where you're trying to get at.
 
Hmmmm so you're saying that 65nm was a backup plan but they decided to switch to the backup plan 2 years before launch? That's a bit of fantasy right there.
?eh? wtf? what are you smoking? gt200 a backup plan of gt300? I was talking about g200/b not a 65nm g300 or a 40nm g200 as a backup of that.

Why doesn't the much more obvious and realistic scenario of 65nm GT200 being delayed make sense to you? This scenario makes a lot more sense in the context of the mess they made of G92's debut.

I'm more on the line of that they didn't need to push G200 because they knew that nothing new was coming from ati from untill rv770. G200 was there to counter anything ati could launch while priority was given to 55nm. If anything production on g200b went quite well, b1 was finished before g92b1

And the market for expensive graphics cards hasn't gone anywhere. The current competitive landscape (which has only existed for about a year) isn't yet an established trend. This time around cheap was king because the expensive stuff wasn't any faster. You can make assumptions about how that's going to hold up in the future but I don't see any indication of it become law just yet.

Maybe not from nV's side but we already had some clear rumours about the cost of ati's high end part. IF ati's part launches at $300 (25 below the cheapest GTX285) and is a good deal faster. a $530 pricetag would not be justified for a GTX295, let alone a G300 that would perform like it or a bit higher than the 295. If we were to go back to the trend G300 will launch at $700 and would be beaten by 2 $300 products.

Graphics card pricing is in itself a short term trend that cannot be ruled by "previous trend figures"
 
a $530 pricetag would not be justified for a GTX295, let alone a G300 that would perform like it or a bit higher than the 295. If we were to go back to the trend G300 will launch at $700 and would be beaten by 2 $300 products.

The price of the "G300" (or whatever its called) will be dictated by its performance and the competition's price/performance ratio compared to that. NV was forced to quickly adjust the street price too for GT200 shortly after its launch.

If NV will go as far to charge premium prices (and yes that $700 figure is still way out of line IMHO) then it'll have to be quite a bit faster than whatever AMD delivers in its highest end offering.

Finally if their high end X11 GPU performs merely a notch over today's GTX295 it would mean that's its nothing else than a GT200 with more than twice SPs and X11 slapped on.
 
1. Despite the couple of weeks GTX280 stayed after launch at roughly $600 (which is quite a notch below your exaggerated 700 bucks)
Thanks for getting your facts straight mister, it was $655 and it's price actually rose the first week!
evgagtx280.gif

And as you can read, I was talking about G300 which would (according to trend) be more expensive than G200 and thus launch around $700.

it dropped quickly to far more reasonable prices. In other words they can sell those at way less and oh well ask me if I care as a consumer if the IHV makes X% less profit on that one or none at all in the end.

Indeed, We don't care... the R&D department obviously does.

2. Workstation and Server markets; need I really to say more than that or isn't it a fact that NV dominates those markets for years now? In retrospect it can also mean that a high amount of expenses can be amortized here, even more so since we're talking way higher than $700 per high end board, let alone Tesla clusters.
The Workstation market has had an even worse shake-up than the desktop market over the last year. And the customers we have that are investing in GPGPU have actually gone for the cheaper approach and purchased desktop cards while still linking it's clustered machines with infiniband. even with millions to spend, they are not THAT crazy.

You do realize though that your theory doesn't make much sense do you? I won't doubt that 65nm GT200 faced a delay for whatever reason, but that still doesn't mean that 55nm GT200b development started later than its 65nm predecessor. How and why the 65nm part can fit into that picture as a hypothetical "back up plan" is beyond my imagination.

In a parrallel trajectory a delay in the 65nm path and good results in 55nm path (for both the 200 and the 92) would be the simplest reason. why would nvidia put all it's effort in g200 when g200b was merely weeks away from being completed as well?
 
The price of the "G300" (or whatever its called) will be dictated by its performance and the competition's price/performance ratio compared to that. NV was forced to quickly adjust the street price too for GT200 shortly after its launch.

If NV will go as far to charge premium prices (and yes that $700 figure is still way out of line IMHO) then it'll have to be quite a bit faster than whatever AMD delivers in its highest end offering.

Finally if their high end X11 GPU performs merely a notch over today's GTX295 it would mean that's its nothing else than a GT200 with more than twice SPs and X11 slapped on.

Indeed, and if such a card were to appear next month (not saying it is) it would be feasable for nvidia and partners to charge $200 over the other premium part, the 295.

wait.. we agreed on something.. can I stop typing now? I'm tired :(
 
Thanks for getting your facts straight mister, it was $655 and it's price actually rose the first week!

Interesting hair splitting nonetheless. It isn't 700 it's 655 no 600...wait did it or did it not drop within short time?

And as you can read, I was talking about G300 which would (according to trend) be more expensive than G200 and thus launch around $700.

It could yes, but will it? And yes it's completely idiotic to assume any price before knowing the performance ballpark it'll play in.

Indeed, We don't care... the R&D department obviously does.

Remains to see why they haven't changed their strategy so far, if it's such a headache overall.

The Workstation market has had an even worse shake-up than the desktop market over the last year. And the customers we have that are investing in GPGPU have actually gone for the cheaper approach and purchased desktop cards while still linking it's clustered machines with infiniband. even with millions to spend, they are not THAT crazy.

Your customers aren't obviously any measurement to go by. There's been a decline for the workstation market beyond doubt, but I haven't seen any ground breaking changes in market shares for those markets lately. Any funky graph you want to complement here also?

In a parrallel trajectory a delay in the 65nm path and good results in 55nm path (for both the 200 and the 92) would be the simplest reason. why would nvidia put all it's effort in g200 when g200b was merely weeks away from being completed as well?

That still doesn't make the 65nm part a back up solution. If they wouldn't had released the GTX280 they wouldn't have had (until the 55nm part was ready) anything to counter the 4870X2.

Indeed, and if such a card were to appear next month (not saying it is) it would be feasable for nvidia and partners to charge $200 over the other premium part, the 295.

In pure theory if it would be by X% percent faster than the 295 and the competition wouldn't have anything by that time to counter it, yes they could charge Y amount more for it. It doesn't look like it'll appear next month or before AMD's X11 solution in any case and no one knows yet what it'll perform like in any case. It could waltz all over a 295 in all flying colours (wishful dreaming) or could even end up slightly slower than it in the end.

wait.. we agreed on something.. can I stop typing now? I'm tired

Did we really? If you really think so.... :rolleyes:
 
I remember how GT200 fared in the early reviews versus two G92 chips in SLI.

G80 and friends' overall awesomeness seemed to really deaden the enthusiasm for GT200, particularly at the price point initially targeted.

Don't you chime in into the "it's all about the bars' lengths" melody, pretty please? SLI and Crossfire deliver oftentimes longer bars but worse compatibility and worse user experience when it's crunch time.
 
Don't you chime in into the "it's all about the bars' lengths" melody, pretty please? SLI and Crossfire deliver oftentimes longer bars but worse compatibility and worse user experience when it's crunch time.

GT200 was given praise for being the fastest single-GPU solution, but the reviews that made note of the better consistency of the card tended to still wistfully look at the longer peak FPS bars of dual GPU solutions.

I don't know how many review sites say "dual GPU is crap, so we're not going to include those in our charts", and I don't know how much of the enthusiast audience made the distinction, either.

If someone already had a dual-GPU setup, it was quite a price hike to transition once again.
 
?eh? wtf? what are you smoking? gt200 a backup plan of gt300? I was talking about g200/b not a 65nm g300 or a 40nm g200 as a backup of that.

Something obviously much weaker than your stuff :LOL: Where did I say anything about GT300? You're the one making up the incredible hypotheses here....

Maybe not from nV's side but we already had some clear rumours about the cost of ati's high end part. IF ati's part launches at $300 (25 below the cheapest GTX285) and is a good deal faster. a $530 pricetag would not be justified for a GTX295, let alone a G300 that would perform like it or a bit higher than the 295. If we were to go back to the trend G300 will launch at $700 and would be beaten by 2 $300 products.

Well surely if you assume the best case for ATI and worst case for Nvidia like you're doing then we'll see a repeat of the current situation. See you assume ATI's new part is a blockbuster and that G300 will be lackluster (GTX 295 perf would be weak IMO). In that version of the world Nvidia has no pricing power.

Now if I were to place a bet I would actually lean towards a repeat of GT200 vs RV770 since we have reason to believe GT300 will have an even greater emphasis on general computation. So barring some revolution on the 3D software side, ATI's lean and mean approach should continue working just fine.
 
Status
Not open for further replies.
Back
Top