NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.
Well if you look only at nVidia's profits, then yes. It's kinda sad that people let themselves be ripped off like that.

We do?

I believe that is not the case. I would be happy to discuss it with you, however I don't think this is the right thread...

I believe you, but I don't agree. If they can already manufacture the cards, they could sell it at insanely high price points because of the lack of competition.

Me too (and unlike you, I'm being paid for it :) )

That may change soon for me, i will post while i am still free. =P

agreed .. but i am new here and have no idea which thread

as to GT200 being sent out early, no .. nVidia still has a lot of left-over cores they are dumping at bargain prices. GX2 is simply created to kill 3870x2 but the 9800 line is way cheaper than 8800 ever was to produce. GT 200 will also be cheap and we should see GPUs based on it for at least another 2 years; it is nvidia's "new G80" and quite improved i think.

. . . and i would say "what other reason" would ATi go with hat in hand and ask to be swallowed up by AMD if they did NOT share their Fusion Vision? - unless you believe, as some of us do [i did] that Fusion was a "ploy" to unbalance nVidia while they scrambled to fix Phenom and r700. But i don't think so anymore.

i believe AMD using ATi's know-how, has managed to get a prototype of fusion and they have diverted their engineers away from lesser projects [note the conservative specs of r700] and are content to attack the midrange and IG while they prepare their real weapon against intel and nVidia [who is also eyeing Via for a SiS division to make their OWN CPU-GPU] while Intel struggles to take everyone on at once. My analysis is that intel has the most to lose and will do so without clear vision - Larrabee is insufficient for at least 5 years, imo

Actually, it hasn't -- yet.
And when it will they'll withdraw it to enchance it so it will return eventually.

Agreed .. my prediction is that DX10.1 will return when GT200 is announced
--nVidia holds all the GPU cards now
[so to speak; but AMD is banking on their CPU Phenom, which is awesome for everything except high-end gaming and then Fusion, their salvation, they envision]

the rage Pro .. yes, i had the Fury 32 which was pretty good - probably the first good GPU from ATi
 
Last edited by a moderator:
agreed .. but i am new here and have no idea which thread
I was thinking about making a new one, but seems to me this one would qualify as the right one:
http://forum.beyond3d.com/showthread.php?t=41627&page=42
apoppin said:
as to GT200 being sent out early, no .. nVidia still has a lot of left-over cores they are dumping at bargain prices. GX2 is simply created to kill 3870x2 but the 9800 line is way cheaper than 8800 ever was to produce. GT 200 will also be cheap and we should see GPUs based on it for at least another 2 years; it is nvidia's "new G80" and quite improved i think.

Cheap? It's supposed to be a big fat monolithic chip as far as I can tell. Nor do I think it's an improved G80. Look at the naming. G70 also had an innovative codename, but its original name was NV47 and it was a slightly improved NV40 with more pipelines. Right now, there's an empty spot at G90, a ninth-generation high-end chip which would complement the existing GeForce 9 line. And I think that this will in fact be the GT200. Why is nVidia launching it so late remains a mystery to me, but it could be that the designing process took more time than originally expected. Or maybe the design is already complete but nVidia saw they could make more money selling dual G92's, as the GT200 is more expensive to manufacture. And now when RV770 and R700 come, GT200 can just be taken out of the fridge and sent to TSMC for fabbing...
 
No IHV is as idiotic to hold back a ready product in anticipation what the competition has. NV's next product family appears by the time it's ready and no it wasn't rotting on some shelve until AMD would release its RV770.

As for the rest of the ATI/AMD soap opera you guys are debating here, I don't see how it's related to the topic here, but either way I'm glad that NVIDIA has still for the moment a serious competitor.

And once you're there spinning funky theories, if AMD wouldn't have anything ready for this summer, NV could have considered not using 8*64/GDDR3 but wait for wider GDDR5 availability with a smaller MC. Of course will you end up then in another web of nonsensical theories since an IHV doesn't decide within a couple of weeks either the layout of each architecture.

NV doesn't use for generations now the smallest available manufacturing process nor the fastest ram available and that obviously to minimize risks as much as possible. The rest belongs to the "but but..." theory realm.
 
Then, why was R600 released in May when there's evidence that ATi actually had the chips (not just the design) ready in January?
 
Then, why was R600 released in May when there's evidence that ATi actually had the chips (not just the design) ready in January?

Because they obviously had problems with the manufacturing process from what I recall. Who told you by the way that there weren't former revision chips available long before January anyway?

In any case can we please get back on topic?
 
The manufacturing process was broken even after they released the chip. Sure there were early revisions, but I've heard somebody saying that the chips that went to retail in May were in fact manufactured in January or so.

Okay, back to the topic. Why are you so sure that nVidia didn't delay the GT200 on purpose? (Remember, it doesn't necessarily mean that they put the blueprints into the refridgerator, they could use the extra time to fix minor bugs, etc.) You yourself said that you expect GT200 to be manufactured on 65nm. So it would be quite a big chip, and since yields do not decrease linearly with increasing transistor count, manufacturing two G92's (or even better: G92b's) could be cheaper than one GT200, while offering almost the same performance. I do admit that this theory has flaws, but it's not all that impossible.
 
From what I remember but I could be wrong, they had A12 in January, not A13. There was indeed an unusual delay between A13 being back and mass production, but it wasn't anywhere near that long I think.

Now, could we all PLEASE remember that it takes more than 2 months just to get the chips back from the fab from the second you decide to start mass production? And then you need to put them on the PCBs etc. and ship them to partners and then stores and... Unless you have insider information or a very clear picture of the length of each phase, you can't conclude anything about when a product should have been available based on functional samples being back.

Both GT200 and R7xx have been not 'kept back', and both will be released pretty much as early as possible. So can the conspiracy theories please die now?
 
From what I remember but I could be wrong, they had A12 in January, not A13.
I frankly don't know, but you're probably right...
Now, could we all PLEASE remember that it takes more than 2 months just to get the chips back from the fab from the second you decide to start mass production? And then you need to put them on the PCBs etc. and ship them to partners and then stores and...
I know, but it was already mentioned here that nVidia probably has more information about ATi's upcoming products than we do. Actually I think that nVidia's current line-up closely reflects their long-term plans, GT200 was meant to arrive later than the G9x family of chips. Granted the current situation on the game market, it's logical to stay with the G8x architecture. But the timing of high-end product launches just seems somehow erratic, that's what's weird here.
 
I know, but it was already mentioned here that nVidia probably has more information about ATi's upcoming products than we do. Actually I think that nVidia's current line-up closely reflects their long-term plans, GT200 was meant to arrive later than the G9x family of chips. Granted the current situation on the game market, it's logical to stay with the G8x architecture. But the timing of high-end product launches just seems somehow erratic, that's what's weird here.

Actually, I'm fairly sure ATI has a better information on what Nvidia has than we do. :p

GT200, if it was ready and Nvidia had enough quantity to at least somewhat meet projected demand, then it would be out there and for sale.

They have never in the past held onto a chip, nor will they in the future. Storage isn't cheap. And if you have a stockpile of product just sitting there then it isn't making you money. And in fact, ever minute it sits there you are losing money. Not to mention that if they miscalculate and it's actually slower than the competition, they are going to be up poop creek without a paddle. By selling as soon as possible not only do you minimize losses from stock sitting around, you maximize profits - especially when there's no competition. Even if the competition then launches a faster product, it doesn't matter as much since you've already made back a significant portion of your investment.

Therefore, even if Nvidia has working final silicon, and actually has working boards. It's obvious they don't have enough to meet channel demand otherwise we'd be buying them as we speak.

BTW - something actually on topic...

Considering the strides and effort put forth by AMD in improving multi-GPU rendering and making it more consumer friendly...has anyone heard of anything Nvidia is doing with regards to multi-GPU rendering to coincide with the GT200 release?

Regards,
SB
 
I know, but it was already mentioned here that nVidia probably has more information about ATi's upcoming products than we do.

You really think that Sapphire, aka the close-est buddy to ATi, would completely sell them out? Or you think some random person/developer decided to stake their career on it?

IMO- Some of the more recent stuff is pure BS and just meant to get a rise out of us, the people that actually take the time to read it...
 
I'm not saying that Sapphire or one of their employees would sell them out. I don't really know. But do you think industry espionage doesn't exist?
 
I do not think it is a matter of espionage. If u have talented people, they will know what is approximately possible with typical architectures at a certain price points. I think if u ask an ATi scientist what is possible with an nV type of architecture with a certain transistor budget, he (or she?) will have a good estimate.
 
Considering the strides and effort put forth by AMD in improving multi-GPU rendering and making it more consumer friendly...has anyone heard of anything Nvidia is doing with regards to multi-GPU rendering to coincide with the GT200 release?

I doubt anything will change on the performance/architecture side of SLI. Some usability issues like multi-monitor support are the most likely candidates for improvement. Other than that SLI is arguably still ahead if you consider better profile support and higher performance to be more consumer friendly.
 
I was thinking about making a new one, but seems to me this one would qualify as the right one:
http://forum.beyond3d.com/showthread.php?t=41627&page=42

Cheap? It's supposed to be a big fat monolithic chip as far as I can tell. Nor do I think it's an improved G80. Look at the naming. G70 also had an innovative codename, but its original name was NV47 and it was a slightly improved NV40 with more pipelines. Right now, there's an empty spot at G90, a ninth-generation high-end chip which would complement the existing GeForce 9 line. And I think that this will in fact be the GT200. Why is nVidia launching it so late remains a mystery to me, but it could be that the designing process took more time than originally expected. Or maybe the design is already complete but nVidia saw they could make more money selling dual G92's, as the GT200 is more expensive to manufacture. And now when RV770 and R700 come, GT200 can just be taken out of the fridge and sent to TSMC for fabbing...

Do you know what the "T" in GT stands for? Think 'scientist' as it is a new name for brand new architecture; i believe they have been working on it for almost 5 years.

nVidia is launching it "late" as it was evidently taped out in December as we noted. This is simply to spoil AMD's launch party and to crush r700. They used their 'extra', cheap g92b/94 cores as 9800 series to beat 3870/3850 series in both price and performance - just barely what was "necessary" [to max their own profit] and their Sandwich chip, GX2 is the interim answer to the other sandwich, the 3870x2. This is all marketing strategy; something nVidia has got right since their DustBuster debacle.

GT200 is adaptable, even more than G80. it is designed to have a cheap core for the lesser GPUs and an expensive one for the top performers. NVIDIA is obviously [to me] using GT200 as an an "all purpose" answer - the true replacement for g80 - for the next two years. It also appears to me that they are so confident in their GT200 as to appear smug. Tast time, ATi ran running to momma AMD because they knew they could not use r600 to compete with g80's GTX; this time AMD will just hang on with CrossFireX and develop Phenom .. and Fusion
 
GT200 is adaptable, even more than G80. it is designed to have a cheap core for the lesser GPUs and an expensive one for the top performers. NVIDIA is obviously [to me] using GT200 as an an "all purpose" answer - the true replacement for g80 - for the next two years. It also appears to me that they are so confident in their GT200 as to appear smug. Tast time, ATi ran running to momma AMD because they knew they could not use r600 to compete with g80's GTX; this time AMD will just hang on with CrossFireX and develop Phenom .. and Fusion

Don't take this too seriously, but aren't you just applying double standards on this matter ?
First you say that "GT200", like "G8x"/"G9x" took several years to develop. Then you say that ATI ran over to "momma AMD" because thay couldn't compete on R600 alone.
Wasn't the R6xx project mostly done and proceeding to fabrication and testing by the time the merger with AMD took place ?
Now, i'm not sure how long it took between the beginning of negotiations and the actual merger, but i'm almost certain that AMD couldn't possibly have done much in the way of influencing both design, testing and production decisions regarding R6xx products, at least not until two quarters into the market introduction date.

I'm betting that both R600 and G80 are very capable designs, and the only thing keeping R6xx in the limbo and eventually at performance deficit against G80 was trouble with the TSMC 80nm half node for one, and, dare i say it, outright incompetent interference of the marketing boys from AMD in ATI's affairs.
AMD had little to none know-how on proper marketing of graphics solutions, and the debacle of the merger left only one consequence, to aggressively cut production and development costs on the things they understood the least, GPU's.
 
Last edited by a moderator:
Do you know what the "T" in GT stands for? Think 'scientist' as it is a new name for brand new architecture; i believe they have been working on it for almost 5 years.
The T stands for Marketing and what you've described is exactly what they want you to believe. That GT200 is something big, powerful and innovative. Even if it isn't (and I predict it isn't, being the reason for a codename that smells like marketing).
This is all marketing strategy; something nVidia has got right since their DustBuster debacle.
Yes, you're right, this *all* is a marketing strategy.
GT200 is adaptable, even more than G80. it is designed to have a cheap core for the lesser GPUs and an expensive one for the top performers. NVIDIA is obviously [to me] using GT200 as an an "all purpose" answer - the true replacement for g80 - for the next two years.
A replacement for G80 for the next two years? Such a chip better be loaded with some future-proof technologies. Yet it isn't. DX11 is still not here, the majority of games are somewhere halfway between DX9 and DX10 and what's the most important, the present G8x/G9x architecture excels in these games.
Tast time, ATi ran running to momma AMD because they knew they could not use r600 to compete with g80's GTX; this time AMD will just hang on with CrossFireX and develop Phenom .. and Fusion
Please, let's not litter this thread with this discussion and continue here: http://forum.beyond3d.com/showthread.php?t=41627&page=42
 
Do you know what the "T" in GT stands for? Think 'scientist' as it is a new name for brand new architecture; i believe they have been working on it for almost 5 years.

nVidia is launching it "late" as it was evidently taped out in December as we noted. This is simply to spoil AMD's launch party and to crush r700. They used their 'extra', cheap g92b/94 cores as 9800 series to beat 3870/3850 series in both price and performance - just barely what was "necessary" [to max their own profit] and their Sandwich chip, GX2 is the interim answer to the other sandwich, the 3870x2. This is all marketing strategy; something nVidia has got right since their DustBuster debacle.

GT200 is adaptable, even more than G80. it is designed to have a cheap core for the lesser GPUs and an expensive one for the top performers. NVIDIA is obviously [to me] using GT200 as an an "all purpose" answer - the true replacement for g80 - for the next two years. It also appears to me that they are so confident in their GT200 as to appear smug. Tast time, ATi ran running to momma AMD because they knew they could not use r600 to compete with g80's GTX; this time AMD will just hang on with CrossFireX and develop Phenom .. and Fusion

Do you ever post anything but baseless personal speculation worded to sound like it was some huge, groundbreaking, visionary discovery?Did this stuff induce joy in "natives" of other places you've visited?The T in GT has nothing to do with what you're imagining it to stand for, your imagination WRT how the merger affected/was affected by the R600 is at best wild, and don't even get me started on the musings about Cuda the barracuda...

Really, has common sense become a commodity that's so out of fashion that everyone ceased to employ it?
 
Do you ever post anything but baseless personal speculation worded to sound like it was some huge, groundbreaking, visionary discovery?Did this stuff induce joy in "natives" of other places you've visited?The T in GT has nothing to do with what you're imagining it to stand for, your imagination WRT how the merger affected/was affected by the R600 is at best wild, and don't even get me started on the musings about Cuda the barracuda...

Really, has common sense become a commodity that's so out of fashion that everyone ceased to employ it?

almost never. You need to understand this is a *summary* of my views gathered over the last decade - elsewhere. i always make sure my readers understand it is my own speculation. However, if you want to make a specific point about what i said, i am glad to defend and explain my views

it appears that you are attacking the messenger - me - because you don't like my message. That is ok with me, but be fair and point out what you disagree with to give me a chance to respond.

What you are apparently doing is tactics suited for that other forum i was formerly associated with. i was hoping for better here.
--i DO know what the "T" in GT stands for; clearly you don't =P
it is brand new architecture

i do agree with you about the "lack of common sense" and i am glad you also included yourself with the general disclaimer "everyone ceased to employ ii"
- at times, yes
 
it appears that you are attacking the messenger - me - because you don't like my message. That is ok with me, but be fair and point out what you disagree with to give me a chance to respond.

--i DO know what the "T" in GT stands for; clearly you don't

I essentially don't care about your message because it's noise...as a consequence, there's nothing for me to agree or disagree with. I also am not attacking you, I was genuinely curious, thanks for at least giving an honest answer.

As for the T...umm, no. Really, no. But don't stop imagining things, it seems to be working out nicely. I'll leave you alone, now that I've figured out what your aim is and that nothing interesting is scheduled. Cheers.
 
I essentially don't care about your message because it's noise...as a consequence, there's nothing for me to agree or disagree with. I also am not attacking you, I was genuinely curious, thanks for at least giving an honest answer.

As for the T...umm, no. Really, no. But don't stop imagining things, it seems to be working out nicely. I'll leave you alone, now that I've figured out what your aim is and that nothing interesting is scheduled. Cheers.

NP, if that is what you really believe, then we are OK and will agree to leave my "noise" alone. I am not so sure about what you mean by my "aim"

i aim to share what i learned and also to learn from you guys. Especially about benchmarking and IQ comparisons. Would you kindly point me in the right direction? i will get the heck out of this type of discussion as i am evidently too controversial in my off-the-wall views and summaries of my analysis for here.

. . . and you will be shocked about the 'T' as new architecture i think .. wait and see
=D
 
Status
Not open for further replies.
Back
Top