NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.
what reasons? specify. they've done it with every other architecture i really don't care either way i just figured they would i never buy the sandwich cards anyway....
 
gddr5 wouldn't cause them not to.... they'd make a new architecture with a gddr5 controller a few months later that would wipe the floor with it.
 
that doesn't mean that they wont they'll just avoid building one if they can....
i agree completly though a single powerful gpu is better than a sandwiched card im just saying if the 9800 gx2 brought in enough revenue they might try it again
 
=>Arun: You can't believe what nVidia representatives say. One day, they tell you the GeForce 7 is the best solution on the market, the next day they launch G80 and claim the whole GF7 line was a mistake they learnt from :)
 
course its all speculation about the far off future im more concerned about the upcoming launch of gt200's anybody got any ideas about pricing?
 
that doesn't mean that they wont they'll just avoid building one if they can....
i agree completly though a single powerful gpu is better than a sandwiched card im just saying if the 9800 gx2 brought in enough revenue they might try it again

Exactly. They will definitely try to go monolithic every generation if they can. But if they have to do a GX2 to compete they certainly will. And the odds that they will have to are just going to increase as AMD pushes their own cost effective multi-GPU solutions.
 
My point is that the incremental cost to go monolithic for them has actually gone *down*. And Lukfi, I'm not aware of NV ever having said the GF7 Series was a mistake; management loved that series, it had awesome margins, awesome ASPs, awesome market share.

And sorry, but for this kind of thing, Jen-Hsun is honest the vast majority of the time; sometimes he's simply wrong or misinformed, but he has better things to do than to lie outright to investors. I've listened to more than enough NV CCs to know he's not the kind of guy who'll try to downright confuse everyone even if it's in his interests. He got a PR and marketing team to do that, though.
 
well in defense of lukifi they did learn from mistakes they made in the 7 series and implemented them in the 8 series (which produced an immensely powerful gpu) but i don't remember them saying that the series in total was a mistake i liked the 7 series sure it had flaws but it was a solid architecture overall
And sorry, but for this kind of thing, Jen-Hsun is honest the vast majority of the time; sometimes he's simply wrong or misinformed, but he has better things to do than to lie outright to investors. I've listened to more than enough NV CCs to know he's not the kind of guy who'll try to downright confuse everyone even if it's in his interests. He got a PR and marketing team to do that, though.

i wasn't saying he was lying i was agreeing with him about sandwiched cards but at the same time i was saying that nv's never going to rule out something that can regain the performance crown
 
Reread Jen-Hsun's reply, he clearly hints at the 9800 GX2 there. And Lukfi, I am absolutely positive that NVIDIA always loved and still loves the GF7 series. In fact they initially expected a substantially slower transition than what actually happened (i.e. they expected OEMs to stick to GF7 for a variety of designs in the 2007 Back-to-School cycle, but nearly every design switched to G84/G86 or RV610/RV630)
 
Shall we write down a timeline with new chip releases, manufacturing processes, transistor counts and according die sizes through the past years? Something tells me that you'll be completely wrong.

Actually, aside from 3 architectures, NV has averaged an architectural transition every 13-16 months. Here's my timetable (source: wiki)

The transition from NV1 -> Riva 128 took 19 months, from September '95 to April '97. The next transition to the TNT generation took only a little over a year, from April '97 to June '98. The next transition to the Geforce generation took only slightly longer, from June '98 to October '99. The transition to the Geforce 3 generation taking the same length as the previous, but from October '99 to February '01. The transition to the FX generation took 2+ years and broke this trend, however. Following on from that the transition to NV40 was NV's quickest yet, taking only from March '03 to April '04. The latest architectural transition to G80 was the longest yet, taking from April '04 to November '06.

So, other than NV1, NV30 and NV50(G80), it is clear that NV tends to transition to a new architecture in 13-16 months on average, with the most recent GT200 only slightly outside this range @ ~18 months.

"Tesla 2" is nothing surprising. It may be nothing more than the FP64 support.

Tesla 2 most certainly does feature DP support, but I'm sure that's not the only new feature.

They keep mentioning 2nd generation architecture. Possibly 2nd generation "unified" architecture meaning they've tweaked or improved areas where its first generation "unified" architecture was lacking in, like its triangle setup. They could have changed the MADD+MUL configuration to Dual MADD. But im still surprised no word on DX10.1.

If they haven't fixed the setup bottleneck, then there would be almost no point in increasing functional unit count.

Basically, it is a reworked G8x/G9x. By the way, these rumours all lack one important piece of information about the chip that I'm sure will surprise you :p

I've noticed one consistent lack of spec across all these rumors, and that is TA/TF count. Shader-based texturing, anyone?
 
Status
Not open for further replies.
Back
Top