NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.
Maybe it stands for Twin. I heard some rumors that GT200 was going to be a dual chip solution... ;)
 
i aim to share what i learned and also to learn from you guys.

Yet you haven't shared a single thing. Claiming to know "something" while not saying anything at all doesn't really go over well here unless you have established credibility in such things which you certainly haven't.
 
[OT] Besides, there's many other people here that have watched the industry for a long time and they don't have an idea what 'GT' might mean, with the exception of Playstation owners for whom it is Gran Turismo. [/OT]
 
Maybe it stands for Twin. I heard some rumors that GT200 was going to be a dual chip solution... ;)

What? Dual chip solution? So it could mean that it will be G92B x2 and it will be something like GF9800GX2 :( ?

But what about these rumours then?
http://translate.google.com/transla...&hl=en&ie=UTF-8&oe=UTF-8&prev=/language_tools

Single-Core, GF9900GTX, 512MC/32ROPs/recommended PSU 550-600W/release time July/GF9900GTX SLI can run Crysis with playable fps in 2560x1600 with AAx4 and Very high details. It sounds very promising if true.

Moreover there are some other rumours which say that GT200 will be one singe GPU wit die size bigger than G80 (above 500mm2).
 
I think apoppin actually knows what the "T" stands for. However, I don't think he knows what the "200" stands for... ;) That's all I'm going to say on this matter.
 
Total BS. Or Trash. Or Thoroughly full of it. Or quite simply, the T isn't there and hasn't been there. There is no T there. Oh and the number is likely wrong as well.

Aaron spink
speaking for myself inc.

Why would for example GPU-Z be recognizing GT200 already, if it's wrong?
 
Total BS. Or Trash. Or Thoroughly full of it. Or quite simply, the T isn't there and hasn't been there. There is no T there. Oh and the number is likely wrong as well.
It does stand for something, but honestly it's not worth wasting our time on. If anyone wants to leak it, feel free, but don't expect it to be a discovery of cosmic proportions.
 
Why would for example GPU-Z be recognizing GT200 already, if it's wrong?

GPU-Z only needs to read device ids, much like driver INF files. The code isn't that complex, it could basically be one giant ugly if/then/else block. Though I'm sure it likely uses a Hashtable or Dictionary lookup based off formulation of Vendor and Device IDs, with an optional Subdevice ID.

Code:
if (vendorId == 0x10DE) { // Nvidia Vendor
  if (deviceID == 0x00F1) {
    chip = "NVIDIA GeForce 6600 GT";
  } else if (devideID == 0x0244) {
    chip = "NVIDIA GeForce Go 6150";
  } else ...
...
} else ... { // ATI Vendor ...
 
I think apoppin actually knows what the "T" stands for. However, I don't think he knows what the "200" stands for... ;) That's all I'm going to say on this matter.

Gah! So his "think scientist" clue actually makes sense? WTF does that mean? :D
 
Why would for example GPU-Z be recognizing GT200 already, if it's wrong?

cause GPU-Z is pointless? No really its a pointless program. Its just mapping vendor and device id into a DB entry. Nothing really advanced about that.

If someone wanted to they could take the code and make it recognize the GTFXTNTGPSUGH100000000000000000000000 if they wanted.

OTOH, CPU-Z is actually just reading and presenting architecturally defined registers. Until there is a standard register definition set from the vendors, GPU-Z will just be something that looks up device ids in a DB. ie, its about the same thing as beyond3d's GPU DB.

Aaron SPink
speaking for myself inc.
 
Gah! So his "thnk scientist" clue actually makes sense? WTF does that mean? :D

supposed to be a snarky reference to tesla. But its useless. it s like some big secret, oh nvidia is going to do exactly what they have already been doing but now its in the chip name. See now everyone should be scared! and I mean SCARED! cause they are putting it in the name and my oh so prophetic vision has seen that everyone else is going straight to hell!!!!!!!!

Let me give everyone a clue, ignore is a function for a reason. Its because posts by apoppin exist.

Aaron spink
speaking for myself inc.
 
Haha you're kidding right? Wow that's a pretty weak basis for proclamations of Jen-Hsun's impending domination of the GPU world. I don't believe in ignore functions but those posts are a bit on the looney side for real.
 
Silly question time. Everyone's aware of this slide ("Era of Visual Computing," that marks Tesla 2 for 2008 and "Next Gen" for 2009, both under the "Programmable Graphics: CUDA-DX11" umbrella) from this Bit-Tech interview with Kirk, right? I'm guessing Tesla 2 is "GT200?"
 
Seems like they were highlighting new generation chips but forgot about one of thier lines....heh fx series didn't make into that chart?
 
Status
Not open for further replies.
Back
Top