NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.
No, the T has a very specific meaning and it doesn't matter in the slightest. Don't waste your time trying to figure out what it means, it really wouldn't tell you anything. As for dual-G94, hah, where will the stupidity end?
 
I wonder if it's worth entertaining the thought that GT200 is G92 "done right", i.e. with double-precision functioning properly?

Could G92 have faulty double-precision which is why NVidia missed that boat?

Though I'm loathe to suggest as much, this might be why the blurred boundaries between G92/9900/GT200 exist.

Jawed
 
http://www.nordichardware.com/news,7578.html

Several sources are reporting on the upcoming GT200 core from NVIDIA, which will be named GeForce 9900. The stories don't match though, when it comes to the specifics. While Expreview claims that GT200 will become the 9900GX2 and 9900GTX, VR-Zone says GeForce 9900GTX and 9900GT. None of them are very specific when it comes to the exact specifications. A user over at Chiphell is though. He says that GT200 will in fact be a dual-core GPU based on the G9x architecture.

While we're certain that the GT200 is hot and will be a huge chip. It looks like it could very well be a dual-core GPU. The users says GT200 is really two G94 chips slapped together, which means that there is a total of 64TMUs, 32 ROPs and 128 shader processors. It would also result in 512-bit memory bus and more than 900 million transistors. Hot in every sense of the word.

Dual-core makes sense on a few points; GT200 is not a proper code-name for an NVIDIA core, but would sooner indicate something like a dual-core spin. The name GeForce 9900 series also points to that GT200 is another spin off the G9x architecture, and not a true next generation architecture.

The launch date is suppose to be early Q2.

Alas, apparently still no DirectX 10.1 support.

PS GT = Twin?

;)

if thats the case finaly a real 8800 GTX replacement.
 
2x G94s is basically 9600GT SLI. Fast but it's not bleeding edge. If it's 2x G9x then it would need to be G92, perhaps that G92b that's been rumoured.

Anyway, I'm keeping my fingers crossed that GT200 is not G9x based and not dual-chip (while delivering significantly more performance than 9800GX2).

Jawed
 
I wouldn't trust any (news) website which uses the term "dual-core GPU". On a similar note, can someone tell me why all these news sites are quoting forum users' speculation as news and rumors? What makes these people credible? Afaik, I could register there and post some baseles speculation, or am I missing something?
 
On April 1st you're allowed to post any sort of nonsense anywhere ;) And since we're here I'm opening bets that NV will charge nearly a buck for each square millimeter....uhhm anyone?
 
note that the geforce 9 name brings Hybrid SLI compatability, especially Hybrid Power. Quite a big deal in my view but it sucks that the chipsets are delayed like six monthes!
so I built my PC around a nforce 520 chipset.

anyway card namings aren't a bid deal imo. AMD does the same, the HD3000 series are the same generation as HD2000. Model numbers are just there to market the thing and you'll always have to go and read stuff to know what's behind them.
 
Last edited by a moderator:
On April 1st you're allowed to post any sort of nonsense anywhere ;) And since we're here I'm opening bets that NV will charge nearly a buck for each square millimeter....uhhm anyone?
If that equates to about ten bucks per Fps in Crysis running at my LCDs' native resolution, i'd be (more than) fine with paying 600 bucks, too.
 
My God, what if they :!: CHANGED THE NAME TO SOMETHING NEW!!!! :!:And I don't mean just some goofy 10th edition suffix here!

*shocked silence*

I digress, however. It is just too much to ask! :)
 
aaronspink said:
And yes, I'm going to keep this crusade up until everyone ridicules anyone who refers to a single alu as a processor or a core. You either count a single bit xor as a "processor/core" you count something that actually is a processor as a processor.
You are, of course, entitled to your opinion. However, this particular name didn't come from the Marketting department.
 
Why wouldn't this product support dx10.1 when ATI is already supporting it, and will be supporting it in their upcoming HD4xxx series? It's a bit of a surprise to me...
 
Status
Not open for further replies.
Back
Top