NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.
R600 could keep 16 ROPs even with 512bit memory bus, because R520/R580 had 8 32bit channels. So ATi just doubled the width per ROP block. nVidia can't do that since they're already at 64bit channels. So I think it the number will be 32, even if they won't be used to their full potential. However, ROPs do more than just draw pixels to the framebuffer, so an increase actually might be useful. Remember the speculation about ATi doubling Z-ops performance on RV770 ROPs? And remeber how this helped RV530 compared to RV515?
 
It could just aswell have 16, I mean seriously, is there anything pointing to that we'd need any more ROPs than 16?

Nvidia has been using 24 ROPs for over 1.5 years now, since G80 was released in fall 2006. That was with a 384-bit bus.

With the new 512-bit bus, there will be more bandwidth of course, and obviously there's need for pushing more raw pixels to the screen so why not 32 ROPs?
 
BTW guys, what's the likelihood of Nvidia releasing a 9900 GX2 card (two GT200 GPUs) this year?

If they don't, wouldn't that mean R700 / 4870 X2 (two RV770 GPUs) might surpass the 9900 GTX?

Or do you think Nvidia will save 9900 GX2 for Q1/Q2 2009 ?

Edit: nevermind, I forgot about the rumored July 2008 release of 9900 GX2. Even if that get pushed up a quarter, still 2008 in time to combat R700.
 
The supposed cooler looks almost identical to the one used for the 8800GTS 512mb except that it looks like it has a bigger fan (80mm to 120mm?) and abit longer. Could have more than 3 heat pipes too.
 
BTW guys, what's the likelihood of Nvidia releasing a 9900 GX2 card (two GT200 GPUs) this year?
If the single-chip GT200 really requires a six-pin and an eight-pin PCIe power connector (as rumoured) it's possible that the power requirements of two GT200 chips together would be more than one could realistically deliver to a single card (not just in terms of power connectors but in terms of the amount of heat the cooler would need to dissipate).
 
http://www.vr-zone.com/articles/55nm_GeForce_9800_GT_Final_Clocks/5771.html

NVIDIA_G92.DEV_0605.1 = "NVIDIA GeForce 9800 GT" : 55nm
NVIDIA_G92.DEV_0614.1 = "NVIDIA GeForce 9800 GT" : 65nm

Apparently, Nvidia plans to clocked the 9800 GT card at 600MHz, like the 8800 GT and same goes to the memory at 900MHz and shader clock at 1500MHz.

So Nvidia went through all of this convoluted naming and renaming of G92 just to avoid diluting the value of the 8800 brand while stock levels were high late last year? What a complete and utter mess.
 
G92b = B1. :LOL:

That also explains why in the last CC Jen-Hsun told the analysts that their upcoming prodcucts would have higher margins. Simply because they seem to not have changed anything, but shrinked it down to 55nm.
 
There won't be a GT200 GX2, I'm quite sure of it.

Everything indicates there certainly can be

IF nVidia needs it ..

they did make it rather clear that they much prefer a single powerful GPU, but to say without qualification "i am sure there won't be a GT200x2" seems pretty overconfident [to me]

what leads you to make your emphatic pronouncement, "there won't be"?
- why not?
There certainly *could be*; nothing about the physical size seems to prohibit it - the only thing that would hold them back is competing with their own SLi and the lack of competition from AMD.

What if AMD comes up with a 4870X2/3 or4 on a single slot that creams the GT200x1?
- i think we may eventually see a GT200x2


That also explains why in the last CC Jen-Hsun told the analysts that their upcoming prodcucts would have higher margins. Simply because they seem to not have changed anything, but shrinked it down to 55nm.

That was my own opinion - that GT200 has awesome margins - nVidia's GT architecture the "new and improved g80" to be their main GPU core for the next couple of years as their basis for all variants of their high-end, mid and mid-low, eventually.

So Nvidia went through all of this convoluted naming and renaming of G92 just to avoid diluting the value of the 8800 brand while stock levels were high late last year? What a complete and utter mess.
Evidently. They do this regularly with a major architecture change and they are simply responding to disorganized AMD.
- i think it is not a mess for their bottom line; just for their fans and HW enthusisats
 
Last edited by a moderator:
Everything indicates there certainly can be

IF nVidia needs it ..

they did make it rather clear that they much prefer a single powerful GPU, but to say without qualification "i am sure there won't be a GT200x2" seems pretty overconfident [to me]

what leads you to make your emphatic pronouncement, "there won't be"?
- why not?
There certainly *could be*; nothing about the physical size seems to prohibit it - the only thing that would hold them back is competing with their own SLi and the lack of competition from AMD.

What if AMD comes up with a 4870X2/3 or4 on a single slot that creams the GT200x1?
- i think we may eventually see a GT200x2




That was my own opinion - that GT200 has awesome margins - nVidia's GT architecture the "new and improved g80" to be their main GPU core for the next couple of years as their basis for all variants of their high-end, mid and mid-low, eventually.


Evidently. They do this regularly with a major architecture change and they are simply responding to disorganized AMD.
- i think it is not a mess for their bottom line; just for their fans and HW enthusisats

Did NVIDIA make a G80 "X2" card? No. The G80 core is too big and too hot to stuff two of them onto a 'single' card. With G92's reduced power size and power consumption, it became much much more feasible to put two cores so close together. But even with G92's higher efficiency, the 9800GX2 is just adequately cooled.

If two GT200 cores (rumored TDP >200W each) were to be put in the same place, a dual-slot aircooler wouldn't be able to effectively dissipate that kind of heat. A triple-slot design would be cumbersome and perhaps too heavy, and there aren't enough enthusiasts with watercooling to justify having a 'watercooling only' card.

If/when GT200 is shrunk to 55nm, it might be possible, but not likely before then.

I also really doubt a monster like GT200 has "awesome margins." Yields won't be great with such a big core, and in a time where $200 cards can run most games at high settings on 1920x1200 monitors, it's going to be a hard sell. It's not like average gamers are really clamoring for more performance right now.
 
Everything indicates there certainly can be

IF nVidia needs it ..

they did make it rather clear that they much prefer a single powerful GPU, but to say without qualification "i am sure there won't be a GT200x2" seems pretty overconfident [to me]

what leads you to make your emphatic pronouncement, "there won't be"?
- why not?
There certainly *could be*; nothing about the physical size seems to prohibit it - the only thing that would hold them back is competing with their own SLi and the lack of competition from AMD.

What if AMD comes up with a 4870X2/3 or4 on a single slot that creams the GT200x1?
- i think we may eventually see a GT200x2
If a single GT200 board already eats 6pin+8pin, I'm having hard time believing there could be GT200X2.

That was my own opinion - that GT200 has awesome margins - nVidia's GT architecture the "new and improved g80" to be their main GPU core for the next couple of years as their basis for all variants of their high-end, mid and mid-low, eventually.
GT200? I'm quite sure Jen-Shun meant G92 > G92b transition
 
Welcome to the forums, Kaldskryke!
I agree with most of your points so I'm just replying to two things I don't agree with here:
I also really doubt a monster like GT200 has "awesome margins." Yields won't be great with such a big core,
Yields in the modern GPU world have little to do with die size, unless your design team is composed primarily of drunken monkeys.
and in a time where $200 cards can run most games at high settings on 1920x1200 monitors, it's going to be a hard sell. It's not like average gamers are really clamoring for more performance right now.
Crysis? 2560x1600? :) Many enthusiasts are definitely clamoring for more single-chip performance given Crysis' subpar multi-GPU scalability. Now, I do know that many people don't consider Crysis to be that awesome of a game, but the point remains that many people like to future-proof themselves and Crysis is obvious evidence that more games will soon also require that amount of GPU performance.
 
I don't know about that Arun. Browsing the more mainstream forums around the net people are clamoring more for something "new" than for something faster. Besides Crysis ... which arguably isnt that popular of a game .... people are playing everything out there quite comfortably. You have the 9600GT and 3870 that handles that stuff quite well for a relative pittance.
 
I dunno about that. I'm sure a lot of people would like to run the game at maxed settings with no visible slowdown and with high levels of AA/AF. You do that right now and your frame rate will be dipping into the 20s or lower...
 
Status
Not open for further replies.
Back
Top