NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.
I don't mind the GPU using a lot of power when its running a game because thats when its being put to use. Idle power draw is a pretty big deal for me though since I leave my PC on when its not in use.

GT2xx is looking very promising!
 
It uses the stream processors for IQ improvements including interlacing; the actual decoding is done on an in-house Tensilica-based processor, just like AMD's UVD.
 
It uses the stream processors for IQ improvements including interlacing; the actual decoding is done on an in-house Tensilica-based processor, just like AMD's UVD.
Ah, OK. Thanks!

Thinking again, it seemed a sexy idea to be able to rely solely on NVIO for idle and video decoding, but under Vista there's still a bit of 3D acceleration required so that wouldn't cut it anyway.
 
16C/512SP | 384-bit GDDR5 | 2.5TFlops+

What's the timing for 40nm and DX11? I don't expect to see a high-end DX10 card from Nvidia on 40nm.

I don't do predictions but I'm sitting in the middle of the desert in Arizona at the moment and feel a bit enlightened so what the heck:

Q4 2008: 55nm G92b, 8C, 256-bit GDDR3 ~ $250 - $300
Q1 2009: 55nm GTX2xx, 10C, higher clocks 512-bit GDDR3 > 1 Teraflop ~ $650
Q3 2009: 40nm GT2xx, 5C, ~600 GFlops, 256-bit GDDR3 ~$250 - $300
Q2 2010: 40nm DX11 architecture

Of course, this leaves a big gaping hole between $250 and $450 for the HD4870 to fill but I don't see how Nvidia can fill that gap in the short term in any scenario.
 
What's the timing for 40nm and DX11? I don't expect to see a high-end DX10 card from Nvidia on 40nm.

I don't do predictions but I'm sitting in the middle of the desert in Arizona at the moment and feel a bit enlightened so what the heck:

Q4 2008: 55nm G92b, 8C, 256-bit GDDR3 ~ $250 - $300
Q1 2009: 55nm GTX2xx, 10C, higher clocks 512-bit GDDR3 > 1 Teraflop ~ $650
Q3 2009: 40nm GT2xx, 5C, ~600 GFlops, 256-bit GDDR3 ~$250 - $300
Q2 2010: 40nm DX11 architecture

Of course, this leaves a big gaping hole between $250 and $450 for the HD4870 to fill but I don't see how Nvidia can fill that gap in the short term in any scenario.


One of the possible solutions could be reached by delivering signature edition of G92b as signed by president of Nvidia.:D
 
What's the timing for 40nm and DX11? I don't expect to see a high-end DX10 card from Nvidia on 40nm.

I don't do predictions but I'm sitting in the middle of the desert in Arizona at the moment and feel a bit enlightened so what the heck:

Q4 2008: 55nm G92b, 8C, 256-bit GDDR3 ~ $250 - $300
Q1 2009: 55nm GTX2xx, 10C, higher clocks 512-bit GDDR3 > 1 Teraflop ~ $650
Q3 2009: 40nm GT2xx, 5C, ~600 GFlops, 256-bit GDDR3 ~$250 - $300
Q2 2010: 40nm DX11 architecture

Of course, this leaves a big gaping hole between $250 and $450 for the HD4870 to fill but I don't see how Nvidia can fill that gap in the short term in any scenario.

Geforce... 9600GX2!

:eek:
 
According to the slides it's true.

NV is gonna promote IDLE powerconsumption a lot... with just a small mention of LOAD powerconsumption. :p But I guess most reviewsites will see through their PR plan. ;)

Realistically, how much time that a computer is on is spent at idle? i am guessing 90% of the time; more for a gamer? 2 hours out of 24 is a lot of gaming for a working person!

And now that high power draw at load is compensated for at idle with a modest draw - that is really green imo [pun intended]

i mean my 8800GTX and oc'd e4300 are only ~120W draw at the outlet at idle; My tv is off. i may game for an hour or two tonight, IF i am lucky. My computer is on mostly 24/7. An extra 100w for 2 hours a day will not break me and i can compensate elsewhere for my addiction.:p

i dug these up [you probably saw them already; i AM slow]:

http://www.hexus.net/content/item.php?item=13600
[mostly about AMD, but mentions the Nvidia beast]

http://www.vr-zone.com/articles/GeForce_GTX_280_Cards_Lurking_In_Computex/5823.html
Here's a screenshot of the GeForce GTX 280 running on GeForce Release 177.23. We are told that this thing runs cool during idle and runs much faster than 9800GX2. If you look hard enough, you will find some GT200 cards lurking around Computex.
 
Last edited by a moderator:
According to the slides it's true.

NV is gonna promote IDLE powerconsumption a lot... with just a small mention of LOAD powerconsumption. :p But I guess most reviewsites will see through their PR plan. ;)

Realistically, how much time that a computer is on is spent at idle? i am guessing 90% of the time; more for a gamer? 2 hours out of 24 is a lot of gaming for a working person!

i mean my 8800GTX and oc'd e4300 are only ~120W draw at the outlet at idle; My tv is off. i may game for an hour or two tonight, IF i am lucky. My computer is on mostly 24/7. An extra 100w for 2 hours a day will not break me and i can compensate elsewhere for my addiction.:p

Now we know they you have the fridge off during the night. :D


The 9600GX2 bears enough value for nVidia to try, this will probably eke out the 4850. But when the price wars come, nothing will be guranteed then.

Still, given nVidia's "You can't one-up us" nature, they should be doing something of the like.
 
Now we know they you have the fridge off during the night. :D


The 9600GX2 bears enough value for nVidia to try, this will probably eke out the 4850. But when the price wars come, nothing will be guranteed then.

Still, given nVidia's "You can't one-up us" nature, they should be doing something of the like.

no .. my water heater :p

Nvidia needs to think "PR" and Green .. and they do have a GPU, that for 90% of its life will IDLE at a low wattage. But, like the turbo charger in a smaller auto engine, it kicks ass - when it is needed.
[maybe i should offer to write copy; for AMD, i am that bad, i know =P]
The only thing i am really bugged about is their lack of DX10.1 for Tesla.

i also had a couple of links i forgot to add in the thread that were not quoted by you that you might find of interest [i am still stuck on 56K :oops:] From Hexus they appear to be saying that the 4XX0 series is not so impressive, i guess [as expected industry wide i also thought; the variable being X2 vs. GT280x1 in my own opinion]. Vr zone appeared to say the 280 was cooler and faster than Gx2 and was spotted at computex; i see the link has been posted already!

http://www.hexus.net/content/item.php?item=13600
 
Last edited by a moderator:
apoppin said:
i mean my 8800GTX and oc'd e4300 are only ~120W draw at the outlet at idle; My tv is off. i may game for an hour or two tonight, IF i am lucky. My computer is on mostly 24/7. An extra 100w for 2 hours a day will not break me and i can compensate elsewhere for my addiction.
Nvidia needs to think "PR" and Green .. and they do have a GPU, that for 90% of its life will IDLE at a low wattage. But, like the turbo charger in a smaller auto engine, it kicks ass - when it is needed.
[maybe i should offer to write copy; for AMD, i am that bad, i know =P]
Why don't you think green and turn off your PC when you're not using it? You claim you can afford the 100w of power (or whatever) but if everyone thought like you, we'd be wasting gigawatts of power.

Using power while gaming is one thing, but you're wasting far more power while idle than you are using while gaming.

-FUDie
 
Oh, for goodness sake, why the discussion about "idle power"? This may be important for mobile graphics, but I'm assuming most of us plug our PC's into the wall.

Unless you're a power saving, save the planet, tree-hugger type, the watts at idle is completely insignificant - the noise of the fan is far more important IMHO. Are we really falling for the latest IHV spin i.e. "we slagged off ATI for the power draw of the 2900XT, but now we're in the same boat, it's idle power that matters"?

Who gives a crap about idle power? If it's fast enough, folks will buy it regardless. The problem with the power-hungry 2900xt was that it wasn't fast enough to justify a power supply upgrade. Maybe the GT200 will be, but I have my doubts given this "idle power" PR nonsense.
 
Oh, for goodness sake, why the discussion about "idle power"? This may be important for mobile graphics, but I'm assuming most of us plug our PC's into the wall.

Unless you're a power saving, save the planet, tree-hugger type, the watts at idle is completely insignificant - the noise of the fan is far more important IMHO. Are we really falling for the latest IHV spin i.e. "we slagged off ATI for the power draw of the 2900XT, but now we're in the same boat, it's idle power that matters"?

Who gives a crap about idle power? If it's fast enough, folks will buy it regardless. The problem with the power-hungry 2900xt was that it wasn't fast enough to justify a power supply upgrade. Maybe the GT200 will be, but I have my doubts given this "idle power" PR nonsense.

Great 4th post! :LOL:
 
Status
Not open for further replies.
Back
Top