NVIDIA GF100 & Friends speculation

So worst case scenario, is that they should replace the current GTX 480s with the 580s and drop the price of the 480. Cayman is right around the corner. Imagine how foolish they will look if they repeat the GTX 2XX era price cuts, right after their launch.

How's that saying going? Fool me once, shame on you; fool me twice, shame on me!

Why is that a problem? Most companies cut prices once a competing product enters the market. They charge what people are willing to pay for their product, and when there is little competition people are willing to pay more. Now there is an element of truth that they might be able to sell more overall if they have a slightly lower price, but if supply constrained then it doesn't really buy them anything but heartache anyway.
 
And 9800GT, GTX260M

8800GTS was the first iteration , then 9800GTX then 9800GTX+ then GTS 250 , and GTS 150 (OEM) .

8800GT is the cut down version , renamed to 9800GT , then GT 330 (OEM?)

GTX 280M is the full mobile version , renamed later to GTX 285M.
GTX 260M is the cut down version , but was never renamed .

So one chip used for 11 products , this is indeed the most used chip evar !

However , lets be real , that only happened because it was so successful .
 
IF the 580 = 512CC, wouldn't ik make sense the 570 = 480? effectively replacing the 480 with lower temps/consumption.
Possible, but personally, I believe 448 (but higher clocks, of course) is more likely.

Yields would have to improve tremendously to allow for a 480CC-GTX 570, I think. Two SKUs with 15 or 16 active ShaderModules? Not impossible, but unlikely.


8800GTS was the first iteration , then 9800GTX then 9800GTX+ then GTS 250 , and GTS 150 (OEM) .

8800GT is the cut down version , renamed to 9800GT , then GT 330 (OEM?)

GTX 280M is the full mobile version , renamed later to GTX 285M.
GTX 260M is the cut down version , but was never renamed .

So one chip used for 11 products , this is indeed the most used chip evar !

However , lets be real , that only happened because it was so successful .
You forgot 8800GS, which got renamed to 9600GSO later ;)
 
Possible, but personally, I believe 448 (but higher clocks, of course) is more likely.

Yields would have to improve tremendously to allow for a 480CC-GTX 570, I think. Two SKUs with 15 or 16 active ShaderModules? Not impossible, but unlikely.



You forgot 8800GS, which got renamed to 9600GSO later ;)

I believe the 8800GS was based off of G80.
 
8800GTS was the first iteration , then 9800GTX then 9800GTX+ then GTS 250 , and GTS 150 (OEM) .

8800GT is the cut down version , renamed to 9800GT , then GT 330 (OEM?)

GTX 280M is the full mobile version , renamed later to GTX 285M.
GTX 260M is the cut down version , but was never renamed .

So one chip used for 11 products , this is indeed the most used chip evar !

However , lets be real , that only happened because it was so successful .

That and it's a damn good chip.
 
I believe the 8800GS was based off of G80.
8800GS was G92-based

8800GS 96SPs/575MHz/192bit/384MB
8800GT 256MB
8800GT 512MB
8800GT 1024MB
8800GTS 512MB
9600GSO 96SPs/550MHz/192bit/384MB
9800GT
9800GTX
9800GTX+
9800GX2
GTS150
GTS240
GTS250
GT330
+mobile models
 
That and it's a damn good chip.

Well, it was great when it was launched. But in 2008 it was slower and bigger than RV770. G92b was smaller and faster than G92, but still a little bigger than RV770 and clearly slower.

NVIDIA just didn't have anything better with that kind of manufacturing cost, so they kept making and renaming it.
 
Looks legit to me.
Sure it's legit, the graphs start at 0.8! :LOL:

I wonder why they didn't add a 6870Xfire to the SLi comparision. :rolleyes:

Also note the the perf/watt graphs. That is not the 50% that were rumoured. If you take Stalker, which seems to show the expected 17% perf gain and ca. 27% efficiency gain, you will get the GTX580 having ca. 92% the TDP of GTX480... while this should keep it under 300W in any case, it's still a bit disappointing, I think.
 
Sure it's legit, the graphs start at 0.8! :LOL:

I wonder why they didn't add a 6870Xfire to the SLi comparision. :rolleyes:

Also note the the perf/watt graphs. That is not the 50% that were rumoured. If you take Stalker, which seems to show the expected 17% perf gain and ca. 27% efficiency gain, you will get the GTX580 having ca. 92% the TDP of GTX480... while this should keep it under 300W in any case, it's still a bit disappointing, I think.

brainfart..
 
Last edited by a moderator:
Back
Top