NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.
Maybe interesting:
vantageperffeaturerv770so1.jpg

sources:
http://bbs.chiphell.com/viewthread.php?tid=24334&extra=page=1
http://xfastest.com/viewthread.php?tid=10925&extra=&page=1

In Perlin Noise G200 seems to have a ~30% higher per FLOP efficiency.
 
Of course nVidia chips have better efficiency per theoretical FLOPS values. But ATi ALUs are probably more efficient per transistor count.
 
Perlin noise is texture-limited on RV770 in 3DMk06 - I don't know if that's the case in Vantage though.

If the purest math test out there is texture-limited on RV770 then that doesn't bode well for gaming scenarios. Still, the perf/mm^2 and probably perf/watt advantage still seems to be firmly in AMD's grasp.
 
GTX260 have the same core as GTX280. Those prices kill completly all the Nvidia margins there. They came from the same waffers.

Also if they price GTX260 on that price, anyone will buy the GTX280.

ATI has a big chance to get market share back with RV770. That´s almost certain.

The first 8800GTS came from the same wafers as 8800GTX and 8800Ultra aswell. Still 8800GTS was priced VERY competitively. I believe the 320 mb model was introduced at about 300e here, I got mine for around 250e not much later. The 8800GTX was in the 450-500e region and 8800Ultra was over 550e.
I'm expecting the GTX260 to be the '8800GTS' of the new generation. So 300-350e is what I think nVidia will be aiming at, with GTX280 somewhere in the 450+ region.
Then they're basically doing the same as what they did with G80.
 
The first 8800GTS came from the same wafers as 8800GTX and 8800Ultra aswell. Still 8800GTS was priced VERY competitively. I believe the 320 mb model was introduced at about 300e here, I got mine for around 250e not much later. The 8800GTX was in the 450-500e region and 8800Ultra was over 550e.
I'm expecting the GTX260 to be the '8800GTS' of the new generation. So 300-350e is what I think nVidia will be aiming at, with GTX280 somewhere in the 450+ region.
Then they're basically doing the same as what they did with G80.

Except GT200 is considerably larger than G80, isn't it...
 
The first 8800GTS came from the same wafers as 8800GTX and 8800Ultra aswell. Still 8800GTS was priced VERY competitively. I believe the 320 mb model was introduced at about 300e here, I got mine for around 250e not much later. The 8800GTX was in the 450-500e region and 8800Ultra was over 550e.
I'm expecting the GTX260 to be the '8800GTS' of the new generation. So 300-350e is what I think nVidia will be aiming at, with GTX280 somewhere in the 450+ region.
Then they're basically doing the same as what they did with G80.
Except that the 320MB GTS came 4 months after G80's launch .. :LOL:
 
Except GT200 is considerably larger than G80, isn't it...

Is it? I didn't bother to check, but G80 was made on 90 nm, not 65 nm... So the physical diesize might not be all that different.

Besides that, G80 was 2 years ago, these days production costs are different.
Heck, even the 7800 was a very expensive card at its introduction, or the 6800, or the 5800 etc.

Aside from that, GTX260 will be a 'broken' GTX280, so that changes economics. In the technical sense it's just 'leftovers'. The diesize itself isn't all that important... What's more important is how much they can improve yields by disabling various broken modules on the chip, and how much performance is left.
Again, look at the 8800GTS, which is basically a 'broken' 8800GTX/Ultra. By simply disabling 32 of the 128 processing units, and reducing the clockspeed a bit, they had a card that was only about 20% slower, but which they could sell at only about 60% of the price of an 8800GTX, making it an incredible bang-for-the-buck card, and probably ATi's biggest nightmare at the time (even their 2900XT had problems competing).
 
Is it? I didn't bother to check, but G80 was made on 90 nm, not 65 nm... So the physical diesize might not be all that different.
484 mm^2 (G80) vs ~575 mm^2 (G200) - ~20 percent larger.

Besides that, G80 was 2 years ago, these days production costs are different.
While true, I doubt the cost per waver (for a smaller process) has really gone down.
Aside from that, GTX260 will be a 'broken' GTX280, so that changes economics. In the technical sense it's just 'leftovers'. The diesize itself isn't all that important... What's more important is how much they can improve yields by disabling various broken modules on the chip, and how much performance is left.
Again, look at the 8800GTS, which is basically a 'broken' 8800GTX/Ultra. By simply disabling 32 of the 128 processing units, and reducing the clockspeed a bit, they had a card that was only about 20% slower, but which they could sell at only about 60% of the price of an 8800GTX,
It's not that simple. If yields are good, nvidia might have used fully working chips in 8800GTS cards. And what you can sell a card for and what you will actually charge is not the same.
making it an incredible bang-for-the-buck card, and probably ATi's biggest nightmare at the time (even their 2900XT had problems competing).
Yes, but that had more to do with R600 (which also fell in the expensive category for manufacturing costs) than with the 8800GTS.
 
It's not that simple. If yields are good, nvidia might have used fully working chips in 8800GTS cards.

Does that matter from an economical point-of-view then? Production costs are the same. There's just this notion that the GPUs had more potential than what they were sold for. So from an economical point-of-view, nVidia took an overly pessimistic estimate about their yields to base their pricing on. Still their pricing was economically sound, because nVidia has done well throughout the G80 era.


Yes, but that had more to do with R600 (which also fell in the expensive category for manufacturing costs) than with the 8800GTS.

Depends on how you look at it. nVidia couldn't know it was *that* competitive with the 8800GTS, because R600 was introduced months later. Still that's the price-point that nVidia picked, based on their analysis of production costs, yields and performance after disabling various modules. It all makes perfect sense, economically. 8800GTS was not a GPU that nVidia was losing money on, and they weren't pressured by the competition... In fact, it put the competition under pressure before they even knew it.
 
It's not that simple. If yields are good, nvidia might have used fully working chips in 8800GTS cards. And what you can sell a card for and what you will actually charge is not the same.
I think NVIDIA knows how 'good' yields will be -)
That's why we'll see pretty stripped down GTX260 soon.
And when (and if; remember G200b talk?) yields will go up they'll most likely switch GTX260 to a G2xx chip with 192 SP and (probably) 256-bit GDDR5 bus.
 
Status
Not open for further replies.
Back
Top