I FORGOT TO MENTION THAT NEW RELEASE OF FORCEWARE IS ON THE WAY
So reviewers still don't have "review drivers"? Kinda late isn't it?
I FORGOT TO MENTION THAT NEW RELEASE OF FORCEWARE IS ON THE WAY
I wouldn't be surprised if NVidia has a special Crysis driver - they'd want to keep something up their sleeve.
Jawed
But it should also respond well to efficiency gains, and double the bandwidth of 9800GTX should go a long way, much further than indicated there.I think it is highly impossible as the crysis is shader-intensive programe.
Perlin noise is texture-limited on RV770 in 3DMk06 - I don't know if that's the case in Vantage though.Maybe interesting:
sources:
http://bbs.chiphell.com/viewthread.php?tid=24334&extra=page=1
http://xfastest.com/viewthread.php?tid=10925&extra=&page=1
In Perlin Noise G200 seems to have a ~30% higher per FLOP efficiency.
Perlin noise is texture-limited on RV770 in 3DMk06 - I don't know if that's the case in Vantage though.
GTX260 have the same core as GTX280. Those prices kill completly all the Nvidia margins there. They came from the same waffers.
Also if they price GTX260 on that price, anyone will buy the GTX280.
ATI has a big chance to get market share back with RV770. That´s almost certain.
And yes i'm aware of the supposed 4870 3DMark scores.
What everyone seems to forget is that R6x0 were very fast in 3DMark also.
The first 8800GTS came from the same wafers as 8800GTX and 8800Ultra aswell. Still 8800GTS was priced VERY competitively. I believe the 320 mb model was introduced at about 300e here, I got mine for around 250e not much later. The 8800GTX was in the 450-500e region and 8800Ultra was over 550e.
I'm expecting the GTX260 to be the '8800GTS' of the new generation. So 300-350e is what I think nVidia will be aiming at, with GTX280 somewhere in the 450+ region.
Then they're basically doing the same as what they did with G80.
Except that the 320MB GTS came 4 months after G80's launch ..The first 8800GTS came from the same wafers as 8800GTX and 8800Ultra aswell. Still 8800GTS was priced VERY competitively. I believe the 320 mb model was introduced at about 300e here, I got mine for around 250e not much later. The 8800GTX was in the 450-500e region and 8800Ultra was over 550e.
I'm expecting the GTX260 to be the '8800GTS' of the new generation. So 300-350e is what I think nVidia will be aiming at, with GTX280 somewhere in the 450+ region.
Then they're basically doing the same as what they did with G80.
Except GT200 is considerably larger than G80, isn't it...
Except that the 320MB GTS came 4 months after G80's launch ..
...And you can't make anything out of this without knowing a couple hundreds of other parameters -)Except GT200 is considerably larger than G80, isn't it...
484 mm^2 (G80) vs ~575 mm^2 (G200) - ~20 percent larger.Is it? I didn't bother to check, but G80 was made on 90 nm, not 65 nm... So the physical diesize might not be all that different.
While true, I doubt the cost per waver (for a smaller process) has really gone down.Besides that, G80 was 2 years ago, these days production costs are different.
It's not that simple. If yields are good, nvidia might have used fully working chips in 8800GTS cards. And what you can sell a card for and what you will actually charge is not the same.Aside from that, GTX260 will be a 'broken' GTX280, so that changes economics. In the technical sense it's just 'leftovers'. The diesize itself isn't all that important... What's more important is how much they can improve yields by disabling various broken modules on the chip, and how much performance is left.
Again, look at the 8800GTS, which is basically a 'broken' 8800GTX/Ultra. By simply disabling 32 of the 128 processing units, and reducing the clockspeed a bit, they had a card that was only about 20% slower, but which they could sell at only about 60% of the price of an 8800GTX,
Yes, but that had more to do with R600 (which also fell in the expensive category for manufacturing costs) than with the 8800GTS.making it an incredible bang-for-the-buck card, and probably ATi's biggest nightmare at the time (even their 2900XT had problems competing).
So reviewers still don't have "review drivers"? Kinda late isn't it?
It's not that simple. If yields are good, nvidia might have used fully working chips in 8800GTS cards.
Yes, but that had more to do with R600 (which also fell in the expensive category for manufacturing costs) than with the 8800GTS.
I think NVIDIA knows how 'good' yields will be -)It's not that simple. If yields are good, nvidia might have used fully working chips in 8800GTS cards. And what you can sell a card for and what you will actually charge is not the same.
To compete with G92b?That's why we'll see pretty stripped down GTX260 soon.