NVIDIA: Beyond G80...

It does sound a peculiar configuration with 160 ALUs, but if true & they binned the fully functioning chips, then it would be interesting as the rest of the SKU doesn't require alteration. They still have their 384bit bus & may be able to use higher speed GDDR4. Now if you tell me they also had redundant ROPs... ;)
 
Is it true that shader units dosnt have a bandwidth cost? Longer shaders does not cost bandwidth, but indirectly with more shader units you may bump up the resolution that will have a TMU/bandwidth cost?
 
That chart was made to counter the OCW sheet. DOH!

it's a real fud-fight now.. check how similar it is to the r600 OCW sheet..
 
Nvidia GeForce 8600 series will launch in the middle of 2Q, says sources
Nvidia is preparing to launch a series of GeForce 8600 cards in the middled of the second quarter 2007 ... . The GeForce 8600 series will be priced to target mainstream markets, the sources add.

According to the release plan several versions are currently being prepared, the sources claim. The GeForce 8600 GT with 512MB memory will be priced between US$219 and US$229, while a 256MB version will be available at US$199. Additionally, targeting an even lower price point will be the GeForce 8600GS at between US$159 and US$165, the sources have revealed.

DirectX 10 graphics cards will be the main focus for both Nvidia and AMD (who is now a player in the graphics card market after the acquisition of ATI) at the upcoming CeBIT trade show to be held from March 15 to 21, 2007, according to industry sources. ...
.
And now vr-zone:
http://www.vr-zone.com/?i=4682
NVIDIA is using back the GTS naming for their mainstream cards and is preparing to launch their GeForce 8600 series codenamed G84 in April. There will be 2 models; GeForce 8600 GTS and GeForce 8600GT and they are priced at $249-$299 and $199-$249 respectively. Interestingly, there is no G81 or GeForce 8900 series we heard. There is something more powerful brewing over at NVIDIA.
I like DT's news more. ;)
 
No? You don't think a GX2 is more powerful sounding than a hypothetical 80nm G81? Tho, of course, I suppose it's possible the GX2 cores could be 80nm. . .

Nah, just semantics. "A more powerful solution" is not used for 2x the old thing, I think.
 
I would think that GX2 style solutions are not the flavor at NV right now, as the SLI support in DX9 (and very much DX10) is very very early wrt stability and performance on the driver side in Vista.
 
Last time I heard SLI was very unstable in DX10. What driver you using?

US

I only have one 8800 GTX.

My comment was going by the large number of very disapointed SLI users who have posted performance losses with two 8800 GTX in Vista (and I feel for the 7950 users have only one GPU enabled currently in Vista!).
 
IMO GX2 was possible with the G7x era. Not only were they cheap to make, but they were power efficent as well. DX10 was not a problem because Vista wasnt going to be released for a long time til now and the fact that SLi is a mature/stable technology.

But currently, DX10 SLi on G80 are pretty poor and unstable. Didnt nVIDIA in an interview not so long ago say that theyre aiming at stabilising DX10/DX9 SLi for G80 by April? Also G80s are huge in terms of transistor count/die space, very hot and consume quite abit of power.

So having GX2 card for this generation doesnt sound to optimistic. Unless they cut down the specs to a point where it does sound realistic. 7950GX2 did produced qutie abit of heat (single slot cooling per GPU), and thinking about a GX2 cards based on the G8x doesnt sound to promising.

Guess we will have to see what the R600 is like first.
 
Back
Top