NVIDIA: Beyond G80...

If an 8800 Ultra launches in two weeks I'll be the most surprised guy in the room. I've been surprised before, of course . . . I'd think it'd make more sense for NVIDIA to wait for R600 launch to know for sure what they are shooting at at this point.

A cool grand can extend your bill of materials quite a bit, we have to count with that too.
But yes, waiting to know what the 1GB GDDR4 R600 can do (especially how much it will cost to end-users and to AMD) would seem like a wiser move in my opinion.

Who knows ? We might even get a 65 nano GPU and a 768bit bus from Nvidia, anything is possible with 1K. ;)
 
How much advance notice did we have on the 7800 GTX 512MB version? Not very much from what I recall.

More than that, and there wasn't (say the published rumors at places like DT and Inq) another NV launch in between either to suck up reviewer time/attention.
 
well lets hope it isnt like the 7800gtx 512 launch.. in and out hard "launch" with bad SLI support. There where rumors of the ultra n47 that summer before the release in late fall.
 
Say, NV has been stockpiling Ultra G80 Chios for a few months now, it could be the case that they need to launch now, if a real Refresh Chip si coming in 3Q07. They need to have the Ultras out at least 3 months earlier.
 
well lets hope it isnt like the 7800gtx 512 launch.. in and out hard "launch" with bad SLI support. There where rumors of the ultra n47 that summer before the release in late fall.


I don't have high hopes about the ultra launch we will see though.
 
Interesting post at [H] from the author of the gpudip benchmark suite.

http://www.hardforum.com/showpost.php?p=1030910236&postcount=72

tertsi said:
0.5 TFlops and not just with MAD instructions... well thats easy with G80 and "future driver release" (hopefully coming soon)
smile.gif

G80@575MHz + 97.92 series drivers
-------------------------------------------------
gpudip 18: cel shading - 23 70 1 512 768 10000 - 682,3311 msec - 403,3983 GFlop/sec

G80@575MHz + "future driver release"
-------------------------------------------------
gpudip 18: cel shading - 23 70 1 512 768 10000 - 584,9589 msec - 470,5480 GFlop/sec

G80@702MHz + "future driver release"
-------------------------------------------------
gpudip 18: cel shading - 23 70 1 512 768 10000 - 496,1952 msec - 554,7236 GFlop/sec

Draw your own conclusions :)
 
Hmmm some reverse engineering reveals the following:

128*3*1350 = 518400 Gflops, measured 470548, 90% efficiency

Assuming shader clock scales with core clock, 700Mhz would result in ~ 1600Mhz shader clock. And....

128*3*1600 = 614400, measured 554723, 90% efficiency

So either the G80 Ultra is a 128 SP, 700 core / 1600 shader part or tertsi made this all up just like I did :)
 
Humm, a 8800 GTX gaining almost 70 GFlop/s just by using a new set of drivers ?
Interesting...


edit
trini, will the ~2.3 ratio between main core clock and shader core clock stand in the Ultra as well ?
 
Looks that way.

Anyway, isn't a 125MHz increase a little steep for a 90nm core ?
There are watercooled versions at less than 650MHz already...
 
Could be another 7800GTX-512 after all - pushing the process to the limit. If that's what they did I forsee another fiasco.
 
Hmmm some reverse engineering reveals the following:

128*3*1350 = 518400 Gflops, measured 470548, 90% efficiency

Assuming shader clock scales with core clock, 700Mhz would result in ~ 1600Mhz shader clock. And....

128*3*1600 = 614400, measured 554723, 90% efficiency

So either the G80 Ultra is a 128 SP, 700 core / 1600 shader part or tertsi made this all up just like I did :)


MUL opened up to general shading/CUDA? Could the difference in flop count from theoretical be accounted for by the MUL also doing the opps it's currently doing?
 
At least the 7800 GTX 512 didn't have a rumored 1000 dollar price tag at launch. :D

I wonder why they priced it so high unless its all rumour of course.

So pretty much a 8800ultra is

G80
90nm
700Mhz Core clock
1600Mhz Shader Clock
128 SP

No rumours on its memory interface, memory type e.g GDDR3 or GDDR4, size of its frame buffer, bandwidth etc.

Maybe we are missing something from the specs that justifies this rumoured $999 price tag. If nVIDIA is confident with such pricing, i expect this card to outperform the GTX by a good margin. (20~30% or even maybe a gap similiar to GTS vs GTX currently)
 
MUL opened up to general shading/CUDA? Could the difference in flop count from theoretical be accounted for by the MUL also doing the opps it's currently doing?

And what kind of implications does this "future driver" pose to G84/G86 performance ?
The regular 8800 GTX seems to benefit from it quite a lot.
Another interesting question.
 
G80
90nm
700Mhz Core clock
1600Mhz Shader Clock
128 SP

No rumours on its memory interface, memory type e.g GDDR3 or GDDR4, size of its frame buffer, bandwidth etc.

Maybe we are missing something from the specs that justifies this rumoured $999 price tag.

Maybe it's a water-cooling kit version like the current BFG?
 
MUL opened up to general shading/CUDA? Could the difference in flop count from theoretical be accounted for by the MUL also doing the opps it's currently doing?
It seems that the software is counting special function FLOPs.

Special functions, such as reciprocal or sine, tend not to get counted (at least round here) because the mathematical operations involved vary and there's no way of generalising a special function ALU's FLOP count to a single figure.

Additionally, these special functions are not so heavily used in code, so there's less importance attached to the actual number, whatever it might be. Though you could argue that reciprocal or square root are pretty common in graphics and should be counted.

But this software can be programmed to "decide" that a reciprocal is worth 1 FLOP, a square root 3 FLOPs (or whatever...) etc.

Jawed
 
Back
Top