NVIDIA: Beyond G80...


Fruitzilla write that D stands for dekstop and M for mainstream, so if they not wrong than this is not a mobile GPU.

When this is a desktop performance part than its cool, hopefully its a real mainstream card (not fake ones like the 8600/hd2600) coming with 256bit bus.
This can be another slap to amd face in desktop segment when amd can't release rv670 in time (Q3), i'm 95% sure they miss the "in time" release again.
 
Fruitzilla write that D stands for dekstop and M for mainstream, so if they not wrong than this is not a mobile GPU.

Sure, but where is the problem to use it as mobile-part? ;)


Another thought would be that mainstream not mention the perfromance, so it could be G98 and GF88M uses another GPU.
 
The whole 8950 GX2 story is a cautionary tale about rumor mills and the rumors they... ahem... mill. It seems pretty clear that Nvidia is going to ride the current G80 lineup until the Q4. With the the kind of challenger R600 turned out to be, I can't blame them.
 
Time to dust off the old thread with some juicy fresh rumours yet again ;) :
G92 is 65nm & 256-bit memory bus

We heard about an upcoming GPU from NVIDIA on 256-bit memory interface earlier this month and we thought it is the rumored 256-bit version of G84. Now we learned that it is actually the next generation G92, a 65nm performance part supporting PCI Express 2.0 and 256-bit memory interface. G92 performance lies between GeForce 8800 GTS and 8600 GTS so it is not the highest end GPU from the G9x series yet. There might be a G90 which we haven't heard about yet and NVIDIA gave clues that their next generation GPU will deliver close to 1 TFlops of performance in a recent analyst conference. G9x could support Double Precision FP too. Sampling will start in September and slated for launch in November timeframe.

http://www.vr-zone.com/?i=5092
 
G92 will not outdo 8800GTS?
Do they conclude this from the 256bit MC, because NV said on this conference G92 will be the ~ 1TFLOP GPU, so I do not think this perfomance estimation suits.
 
VR-Zone said:
G92 performance lies between GeForce 8800 GTS and 8600 GTS
Errr, I think they 'slightly' misunderstood G92's performance target. I'm pretty sure we're talking >= 8800 Ultra here for the highest-end single-chip model.
 
G92 will not outdo 8800GTS?
Do they conclude this from the 256bit MC, because NV said on this conference G92 will be the ~ 1TFLOP GPU, so I do not think this perfomance estimation suits.

They could *in theory* saddle it with very high speed GDDR4 (up to 2.8/3.2GHz), but i don't think anyone said "G92" when referring to the 1 TFlop figure.
They said "our next generation will...", that was it.
My best guess is that G92 will perhaps have a performance level ranging between the 8800 GTS 640MB and the 8800 GTX, as a performance mainstream follow-up to the 8600 GTS.
But that's just a guess...
 
If you can quote him directly saying the "G92" moniker i might change my wording, but until then... :D
Oh, I didn't say NVIDIA ever said G92. In fact, I'm very confident nobody at NVIDIA ever said G92. But you said 'anyone' and, last I heard, I am part of that category! ;)

I think I already said this a couple of times in the past, but my prediction for G92 is 192SPs @ 2.2-2.5GHz+, 32 TMUs with free trilinear @ ~750MHz, 16 ROPs for depth & stencil, and blending+triangle setup done in the shader core. And you'd have a 256-bit memory bus with 1.4GHz GDDR4 for the highest-end single-chip model. Probably ~1.2GHz for the dual-chip SKU too, I guess.

If my speculation is right and these specs are roughly accurate, it shouldn't be very hard to conclude that it should be faster than the 8800 Ultra on average...
 
I find it hard to believe they'll go down to 256-bit for hi-end while ATI's having 512 bit there right now.
IMO, the next NV's hi-end should be at least 384 bits again if not 512...
 
but my prediction for G92 is 192SPs @ 2.2-2.5GHz+, 32 TMUs with free trilinear

Why not 24 TMUs free tri or 48 TMUs/TAUs (G84/G86 style) and 32 SPs per Cluster?

I find it hard to believe they'll go down to 256-bit for hi-end while ATI's having 512 bit there right now.

G92 (x2) vs RV670 (x2) ;)
 
You don't see it much any more, but back when DX9 was coming out, I remember seeing a lot of 64bit and 128bit used for FP16 and FP32. If G92 does support double precision, that would be FP64 which is also 256bit, right?

Now how likely that explanation is, I have no idea.
 
IMO, the next NV's hi-end should be at least 384 bits again if not 512...
You could think of G92 as being a $199-$399 product (possibly not that low at launch, but eventually) with dual-chip solutions hitting anything higher. In fact, I wouldn't even be very surprised if they didn't go as high as $399 with the single-chip, but we'll see.
AnarchX said:
Why not 24 TMUs free tri or 48 TMUs/TAUs (G84/G86 style) and 32 SPs per Cluster?
Because then the ALU-TEX ratio would probably be too high to NVIDIA's liking. After all, if they increase the SPU clocks to 2.4GHz+ and increase their unit ratio by 50%, they've already doubled the ALU-TEX throughput ratio. Why they would want to do more than that is beyond me... It's not strictly impossible I guess, but it does feel a tad too extreme to me.
 
Back
Top