The G92 Architecture Rumours & Speculation Thread

Status
Not open for further replies.
If you're talking about G96 which replaced G94 on the roadmap and is scheduled for March/April 2008, it's not what I was talking about....


Only if we think the g90's were supposed to be released or never released.
 
Well, I haven't seen one personally, and it's not inside my 60 day rule yet (to my knowledge), but the tea leaves and the whispers in the wind all point in that direction. And without another high-end chip in sight, seems fairly logical to me. I can tell you that around the time of G80 release we asked about a GX2, and we're told then that there was unlikely to be one for that generation, but it'd probably make a comeback the next. G92 is next.
 
Boy, is he a supreme hater or what ?
He even calls the 30 inch Dell 3007 "one of the worst monitors on the face of the planet, avoid it at all costs".
:LOL:
 
Certainly a bit of a rant there from Charlie.

An IHV cherry-picking benchmarks/resolutions/etc for a product release to make their product look better! Who'd have thunk it?
 
PConline measured in their Review of 8800 GT a die-size of 330mm².:oops:

And without another high-end chip in sight, seems fairly logical to me.
Why is G80@65nm = G92 not a high-end-chip?
Put a dual-slot cooler on it, rise VGPU to 1.25-1.3V and clock it up to ~800/2400MHz and buy some GDDR4 at Samsung with 1.2GHz+.

Summarized you will get a solution, which can beat 2xRV670 in most cases and will have a higher margin than G80-SKUs and have the advantage of 384Bit in bandwidth and 768MB in memory-size, which is optimal for enthusiast-gaming at the moment and upcoming games, till the next high-end SKU comes.

If NV will really goes GX2 again they must be insane or there is a strong reason, that does not allows higher clocks, like above... :???:
 
Last edited by a moderator:
Certainly a bit of a rant there from Charlie.

An IHV cherry-picking benchmarks/resolutions/etc for a product release to make their product look better! Who'd have thunk it?
Cherry-picked benchmarks are one thing; flat-out lies about competing products are another. The latter is what Charlie thinks Nvidia is guilty of: that's what's got him so riled.
 
Maybe he expected "A Perfect 10"?

Personally, i think both G92 and RV670 have a perfect feature set/performance balance for the market segment they're aimed at. ;)

Cherry-picked benchmarks are one thing; flat-out lies about competing products are another. The latter is what Charlie thinks Nvidia is guilty of: that's what's got him so riled.

Then why does he stay quiet about it whenever it's AMD/ATI doing it ?
If you read his posts (i wouldn't call them "stories") there's always a sense of almost personal grudge against Nvidia, for reasons i can't quite ascertain.
 
Cherry-picked benchmarks are one thing; flat-out lies about competing products are another. The latter is what Charlie thinks Nvidia is guilty of: that's what's got him so riled.

I just love how he gets so riled up over NV using AA when benchmarking a high end GPU. Oh the humanity!
 
PConline measured in their Review of 8800 GT a die-size of 330mm².:oops:
Blimey - I can't make out if the calipers are measuring the die or the die+sealant. But if it's really 330mm2 then 289mm2 was an entertaining diversion :LOL: and makes the comparison with RV670's 194mm2 even more fun.

Why is G80@65nm = G92 not a high-end-chip?
Put a dual-slot cooler on it, rise VGPU to 1.25-1.3V and clock it up to ~800/2400MHz and buy some GDDR4 at Samsung with 1.2GHz+.
That memory would be too slow to achieve 8800GTX performance (though it'll be interesting to see how close). And there's still the question of whether the chip can support GDDR4.

I think you're relying too much upon it being 384-bit.

Jawed
 
Cherry-picked benchmarks are one thing; flat-out lies about competing products are another. The latter is what Charlie thinks Nvidia is guilty of: that's what's got him so riled.

No, Charlie has been an NV hater for over 1 year now and to date has never reported 1 bad thing about ATI.
 
Then why does he stay quiet about it whenever it's AMD/ATI doing it ?
If you read his posts (i wouldn't call them "stories") there's always a sense of almost personal grudge against Nvidia, for reasons i can't quite ascertain.


Exactly!
 
CJ said:
If yields really were that good they'd come out with a full ASIC configuration to blow the competition out of the water before they even got the chance to launch it just like they did with G80... but instead they opted for the 'saver' 112 SPs
First you say they needed to upgrade the specs to compete, now you are implying they downgraded the specs for yields... make up your mind. Furthermore, unless you can cite the MSRP of the GT's competition, you're in no position to determine if anything is or is not getting "blown out of the the water."
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top