The G92 Architecture Rumours & Speculation Thread

Status
Not open for further replies.
As said, out of thin air. And I think they'd only reveal/release that if ATI's offerings are competitive, otherwise not (and all due respect, I don't think ATI/AMD will hit the jackpot based on the happenings with last few gens). And although I'm saying this, if RV670 should indeed consume only 80-something Watts, that is my next card regardless of what nV offers.
 
I'll make one of my crystal ball unfounded out of thin air predictions again: 192 SP, 512-bit bus and 1 GB mem for the next nV high-end part (next as "this round") trouncing AMD/ATI into oblivion. Big and bold.

Hey, I almost totally agree. :D

Only 2x96, 2x256-bit, 1024MB. ;)

I guess we'll see. I think Utt's guess sounds fairly logical. I agree on the three single core SKUs (obviously G92 has possible 8TCPs, as the push from the original 6 to 7 on the GT has shown, as well as the rumored G92_300 almost certainly being 8TCPs) as well as a dual part, although I stll think it'll be with slightly cut down cores...although I bet they wish they didn't have to. I just don't see them fitting two full-fledged 8TCP G92's on one part for fear of a TDP over 225w, which surely that would be. I don't think think ATi nor Nvidia would be ballsy enough to cross that barrier....yet. I know the spec allows for 300W, but I think they'll wait until PCI-E 2.0 becomes more of the standard, rather than the elite. I doubt there is a single high-end core more than 8TCP's, but I also doubt they would surrender the high-end to a dual Rv670. That leads me to the conclusion of a 2x6TCP part that will equal if not exceed R680, all while staying on fairly even keel with power consumption (<225W). I suppose a 2x7TCP part could also be possible, but that to me would seem to be pushing the envelope right to the edge.
 
Last edited by a moderator:
I think Jawed is way too much into tech to see the "reality" ;) (no pun Jawed, you're just almost a living supercomputer and ahead of the retail happenings)
 
I think Jawed is way too much into tech to see the "reality" ;) (no pun Jawed, you're just almost a living supercomputer and ahead of the retail happenings)


But we love him all the same.

Or should I say:

01110111 01100101 01101100 01101111 01110110 01100101 01111001 01100001 01001010 01000001 01010111 01000101 01000100
 
I think Jawed is way too much into tech to see the "reality" ;) (no pun Jawed, you're just almost a living supercomputer and ahead of the retail happenings)
Hey, I think Turtle meant Arun's guesses...

42 6C 75 73 68 21 20 48 65 78 20 69 73 20 6E 69 63 72 20 3A 44

Jawed
 
Because the people on this site who actually know something never say anything... which is probably why they know something.
 
I suppose a 2x7TCP part could also be possible, but that to me would seem to be pushing the envelope right to the edge.
Maybe NVidia's saving the 8 cluster dies for the GX2, for a repeat of the 7800GTX-512 effect.

Jawed
 
One thing I'm having trouble understanding is how they (supposedly) are getting such good yields for the GT part... considering the SRP of the 320MB GTS something seems very odd.
 
One thing I'm having trouble understanding is how they (supposedly) are getting such good yields for the GT part... considering the SRP of the 320MB GTS something seems very odd.

Maybe they were just making a killing on G80 in general.
 
Why you think NV will offer again a GX2 for consumers?

In my opinion it collides with 3-Way-SLI, more than 3 GPUs in AFR make not many sense, what we see on 79x0GX2, which could never offer 4-Way-AFR under D3D9 and Q-SLI is not supported under Vista.
The other problem is the lag of AFR-methode, the 2 FPS of 3-Way are just acceptable

Also the 289mm² Die, which I think includes a full "G80" is to big for such a SKU, G71 was ~200mm², like RV670...

8800GT reach its 600/1500MHz already with 1.05V and 110W TDP, so the often rumored ~800/2400MHz should no problem at a ~ 88GTX-level-TDP.

Paired with some fast GDDR4, this SKU will imo enough to beat Dua-RV670, which is think is in good cases 2x R600XT.

Another advantage would be: 3 High-End-GPUs > 2x2 Performance-GPUs...;)
 
I'm surprised Gibbo from overclockers.co.uk has not been quoted yet with his normal nuggets of information, nuggets of disinformation and marketing to sell products through the business he represents.

Here is what he says

"Hi there

Now thats misleading people by quite some margin.

I've got several brands on order and none of them will be that cheap delivered.

OcUK will be selling BFG/EVGA branded 512MB models in the £140 - £160+VAT region, the more expensive ones are overclocked.

We shall have our own brand of 512MB ones which we should be able to do for around £120-£130+VAT.

For those wonderin, they are quicker than 8800 GTS 320MB and no doubt as quick and quicker in some games over the 640MB too, so they are better value.

They have faster core, shader and memory speed over the GTS along with more stream processors. The only weakness is the 256-bit interface which is not really a weakness as such.

256MB will be cheaper but 512MB will be available first and in better quantity, but 256 version should be around £20-£30 cheaper"

Gibbo also says "End of this month, can't pud on pre-order due to NDA at the moment."

Never stopped them before, maybe the real gangsta's at nvidia decided to put a horses head in his bed? :D



They are selling GTS 320M starting from £178 so this is a fair bit cheaper given similar performance. Who is going to buy a 640 GTS for over £200? Not me if I was in the market.

Looking at the 8600GTS this looks to be under pressure from the 256MB version as well.
 
Since we talked about the clock of what suppose to for G92 in the past.
If we read the news some sites said G92 running at 800+ MHz core clock.

Do you think 8800GT is underclocked ?
 
well according to that asian site it has 16tmu & 16rop. if this is really the case i cant see how its beating the GTS cards and coming so close to the GTX. i dont think the 16 extra SP explain the performance gap. something else we dont know about????
 
Why should have a 289mm² Die only 256Bit and 8 Clusters(32TMUs/128SPs)?

I think the most obvious guess is that Nvidia has to add a bit to the chip for DX10.1 certification. Unless rumors are corrent and they are skipping DX10.1 this fall. But that would seem odd. I've never known Nvidia to skip out on a marketing checkmark if they could help it.

Either way, I'm hoping this's will be more competitive in the same timeframe (rather than ATI always being late and giving Nvidia a virtual monopoly for a few months). I'm rubbing my hands gleefully at the thought of real price competition.

Regards,
SB
 
Status
Not open for further replies.
Back
Top