The G92 Architecture Rumours & Speculation Thread

Status
Not open for further replies.
If G92 is really the 4 Cluster-Chip like supposed by the rumor-mill, than he is far away to beat a 8800Ultra, also with high clocks and little optimizations.

My guess is still a dual-gpu-SKU for high-end, but maybe with some advanced technologies:
dual-die-package and both GPUs use only one MC, which is equipped with 1GB fast GDDR4 (1.4GHz+), Hybrid-SLI: under 2D one GPU is disable
 
8 Clusters, 6 ROP-Partition and a VP2? :???:
nvspoctq4.jpg


And this slide seems to say, all GeForce8 supports DX10.1(via driver update):
g8xdx101di8.jpg


source
 
8 Clusters, 6 ROP-Partition and a VP2? :???:
Considering that part was merely the G80 diagram cropped, I wouldn't exclude a 'bad copy-paste'... But still, strange! 6 Quad ROPs would certainly go against everything said so far...
 
But still, strange! 6 Quad ROPs would certainly go against everything said so far...

Said for G92(new mid-range/performance) and not the hidden "G90". ;)
Also a order of NV to Samsung, was a little indication that there is maybe a 384Bit GPU...
 
Said for G92(new mid-range/performance) and not the hidden "G90". ;)
I wouldn't exclude a 55nm shrink with GDDR5, but I am very skeptical about the existence of a real G90. I'm sure NVIDIA knows by now that AMD is going for a multi-GPU strategy going forward, with R700 and maybe RV670, so why shouldn't they do the same? Both from a financial and a mindshare point of view, there is no reason for them to create a GPU with $399+ as its sole market segment anymore.

Of course, I could be horribly wrong. I know nothing about G90, and have never heard about it from anyone reliable. That doesn't mean it doesn't exist of course - but call me skeptical for now!

Also a order of NV to Samsung, was a little indication that there is maybe a 384Bit GPU...
Or a 192-bit one... ;) Can you say anything more about this? I'm especially interested in the kind of memory G9x will use. As I said above, GDDR3 vs GDDR4 would be a very important factor in figuring out G92's target markets. There is no way a 256-bit GDDR3 GPU could compete against a 8800 Ultra, but with GDDR4... In fact, GDDR3 could be very limiting for G96 too, I would presume. Unless that supports GDDR5 too, which I doubt.
 
Of course, I could be horribly wrong. I know nothing about G90, and have never heard about it from anyone reliable. That doesn't mean it doesn't exist of course - but call me skeptical for now!

I think even if they have a G90 somewhere, and it is the real high-end, they can keep tweaking it until AMD comes out with a new high-end announcement. After all, the G80s are selling well and given the time, they are probably quite a bit cheaper to produce. Why should they pull out a G90 now, while the G80 is still king-of-the-hill? Unless they had it ready shortly after the G80 release already (but that would have been the G81 refresh chip?).

Only argument I've seen so far is that they told their investors a new high-end is coming - is there any press release or source for that?
 
I wouldn't exclude a 55nm shrink with GDDR5, but I am very skeptical about the existence of a real G90. I'm sure NVIDIA knows by now that AMD is going for a multi-GPU strategy going forward, with R700 and maybe RV670, so why shouldn't they do the same? Both from a financial and a mindshare point of view, there is no reason for them to create a GPU with $399+ as its sole market segment anymore.

Why would NVIDIA stop producing large GPUs? It seems to be working very well for them. Whether the rumors that ATI will or not are still just rumors, but ATI is clearly not executing as well as NVIDIA in many areas.

NVIDIA has the R&D capability to do the big designs and with reconfigurability it seems like the economics are only getting better.
 
Why would NVIDIA stop producing large GPUs? It seems to be working very well for them.
It does at the top end, yes. But when your die size disparity is that large, you need four different chips to fill all pricepoints. ATI already did that, and it's because they realized how expensive and complex that was that they seem to be switching to dual-die at the top.

NVIDIA is still on three chips (they kinda sorta were on four in the NV4x era, but no matter that), however the price they had to pay for that is clear: the gap between the 8600GTS and the 8800GTS is huge. There is no easy solution to that problem: either you add an extra chip to the line-up (RIP, G82) or you use a dual-die solution for the enthusiast segment.

Because my premise was that I don't think having 4 chips makes much financial sense, my conclusion was that NVIDIA would naturally be tempted to also get rid of 'monster chips'. Remember that before G80, the largest chip NVIDIA ever made was G70, which was in the low 300mm2s, iirc. This is not really changing strategy per-se; it's just going back to the previous one they had with G71.
 
After all, the G80s are selling well and given the time, they are probably quite a bit cheaper to produce. Why should they pull out a G90 now, while the G80 is still king-of-the-hill?

Are you supposed to have mercy for the weak in business? Why wouldn't Nvidia take the opportunity to further harm the competition? In the long run, Nvidia stand to gain much more if they can completely eliminate the competition than if they're slightly more profitable right now without G90 or some other monster GPU.
 
nV would actually end up loosing possible sales if they don't put out a new flagship, they have to give reason for people to upgrade cards, a person that has purchased a 8800 GTS isn't going to go upgrade to a GTX or Ultra.

Its possible people with lesser cards would upgrade but they usually won't go higher then midrage cause of cost.
 
Why even produce G90 when it's clear ATI won't have an enthusiast response to G80 until R700 comes out perhaps a year from now? Save the R&D budget and switch the engineers over to other projects. It's the smart thing to do, IMHO.
 
Umh..maybe, or maybe not.
The bigger mistake for Nvidia would be to understimate AMD..
 
Umh..maybe, or maybe not.
The bigger mistake for Nvidia would be to understimate AMD..

It's not like AMD can just turn the ship around and come out with an enthusiast response to G80 this generation... R6xx (single GPU) cannot compete with G80 GTX/Ultra except in a few select benchmarks. They need multi-GPU to compete and clearly dual-R600 isn't enough to get the job done (not at its current price point anyway). Dual RV670 is a more likely candidate, but unless they can pull a miracle out of their collective ass with R680 I think we won't see a competitive enthusiast single GPU part from AMD until R700.
 
Why even produce G90 when it's clear ATI won't have an enthusiast response to G80 until R700 comes out perhaps a year from now? Save the R&D budget and switch the engineers over to other projects. It's the smart thing to do, IMHO.

That's absolutely NOT the smart thing to do. It's never a smart idea to rest on your ass and hope and pray no one surpasses you. Do I even need to mention NV30?
 
That's absolutely NOT the smart thing to do. It's never a smart idea to rest on your ass and hope and pray no one surpasses you. Do I even need to mention NV30?

I never said anything about resting on their ass. AMD has no chance of competing this generation at the high-end. G80 is so far out in front of R600 it would literally take a miracle for AMD to compete. A multi-GPU G9x part using "mid-range" or performance parts in the enthusiast segment is the smartest move here, due to economies of scale and potential ASPs & margins. G100 will take on R700, and not until late next year at that. Why design yet another monolithic enthusiast-class GPU if it has no competition but its predecessors? They'd only be robbing themselves of potential sales.
 
That sounds like 3DFX. They were #1 for years and lived off of what were basically just revisions of Voodoo 1. And then they got blown away.

Just because G80 is faster than R600 doesn't mean ATI won't solve that soon. Who knows? And the difference isn't that immense between the two until you turn on AA. I've seen a few game benches with R600 up around 8800GTX. If the manufacturing process had worked out better and they'd managed another 100 MHz or so, it probably would be pretty competitive. But they once again bet on a new process. And made weird decisions with regards to AA (and lots of other things too lol)...
 
Would a multi-gpu single-part be capable of running MultiMonitor?

"Why design yet another monolithic enthusiast-class X if it has no competition but its predecessors?"

By that same token, we'd still be running Intel 8086 CPUs and 3dfx Voodoo 1's. Now do you see why that that question seems so silly to me? It's called technology advancement.
 
It does at the top end, yes. But when your die size disparity is that large, you need four different chips to fill all pricepoints. ATI already did that, and it's because they realized how expensive and complex that was that they seem to be switching to dual-die at the top.

NVIDIA is still on three chips (they kinda sorta were on four in the NV4x era, but no matter that), however the price they had to pay for that is clear: the gap between the 8600GTS and the 8800GTS is huge. There is no easy solution to that problem: either you add an extra chip to the line-up (RIP, G82) or you use a dual-die solution for the enthusiast segment.

Personally, I think the performance disparity between the 8600 and 8800 GTS is more of anomaly due to process, architecture, and economics. Clearly DX10 required a big step up in transistors and obviously NVIDIA didn't want to hold back in the high end with performance and didn't want to risk a new process, so they created a monster of a chip. The custom design of course helped with the power consumption and thermals. The 8600 sells in a much more price-sensitive market, so die size is not really flexible.

The 80 and 90 nm nodes do not seem like the sweet spot for DX10. At 65 nm NVIDIA can shrink the die size of a high end chip and increase the performance of the mainstream parts. Whether the delta will be quite the same the NV40 era remains to be seen.
 
Status
Not open for further replies.
Back
Top