The G92 Architecture Rumours & Speculation Thread

Status
Not open for further replies.
It looks like AMD really succeeded in crashin nV's party by releasing the HD2900Pro which offers better performance for the same price one and a half months sooner.

Do not forget the better power-efficiency, PureVideo HD and D3D10.1...

And we should not underestimate NV, because they are supposed to be very confident of their G92-lineup. ;)
 
Do not forget the better power-efficiency, PureVideo HD and D3D10.1...

And we should not underestimate NV, because they are supposed to be very confident of their G92-lineup. ;)

You do realize that by the time G92 is released, RV670 will be released too (middle of November) and well... you've seen the power efficiency numbers and it has UVD and D3D10.1... ;)

The HD2900Pro is just a card to hold us over until the real thing and to spoil nV's party. In fact the HD2900Pro will go EOL in about 2 weeks time. nV shouldn't be worrying about HD2900Pro... it's RV670.

Well, NV40 was 8x2, wasn't it?

Wasn't that R420 with it's extreme pipelines? :p
 
Wasn't that R420 with it's extreme pipelines? :p
No, both IHVs claimed to be preparing 8 pipelines parts and strategically leaked such information to certain parties. In fact, in a final act of mockery, NVIDIA briefed the european paper press with slightly wrong specs on purpose, IIRC.
 
Why would there be a big difference between debugging an 8800 and an 8600? If the majority of the chip consists of replicated parts, it comes down to (almost) the same thing.


As shown by ever increasing margins, I don't see indications that yields are a big problem right now. I'm sure there will some point in the future where it may be, but it's not common for companies to stop pushing the envelope before they are getting burned. ;)


... that are still facing the problem of not having a very high bandwidth data exchange interface.

High-end customers better hope that we're quite a bit away from only multi-chip solutions.

I'd argue that it's harder to find a package in an apartment building of 100 apartments when compared to one of only 50, even though the apartments are identical copies of eachother. Chasing a bug in a huge billion-transistor chip may be a tad bit more problematic that chasing one in a 500 million one, even though they're both comprised of similar building-blocks.

Ever increasing margins on which segment?The prices have also been ever increasing...I guess it depends on wheter or not 1000$ cards are economically viable or not...the Ultra hasn't yet answered that question I think.

I agree on the issues WRT high-speed interconnecting the chips-and bear in mind that I'm not saying the cross into multi-chip/multi-die for the high-end is something that'll come into full-force tomorrow. My guess(I underline the word guess) is that once we get to multiple-dies on a single package, a high-speed solution will be in place-perhaps an evolution of HT or something similar. With enough incentive, this is hardly an unsolvable problem, IMHO.

High-end customers better hope for some form of actual competition, be it with single-chips or with multiple ones, as the current pricing scheme is fairly ridiculous:D.
 
No, both IHVs claimed to be preparing 8 pipelines parts and strategically leaked such information to certain parties. In fact, in a final act of mockery, NVIDIA briefed the european paper press with slightly wrong specs on purpose, IIRC.

Ah yes.. the big misinformation campaign. Didn't one of them (I believe it was ATi) even give 8 pipe versions to the developers which confused everyone even more?
 
Last edited by a moderator:
No, both IHVs claimed to be preparing 8 pipelines parts and strategically leaked such information to certain parties. In fact, in a final act of mockery, NVIDIA briefed the european paper press with slightly wrong specs on purpose, IIRC.

You are saying that European periodicals printed misinformation prior to launch? Even if so, what of the possibilities that that info was just rehashed rumors or speculation on the part of the press. Not that PR types wouldn't stoop so low, but that does seem awfully bizzare. What would the point of that be, only to have the paper press look embarrassed. It's not like print journalists are necessarily any more diligent or ethical than some of their notorious online comrades.
 
You are saying that European periodicals printed misinformation prior to launch? Even if so, what of the possibilities that that info was just rehashed rumors or speculation on the part of the press.
It wasn't - I don't remember all the details though, that was 4 years ago or so after all! I can't remember if NV contacted them to correct the information post-briefing/pre-publishing, but it wouldn't surprise me. Anyway, they obviously didn't give wrong data on things they insisted upon or that the paper press cared about.

It was stuff that techies and ATI would have cared about, however. The goal obviously was to make everyone think that NV40 was more of a logical evolution of NV3x than it really was, thus underestimating it.

CJ: Once again, I don't remember, but it wouldn't surprise me...
 
Yes , But could Nvidia launch 8800GT with full 128 sp enabled ?
They already have 128 SP part -- that's G80 and it's on 90nm. Why in the hell wouldn't they be able to do essentially the same part on the 65nm???
Something doesn't add up in the known NV's G9x line... I smell a smokescreen too =)
P.S. And what's G96???
 
It wasn't - I don't remember all the details though, that was 4 years ago or so after all! I can't remember if NV contacted them to correct the information post-briefing/pre-publishing, but it wouldn't surprise me. Anyway, they obviously didn't give wrong data on things they insisted upon or that the paper press cared about.

It was stuff that techies and ATI would have cared about, however. The goal obviously was to make everyone think that NV40 was more of a logical evolution of NV3x than it really was, thus underestimating it.

CJ: Once again, I don't remember, but it wouldn't surprise me...

A trick they repeated with G70 but even more so. " No you don't have to have unified pipelines", said Dr James T Kirk phd. "Warp factor 5 Mr Sulu....."

Strangely nvidia have said nothing recently to throw us off the track. In fact they have said nothing at all, which is probably actually quite worrying ;)
 
P.S. And what's G96???
I think G96 is a G84 replacement, while G98 is a G86 replacement. (ohh, look what that pattern gives us for G92-G80! heh... NVIDIA's codenames are wonderfully illogical though so that doesn't really mean anything)
 
Strangely nvidia have said nothing recently to throw us off the track. In fact they have said nothing at all, which is probably actually quite worrying ;)
Well they were talking non-stop during NV30 developement, so i wouldn't worry if they suddenly became quiet 8)
 
Ah yes.. the big misinformation campaign. Didn't one of them (I believe it was ATi) even give 8 pipe versions of the developers which confused everyone even more?

It's 12 pipelines, when ATI heard that Nvidia will launch 8X2 product before the launch.
 
I think G96 is a G84 replacement, while G98 is a G86 replacement. (ohh, look what that pattern gives us for G92-G80! heh... NVIDIA's codenames are wonderfully illogical though so that doesn't really mean anything)
+2 eh? 10 TCPs for G92? :D
 
I was hoping that they'll release GeForce 9 series this november becuase we're nearly end of the year.(I know this is sound silly but since we cant trusted any info I got nothing to believed with)If you take a look at history you'll see that Nvidia will release new GPU every year excepted GeForce 4 and FX series.(Both released in same year 2002)
 
Just heard from a fairly reliable source that the G92 aka GF8700 performs in between a GF8600GTS and GF8800GTS. It looks like AMD really succeeded in crashin nV's party by releasing the HD2900Pro which offers better performance for the same price one and a half months sooner. And G92 will go up against RV670 in November... and that fight should be in favor of the RV670...

There will be two versions of the GF8700, a GTS and a GX2, which was sort of confirmed by Kinc yesterday who said that there will be a GX2 version of the "die shrink".

I hope it won`t be a GX2 like card.

Then how NVIDIA want to win against ATIs rv670?? It is said that G92 is between GF8600GTS and GF8800GTS but rv670 is about r600 performance so....?
 
Last edited by a moderator:
They already have 128 SP part -- that's G80 and it's on 90nm. Why in the hell wouldn't they be able to do essentially the same part on the 65nm???

Yeah that doesn't sit right with me either. If they were in fact going for a baseline chip which they will use in dual-chip configurations at the high-end, a 64-shader part at 65nm seems to be aiming extremely low for the baseline. They could simply shrink G80 and have a beast of a performance/mainstream product at great margins. So exactly why would they aim so low? They can't be that greedy...or can they? :smile:
 
Another thought:

Maybe some people misinterpret the dual-card-rumor and this will be exclusive for Tesla (and maybe Quadro) to get more power in the 1U-racks (4->8 gpus + higher clocks @ same power-envelope).

In November enthusiast will be maybe see the hidden G90 (a G80@65nm with the rumored clocks of ~0.8/2.4/1.6GHz + D3D10.1, IO+VP3), which will compete against Dual-RV670(look at the bandwidht...)

Imagine this in Triple-SLI, which will NV launch soon with nF780a/i -> no signs of rebirth of Quad-SLI ;)

G92 will be a different GPU, which is like supposed by the rumors the new mid-range part.
To put two 4C-chips together with all the redundancy(transistors, memory) and the problems of multi-GPU(lag, inhomogene frame-times) would not make much sense or?
 
Last edited by a moderator:
Yeah that doesn't sit right with me either. If they were in fact going for a baseline chip which they will use in dual-chip configurations at the high-end, a 64-shader part at 65nm seems to be aiming extremely low for the baseline. They could simply shrink G80 and have a beast of a performance/mainstream product at great margins. So exactly why would they aim so low? They can't be that greedy...or can they? :smile:
I see two possibilities right now:
1. All we know about G92 is wrong, this is a hi-end chip aimed at beating G80 performance levels by quite good margin (dual-chip board configuration (GX2) is possible if this chip is just a shrink of G80 to 65nm + higher clockspeed). This essentially puts G92 in the RV670 territory (which is assumed to be a shrink of R600 to 55nm plus 256 bit bus instead of 512 bit).
2. G92 is GF8700 (3-4TCPs, 256-bit bus, 8600GTS-8800GTS performance levels). But this means that there should be another G9x -- the one that would be hi-end and would compete with RV670 either in SLI or as one chip (as GX2 as i described above or as G80U vs R600CF now if this hi-end G9x will have more than 128 SPs and 384-512-bit bus, etc.).
For now i'm leaning towards the 2.1. possibility :) The one where G92 is GF8700, but we're missing shrinked G80 (G90? G91? who knows...) and this hi-end G9x will either pop out of nowhere in November (doubtful, yeah) or will come out in the 1st Q of 2008 (more likely).
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top