nvidia "D8E" High End solution, what can we expect in 2008?

Yeah it is depressing if it's just G92 x 2. I'm not really interested in the inefficient, dev-attention-req'd SLI or Crossfire techs....

Agreed. A mere 50% faster than the GTX *when it works* after nearly 2 years??

Thats pretty pathetic.

Then again, thats what lack of competition gets you. Lets hope ATI uses this opportunity to catch up! NV got complaicant when they took a serious lead with the GF4 family, here's hoping history is repeating itself.
 
Thats pretty pathetic.

Pathetic. Yes.
Unexpected. No.

How much faster than 7800 GTX was the 7900 GTX ?
How much faster than Geforce 3 Ti 500 was the Geforce 4 Ti 4600 ?
How much faster than Geforce 256 DDR was the Geforce 2 Ultra ?
Etc, etc...

Exactly.
Most mid-life "by-products", or "mild updates" as you will, of a certain technological generation in Nvidia's GPU's were never that much faster than their predecessors.
This tends to occur some 8 to 12 months within their market life, for Nvidia at least.


It may seem like a "new 7950 GX2" at first, but as the adoption of Vista throughout 2008 increases, so will the chances that multi-GPU setups become more effective, as certain Windows XP/DX9-era technical limitations are removed/less important overall.
After all, this is a DX10/DX9 Vista performance monster, not exactly something that anyone still using XP would be inclined to invest in.
 
Ugh, really disappointed. I don't want a pseudo-SLI card.
 
Pathetic. Yes.
Unexpected. No.

How much faster than 7800 GTX was the 7900 GTX ?
How much faster than Geforce 3 Ti 500 was the Geforce 4 Ti 4600 ?
How much faster than Geforce 256 DDR was the Geforce 2 Ultra ?
Etc, etc...

I think you mean (at least I would say, using your format):

How much faster than 6800 Ultra was the 7800 GTX,
(since 7800 GTX was the refresh of 6800).
 
9800 GX2 sounds like D8E, not D9E.

2 GeForce 8800s on a card. not GeForce 9 series.

it's still "NV50" G80 / G92 based. not the refresh

though I haven't been keeping up with all the recently released and upcoming products using Nvidia's latest way of naming chips & cards.


given that AMD isn't going to have R700 until late this year at the soonest, hopefully Nvidia will launch the actual highend refresh of G80/GF8800, the D9E or what should've been called G90 (what I still call "NV55"), the supposed 1 TFLOP GPU that was mentioned last year, in the fall of 2008.
 
Last edited by a moderator:
I think you mean (at least I would say, using your format):

How much faster than 6800 Ultra was the 7800 GTX,
(since 7800 GTX was the refresh of 6800).

Yeah, that was a somewhat stretched out time frame for the basic GF6/GF7 architecture to last, but let's not forget that the 6800 Ultra lasted considerably longer than either 7800 GTX or 7900 GTX, and that there were some minor improvements in the general design going from GF6 to GF7 (that didn't happen between both top-end GF7's).
The 7800 GTX 256MB was fast, yes, but not quite "twice as fast" as the 6800 Ultra.
 
I still have an older computer with ATI AIW Radeon 9800Pro; it's time to get :) Nvidia GF9800GX2 or in 2008 R700 ATI Radeon 4600Ti :LOL:

Edit: -OR- GF4800Ti AGP8X = Radeon 4800Ti PCIE-2.0 :)
 
Last edited by a moderator:
I think that the point to remember is that, as in the world of CPUs, increased parallelism with GPUs (i.e. multi-core/multi-chip) is ultimately the only way we're going to see significant improvements in performance in the future. A 'monster' chip with 1 billion+ transistors may just about be feasible now but it isn't sensible to expect that we can keep increasing the size of GPUs to the same degree in the future.

All the IHVs know this this is the case and that they are going to have to bite the bullet for multi-core/chip eventually so it doesn't seem to me a coincidence that they've both decided now is the time for the change. In fact, it seems to me that this is probably the ideal point in time. Both IHVs now have a good handle on the new DX10 technology so they can focus their efforts on their SLI/Crossfire technology as this will be extremely important in the future.

I expect it may be a slightly rough transition - we've seen many performance and compatibility problems with both SLI/Crossfire in recent months - but, ultimately, I think in the long term we'll benefit, especially if the software development houses jump on board and design with SLI/Crossfire in mind. Think of it as an investment - pay now, profit in the future.

P.S. When I say multi-core in regard to GPUs, I mean, of course, more than one chip on a single package.
 
Pathetic. Yes.
Unexpected. No.

How much faster than 7800 GTX was the 7900 GTX ?
How much faster than Geforce 3 Ti 500 was the Geforce 4 Ti 4600 ?
How much faster than Geforce 256 DDR was the Geforce 2 Ultra ?
Etc, etc...

Exactly.
Most mid-life "by-products", or "mild updates" as you will, of a certain technological generation in Nvidia's GPU's were never that much faster than their predecessors.
This tends to occur some 8 to 12 months within their market life, for Nvidia at least.

I disagree. We are talking about the performance jump between the first of each new GPU generation.

In that case the comparisons are:

GeForce 2 GTS --> GeForce 3 : 10 Months
GeForce 3 --> GeForce 4 Ti4600 : 12 Months
GeForce 4 Ti4600 --> GeForce 5800 Ultra : 9 Months
GeForce 5800 Ultra --> GeForce 6800 Ultra : 17 Months
GeForce 6800 Ultra --> GeForce 7800GTX : 14 Months
GeForce 7800 GTX --> GeForce 8800 GTX : 17 Months
GeForce 8800 GTX --> GeForce 9800 GX2 : 16 Months (assuming March launch)

Ok, so the timeframe itself is actually pretty representitive of past transitions. Problem is that past transitions were always to a significantly enhanced architecture while this time we see virtually no change at all.

In addition most of those past transitions resulted in near 100% performance increases (or even more in some cases). This time we get about 50% and its achieved by a dual gpu solution! If we only consider single GPU solutions (which is fair given the problems that can arise from SLI) then we are getting a mere 30% performance improvement for the same wait.

Note that I didn't include any refresh cards within each generation in the above comparison. Its all start to start. In fact had I done so the picture would look even worse given that we are looking at a 10 month gap between the 8800Ultra and the 9800GTX and recieving something along the lines of a 10-15% performance boost. I don't recall their ever being such a small gap with the possible exception of the GF2 Ultra --> GF3. But at least that came with major feature enhancements.

No, i'm sorry, but even considering NV's past performance, this is pathetic.
 
I think you mean (at least I would say, using your format):

How much faster than 6800 Ultra was the 7800 GTX,
(since 7800 GTX was the refresh of 6800).

Well technically he's right. NV40 is a different gen than G70. So 7900GTX is to be considered the refresh of 7800GTX..

The problem's not that imho..
The problem is the timeframe that these refreshes take place.. Not only nVIDIA had more than a year to provide us with a "serious" refresh and not the ridiculous 8800Ultra , but now we have to bare with a GX2 solution for an upgrade.. The problem is not the release of the new generation, it is that it doesn't provide us with a serious refresh in the high end..
So why should I not go with another 8800GTX instead of a GX2? Will 65nm process and the G92 tweaks be enough to make this card a more attractive solution than purchasing another GTX for SLI? And the card should have been called 8900GX2 and if we see a single solution 8900GTX..

Does anybody know for sure if we are to see a single 9800GTX?
 
Does anybody know for sure if we are to see a single 9800GTX?


HardOCP stated February/March for 9800GTX and March/April for 9800GT. GF9800GTX should support 3 Way SLI.

I've heard March for the release of D9E as "the followup to 8800GTX/Ultra". So if D8E = 9800GX2 and D9E = 9800GTX/GT then that info seems about right.
 
So if D8E = 9800GX2 and D9E = 9800GTX/GT then that info seems about right.

Why should 9800GX2 be D8E? :???:

Imo:
D9E-GX2 (aka D9E-40) is 9800GX2.
D9E-GTX (aka D9E-20) is 9800GTX, probadly just a high clocked G92 (A3 rev?) with 0.83ns GDDR3.
 
I disagree. We are talking about the performance jump between the first of each new GPU generation.

In that case the comparisons are:

GeForce 2 GTS --> GeForce 3 : 10 Months
GeForce 3 --> GeForce 4 Ti4600 : 12 Months
GeForce 4 Ti4600 --> GeForce 5800 Ultra : 9 Months
GeForce 5800 Ultra --> GeForce 6800 Ultra : 17 Months
GeForce 6800 Ultra --> GeForce 7800GTX : 14 Months
GeForce 7800 GTX --> GeForce 8800 GTX : 17 Months
GeForce 8800 GTX --> GeForce 9800 GX2 : 16 Months (assuming March launch)

Don't be fooled by some of the names.
Geforce 2 GTS wasn't much more than a die shrink of the original Geforce 256.
Geforce 4 Ti was basically a shrunk Geforce 3 with an extra vertex shader unit and some more pixel/vertex shader programming capabilities (still under the same SM 1.x/DX8.x umbrella as the GF3).
Geforce 6800, 7800 and 7900 had a lot more in common than most people think, so the 7800 wasn't exactly "new" either.

So, that timetable isn't entirely accurate.
In fact, Nvidia's timetable shares a lot with Intel's:

- Generation debut.
- Mid-life generation update (after, roughly, 12 to 18 months), usually accompanied by a die shrink.
- New Generation (24 to 28 months after the first high-end of the preceding generation's first top-of-the-line, depends on either TSMC's or Nvidia's quality of design/fab process execution abilities at any given time).

This "9800 GX2" falls right in line with the 6800/7800/7900 timeline, if it's not even speedier than it.
It's a "carrier" of the GF8 tech in the high-end until the holiday season.
A "true" new generation will likely appear 24 months after the 8800 GTX, so that means October/November of this year.
 
Last edited by a moderator:
Don't be fooled by some of the names.
Geforce 2 GTS wasn't much more than a die shrink of the original Geforce 256.
Geforce 4 Ti was basically a shrunk Geforce 3 with an extra vertex shader unit and some more pixel/vertex shader programming capabilities (still under the same SM 1.x/DX8.x umbrella as the GF3).
Geforce 6800, 7800 and 7900 had a lot more in common than most people think, so the 7800 wasn't exactly "new" either.

So, that timetable isn't entirely accurate.
In fact, Nvidia's timetable shares a lot with Intel's:

- Generation debut.
- Mid-life generation update (after, roughly, 12 to 18 months), usually accompanied by a die shrink.
- New Generation (24 to 28 months after the first high-end of the preceding generation's first top-of-the-line, depends on either TSMC's or Nvidia's quality of design/fab process execution abilities at any given time).

This "9800 GX2" falls right in line with the 6800/7800/7900 timeline, if it's not even speedier than it.
It's a "carrier" of the GF8 tech in the high-end until the holiday season.
A "true" new generation will likely appear 24 months after the 8800 GTX, so that means October/November of this year.

Yeah I realise that the 9800 is more of a "tock" than a "tick" but its still poor compared to previous "tocks".

For a start the previous "tocks" (GF2, GF4Ti, GF7800) all brought reasonable architectural changes over the "tick". They werem't total re-designs but things did change and improve architecturally to a noticable degree. However the 9800 really is very little more than a die shrunk 8800 (assuming its using the G92 core).

Also those "tocks" all brought significant performance increases over the "ticks" (GF SDR, GF3, GF6800). Certainly more than the 30% expected from the 9800GTX over the 8800GTX.

Architecturally and speed wise, I would say the 9800GTX falls more in line a refresh of the 8800GTX's "tick" (in the same way as the Ultra) rather than being the "tock" to that "tick". However timeframe wise and naming wise, NV are clearly positioning it as a "tock"

In other words, the 9800GTX should have been launched 4-5 months ago and named the 8900GTX and what we are about to get as the 9800GTX should feature more architectural improvements and at least a 50% performance boost.
 
Well technically he's right. NV40 is a different gen than G70. So 7900GTX is to be considered the refresh of 7800GTX..

The problem's not that imho..
The problem is the timeframe that these refreshes take place.. Not only nVIDIA had more than a year to provide us with a "serious" refresh and not the ridiculous 8800Ultra , but now we have to bare with a GX2 solution for an upgrade.. The problem is not the release of the new generation, it is that it doesn't provide us with a serious refresh in the high end..
So why should I not go with another 8800GTX instead of a GX2? Will 65nm process and the G92 tweaks be enough to make this card a more attractive solution than purchasing another GTX for SLI? And the card should have been called 8900GX2 and if we see a single solution 8900GTX..

Does anybody know for sure if we are to see a single 9800GTX?

I somehow have the feeling that if you'd go out tomorrow and buy a second G80 (or sell the existing one and buy a GX2) that you might regret that investment in a relatively short time. How long did the G71 GX2 and the 7900GTX exactly last? It might sound naive, but that G9x/GX2 thingy sounds to me more like a gap-filler than anything else.

If it would be possible for both IHVs to deliver significantly faster GPUs than currently available, I'm pretty sure they would.
 
I somehow have the feeling that if you'd go out tomorrow and buy a second G80 (or sell the existing one and buy a GX2) that you might regret that investment in a relatively short time. How long did the G71 GX2 and the 7900GTX exactly last? It might sound naive, but that G9x/GX2 thingy sounds to me more like a gap-filler than anything else.

If it would be possible for both IHVs to deliver significantly faster GPUs than currently available, I'm pretty sure they would.

If it suited them. If it would be possible to deliver significantly faster GPUs whilst spending just as much as they're spending by implementing these multi-chip solutions and risking just as little, they probably would. If it would mean spending significantly more in order to shorten the R&D cycle and taking more risks with a more complex architecture, newer process and whatever, I'm not sure they'd do it. These guys don't exist in order to innovate, their goal is to make money. Innovation is a means to that end, but in induces risks. If they can innovate less whilst making the same money or more, they will, even if the community cries itself to sleep because Crysis isn't playable with AA and AF at 1920x1200, IMHO.
 
I somehow have the feeling that if you'd go out tomorrow and buy a second G80 (or sell the existing one and buy a GX2) that you might regret that investment in a relatively short time. How long did the G71 GX2 and the 7900GTX exactly last? It might sound naive, but that G9x/GX2 thingy sounds to me more like a gap-filler than anything else.

Can't recall m8. 7-8 months? It's still a decent amount of time for some ppl, if it's paired with a 30-40% increase. But if we are to see 9800GTX in March and the new gen high end in the mid summer for example, then by all means it's not worth it.

If it would be possible for both IHVs to deliver significantly faster GPUs than currently available, I'm pretty sure they would.
IMHO it's all a matter of perspective. Somehow I doubt (though I could be wrong here) that nvidia could not have provided us with a "proper" refresh for some time now( e.g by the time of G92 mainstream release).. I think the keyword here is competition, and in this mess I cannot solely blame Nvidia.. It's a profitable organization/company after all.
 
If the only thing we will get this year are refreshes, that still make Crysis unplayable, both, Nvidia and AMD deserve that Intel rescue all of us and crush them.

Go Larrabee !!!
 
Back
Top