nvidia "D8E" High End solution, what can we expect in 2008?

I for one hope something more than a g92 refresh of the old king of the hill is very near...

Why not? they can made a bunch of money with G92 55nm shrink, and not much money/resource need to do this.
They got what a company need for success, the game DEV's.
 
Single core Dx11 GPU in late Q3 at 55nm? ;)

NV has so huge lead i think G92 55nm shrink based cards can beat everything what ATi can offer in the next 12 month.
The only reason NV do this, if they FEAR for Intel Larrabbe.

12 months? That's a bit much... Even R680 should be able to take on a 55nm G92, unless NV switches to GDDR4/5 to provide some much-needed bandwidth.
 
12 months? That's a bit much...
It's been even more than 12 months since G80 introduction and AMD still hasn't beaten it.
So this 12 months lead is more or less accurate -- if we're talking about hi-end segment.
NV is all about secrecy right now.
Think about pre-G80 times -- most of us were sure that R600 will end up better than G80.
I have a feeling that we're in for something similar this year.
If not then NV's an ass and wasted a lead they had with G80. Not saying that this is impossible (i for one still remember NV30...) but chances of such development is somewhat slim, don't you think?
 
It's been even more than 12 months since G80 introduction and AMD still hasn't beaten it.
So this 12 months lead is more or less accurate -- if we're talking about hi-end segment.
NV is all about secrecy right now.
Think about pre-G80 times -- most of us were sure that R600 will end up better than G80.
I have a feeling that we're in for something similar this year.
If not then NV's an ass and wasted a lead they had with G80. Not saying that this is impossible (i for one still remember NV30...) but chances of such development is somewhat slim, don't you think?

We know what G92 is capable of, and it certainly isn't the leap over G80 as G80 was to G71, so the prior 12+ month lead doesn't apply here.
 
We know what G92 is capable of, and it certainly isn't the leap over G80 as G80 was to G71, so the prior 12+ month lead doesn't apply here.

Yeah but isn't the problem that R680 is basically only a die shrink version of R600 (times 2). If R600 couldn't beat G80 then why woulf R680 beat G92 (times 2)?

I think our best hope at the moment is for ATi not to lag too far behind, G100 to be a fairly modest improvement and R700 to be ATI's next R300 :D.
 
Yeah but isn't the problem that R680 is basically only a die shrink version of R600 (times 2). If R600 couldn't beat G80 then why woulf R680 beat G92 (times 2)?

I think our best hope at the moment is for ATi not to lag too far behind, G100 to be a fairly modest improvement and R700 to be ATI's next R300 :D.

My point was that R680 is ATi's response to G80 (sad that it takes 2x R6xx GPUs to match 1x G80). 2x G92 will probably be faster than R680, but I doubt it'll be a significant margin.
 
My point was that R680 is ATi's response to G80 (sad that it takes 2x R6xx GPUs to match 1x G80). 2x G92 will probably be faster than R680, but I doubt it'll be a significant margin.

Ah, in that case then I agree. Kind of anyway. If the performance numbers that have been leaked so far are to be believed, the 9800GTX at 30% faster than the 8800GTX should fall roughly in line with R680 which has been mentioned as being 15% faster than the 8800 Ultra.

The interesting thing for me is that it sounds like the 9800GX2 won't be all that much faster than the 9800GTX anyway, and thats only when it works!
 
Ah, in that case then I agree. Kind of anyway. If the performance numbers that have been leaked so far are to be believed, the 9800GTX at 30% faster than the 8800GTX should fall roughly in line with R680 which has been mentioned as being 15% faster than the 8800 Ultra.

The interesting thing for me is that it sounds like the 9800GX2 won't be all that much faster than the 9800GTX anyway, and thats only when it works!

What was said was that the 9800 GX2 is at least 30% faster than a 8800 Ultra.
It could imply that's based on a worst case scenario, and it also means we don't know yet how much faster it can get once the driver is properly tuned for dual-GPU -or even quad-GPU- operation under DX10, for instance.
Personally, i'm more interested to know the price range for it and how much more expensive it will end up being than its direct competitor, now that we know that R680 should hover around the 450 dollar mark.
 
What was said was that the 9800 GX2 is at least 30% faster than a 8800 Ultra.
It could imply that's based on a worst case scenario, and it also means we don't know yet how much faster it can get once the driver is properly tuned for dual-GPU -or even quad-GPU- operation under DX10, for instance.
Personally, i'm more interested to know the price range for it and how much more expensive it will end up being than its direct competitor, now that we know that R680 should hover around the 450 dollar mark.

How can it be "at least" 30% faster in games that SLI does not work for? Answer: it won't be. Therefore, the "at least 30% faster" is either patently false, or must be amended to read "at least 30% faster* *in 3d applications for which there is an SLI profile".
 
How can it be "at least" 30% faster in games that SLI does not work for? Answer: it won't be. Therefore, the "at least 30% faster" is either patently false, or must be amended to read "at least 30% faster* *in 3d applications for which there is an SLI profile".

I take it to mean *on average* at least 30% faster. i.e. it may turn out to be even faster *on average*.

In games were SLI doesnt work well it will probably be slower which is why I wouldn't get one over a 9800GTX.
 
I don't follow 3dMark scores all that much but isn't that score crap compared to the R680 which scores something liek 18K at default?

A factory-OC'd 3870X2 from MSI scores 16184 on a Q6600 @ 3GHz. Sounds about right for the dual G92 board in comparison, considering the overclocks. IOW: those rumors of default 06 runs scoring 18k were pure B.S.
 
Last edited by a moderator:
A factory-OC'd 3870X2 from MSI scores 16184 on a Q6600 @ 3GHz. Sounds about right for the dual G92 board in comparison, considering the overclocks.

According to that link its slower than a factory O/C GTS 512! The CPU won't make much difference as being a 2.83Ghz Yorkfield vs a 3Ghz Kentsfield.

Thats pretty crap if you ask me. I'm guessing either its running at a higher resolution or maybe 06 just isn't using both chips at the moment.

Or its fake of course!
 
Sorry, I was talking about the forum post with the 9800X2 scores being fake.

Gotcha. The dual G92 06 score still seems reasonable considering the O/Cs in the TT review, especially when you consider that ATi R6xx hardware has always been very strong in 3dmark06.
 
Back
Top