nvidia "D8E" High End solution, what can we expect in 2008?

G98 is low-end.
New 8800GTS will use G92 with 8 TCPs running.

The comma in-between "G98" and "8800 GTS 512" was a separator. He wasn't saying 8800 GTS 512 = G98.

i heard that the 9800 series is coming out before the end of the year (or whatever its called).

Source? I have yet to see solid evidence of this, and certainly not from a reliable source. All I hear about GF9 series are interpretation-based rumors from before the 8800 GT launch, all of which assumed that G92 = GF9800.
 
Source? I have yet to see solid evidence of this, and certainly not from a reliable source. All I hear about GF9 series are interpretation-based rumors from before the 8800 GT launch, all of which assumed that G92 = GF9800.
Why should not be G92 also GF9800, when G86 is also GF9300?;)

In January or early February I bet NV will be able to unveil G92s full potential, maybe through better yield, against HD3870 X2, with a single-GPU-SKU.
 
Why should not be G92 also GF9800, when G86 is also GF9300?;)

In January or early February I bet NV will be able to unveil G92s full potential, maybe through better yield, against HD3870 X2, with a single-GPU-SKU.

I'd expect more from the 9800 than what G92 can offer in a single GPU card, even in a "full" 8TCP configuration. Unless there's huge clock headroom left over what the 7TCP G92 (8800 GT) offers. Also, where do you get the extra bandwidth from? That 256-bit bus doesn't cut it at the high-end, even with the fastest GDDR4 currently available, so unless they're hiding a larger memory interface in that die, they'd need GDDR5 to ship in volume before they could realistically sell such a card at the high-end.

The only high-end part that is feasible given these facts is a GX2-like configuration using mobile G92s. I doubt they'd name such a part GF 9800.
 
I still doubt, that G92 is only 256Bit, since there are rumors about prototyp-cards with a G92 and 12 memory-chips.

And absolutely no indication that NV will go (the step) back to Quad-SLI.
 
That could explain the transistor count too.

AMD went from 512b to 256b and got a nice reduction but that isnt the case here with Nvidia.

(NVIO was a separate chip for Nvidia and UVD was missing for AMD, so they cancel each other out)
 
That could explain the transistor count too.

AMD went from 512b to 256b and got a nice reduction but that isnt the case here with Nvidia.

(NVIO was a separate chip for Nvidia and UVD was missing for AMD, so they cancel each other out)

I don't believe the situations are directly comparable. AMD had a 512-bit (external) MC interface with 1024-bit internal precision and a ring bus w/accompanying ring stops to accomodate. This is far more complex than NV's 6x64-bit + crossbar approach.
 
I don't believe the situations are directly comparable. AMD had a 512-bit (external) MC interface with 1024-bit internal precision and a ring bus w/accompanying ring stops to accomodate. This is far more complex than NV's 6x64-bit + crossbar approach.

And I can bet my ass that NVIO is a lot, lot bigger than the space UVD takes
 
I still doubt, that G92 is only 256Bit, since there are rumors about prototyp-cards with a G92 and 12 memory-chips.

And absolutely no indication that NV will go (the step) back to Quad-SLI.

Any sources of these rumours?? I haven`t heard about it yet....
The other thing is if GF9800 (or GF8900 or D8E - whatever :) ) will be based on G9x architecture then how DX10.1 support when as we know G92 doesn`t have DX10.1 support??
 
[
Source? I have yet to see solid evidence of this, and certainly not from a reliable source. All I hear about GF9 series are interpretation-based rumors from before the 8800 GT launch, all of which assumed that G92 = GF9800.[/QUOTE]

i dont have a source that you would know, also im not realy inclined to say who or from where it came, he could be wrong but i highly doubt it.
 
Any sources of these rumours?? I haven`t heard about it yet....
The other thing is if GF9800 (or GF8900 or D8E - whatever :) ) will be based on G9x architecture then how DX10.1 support when as we know G92 doesn`t have DX10.1 support??

Technically, neither do the HD38xx cards for now (until Microsoft releases their Service Pack 1 for Vista, which might not be out until the Spring). ;)
 
Technically, neither do the HD38xx cards for now (until Microsoft releases their Service Pack 1 for Vista, which might not be out until the Spring). ;)

DX10.1 is available through the latest DX SDK. If you want, it's there.

Of course it's all academic since there won't be a single game that uses DX10.1 for quite some time.
 
That could explain the transistor count too.

AMD went from 512b to 256b and got a nice reduction but that isnt the case here with Nvidia.

(NVIO was a separate chip for Nvidia and UVD was missing for AMD, so they cancel each other out)
They don't.

ATI went from a 1024bit ring bus to a 512bit one and added UVD, probably disabling some redundant parts at the same time.

nVidia reduced the memory controller by 1/3rd, removed 1/3rd of the ROPs, include the HD processor, add texture fetch units and the IO controller.

Rev. 2 8800GTS could have 24 ROPs but it seems unlikely given the transistor count limiting the SP number to just what it was with G80, my guess is the GTS will end with 128SP, 16 ROPs, 64 TMU and the same 256bit bus, with frequencies such as 650 ROP (+8%), 1500 SP (+14%) and 1050 RAM (+16%).
 
Back
Top