i heard that the 9800 series is coming out before the end of the year (or whatever its called).
I hope so .. G80 here like forever.
i heard that the 9800 series is coming out before the end of the year (or whatever its called).
What particular monitor is that on? Is such behaviour (stretching of content) standard across most / all CRTs? It does almost sound like a good thing.
G98 is low-end.
New 8800GTS will use G92 with 8 TCPs running.
i heard that the 9800 series is coming out before the end of the year (or whatever its called).
Why should not be G92 also GF9800, when G86 is also GF9300?Source? I have yet to see solid evidence of this, and certainly not from a reliable source. All I hear about GF9 series are interpretation-based rumors from before the 8800 GT launch, all of which assumed that G92 = GF9800.
Why should not be G92 also GF9800, when G86 is also GF9300?
In January or early February I bet NV will be able to unveil G92s full potential, maybe through better yield, against HD3870 X2, with a single-GPU-SKU.
I still doubt, that G92 is only 256Bit, since there are rumors about prototyp-cards with a G92 and 12 memory-chips.
And absolutely no indication that NV will go (the step) back to Quad-SLI.
That could explain the transistor count too.
AMD went from 512b to 256b and got a nice reduction but that isnt the case here with Nvidia.
(NVIO was a separate chip for Nvidia and UVD was missing for AMD, so they cancel each other out)
I don't believe the situations are directly comparable. AMD had a 512-bit (external) MC interface with 1024-bit internal precision and a ring bus w/accompanying ring stops to accomodate. This is far more complex than NV's 6x64-bit + crossbar approach.
And I can bet my ass that NVIO is a lot, lot bigger than the space UVD takes
I still doubt, that G92 is only 256Bit, since there are rumors about prototyp-cards with a G92 and 12 memory-chips.
And absolutely no indication that NV will go (the step) back to Quad-SLI.
Any sources of these rumours?? I haven`t heard about it yet....
The other thing is if GF9800 (or GF8900 or D8E - whatever ) will be based on G9x architecture then how DX10.1 support when as we know G92 doesn`t have DX10.1 support??
Technically, neither do the HD38xx cards for now (until Microsoft releases their Service Pack 1 for Vista, which might not be out until the Spring).
They don't.That could explain the transistor count too.
AMD went from 512b to 256b and got a nice reduction but that isnt the case here with Nvidia.
(NVIO was a separate chip for Nvidia and UVD was missing for AMD, so they cancel each other out)
If the full G92 does indeed have a 384bit bus, then how does the frame buffer size work out?
The "full" G92 wouldn't be the GTS 512 then.