dan2097 said:
In a way the gffx was the most impressive card Nvidia has ever released in terms of pushing forward performance, ATI just managed to do even better. Bothe 9700s and fx 5800s were large upgrades from the geforce 4 ti series.
Was it? IIRC, the GF4 was also twice as fast as the GF3 when it came to AA performance, as was the 5800U compared to the GF4. (The 9700P was more than twice as fast as the 8500, but that's not taking into account MSAA vs. SSAA.)
Had R300 not launched is not a very interesting Q. Had R300 not launched with DX9 or gamma-corrected and sparse-sampled AA is a more interesting question. Would the FX 5800/U still have been (obnoxiously) loud, double-decker solutions? Would they have been clocked anywhere near where they were? How would performance have fared in that case?
The FX series wasn't that interesting (from my end-user perspective) because nV basically presented a 4x2 DX8 architecture for the third time, with not many improvements beside speed (raw clocks, double stencil). OTOH, this relative feature stability allowed nV to maintain its driver advantage, and it probably helped devs in the sense of not changing their featureset too greatly.
(Edit: I have to say that last paragraph was written with the R300 in mind. If nV had released the 5800/U to no competition, I'm sure I would've welcomed its huge speed boost over the GF4.)
Ailuros, are you just saying that I was being (unduly) flippant, or that DX9 features so far ahead of hardware that can actual run them quickly aren't important to devs? If the latter, Doom 3 was built with the GF1 featureset in mind, and apparently JC's next engine will be built with the FX's featureset in mind. If the former, well, it's easy to get carried away on a forum.
BTW, as to the R300 without DX9: aren't Deus Ex: IW and Thief 3 running on DX8 engines? IIRC, R300 trounced (still trounces?) NV30 in those two games.