Chalnoth said:What you are speaking of, mistakenly, is of course a result of nVidia's choice to retain integer units in the original NV3x lineup (NV30-34), in conjunction with Microsoft's refusal to support integer types.
Ah, yes. It's all MS fault, of course. The fact that some IHV managed to implement (and come first to market) a full FP part running at great speed should indeed tell you something (that Nvidia got badly burned by their old strategy of designing their next gen parts as a faster previous gen part, plus some next gen features thrown in for OEM checkboxes).
Since the NV30-34 suffer a huge performance hit when going all FP (they lose about 2/3rds of their available power), nVidia has been forced to use integer precision anyway under DX9.
Of course, Nvidia would never decide on their own to hack image quality in order to gain some speed, wouldn't they ? In Chalnoth's wonderful NV-colored world, they are forced to by bad, evil MS and their forward-looking standards, and bad, nasty ATI that releases parts that follow this spec and offer great speed and image quality. Just like BB is forced to lie, bully and spew BS all day long to feed his family.
It would have been vastly better for everybody involved if Microsoft had just supported integer types.
"Everybody", in Chalnoth's NV-colored vision, means "Nvidia and the mindless Nvidia fanb0ys who bought what they knew was an inferior product and went into denial afterward".
5900 Ultra.
I'm not sure if your Nvidia driver is replacing every mention of it, but he said "0085 Ultra" (reversing the order of digits so your driver won't catch it, hopefully you won't mind the dip in 2D rendering performance).
Integrated chipsets.
Nvidia's excellent sales in the high end market indeed prove that Intel is Nvidia's main competitor... Looks like NV pretty much dropped out of high end and doesn't consider itself in competition with ATI.
False. ATI had DX9 hardware first, so developers started developing DX9 titles on ATI hardware. ATI got a head start.
Denial at its finest...
ATI doesn't do any trilinear on any texture stage but the first (when aniso is selected from the control panel).
And it does full trilinear when you specify application-controlled filtering and the app has a setting for trilinear. How can you get full trilinear filtering with the Cheatonators ? You can't.
Specifically: what has S3 sacrificed to get that performance?
Well, they have many options, ranging from butchering filtering to replacing shaders with hand-tuned versions (calling it a generic compiler) or to static clip planes. I trust your answer was ironic, or was it a case of pot calling the kettle black at its finest ?
Regardless, the 5600 line is being replaced by the much better 5700 line.
Which is a damn fine consolating thought to all the people having bought Nvidia's lie and a 5600...