Testiculus Giganticus said:
Its architecture is a bit diffrent from the nv3x, so it doesnt like some things that it`s little brother loved
Lets get the facts straight:
fact: BOTH the 6800u and the 9800 have a significant performance increase when running the NV3x codepath compared to the normal DX9 path.
That means that
part of the NV3x codepath is NOT specific for the NV3x, but improves performance on all videocards. Which part? Well, look here:
fact: The NV3x code path uses less PS2 shaders, and more PS1.x shaders.
fact: BOTH 6800u and 9800 produce the same artefacts when running the NV3x path, and look fine on the normal codepath.
This performance increase comes from running PS1.x shaders giving a much worse image quality.
Now, it might be that the NV3x codepath also inclused pp-hints. Might be that that is only usefull for the NV3x. Might be that the NV3x gains much more from switching from PS2 tot PS1.x
But the fact remains:
Part of the performance increase of the NV3x codepath comes from simply running at a lower image quality, produces by simpler PS1.x shaders that use less computing power.
And that will improve performance on ANY videocards. As is demonstrated by the 6800u and 9800 performance increase.
Testiculus Giganticus said:
Finally, though it has the muscle for full quality, it is forced to have those nice "optimizations" that were necessary before
Do you honestly believe it's an accident that the NV40 is running the NV3x codepath? It's bound to have a different ID... So that would mean Farcry defaults to the NV3x codepath for unknown cards? :?
Come on, it's so obvious what's going on. When the 1.1 patch was released, the NV40 was ready. Developers were completely informed about it, probably had been running samples allready. The patch was even claimed to allready included PS3 paths for the NV40.
So much for not being aware of the NV40...
It is unconceivable that the NV40 is running the NV3x by accident. It is running that path, because NVidia and/or Crytek WANT it to run that way
Why? Because Crytek thinks it's not fast enough to display at full image quality? No way! It's running the normal DX9 path slightly faster than the Radeon 9800XT.
There's only one option left: Because NVidia didn't like the very small performance difference compared to the 9800XT, they pressed Crytek to have the NV40 run a simpler codepath, exchancing image quality for performance.
Does that make NVidia the devil incarnate? Well... you decide!