xGL said:....
nVidia did say after all :
Since we know that obtaining the best pixel shader performance from the GeForce FX GPUs currently requires some specialized work, our developer technology team works very closely with game developers. Part of this is understanding that in many cases promoting PS 1.4 (DirectX 8) to PS 2.0 (DirectX 9) provides no image quality benefit. Sometimes this involves converting 32-bit floating point precision shader operations into 16-bit floating point precision shaders in order to obtain the performance benefit of this mode with no image quality degradation. Our goal is to provide our consumers the best experience possible, and that means games must both look and run great.
Yay, let's go back to DX8 specs instead of DX9!
If DX8 shaders are so much better, why should we buy their super DX9 cards when DX8 cards are so much better ?
Also I've heard of nVidia changing all of it's precision down to 12bit in these new drivers. Only time will tell but you'll be sure to see sites like , Beyond 3D or Extreme tech telling us more about this
You know, nVidia's amazing--every time they do an "optimization" which decreases IQ they are quick to point out that the reason they're doing it is because it doesn't decrease IQ.... (Like, who needs all the visible textures in UT2K3 trilineared, anyway, since bilinear filtering results in no IQ loss?) Why don't they just come out and admit that they hadn't planned on supporting DX9 in hardware with nV3x all along, and that what motivates them to "optimize" for UT2K3 are benchmark scores instead of IQ? It would certainly make life simpler for them, for sure.