Irrespective of whether or not the support of the integer format for calculation was a good idea, Microsoft's refusal to support any integer shading format makes things much harder on the FX line (below NV35). Without API support, and since the NV30-34 need to calculate many things at integer precision for decent performance, programmers are going to be unable to make the NV30-34 look good and perform well in all cases.Heathen said:Microsoft's spec is holding it back,
Interesting placement of blame. Especially considering all the hype Nvidia were pumping out. "A dawn of cinematic computing." Not sure whether to laugh or cry.
It was Nvidia's choice to put such an unbalanced design together.
If there was API support for integer types, programmers could use FP whenever necessary. Since nVidia is essentially forced to break with the spec and use integer calcs anyway, many of the times that integer is used will be detected incorrectly, causing quality loss.
I say it's one hell of a lot easier for Microsoft to add integer types to the spec than it is for nVidia to change the hardware.
Depending on the calculation, one can use integer formats with no noticeable quality loss. It all depends on the math. Always remember that the display output will always be 8-bit integer anyway, and nVidia's integer calculation format is 12-bit. I will admit that it is more than possible for nVidia to have miscalculated how many operations should use integer format, but it is absolutely silly to think that all calculations need integer format. Why do you think we still have integer units on CPU's?