3DMark03 obviously caused a lot of contention in some quarters, due to the performance with the PS2.0 tests. Part of the reason for this was that you chose to utilize full precision float shaders and NVIDIA's hardware wasn't optimal in these conditions, despite it being the DirectX default. Would you like to expand on the reasons for this choice? Will 3DMark04 be utilising a mix of full and partial precision shaders in the game tests, dependant on whether the quality allows it? If so, will you be offering options to test partial, mixed and full precision modes such that performances and qualities can be compared?
Patric: Full precision is indeed the DX default and the very few places we used float values in 3DMark03 full precision was needed to keep the image quality. Then again, most materials in ’03 used 1.x shaders that are all fixed point. Now all shaders are SM 2 or 3 and many materials look just the same in full and partial precision. We therefore have a precision mix as default. Materials that get rendered the same in half precision are allowed to use that if the hardware supports it. Then again, if some material gets reduced image quality in half precision, full precision is forced. We plan on adding a switch to force full precision in all shaders for comparison, but that’s only an optional setting.