I've noticed an increased use of Mandelbrot demo's to point out the rendering precision differences between the various cards on the basis of visual quality. Specifically, the NV30/31/34 have been accused of running at FX12 in such tests, again on the basis of visual differences between those cards and the Radeon 9500/9700s.
For example, in this review:
http://www.hardware.fr/articles/468/page4.html
we see two such images taken from Humus' MSR demo for D3D. The 5600 produces nasty, blocky looking images unlike the 9500 which is all very svelt and pretty. Ergo, the reviewer makes the point that the 5600 is using FX12 for PS2.0 shaders; how all very evil and underhand.
However, somebody over at the FM forums has used the OGL version, which presumably runs the 5600 at FP32 unless the ARB_nicest_thingy hint, and came up with these images:
5600
9700
Now on face value, you'd probably think "Well, so what. FX12 worse than FP24. FP24 worse than FP32". What I am actually wondering though is whether such visual differences are sufficient evidence to make the claims of FX12 being used; after all, the differences between FP24 and FP32 are similar to the claimed FX12 and FP24.
Given that pcchen's precision tester hasn't quite helped clear up this malarky with the default PS2.0 shader precision in the NV30/31 (ie. it's 10 bits but is that FX12 10-bits or FP16 10-bits?) and the inconclusive (to my mind at least) MSR results, are we ever likely to get to the bottom of this issue, short of coshing an NVIDIA employee over the head at the forecoming GDC/ECTS in London and blackmailing Kirk to tell us the truth, lest we send bits of the employee back via airmail???
For example, in this review:
http://www.hardware.fr/articles/468/page4.html
we see two such images taken from Humus' MSR demo for D3D. The 5600 produces nasty, blocky looking images unlike the 9500 which is all very svelt and pretty. Ergo, the reviewer makes the point that the 5600 is using FX12 for PS2.0 shaders; how all very evil and underhand.
However, somebody over at the FM forums has used the OGL version, which presumably runs the 5600 at FP32 unless the ARB_nicest_thingy hint, and came up with these images:
5600
9700
Now on face value, you'd probably think "Well, so what. FX12 worse than FP24. FP24 worse than FP32". What I am actually wondering though is whether such visual differences are sufficient evidence to make the claims of FX12 being used; after all, the differences between FP24 and FP32 are similar to the claimed FX12 and FP24.
Given that pcchen's precision tester hasn't quite helped clear up this malarky with the default PS2.0 shader precision in the NV30/31 (ie. it's 10 bits but is that FX12 10-bits or FP16 10-bits?) and the inconclusive (to my mind at least) MSR results, are we ever likely to get to the bottom of this issue, short of coshing an NVIDIA employee over the head at the forecoming GDC/ECTS in London and blackmailing Kirk to tell us the truth, lest we send bits of the employee back via airmail???