The average joe who downloads the application or installs it from a magazine CD will never even read anything about PS1.4.
That's the whole point. It was next to impossible to download or get on a CD without reference to the new test, which was clearly identified as a PS1.4 test (JUST like the two examples you have pulled up).
Seriously, you guys are acting like the Advanced Pixel Shader's only mention to PS1.4 was on display at the offices, in the basement, without stairs and behind a door that say "Beware of the leopard." Sorry, try again.
I guess if I went to forums around the globe and posted "Hey guys, I just discovered the APS has PS1.4!" the result would be a bunch of people "Wow! I never knew that!" Give it a rest.
The APS test already needs two passes on non-PS1.4 hardware as opposed to one on Radeon, what more do you want? An artificial scenario that strongly favours PS1.4 coding, even though showcasing PS1.4 wasn't the intend of this test
There is the point of disagreement. The results on a Radeon already disprove this (just disable PS1.4). The "point" of the test is pretty clear- that PS1.4 performs similar or worse than PS1.1 (as illustrated by scores or any amount of unbiased testing) and showcases this message nicely.
I'm okay with that. Really. It just goes hand in hand with my prediction of how this will not be the case with 3dmark03. I guarantee it. Any DX9.0 feature tests will be used as showcases (once again) to make the same kinds of fictional statements about performance. The same kinds 3dmark99, 2000, 2001 and 2001SE have flip-flopped concerning levels of complexity, static/prepathed HW T&L, excruciatingly specialized texturing and a PS1.4 test that gains little to no performance between it's runtime performance on PS1.0/1.1.
I await 3dmark03 to either prove or disprove this theory.
It all comes down to a design approach to a benchmark. If all IHV's are tossed out and the focus becomes one of an API, the end result tests will be substantially different. The moment coding becomes custom tailored to one or more IHV's is the moment the usefulness of said tool diminishes and at the same moment becomes nothing more than another marketing tool.
Like Democoder put- to test shader performance, take something generic and not tied to an IHV's capabilities. Take something non-realtime in Renderman or similar and create the highest level complexity the API supports and write the most optimum methods (by API, not by IHV) for lower/fallbacks then let it rip. The only code changes from that point forward would be for possible bugs.. not optimizations.
You dont create Utopian scenarios in a benchmark for any particular featureset.