Wasn't Half-Life 2 proposed as system/CPU limited for the 9800 (non-SE) family at 1024x768 with no AA and AF? Wouldn't 95 fps be primarily a factor of a high speed system (perhaps an Athlon 64?), even if full DX 9 settings were used and the result indicates problems with floating point shading are no longer hobbling performance dramatically?
As for UT2003, I find those numbers pretty believable for some flyby maps at a decent amount of quality (as in, not like the Quake 3, etc, numbers put out before the NV30 launch). However, was the source for all numbers nVidia or some 3rd party? I just ran a quick Benchmark.exe at 1600x1200, without closing any programs on a 3.0GHz/250FSB/200MEM system and 432/398 versus their 466/366, Vsync set to app pref:
No AA, No AF - 127 fps average (152 fps asbestos, 102 fps antalus).
4xAA 16xAF (Control panel Quality) - 73 fps average (85 fps asbestos, 61 fps antalus).
Plenty of opportunities for assumptions to distort things here without knowing the system and maps used across the board, especially with flyby dependence on system performance. For example, were the 9800 Pro numbers pulled from the same map, settings, and same system, or pulled from some other non-disclosed source (and therefore completely useless in the context given, instead of simply suspect)?
The included HALO benchmark mode drops from 33.56 fps at 1024x768 to 26.33 fps at 1600x1200 (vsync off, all settings maximum, no programs closed)...this again makes me wonder what system and settings they use, because the system seems to have a large impact.