...so we got our hands on them and ran 'em. Here are the results we found,
along with the results from Radeon (from sharkyExtreme):
<<Nvidia 6.16 vs ATI Quake3.xls>> ...is it me or is it just coincidence that the new 6.16 drivers give the GTS
just enough performance to put it over the top in the 32bpp numbers where
ATI used to have an advantage? Maybe I believe in conspiracies, but
something smells bad about this whole thing. Of course I'm biased, but I
just find this whole thing very suspicious. Plus I just don't trust nvidia.
Has anyone:
- Run the new 6.16 drivers using a different demo set than the
"usual" ones (demo001, demo002, Quaver, etc.)? Do the 6.16 drivers see
benefit across the board on all demo sets, or are they just optimized for
the ones reviewers typically use?
- Has anyone actually measured real-world performance benefit? The
ideal thing to do here would be to put 2 identical GTS systems side by side,
one with 6.16 drivers and one with the older (5.22) drivers and then put one
of them in "follow" mode on the network...turn on the framerate counter in
both and run around a real-world gameworld...is there real-world perf.
benefit using the new drivers (at high rez, 32bpp cases which is where they
seem to have made the most perf. increase)? If not, then they have just
optimized the benchmark and not the gameplay...if you can't set a "follow"
scenario like this, then just pick some parts from your favorite maps during
real-world gameplay and go in and measure perf. with old and new
drivers...in a real-world game scenario, is there real-world benefit?
...in a nutshell, would sure like to see someone looking hard at these
drivers and poking into whether they are just tuned for Q3 benchmarks or
whether they actually buy the user anything in terms of real-world game
performance...
Scott