Which is why you never buy a video card based on just one benchmark (or, indeed, one review). I think that's pretty much common sense.PatrickL said:The whole problem with shader replacement, when the game is used as a benchmark, is that benchmark becomes only an indication about performance of the card on that game and even more on that game version.
However, if you had been foolish and bought an NV30 class card, isn't it the least Nvidia can do to at least try and give you decent performance? Don't the end users deserve that? I find it a little hard to swallow that ATI fans want to deny gamers large performance increases based on the fact that maybe the game will be used for benchmarks (especially the same ATI users who vocally defended ATI when they did exactly the same thing with Quake 3).
I don't think there's anything wrong with doing per-game optimisations as long as there is A) no discernible effect on IQ and B) there is a degree of openness about the fact that it's being done. Is it any different, really, than Valve working closely with ATI coders to make sure HL2 runs well on their hardware? Or Nvidia working directly with developers to help them write the most efficient shaders possible?
All I'd ask is that if Nvidia are doing optimisations for one particular game then they should say so openly (same goes for ATI). However, if those optimisations are more generic then that's their business. Again, though, no one (AFAIK) has yet to prove Nvidia have done anything.