Borsti said:
Ouch... that´s a copy error in the charts. The fsaa results in that table were run in medium quality, not HQ.
That's a pretty big error.
For the record, you also made a typographical error here:
http://www6.tomshardware.com/graphic/20030512/geforce_fx_5900-11.html
All of the charts on this page say "Medium Quality", though you typed "High Quality" as the title of the page.
I did not want to post more HQ results until I knew more about what happens in Doom III´s HQ modes. At the time we published the article I was still waiting for a response from John. My guess was that AF is on in HQ but I was not sure at that time.
I don't understand. Whether AF is on or off in HQ, why should that prevent you from doing more tests? Or do you think it's only being turned on for one card, and not others?
For the record, it has been said that HQ turns on "8X AF" and turns off texture compression. (I can't of course verify that myself.) It sounds to me like, assuming the AF that is turned on with the FX is it's "application" AF...that the FX AF just takes a much bigger performance hit than ATI's AF. That is nothing new, and explains the results quite nicely, actually.
I assume realizing this, nVidia will in the future not (at least easily) allow their "high quality" AF to be turned on by Doom3....anyone care to make a wager on that?
ATI may have told [H] about the cat32 and 256MB but they did not tell me.
Did you ask ATI what the possible implications were of not using the drivers they
gave you, and reverting back to older drivers?
In other words, why were the HQ benchmarks basically discarded? Why wasn't AA tested at those settings? These are 256 MB $500 cards. That would be one of the FIRST things I tested.
This question doesn't pertain just to you, but to H and Anand as well...NO HQ benchmarks?! For these cards? Interestingly, Anand claimed that HQ scores didn't change much from Medium quality scores.
YOUR benchmarks seem to go agaist his claims. At least with nVidia hardware. nVidia hardware takes a major hit, and ATI takes a relatively minor hit.
Seems to me that would be a highly relevant characteristic to take note of, and yet not one web site picked up on it. Your review came closest, because at least you ran one set of tests. Although you seem to pass off the large hit as more of a bug, rather than assume it's not.
Honest question: did nVidia request that you benchmark medium quality, or suggest not not benchmark high quality?
"Do you only believe nVidia when they tell you there are driver issues?"... do I read a imputation here? What do you want to say with this in detail?
I'm saying, when the "High Quality" results didn't go as "planned" (nVidia was "only" on Par with ATI), you asked nVidia...."Why?" They said possible bug, which is apparently enough for you to not bother doing further high quality testing?
And yet, when the Cat 3.4's didn't work as expected, you just used the 3.2's without apparently asking ATI what the implications of that could be?
The biggest problem with all the Doom III benchmarking was the limited time. We only had one day to play with it.
I completely understand this problem.
And this is one main reason why I so detest the manner in which the whole Doom3 benchmark thing was handled. Especially when you have a
brand new benchmark, having such limited time, and not being able to go back and investigate issues.......things like this are assured to happen. I'm not saying I blame you for running the benchmarks (who wouldn't, given the chance to run Doom3?) But nVidia and Id did the whole community a HUGE disservice for introducing Doom3 performance this way. It's not fair to you, and it's not fair to us, the consumers.