Borsti said:
My results are correct - as far as I can say that. I messed up the table description but that does not change anything on the results (numbers).
It shuould change the conclusions considerably. Based on the "Wrong" labels, it looks the the NV35 does better on "medium settings without AA and without Aniso", and the NV35 ALSO does better when cranking up the image quality to maxium, plus AA.
In short, based on the old and incorrect lables, the NV35 performs better at minimal AND maximum quality.
NOW, with the correct labels, it looks like NV35 performs better in medium quality...but crank up the in-game quality, and the R350 pulls even. Dunno what ultimately happens when you crank up in-game quality AND AA settings. That paints a pretty different picture.
I don´t know why Anand said that there´s no difference between HQ or medium. You have to ask him. I found a big difference between those settings.
Clarification, did you find a big difference in performance, or in Quality (or both?) I'll assume performance, because that's what I said Anand did not find all that different.
I'll have to refute that though. According to your tests,
The GeForceFX had a significant impact in performance when going from medium to high quality. (83 to 55 FPs at 1024).
However, the Radeon did not. It only went from 68 to 61 FPS.
Again, this is my point. Based on your numbers, the FX takes a major performance penalty when going from medium to high quality, and the Radeon 9800 does not.
There was a performance difference between the AF the app wants and the AF setting the driver is forcing. I spoke to JC about this and he thinks about solutions for that.
I don't know what you're trying to say here. The app calls for 8X aniso....how do you know there is a performance difference between what is asked for, vs. what is delivered? What is your baseline for measurement?
The default AF rendering in Detonator FX is Quality - as well as in ATIs driver. There was a little difference between HQ and the AF driver settings (about 0,5 to 0,2FPS depending on the resolution). So HQ shows correct numbers of the high quality AF modes of both cards.
Again, I don't follow you.
Little performance difference between what? Do yo mean this:
1) Turning off "forced" Aniso in the driver control panel, and letting Doom3 set the aniso level.
and
2) Turning on "Forced 8X Aniso" with default quality in the driver control panel....and running doom3 in what? High Quality or Medium Quality mode?
You seem to be saying now (correct me if I'm wrong) that whether Aniso is forced by the panel, or from within Doom3 internally, the results are very close.
If that's the case, then why is there any reason to suspect there are problems with Aniso in nVidia's drivers?
One other reason why I did not more HQ testings was simply time. I thought it´s more important to run tests with as many cards I can. And it takes a lot time to run these tests. I had to decide between detailed tests with two cards or running as many cards I can. In the end I did a mix of both.
Understood.
There were also no rules from NV that we had to test specific quality modes only or anything else.
That's good to hear.
From what I learned from id´s politics so far is that they do not publish anything until it´s done.
Their statements are to the contrary though in this case. (You can't post screen-shots, etc., because well, it's not "done" enough.)
So I still trust in those numbers. JC would´nt have allowed this if there´s still trouble with the codepaths.
To be clear, I don't doubt that Carmack beileves that that HIS ENGINE CODE is pretty near final with only minor tweaks left. What about ATI's drivers though? Does he have ATI driver builds that the public does not? Does he know how much if any headroom is left in whatever ATI drivers he does have?
I know that ATI feels handicaped here. But it´s not my fault that they did´nt optimize the driver yet (as they say).
I think that's a poor and misguided attitude to take. I would agree IF the benchark was either public or available to ATI and known to be publically tested with some notice. The point is, you have no idea if ATI has driver builds that are more Doom3 optimized than the cat 3.2s.
I also know that ATI would have liked to do the same thing with Halflife2, since it´s developed on ATI cards and the drivers are allready optimized for the game.
The publically available drivers? Maybe, maybe not.
The reason why that did not happen was that Valve did not like to do this (yet).
Kudos to them. If they aren't ready, then they shouldn't do it.
I would´nt have any problem in running HL2 benchmarks at this time.
I wouldn't either, if all vendors had amply notice that the benchmarks were going to be done, to make sure that any driver optimisations they may have in house make it into the testing drivers.
If the game developer says that they´re ready for a performance evaluation there´s no reason not to do that. If one company neglected their optimization for a certain game yet - well, shame on them!
Shame on you.
If one company knows that the game / demo is going to be released by date X, THEN shame on that company if they don't get drivers out to support it. If one company is blindsided by an unknown public display, SHAME ON ANYONE involved. This includes ID, NVIDIA, and yes, you.
ATI is well aware of the development status of Doom III.
They were NOT aware of this public doom3 benchmarking.
They also know that we´ll see test version of it in the near future.
And if that is true, and if ATI doesn't get any Doom3 optimizations in their drivers by the time the test version THEY KNOW OF is released, then yes, shame on them.
So shame on ATI if they did not optimize their driver yet.
Shame on you for assuming that they don't have drivers in house that are not released to the public.
I don´t want to take care if it fits into the marketing of one company if run benchmarks with a game, even if it´s not yet released. Why should I take care?
Because your readers expect benchmarks to be done on as fair and level playing field as possible, perhaps
Everybody says that they are working very close with the game developers but if you run some benchmark everything is suddenly different?
You seem to basically miss the premise that publically released drivers are not the same as in-house / development drivers. There's no sense in publically releasing drivers that have optimizations for a game or benchmark that is not available, if the suite of optimizations has not been tested enough to ensure it doesn't cause other problems to already shipping titles and benchmarks.
Let´s take HL2 as an (virtual) example. If you want to buy a new card right now and you´re waiting for that game. If I run benchmarks at this time with an alpha the ATI cards might look better. So what´s the conclusion for that guy? He´ll buy an ATI card. Is that unfair?
That all depends on the circustances surrounding the test!
Unfair for whom? For NV or for the consumer?
It could be unfair to BOTH. That's the point.
One can argue that there might be an optimization but it´s not implemented in the public drivers yet. But why should they make such a silly descission?
SILLY?!
I explained it above, and someone else also explained it. Why SHOULD a company release drivers to the public that have optimizations for a game that doesn't (or shouldn't) exist yet!? Possible upside? NONE. Possible downside...negative effects for other games.