Doom3 benches revisited.

My whole take is this . NONE of the Doom3 benchmarks accurately portray the performance of the 9800 Pro when Doom 3 is released . I thought these site were trying to inform consumers into making a knowledgable purchasing decision. The comparison of the two products benchmarks benchmarks can only mislead.
 
Joe DeFuria said:
Borsti said:
So it looks that the performance drop of NV35 in Quality mode has nothing to do with Aniso at all.

Possibly....or that driver forcing on Aniso with the GeForce doesn't do anything at all in Doom3. (Did you check the image quality?)

It is very surprising to see almost no performance drop with 8X "quality" aniso on the FX. This is unlike any other situation I know of. Look at your own UT benchmarks.

In short....looking at your data (the medium quality numbers), I would be more suspect that nVidia has a driver bug that DOESN'T ACTUALLY TURN ON aniso with the control panel, or perhaps turns on a different setting (performance) than selected. And that the performance drop between medium and high quality doom3 settings is in fact a combination of proper aniso actually being turned on, and higher quality (more bandwidth sucking) textures.


Borsti said:
Seems to be a trouble with the textures or whatsoever. I feel very bad that I did not run more HQ tests with the R350. That would make things much clearer now.


Again, I more suspect that "forcing on" Aniso isn't properly working with Doom3.
[dream caster: I highlighted those words in red]

:devilish: for a "bug" this one would be strangely similar to the 3DMark2003 "bug" of Detonator 44.03 and so convenient as it also! (at least to up benchmarks!)
 
I just noticed something interesting with the Doom 3 benchmarks in the THG and HardOCP articles.

GeForce FX 5900 Ultra
Medium Quality, 4x AA, no AF (from THG - http://www6.tomshardware.com/graphic/20030512/geforce_fx_5900-13.html)
1024x768 - 57.1 fps
1280x1024 - 38.1 fps
1600x1200 - 27.7 fps
Medium Quality 4x AA, 8x AF (from HardOCP - http://www.hardocp.com/article.html?art=NDc0LDI=)
1024x768 - 57.2 fps
1280x1024 - 38.1 fps
1600x1200 - 27.6 fps

Notice how the scores are within 0.1 fps of each other, despite 8x AF supposedly being enabled in the second set of tests. Both sites used P4 3.0GHz CPUs.

Radeon 9800 Pro
Medium Quality, 4x AA, no AF (from THG - http://www6.tomshardware.com/graphic/20030512/geforce_fx_5900-13.html)
1024x768 - 43.3 fps
1280x1024 - 29.2 fps
1600x1200 - 17.3 fps
Medium Quality 4x AA, 8x AF (from HardOCP - http://www.hardocp.com/article.html?art=NDc0LDI=)
1024x768 - 37.8 fps
1280x1024 - 26.2 fps
1600x1200 - 16.5 fps

Notice how the 9800 scores significantly lower with AF enabled, as would be expected.

This begs the question of whether AF was being correctly enabled on the GeForce FX cards in the Doom 3 previews. Given how much this game stresses the gfx hardware, it seems highly unlikely that there would be no performance impact whatsoever for enabling 8x AF.

Even more interesting is the news that the High Quality mode in Doom 3 enables AF from within the app. Only THG ran benchmarks in this mode - and in those benchmarks the 9800 Pro matched or outperformed the 5900 Ultra.

If it is true that AF was not actually being enabled on the GeForce cards for the tests (whether by cheat, driver bug, or game bug), but it was correctly enabled on the Radeon cards, then a good portion of the Doom 3 benchmarks we saw posted would be grossly inaccurate. In fact, only the High Quality tests on THG would give a legitimate picture of relative performance with AF - and if all the test results looked like that, I think the conclusions drawn by the articles would have sounded much different than they do now. Especially considering that anyone who pays $300+ for a gfx card these days is going to expect to keep AF on all the time.

Pity no one was allowed to take screenshots to prove whether or not AF was getting correctly enabled... but I guess we now know why Nvidia might have wanted it that way!
 
That is interesting. Ether nvidia does AF for free(Impossible isn't it?) ...... or there is no AF being applied. There is no differance in the frame rate between no AF and 8x AF on the Geforce FX 5900.... That is interesting and noteworthy GV. Thanks.
 
one would say : something is fishy in the state of denmark ;)

i don't know if it has the same meaning as it does here " es ist etwas faul im staate dänemark"

i hope b3d get the opportunity to do doom3 benchmarks ,it would for once be nice to have some reliable numbers
 
tEd said:
one would say : something is fishy in the state of denmark ;)

i don't know if it has the same meaning as it does here " es ist etwas faul im staate dänemark"

i hope b3d get the opportunity to do doom3 benchmarks ,it would for once be nice to have some reliable numbers
That would be something SOME people wouldnt want.
 
Wouldn't surprise me. Seems to be a lot of that lately. Driver settings not doing what you tell them to do. Reviewers need to spend more time actually watching what they're benching to at least have a clue as to weather the settings are taking affect or not.

As a side note I see that Lars updated his review to reflect that the AA numbers were actually medium quality setting and not high quality. Now if Anand would just get around to fixing his.
 
Nothing fishy at all. The drivers only apply the level of AF specified in the driver panel if the application itself doesn't specify an AF level. If the app does specify an AF level then that one is used. To put it another way, the settings in the driver panel are used as a default, but can be overridden by the application.

This is nothing new. I seem to remember it happening in JK2 too. (But slightly different, AFAIK, if you set the AF off in game, the game would simply not specify an AF level to the drivers, so the drivers default was used. But in D3 I think the game specifies an AF level of 1 if AF is not selected). It also works like that in Tribes2.

JC has said that this is the way he wants drivers to do it, so I don't see anything wrong with it. (Or, nothing wrong except the reviewers not having a clue)

I think NVIDIA deserves getting ripped a new one for the 3dmark2003 stuff; that was so wrong (and couldn’t in any possible way have been a bug). I don’t think they deserve getting ripped a new one for handling AF selection the right way (I agree with JC on this one, this is the right way to do it). They should have tried to make people aware of what they were doing, but not doing so is still not cheating. Or maybe they could include a switch in the driver panels: Use as default / Force.
 
Thowllly said:
Nothing fishy at all. The drivers only apply the level of AF specified in the driver panel if the application itself doesn't specify an AF level. If the app does specify an AF level then that one is used. To put it another way, the settings in the driver panel are used as a default, but can be overridden by the application.

The thing is, AFAIK, in Medium Quality, the app is NOT specifying an AF level. It's just bilinear (or possibly trilinear). So why would there be no AF if:

1) Doom3 does not specify AF
2) The control panel App does.

Or are you saying that in medium quality, Doom3 is "forcing" Anisotropic filtering off, and the drivers should obey that?
 
Joe DeFuria said:
are you saying that in medium quality, Doom3 is "forcing" Anisotropic filtering off, and the drivers should obey that?
Yep, that is what I think happens. Most games don’t specify an AF level at all. If a game specifies 1x AF then the drivers obeys that (At least, they do in OpenGL, I don't know about D3D...)
 
Thowllly said:
Yep, that is what I think happens. Most games don’t specify an AF level at all. If a game specifies 1x AF then the drivers obeys that (At least, they do in OpenGL, I don't know about D3D...)

Just out of curiosity...is that how Quake3 works? Do you know for a fact that's how Doom3 works, or just making a guess?

In any case, we are both saying essentially the same thing: the REVIEWER needs to look at the resultant image quality and make a determination about what's going on.

In this case, it appears that nVidia's and ATI's drivers are behaving differently. I'm less interested in determining which one is more "proper", (should ATI NOT be forcing Aniso on in medium quality?) than just knowing what is going on.

Sadly, I guess we won't know until there's some official demo/test release.
 
Whether or not this is a cheat, it still invalidates the results. And of course, Nvidia isn't going to complain about it, but I'm sure ATI will.

At least the ATI control panel is explicit about how things like this should be handled. If you check the Application Preference box, then the app decides the AF level. If you uncheck it, then the control panel slider decides the AF level.

From the reviews I've seen of Detonator FX, its control panel no longer has an Application setting. That means if you specify, say, 8x AF, you can't be sure whether or not an app is going to override this, which is much more confusing. So much so, in fact, that it led several well-known reviewers to post erroneous results and conclusions for Doom 3.

You can blame Nvidia for not being clearer about what was going on, or you can blame the reviewers for not checking what was going on more carefully. Either way, I think someone should be doing something about it. I feel much more comfortable with my 9700 Pro now, but I wonder how many other people would decide to "upgrade" to a FX 5900 on the basis of the invalid results.
 
GraphixViolence said:
Either way, I think someone should be doing something about it. I feel much more comfortable with my 9700 Pro now, but I wonder how many other people would decide to "upgrade" to a FX 5900 on the basis of the invalid results.

The only saving grace right now, is that the 5900 is not yet available. So even if people want to "upgrade" to a 5900...they can't.

I can only hope that by the time they actually become available, that this mess can be cleared up. (Both the 3DMark and the Doom3 messes). I don't care which choice people make, as long as it's a properly informed one.
 
Thowllly said:
Yep, that is what I think happens. Most games don’t specify an AF level at all. If a game specifies 1x AF then the drivers obeys that (At least, they do in OpenGL, I don't know about D3D...)

ATi's drivers are specifically designed to override any game setting relative to them in both OpenGL and D3d. That's why there's an "Application Preference" checkbox in the ATi drivers, for both OpenGL and D3d. When this box is checked settings for FSAA and AF become inaccessible in the control panel and are controlled exclusively through the application. When the box is cleared the drivers override any internal application settings for AF and FSAA to the desired degree. That's the "right way" to do it of course, since it gives the end user total control over his display.

So with the ATi drivers in the event an application specifies 1x AF the drivers will only provide 1x AF if the Application Preference box is checked--otherwise the drivers will override application settings to correspond to control-panel settings.
 
Back
Top