Doom3 benches revisited.

jjayb

Regular
Anyone else notice that 2 of the 3 sites that ran the benchmarks didn't use high quality settings. They used medium quality settings. The one site that actually used high quality settings (tom's) actually showed the 9800 beating the nv35 in 2 of the 3 tests.

Why weren't high quality settings used on the other 2 sites? By high quality I'm not talking about aa and af, I'm talking about high quality on the in game settings.

I can't believe I didn't notice this before. Changing to high quality settings really gives a different picture of the 9800 vs. nv35 performance in doom3. Who here is actually going to play doom3 in medium quality on their 9800 or nv35?
 
jjayb said:
I can't believe I didn't notice this before. Changing to high quality settings really gives a different picture of the 9800 vs. nv35 performance in doom3. Who here is actually going to play doom3 in medium quality on their 9800 or nv35?
1334 (translated: elite) multiplayer frag-fest tournament monkeys. :p Actually, I'm wrong, they'll use minimum quality settings in order to max out their framerates. :D
 
Of course, we don't know what the difference is between medium and high quality, so its kinda a moot point until we do.
 
Of course, we don't know what the difference is between medium and high quality, so its kinda a moot point until we do.

Of course we know the difference. Medium quality is a step below high quality.

The point, which is not moot in my opinion, is the medium quality tests show the nv35 stomping the 9800. The high quality tests show the 9800 edging out the nv35 in 2 of the 3 tests.

The image I get from reading the forums around the net and the articles in question is that the nv35 is stomping the 9800. Yet when you aren't using medium quality that isn't the case at all.
 
You do realize the only difference between DOom 3 medium and high settings is higher samples of Ansio and AA?

Then why do the scores drop for both cards between medium and high quality tests when no Aniso or AA is applied?
 
surfhurleydude said:
You do realize the only difference between DOom 3 medium and high settings is higher samples of Ansio and AA?

AFAIR

high quality = no TC and 8xAF
medium quality = TC and 0x AF

according J.C
 
Thought I'd post these numbers from Tom's for anyone too lazy to look.

Medium Quality

1024x768
9800/Nv35
68.7/83

1280x1024
9800/nv35
46.8/56.9

1600x1200
9800/nv35
31.4/42

Now we move on to high quality.

High quality

1024x768
9800/nv35
61/55

1280x1024
9800/nv35
42.8/41.6

1600x1200
9800/nv35
29.5/32.9

The high quality numbers surely paint a different picture of the cards than just using the medium quality numbers only.
I can't remember the last time I've seen a review where a benchmark was run in medium quality and medium quality only. You're testing $500 video cards you would think you would use high quality.
 
I'm confused.

I would expect the 256MB card to have an advantage when texture compression is turned off, but the 128MB card stays right with it, debunking that theory. And when moving to 1600x1200 you would think the ATI cards would increase the gap between the nv cards, not have the framerate suddenly drop by 30% and start losing.

Something doesn't add up..

Is ATI still storing textures purely in AGP memory?
 
Ilfirin said:
I'm confused.

I would expect the 256MB card to have an advantage when texture compression is turned off, but the 128MB card stays right with it, debunking that theory. And when moving to 1600x1200 you would think the ATI cards would increase the gap between the nv cards, not have the framerate suddenly drop by 30% and start losing.

Something doesn't add up..

Actually, the reason why the ATI card drops in 1600*1200 might just be because of the larger framebuffer eating up the texture RAM.
 
Tom's site is also the site that found the NV35 increasing in speed with HQ when AA was turned on...

If you read his desciptions
At first, we see the ATi cards lead the pack. The FX 5900 Ultra only overtakes the competition at 1600 x 1200. According to NVIDIA, this may be a driver problem. The NVIDIA-optimized anisotropic filtering may have trouble with the anisotropic levels Doom III uses.
So it appears that however D3 turned on AF with this quality settings, the NV drivers didn't really like it.

Anand's comment on medium vs high quality:
The only options that we could change in game were the quality settings which we could set to low, medium or high. We benchmarked medium and high, but on the fastest cards the performance drop in high quality mode was negligible so we stuck to reporting the medium detail level scores due to time constraints.
This also makes it possible to manually compare how much of an effect AA and AF have on the game as HQ would set at least AF automatically.

(If it hasn't been yet) The ATI R9800-256 in the D3 benches used Cat 3.2s which do not recognize 256MB of ram. Any conclusions about effect of ram for 128vs256 ATI cards is invalid.
 
Tom's site is also the site that found the NV35 increasing in speed with HQ when AA was turned on...

Funny you mention it. So was Anand:

http://www.beyond3d.com/forum/viewtopic.php?t=5956


Anand's comment on medium vs high quality:
Quote:
The only options that we could change in game were the quality settings which we could set to low, medium or high. We benchmarked medium and high, but on the fastest cards the performance drop in high quality mode was negligible so we stuck to reporting the medium detail level scores due to time constraints.

That's what gets me. If the difference was negligible, why did they use the medium quality scores? Who uses Medium quality only in a review. If anything, I would think they would have used the high quality scores. It just doesn't make sense to me. And Hardocp decides the same thing?
 
RussSchultz said:
But wouldn't the 128 meg GFFX have that same problem?

(Or was the GFFX 256MB and the 9800 128MB?)

Oh, it was a 256MB GFFX. Still doesn't explain why the 256MB 9800Pro meets almost identical speed decreases as the 128MB version.
 
jjayb said:
That's what gets me. If the difference was negligible, why did they use the medium quality scores? Who uses Medium quality only in a review. If anything, I would think they would have used the high quality scores. It just doesn't make sense to me. And Hardocp decides the same thing?

If you wanted to see what effect enabling AF had on the framerate and the HQ setting does infact auto-engage some kind of AF, then you must use a setting which doesn't turn AF on automatically. That would be a reason for using medium quality setting.

That's a justification for using medium quality setting. May be neither correct nor a good reason, but I could see that being a reason.
 
Ilfirin,
Any Doom 3 tests using Cat 3.2 drivers on a 256MB Radeon 9800 Pro were treating it as a 128MB Radeon 9800 Pro, according to ATI's statements presented in one or more of the previews. Since Cat 3.4 seems to have had problems with Doom 3, I believe that is all of them.

Actually, if that statement is correct, that's true of any benchmarks using the Cat 3.2 drivers on the 256MB Radeon 9800 Pro.
 
Back
Top