Tomb Raider: AOD DX9 Benchmarks

One thing I'd like to see is a benchmark with PS2.0 completely disabled, to see how much impact it has on the different cards. It IMO important to eliminate the possibility that the lackluster performance on nVidia cards might be hindered by something else than the DX9 shaders. The 5200 Default settings values are already a hint (that the shaders _are_ the limiting factor), but a comparison of the high end boards would be more interesting.
 
Great article Dave...
a friend have a FX5200 and he says they card can use ALL visual-candy in this game....
so....here's my question
Can you enable the PS2.0 effects on the 5200/5600 FX cards? Does the card really use this effect or it just "says" that it uses?

Thanks
 
I'm curious as to why this is "only" a beta article, and not an official one?

(Are you waiting for feedback from the IHVs? Not quite as comfortable with TR benchmarking as you'd like?)

This article is great stuff...not only good content, but content not found anywhere else on the 'net.

One suggestion:

http://www.beyond3d.com/misc/traod_dx9perf/index.php?p=5

It would be great to see a screen shot comparison of the output at the default settings. It would be very useful to see the "visual quality" difference at similar performance levels, vs. just knowing what settings are enabled / disabled.
 
Very interesting article thanks...

A picture is worth a thousand words though. Any chance of backing up the benchies with some IQ shots?

Cheers

loGan
 
I noticed that fog was disabled for all cards in the default settings. Is there an issue with the it? How much does it affect performance?

It would be interesting to see an apples to apples comparison with fog turned off.
 
A lot depends on how the game is written.. maybe this game is aim more at ATI implementation of shaders and not NVidia's. Some of this is evident by some of settings.. it appears that NVidia's are more strip down in settings - what would the benchmarks be like it they were equal.
 
A lot depends on how the game is written.. maybe this game is aim more at ATI implementation of shaders and not NVidia's.

Well, TR is in fact an "nVidia...the way it's meant to be played" title, so either:

1) The game was developed in part with decent consideration of nVidia's architecture, but that still doesn't make up for the lack of hardware capability.

2) "The way it's meant to be played" doesn't mean much at all in terms of user experience...it's just a co-marketing campaign.

(Probably, it's a combination of the above.)

guest - another said:
...NVidia's are more strip down in settings - what would the benchmarks be like it they were equal.

?

The majority of the article has benchmarks where all the settings are the same....and the FX line-up gets trounced.
 
One thing I'd like to know is what the Cg compiler is doing for the GeforceFXs.

1) Do both the Cg and DX HLSL versions use the same code and just compile differently? Cg minimizing register usage and DX HLSL minimizing instruction counts?

2) Or is there optimized HLSL code for Cg, making use of the various precisions available to the GeforceFX?
 
Not only does this discredit Nvidia, but also Futuremark. Nv is cheating more heavily than was previously thought, and Futuremark are (now) endorsing it.

Let's just face it, Nvidia fokked up their dx9 line.
 
Joe DeFuria said:
A lot depends on how the game is written.. maybe this game is aim more at ATI implementation of shaders and not NVidia's.

Well, TR is in fact an "nVidia...the way it's meant to be played" title, so either:

1) The game was developed in part with decent consideration of nVidia's architecture, but that still doesn't make up for the lack of hardware capability.

2) "The way it's meant to be played" doesn't mean much at all in terms of user experience...it's just a co-marketing campaign.

(Probably, it's a combination of the above.)
:LOL: I just found out it was a TWIMTBP title a few minutes ago and found it bleeding hysterical! :LOL:
 
Joe DeFuria said:
Anonymous said:
ahem, in the last test, the FX5200 is faster than the FX5600? how is that possible?

Read the first table.

http://www.beyond3d.com/misc/traod_dx9perf/index.php?p=6

The 5200 has ALL PS2.0 settings disabled.

The 5600 (and all ATI cards) are using at least some PS 2.0 effects. ATI cards using more PS 2.0 effects than the 5600.
About page 6... There's the following paragraph:
page 6 of TRAOD benchmark article said:
By default we see a similar pattern with the lower end boards than the high end boards - all the extra PS2.0 effects are turned off for the FX boards and enabled for the Radeons. You'll note that despite this being DX9 capable, all PS2.0 effects are disabled for it, effectively limiting it to DX8 functionality.
In the last sentence (which I've put in bold), it's not obvious what "it" is. I presume you mean the FX5200, but it should be made more clear.
 
I've done some quick shots showing a few of the differences between the various "preset" modes you can use (fixed function, PS1.1, 1.4 and 2.0), plus one with everything on and set to the maximum value. Each image is 1600 x 1200 and is between 500kB and 800kB:

Fixed function
http://freespace.virgin.net/neeyik.uk/tr_aod_shots/ff.jpg

PS1.1
http://freespace.virgin.net/neeyik.uk/tr_aod_shots/ps11.jpg

PS1.4
http://freespace.virgin.net/neeyik.uk/tr_aod_shots/ps14.jpg

PS2.0
http://freespace.virgin.net/neeyik.uk/tr_aod_shots/ps20.jpg

Maximum
http://freespace.virgin.net/neeyik.uk/tr_aod_shots/max.jpg

From my own testing, it's the post-processing screen effects that kills the 5900U. Disabling the glow, heat wave and DoF effects results in a B3D-level 1024 x 768 test result jump from 24.7 fps to 67.3 fps (and note that all the other PS2.0 shaders are still active).
 
Re: Great job, Rev...

Reverend said:
SmuvMoney said:
Great article, Reverend. :D <snip>
The article that this thread is about was written by Dave, not me.

I'm sorry - my reading comprehension left me on that one. :oops: When Dave mentioned that Reverend introduced the new TR:AOD benchmark, I mistook that for credit for the article as well. So I say to Reverend - great new benchmark, and I say to Dave - great article. :D No more late night Beyond 3D forum reading for me...
 
OK that was me above...

ARGH - I forgot to log in...

Neeyik, that is an interesting point. Dave, can you confirm this or was this already in the article in some way, shape, or form?

Does anyone have an ATi DX9 card and TR:AOD to check what increase removing the DOF, glow, and heat wave has? I don't have the latter unfortunately. It would be nice to see if ATi also has a significant increase as well.
 
Great look at things. Should answer questions about the actual meaning of the TWIMTBP ad campaign for software titles (that there is no meaning apart from advertising); should answer anyone's questions about just what sort of DX9 card the 5200 actually is; and should explain exactly why nVidia quit the FutureMark program last year (which is really kind of funny considering that this test meets nVidia's criteria of a "real 3d game." Seems if anything performance might be worse in "real DX9 3d games" than in 3dMK03.)
 
i still have a (stupid?) question...
Can you enable the PS2.0 effects on the 5200/5600 FX cards? Does the card REALLY use this effect or it just "says" that it uses?

thanks
 
god, can't people read?

The game, when using default settings, disables PS 2.0 effects on an FX5200, which is the last test run by Dave.

Dave initailly tested with PS2.0 effects DOF etc enabled. See how the FX5200 outperformas a FX5600 when PS2.0 is disabled. Now look at performance when they are enabled, in the first 2 pages test of the article.
 
Back
Top