Tomb Raider: AOD DX9 Benchmarks

digitalwanderer said:
eSa said:
9800 Pro. It's faster and has better image quality.
Gabe Newell said that?!? :oops:

Well, so it seems. At least thats what the user called "SaiboT" seems to have got as e-mail reply when he asked the Gabe about the subject.

There is also other gems in the that thread, like one guy asks how well his 2.0 GHz P4 + Radeon 9600 will do in the game and Gabe answers 30-40 fps at the 1024x768...


Oh, and these bits are there too :

"
Hey Gabe, just a few quick questions:
-Will i get minimum 30fps, with a 512 ddr ram, amd athlon
xp2700+,GeForceFX5600


You should be right around 30 FPS if you run with fairly high detail levels.
To get 30 FPS with a 5600 you'll probably have to turn off high dynamic
range lighting and character bump mapping (this will be done for you
automatically or you can tweak it). "



" Hi Gabe,
A lot of people seem to be arguing over what system the game was run on at E3. To end all the confusion, could you please tell me the systems specs? Also, if you had FSAA or AF on, what detail setting, and the fps you were getting. Thanks for clearing this up, and great work on HL2.



Dell XPS, 2.8 GHz P-IV, ATI Radeon 9800 128 MB RAM. Anti-aliasing off, anisotropic filtering off. 60 FPS. "
 
Eww, AA & AF off and 60fps? I was hoping I could squeek 4xAA & 8xAF 1024x768 in the 40-50fps range, looks like that'll be damned hard to pull off!

Thanks for all the info eSa, it's very much appreciated.
 
Digitalwanderer,

I sent this email to Mr. Newell a little before I read your msg. Hopefully he'll respond. :)

I've been quite interested in knowing what performance numbers for HL2 I can expect with my system. I just built it.

Pentium 4 3.0Ghz C @ 3.12Ghz (208Mhz FSB)
1GB Corsair DDR dual channel (2-512 sticks)
Radeon 9800 Pro 256MB 378Mhz Core/351Mhz Memory @ 405Mhz Core/371Mhz Memory

I force 4xAA and 16xAF and Quality filtering in the ATI control panel. With my rig, what do you expect I should be able to get with and without AA/AF? I'd really like to play with AA and AF at the levels I stipulated, but this game looks like it could bring my system to its knees, so I'm flexible.

Thank you for any reply you see fit to give, if any. :)
 
Here's something of note...

Having just finished off some 5900 testing with two driver revision (vendor supplied and latest NVIDIA reference), TR alters somewhat. Picking just one resolution, 1024 x 768, with all the B3D setting enabled (but no AA/AF), I get the following on a 5900U:

44.03 = 11.0 fps
45.23 = 24.7 fps

So what, you may think. Well a discussion in other form prompted me to dig out Marco's fill rate tester and give it a bash on a 5900:

Fillrate Tester
--------------------------
Display adapter: NVIDIA GeForce FX 5900
Driver version: 6.14.10.4523
Display mode: 1152x864 X8R8G8B8 85Hz
Z-Buffer format: D24S8
--------------------------

FFP - Pure fillrate - 1578.568604M pixels/sec
FFP - Z pixel rate - 1576.726807M pixels/sec
FFP - Single texture - 1486.767090M pixels/sec
FFP - Dual texture - 1384.217407M pixels/sec
FFP - Triple texture - 706.989380M pixels/sec
FFP - Quad texture - 482.227875M pixels/sec
PS 1.1 - Simple - 793.391479M pixels/sec
PS 1.4 - Simple - 746.848145M pixels/sec
PS 2.0 - Simple - 375.354065M pixels/sec
PS 2.0 PP - Simple - 499.227448M pixels/sec
PS 2.0 - Longer - 150.545624M pixels/sec
PS 2.0 PP - Longer - 300.501587M pixels/sec
PS 2.0 - Longer 4 Registers - 161.009705M pixels/sec
PS 2.0 PP - Longer 4 Registers - 374.919739M pixels/sec
PS 2.0 - Per Pixel Lighting - 73.962013M pixels/sec
PS 2.0 PP - Per Pixel Lighting - 93.716530M pixels/sec

What's particularly interesting is that the longer 4 register fill rates are higher than the "normal" longer tests. Now if you look at Dave's 5900U preview using the 44.03 drivers, his fill rate results for the longer tests are as follows:

http://www.beyond3d.com//previews/nvidia/nv35/index.php?p=21

PS 2.0 - Longer = 242.0
PS 2.0 PP - Longer = 340.3
PS 2.0 - Longer 4 Registers = 203.9
PS 2.0 PP Longer 4 Registers = 273.4

The 44.03 drivers show a relatively large decrease in FP16 fill rate with the longer 4 register test, whereas the 45.23s show an increase. This would tie in with the increase in TR performance (Splinter Cell also runs better too, but not to the same extent). Now unless there is shader fiddling in all these tests ;) it would suggest that some fix for the poor shader performance - I'm not by any means suggesting that it'll leap up to R9800 Pro level; I'm just suggesting that as Carmack stated a while ago, the NV3x pipelines are evil worms that still have more to offer. How much more is anyone's guess.
 
I just read at DH that the 50.* detonators wil be out later today.

Nice to re-test ths FX serie?
 
hjs said:
digitalwanderer said:
hjs said:
I just read at DH that the 50.* detonators wil be out later today.

Nice to re-test ths FX serie?
Link please?

http://www.driverheaven.net/showthread.php?s=&threadid=24785
I thought that's the thread you were talking about. Did you read the second or so post in that thread by the guy they got the story from?
Mark Fox said:
well, no I didn't say you'll be installing 50.xx drivers - hehe...I wish - just proof with a "sample" you can download yourself, all will be revealed as soon as w2s gets working well again, its on and off all the time at the moment

Then the story came out on Warp2search here:
http://www.warp2search.net/modules.php?name=News&file=article&sid=14098

It's just they found some files referring to det 5 in some nForce2 drivers, nothing more.

No news here, sorry.
 
digitalwanderer said:
No news here, sorry.

Yea, i saw that :(

then again, my ti-4200 is dying i think, just got my old Radeon 32mb DDR back in (one of the first in the netherlands ;) )
I geuss it's time for a Radeon 9800 (pro) with a zalman 8)
 
http://forum.aquamark3.com/showthread.php?s=&postid=1686#post1686

From Wire at Massive

AquaMark3 will not have 50% and higher performance difference on the high end cards. If I remember correctly, Tom's Hardware made some benchmarks using the rather similar Aquanox2 already.

I have no idea why Tomb Raider would be so slow, especially considering that its Playstation2-quality graphics don't justify using complicated shaders. Maybe they should not have used Cg, even on NVIDIA hardware. We're using a combination of directly coded assembly, some HLSL shaders and some hand-optimized assembly of HLSL shaders.

* * *

Well not 50% or more difference from a R350 vs NV35 but I'd call >10% significant and > 20% I'd want to be on the better card.
 
Got a question. I could be wrong but just curious for all the people that bought the NV cards. You tested on a top of the line system, Intels p4 3.0G with 800 Mhz is about as good as it gets. I know I just upgraded my system from a p4 1.4 g 400 mhz fsb with 500 MHz mem to a p4 2.6 (oc'd to 3.0) 800 MHz fsb with 500 MHz dual ddr mem. I also have an ATI 9500 pro. Before I upgraded, I could not really use my card to the best. I could hardly turn any ey candy at all. My 3dmark 2001 scores where a lousy 6000..With my new system and same video card, I can turn all the eye candy on I want, ( A big plus) and my scores went to 1500 + My question is how bad is the NV cards gonna run on a low end system or athlon system, Not to down AMD but it is well seen through all benchmarking that Intel is king with its 800 MHz FSB when it comes to 3D gamming, although AMD may regain crown with its 64 bit, and I Hope Intel can keep up with its next gen, My point is if these cards are this bad with the top of the line system, how bad will it be with a old 533 Mhz FSB Pentium, or an AMD, I think it would be a good thing to know before more people purchase this card.
 
Is there any chance you release some kind of "public" demo, so we can compare the results, reserving your testing demo unrevealed,so someone could not cheat in the drivers..?
 
FX5900

First of all really great review B3D did on the cards (didn't gun down ATI or Nvidia)

Second Im really distressed i bought the FX5900 Ultra about a month ago and what im reading is realy not what I was hoping for or spent the money on.

Does this mean that the FX cards can't handle the PS2.0 or the drivers aren't optimized??!?!

Please guys some constructive advise.

Do you believe that the detonator 50.xx might be a solution or must I sell and get ATI or wait and see what happens?
 
If the drivers are still not "optimized" for best performance (with suitable image quality), keeping in mind the amount of time NV35 has been out, it seems little to no optimization/driver efficiency will be gained (unless the software development team experiences some sort of uber enlightenment).
 
While using anything DX9 with the NV30/35 looks rather disasterous in terms of performance is it likely that when shadows are used this maybe an area where there are gains to be made.

I believe it was in another forum post that suggested that HL2 will use Shadow Buffers which would seem to have little performance hit v Stencil buffers.
 
THe_KELRaTH said:
While using anything DX9 with the NV30/35 looks rather disasterous in terms of performance is it likely that when shadows are used this maybe an area where there are gains to be made.

I believe it was in another forum post that suggested that HL2 will use Shadow Buffers which would seem to have little performance hit v Stencil buffers.
I seem to recall reading that NVidia cards could use the fixed function implementation of shadow buffers along side the programmable pipeline. . . Conversely, ATI cards must either do one or the other: use the fixed function pipeline for everything or do shadows with everything else in the pixel shader. If this really is the case, then the GeforceFX may indeed be given somewhat of a break. . .
 
Gaal Dornik said:
Is there any chance you release some kind of "public" demo, so we can compare the results, reserving your testing demo unrevealed,so someone could not cheat in the drivers..?

I'm curious about this too as I want to test my own setup and compare it against your results. Maybe the low score is an anomaly with your setup? Maybe others will get different results based on the various hardware differences that are out there?

I just think a public demo would be nice considering the situation that the TR:AOD benchmark has created. Kind of like how 3DMark was used to show how good your system was, TR:AOD could do the same for DX9 measurements; letting people know what others are getting with the various setups out there.
 
Gaal Dornik said:
Is there any chance you release some kind of "public" demo, so we can compare the results, reserving your testing demo unrevealed,so someone could not cheat in the drivers..?
At the moment, no. Beyond3D has an exclusive version of the EXE which is not available to the public yet. As soon as EA (who is responcible for releasing patches to the public) get around to releasing it then we should have all of the tools we need to make our own demos. Of course, you will have to purchase the game as per usual.
 
Back
Top