Tomb Raider: AOD DX9 Benchmarks

As someone who was days away from ordering a Leadtek 5900 Ultra, I'm very much interested to see Nvidia's reaction to these benchmarks. If they can't give me pause for thought, I for one will now be getting a 9800 Pro....those numbers are just way too worrying.
 
Wow - great article - stunning result - as Reverend knows I am trying to get Massive Development to also comment of what they have found in Aquamark 3 for PS 2.0 on NVidia vs ATi.

I am stunned to see the difference is this huge. I wait in bemusement to see how NVidia respond with 1) PS2.0 is over used and/or less important than PS1.1 - 1.3 and/or our drivers Det 50.xx will fix this without sacrificing Image Quality and/or PS2.0 alone does not determine your game experience, on balance we are faster because ...

Impressive article I wonder if this trend will be repeated across other looming TWIMTBP DX9 titles?
 
Very interesting. I, too, am curious to see nV's and ATi's response to this, both in terms of PR and future driver developments and/or improvements. Can we still assume Cg is early and therefore not fully up to speed?
 
Re: Question?

digitalwanderer said:
Have you received any reaction from nVidia or ATi over the article, and if so what? :?:
NVIDIA emailed me, asking why AA wasn't tested.

I said (in summary, not ad verbatim) that there's a bug when AA is enabled at certain AA levels+resolutions (very nasty bug) but that this only happens on a 128MB 5900 (works fine on a 256MB 5900).

That, and that the performance gets even worse :)

I conveniently forgot to tell NV that their the game looks horrendous with DOF enabled when running using the default DX9 compiler with their 45.23 drivers (well, actually I did tell NV about this a number of days ago but that was to a different NV personnel and not the one that emailed me upon his reading this article). "Conveniently forgot" because I wanted this to be shown in the Albatron GFFX 5900 review later today.

Nothing from ATI to me but they're more likely to email Dave.
 
BTW, wrt the various TWIMTBP references.

I thought it was understood that marketing campaigns have very little (very little) to do with how game programmers program their game? I think Tim Sweeney pretty much said the same in one of our interviews here.

It is PR for NVIDIA, nothing more. Don't read too much into it... we're not little kids here.
 
Reverend said:
BTW, wrt the various TWIMTBP references.

I thought it was understood that marketing campaigns have very little (very little) to do with how game programmers program their game? I think Tim Sweeney pretty much said the same in one of our interviews here.

It is PR for NVIDIA, nothing more. Don't read too much into it... we're not little kids here.
Primarily because there is so much FUD out there about the TWIMTBP program and what it means that there are a lot of people who don't understand that yet. It's funny to me because this rather blatantly shows that you are 100% correct in your assessment of the TWIMTBP program as being absolutely nothing more than a PR move. :)

Thanks for sharing the nVidia reaction, it's much appreciated.
 
A few may be thinking why this is a "beta" article by Dave. I thought it was pretty clear.

Dave didn't include screenshots, nor extrapolated on any IQ issues that definitely exists (with GFFXs). He didn't expand on why the attempts at "apples-to-apples" settings were made [although the two links (re our readme and settings pages) he provided in the article should be sufficient], he did not reveal any bugs by video card+drivers (of which there are), etc. etc.

In short, I think Dave just wanted to get this (performance data, nothing much else) out quickly (although I think it would've made a little bit more sense to all who read it if this article came out after the Albatron GFFX5900 review... lots of relevant info in this coming review). It is "beta" because it is incomplete... as confirmed by some of you here asking for screenies and such :)
 
I thought it was understood that marketing campaigns have very little (very little) to do with how game programmers program their game?

That is actually not the case though. I'm sure there are levels of involvement, but at the Dusk-till-Dawn developer event they went into some details as to the levels it goes into - it can be much, much more than a co-marketting/branding thing.
 
Reverend said:
although I think it would've made a little bit more sense to all who read it if this article came out after the Albatron GFFX5900 review... lots of relevant info in this coming review
Hey Rev, when is the GFFX5900 review coming out? You've sold me on it already. :LOL:

Dave Baumann said:
I'm sure there are levels of involvement, but at the Dusk-till-Dawn developer event they went into some details as to the levels it goes into - it can be much, much more than a co-marketting/branding thing.
So there are actually TWIMTBP games that have more/better graphics on nVidia cards than FX cards? Well don't that stick a fly in the pudding, how are we to know when it's just a branding thing and when it's a real performance thing?
 
A few may be thinking why this is a "beta" article by Dave. I thought it was pretty clear.

Dave didn't include screenshots, nor extrapolated on any IQ issues that definitely exists (with GFFXs).

Sorry, yes - this wasn't supposed to be an expansive article on the differences between the various settings, but to answer the call from a few people to get some kinds of comparative baseline performance. The article was only supposed to be a small thing to address those that asked for this - unfortunatly it spread far wider than I had anticipated. Had I gone for a fuller article I would have included some IQ comparisons and probably gone into some more depth as to why we are seeing such wildly fluctating performance where Pixel Shader operations are in place (to which I may have to do separately now anyway).

However, I was aware that Rev would be includinging more IQ comparisons in his next review, as well as those already shown in the 9600 review.
 
Anonymous said:
Can you enable the PS2.0 effects on the 5200/5600 FX cards? Does the card really use this effect or it just "says" that it uses?
Thanks
Yes you can, and it works as intended.
 
digitalwanderer said:
Reverend said:
although I think it would've made a little bit more sense to all who read it if this article came out after the Albatron GFFX5900 review... lots of relevant info in this coming review
Hey Rev, when is the GFFX5900 review coming out? You've sold me on it already. :LOL:
I think Dave will be preparing for bed soon... so I'd say in about 7 to 8 hours' time! :)
 
Anonymous said:
I noticed that fog was disabled for all cards in the default settings. Is there an issue with the it? How much does it affect performance?

I did a quick test on the 5900 at 640x480 (which is still very shader limited) and there was about 1 FPS difference with Fog on or off, so I don't think this is an issue.
 
Ostsol said:
One thing I'd like to know is what the Cg compiler is doing for the GeforceFXs.

1) Do both the Cg and DX HLSL versions use the same code and just compile differently? Cg minimizing register usage and DX HLSL minimizing instruction counts?

2) Or is there optimized HLSL code for Cg, making use of the various precisions available to the GeforceFX?
I'm trying to find out. It's a very intriguing thing to me re the battle of compilers because there's such a huge difference between the two in terms of IQ output when the 2.0 pixel shader for DOF is used.
 
Reverend said:
I think Dave will be preparing for bed soon... so I'd say in about 7 to 8 hours' time! :)
I keep telling you folks; sleep causes cancer and is an entirely unsuitable substitute for caffeine. ;)

Ok, ok...I'll wait. :rolleyes: :LOL:
 
Pete said:
Very interesting. I, too, am curious to see nV's and ATi's response to this, both in terms of PR and future driver developments and/or improvements. Can we still assume Cg is early and therefore not fully up to speed?
Actually, if you have a GFFX board, you will definitely want to use the Cg compiler instead of the DX9HLSL compiler due to :

1) massive IQ issue using DX9HLSL compiler with GFFX+45.23 drivers when DOF is enabled... doesn't happen when using Cg compiler
2) there's a slightly-to-much (depending on rez, percentage-wise) better performance with Cg compiler when AA is applied... almost identical performance between the two compilers without AA however.
 
Back
Top