nVidia discrepancies in FarCry benchmarks?

Re: Driver heaven test is invalid!

hstewarth said:
From forum on driver heaven

I think changing the device ID is not a correct way to test anything cheated or tuned for performance .

It prove nothing more than NV40 didn't perform well on R3x0 path , or R3x0 didn't perform well on NV40 path

I think what Driverheaven is an invalid test.. one should not judge one card by tweeking it to look like a another card. The cards have different specs and depending on how the game is designed - exceptions could happen and performance could be alter. Besides any specific code for the desired hardware.

Now if the game as a true SM 2.0 and SM 3.0 options as defined by Microsoft for Direct X - doing so will only conpare the difference between code on standard API without any Card specfic changes by game developer.

Also 6800 Ulra will not be in stores until Memorial day... so this tests are done on early versions of both hardware and drivers - so its really not valid anyway.

The tests were done on an early version of 6800 Ultra hardware? :oops: :rolleyes:
 
Re: Driver heaven test is invalid!

hstewarth said:
From forum on driver heaven

I think changing the device ID is not a correct way to test anything cheated or tuned for performance .

It prove nothing more than NV40 didn't perform well on R3x0 path , or R3x0 didn't perform well on NV40 path

I think what Driverheaven is an invalid test.. one should not judge one card by tweeking it to look like a another card. The cards have different specs and depending on how the game is designed - exceptions could happen and performance could be alter. Besides any specific code for the desired hardware.

Now if the game as a true SM 2.0 and SM 3.0 options as defined by Microsoft for Direct X - doing so will only conpare the difference between code on standard API without any Card specfic changes by game developer.

Also 6800 Ulra will not be in stores until Memorial day... so this tests are done on early versions of both hardware and drivers - so its really not valid anyway.

Yes NV hasn't had time to "optimize" the game and release the "optimized" drivers yet -_-
 
Re: Driver heaven test is invalid!

hstewarth said:
From forum on driver heaven

I think changing the device ID is not a correct way to test anything cheated or tuned for performance .

It prove nothing more than NV40 didn't perform well on R3x0 path , or R3x0 didn't perform well on NV40 path

I think what Driverheaven is an invalid test.. one should not judge one card by tweeking it to look like a another card. The cards have different specs and depending on how the game is designed - exceptions could happen and performance could be alter. Besides any specific code for the desired hardware.

Now if the game as a true SM 2.0 and SM 3.0 options as defined by Microsoft for Direct X - doing so will only conpare the difference between code on standard API without any Card specfic changes by game developer.

Also 6800 Ulra will not be in stores until Memorial day... so this tests are done on early versions of both hardware and drivers - so its really not valid anyway.

What? Ok NEVER EVER compare previews with current hardware? Wrong! always compare , just keep revisting the benchs/game performance. Tthe "paths" are just more tests in the whole. Your statment sounds like one shouldnt use these tests to make a purchase, cause they are so diff and one card is too new ,not for sale and might/will improve by release. WELL DUH!
 
I'm surprised they didn't go the other way and label the 9800XT as a GeForce 6800U and see if that increased performance a lot. I figure that calling one card a different card--no matter what one--stands a good chance of making it do improper calls to the hardware and causing timing issues and all sorts of minor silliness which could certainly cause delays, so don't see the "kneecapped 6800U" results as too surprising. But if other cards sped UP by being labelled a 6800U...? That would go much further along in pointing out the invalidity of certain benchmarks, and help more in discovering exactly what's going on.
 
cthellis42 said:
I'm surprised they didn't go the other way and label the 9800XT as a GeForce 6800U and see if that increased performance a lot. I figure that calling one card a different card--no matter what one--stands a good chance of making it do improper calls to the hardware and causing timing issues and all sorts of minor silliness which could certainly cause delays, so don't see the "kneecapped 6800U" results as too surprising. But if other cards sped UP by being labelled a 6800U...? That would go much further along in pointing out the invalidity of certain benchmarks, and help more in discovering exactly what's going on.

I'm pretty sure that the ONLY effect of setting to device ID to an ATI DX9 part, is for FarCry not to use the crap quality nVidia path, and instead use the standard "DX9" path within Far Cry. In fact if there was another device out there which was DX9 then I'm sure setting the device id to it would not effect the R3X0 cards and would have the same effect on the NV cards. The basic issue is that because of the performance issues with the NV30 card range, it appears as if Far Cry detects if the card is a NV card and instead of using the dull DX9 shader path, it uses a cut down version instead to increase performance.

Aaron Spink
speaking for myself inc.
 
Re: Driver heaven test is invalid!

hstewarth said:
From forum on driver heaven

I think changing the device ID is not a correct way to test anything cheated or tuned for performance .

It prove nothing more than NV40 didn't perform well on R3x0 path , or R3x0 didn't perform well on NV40 path

Oh, someone from the driver heaven forums is an authority. More likely, he's just another fanboi.

I think what Driverheaven is an invalid test.. one should not judge one card by tweeking it to look like a another card. The cards have different specs and depending on how the game is designed - exceptions could happen and performance could be alter. Besides any specific code for the desired hardware.

Hmm, could that be you on the driverheaven forums? Writing is certainly similar.

Anyways... It is a perfectly valid test. They didn't tweak it to look like another card. Instead, they simple tweaked it not to run the low quality NV30 path. The only difference appears to be the switching of sub-par 1.1 shaders for more accurate and visual pleasing 2.0 shaders. If

Now if the game as a true SM 2.0 and SM 3.0 options as defined by Microsoft for Direct X - doing so will only conpare the difference between code on standard API without any Card specfic changes by game developer.

Gee, guess what a DX9 card without an nv vendor id runs: the standard DX9 path of the game. Hey perfect.

Also 6800 Ulra will not be in stores until Memorial day... so this tests are done on early versions of both hardware and drivers - so its really not valid anyway.

So when the inital reviews that had the 6800 performing great were released, it was proof that nvidia was going to trounce ATI, now that tests in more detail come out, they are all invalid because the drivers and hardware are "early versions".

Sure Sure, whatever.

Aaron Spink
speaking for myself inc.
 
I see it as NV paying off DEVs to speed things up by lowering IQ.
I think that this year, NV will pay/threaten DEVs to lower IQ for speed that works on a ID string.
This way, NV can't be seen as driver hackers. You will see the drivers cleaned out for the most part but some will be there for the DEVs that refuse to do NVs wishes.
 
{Sniping}Waste said:
Yall forget that money talks. All NV had to do is to say do this or no money for you.
Both Nv and Crytech are keeping there mouth shut on this. I would think a new patch would have come out by now.

Yeah cool lets patch a game for a piece of hardware that hasn't even shipped yet, still has a core rev to go along with running on a different codepath not suited for the card, on prebeta drivers.

You ati kids really, really need to cool down a bit.
 
Waltar said:
Yeah cool lets patch a game for a piece of hardware that hasn't even shipped yet, still has a core rev to go along with running on a different codepath not suited for the card, on prebeta drivers.

The FX series has the same banding problems. Its a patch 1.1 problem. The same patch that gave the FX series a big PS 2.0 boost.

Waltar said:
You ati kids really, really need to cool down a bit.

And you really shouldnt talk down to people and insult them, if you want to taken seriously.
 
The FX series has the same banding problems. Its a patch 1.1 problem. The same patch that gave the FX series a big PS 2.0 boost.
Yes and since the banding problem happens by default with the 6800u, benching farcry shouldn't even be done til a proper patch comes out by Crytek that allows full precision for the gf6. Simple as that.


fallguy said:
And you really shouldnt talk down to people and insult them, if you want to taken seriously.

I'm not sure how telling someone to calm down over a videocard debate can be considered an insult. :rolleyes:
 
They sent the card out for a preview, deal with the results. The game is a TWIMTBP game too.

You didnt tell them to just calm down, you called them "kids". You ment it to belittle them, if you didnt, you wouldnt have said it. You make yourself look childish, and hard to take seriously if you have to resort to insults.
 
l…

… forcing the XT to run as an FX increased performance (it was using lower shaders).

Amazing, the 6800U is virtually being treated as a DX8 card like the NV3x’s. I wonder if the 6800 is defaulting to a "DX8" path for HL2 and other games.
 
i think Crytec was downloading too much warz to finnish up a patch for the nv40 yet. And why would they release a patch for the preview? that WOULD look bad. I bet you will have a patch come out when the new cards ship. No BIG DEAL!. Like its news that the nv3x gets preferd treatment on PS2.0.... ITS BEEN GOING ON FOR 18MONTHS*! by gawd. What we did learn is that Farcry is not the HL2-DX9 game that all the PR?Marketing/Webreview sites have said. Its just a cool game.
 
Hello everyone, first post here...

Anyways after reading much info on this matter, and confusing myself I rejected my first post offering. Only to conclude this from my own little test.

I can confirm same situation during the following:

5900NU 473-906
56.72 drivers
FarCry v 1.1 Level Archive checkpoint ~2?
Settings Med except Texture, Water, Lighting all Ultra High.
3DAnalyze v 2.33

Forcing my 5900NU using the 9800Pro ID I can see that it indeed renders the flooring tile correctly.

However what I have failed to read or seen anyone else post is this.

Simply I forced the 9800Pro ID but tested many times forcing max PS 1.1 then 1.4 and found indeed artifacting or banding?? that is slightly different from the banding effect of the default Nvidia ID path.

So there you have it, however I wanted to test the 9800Pro ID in other parts of the game. In the level Fort looking through the jungle, I had almost 90% blue in the background. It wasn't rendered correctly at all. I triple checked in game settings, 3Danalyze including closing and relaunching the game and 3danalyze. Still the same deal.

So what does this mean? Hmm... I seen what Tommti-Systems had to say in that particular level Archive however, what about the errors in rendering in other levels of the game? So all this hoopla about forcing the 6800U running the ATI ID in their benchmark is incorrect to say the least in comparism of apples to apples performance. Shame on them and Driver Heaven for jumping the gun.

It seems to be in their case test "archive" yes a difference of the shader usages however it also doesn't explain how this same test case can incorrectly render other parts of the game.
 
Since ATI can't run at less than Fp24, if running PS1.4 gives artifacts, it means the shaders used under 1.4 are different and/or buggy. If it was just FP16 precision that was causing the problem, you'd expect no difference on the R300 running the PS1.4 path. This therefore means that something else is causing it. It could be that some other approximations are being used, such as lower precision normalization, no normalization at all, etc
 
I find it amazing that so many people say that NV40 needs a special path or code to run Far Cry properly..I mean the ONLY reason their were special paths made for NV30 is that is was unable to run DX-9 properly, and it was the only way people could get thier games to run on these cards. NV40 (I Thought) Had corrected all these issues. So it should be able to run the standard path with no problems. Maybe if NV40 really does need a special NV40 path to run as quite a few are saying..It is not all its hyped up to be and still needs help..That is sure what it sounds like..I can't wait until the NDA is lifted on the R420 so we can see what actually is going on here. Because until then, I doubt we will know.
 
Bry said:
I find it amazing that so many people say that NV40 needs a special path or code to run Far Cry properly..I mean the ONLY reason their were special paths made for NV30 is that is was unable to run DX-9 properly, and it was the only way people could get thier games to run on these cards. NV40 (I Thought) Had corrected all these issues. So it should be able to run the standard path with no problems. Maybe if NV40 really does need a special NV40 path to run as quite a few are saying..It is not all its hyped up to be and still needs help..That is sure what it sounds like..I can't wait until the NDA is lifted on the R420 so we can see what actually is going on here. Because until then, I doubt we will know.

Since the NV40 is a PS3.0 card it requires DX9.0c to be installed on the computer in order to use that shader model. Microsoft has not yet released 9.0c so the card is forced to run using the NV3x codepath. The computer in which nvidia showcased Far Cry during the presentation did have DX9.0c installed, and as you should have been able to see the image looks very much like that of current 9800 XT images.
 
I have to admit I'm confused. What is unfair about making the nv40 pretend it's a R3xx and run the DX9 path? Surely that's a perfectly fair PS2.0 test, since the path is suppsed to be for all DX9 cards?

What am I missing?
 
cthellis42 said:
I'm surprised they didn't go the other way and label the 9800XT as a GeForce 6800U and see if that increased performance a lot.
He did, it did but not by "a lot" but by a bit...it's in the thread at DH. ;)
 
Back
Top