nVidia discrepancies in FarCry benchmarks?

Radeon forced to run the NV ID/Path = Avg: 50.011 - Min: 28.

In response to the comment saying the 59xx had display issues when forced to run the RadeonID... i played through a great deal of the game with the radeon ID and no display issues were experienced. Must be specific to the NV3x series.
 
Veridian3 said:
In response to the comment saying the 59xx had display issues when forced to run the RadeonID... i played through a great deal of the game with the radeon ID and no display issues were experienced. Must be specific to the NV3x series.

I tested again using the 60.72 betas with the same setup. This time with these drivers I was able to render correctly other levels in the game..

Also to note that between the 2 ids in these other levels per screenshot I saw no noticeable differences in rendering quality only a decrease in framerates running the ATI ID.

One other note there where visible differences also in the archive level between the 56.72 and 60.72 driver sets. It appears the 60.72 driver renders the image more correct showing the tiles with an increase of framerates.

There is still the banding with NV ID but not with ATI ID. But again between these 2 the framerates are exactly the same on this card.

Even though the article is about benchmarking the 6800U, If you had time check it out yourself what I done, as it has really sparked my interest.

Also there be plenty of bugs in the 56.72 driver set itself..
 
At first I wasn't sure what to think about this, but after reading through exactly what's been done I'm still curious as to why nVidia would have the initial set of nV40 drivers converting down to ps1.x from 2.x, since nV40 shouldn't need that crutch (theoretically.) Has anyone tried the same tests with Halo, using the command-line switches to test between ps1.x and 2.0? It'd be interesting to see some numbers there, and ps1.x to ps2.0 rendering IQ differences in Halo are pretty obvious, so it'd be easy to make some pertinent observations. Also, DX9.0c isn't required for nV40 to render ps2.0, just as it isn't required for R3x0. There might be at least some smoke here--if not a raging inferno (although considering the official nVidia-announced specs for the product why there'd even be smoke is kind of baffling.) Ah, well, I await further investigation with interest...
 
Ooo boy, it summer is just around the corner but these forums are already on fire :LOL: A bit of chillout moods for everyone, cold drinks and everything should be fine.Though an extraordinary finding, consequence of exceptional journalism, this proves....umm...yea...i got it...about nothing.Why?Fact-the nv30 path uses ps 1.x shaders wherever possible in farcry.Fact-the r300 path uses less ps 1.x shaders and more 2.0 ones.Fact-the r300 likes some stuff that nv doesnt like and vice-versa.The straight dx9 path=r300?mmm, perhaps a bit tweaked.ATM, the nv30 path hurts the nv40 more than it helps but...ahem....that`s it.....THE DAMN CARD`S NOT RELEASED ;) And now we can all go back to ibiza
 
Testiculus Giganticus said:
Ooo boy, it summer is just around the corner but these forums are already on fire :LOL: A bit of chillout moods for everyone, cold drinks and everything should be fine.

I'm chillin here my friend. 8)

I just really hope Nvidia doesn't disappoint me this year. I can understand what all went down, I mean I payed less then $200 for my card 6 months ago. In fact bought 3 video cards for other systems for less then the price of 9800xt or 5950U.

Anyways point being is I don't like it when I have a product that renders an image incorrectly IMO. I would rather it be @ss slow but correct and let me tweak the game and the drivers to my liking. Time will tell.
 
WaltC said:
At first I wasn't sure what to think about this, but after reading through exactly what's been done I'm still curious as to why nVidia would have the initial set of nV40 drivers converting down to ps1.x from 2.x, t...
Im going with the dreaded "Quack" issue. They(nvidia) get one pass on the issue untill the next driver release.. But isnt it realy Farcry that sees the nvidia name and sets 2.0 to 1.1 ( would be np for GF4) Unless you set the settings to uber high then you get more 2.0 light.?
 
At first I wasn't sure what to think about this, but after reading through exactly what's been done I'm still curious as to why nVidia would have the initial set of nV40 drivers converting down to ps1.x from 2.x, since nV40 shouldn't need that crutch (theoretically.)

You are correct, the NV40 in no way needs lower quality shaders. In fact, there have been some tests conducted in the intial set of reviews which suggest that the NV40 may actually be faster at times using PS 2.0 vs using PS 1.1! Note that the NV40 was being detected as NV3x hardware by FarCry using the v1.1 patch. I'd expect things to change noticeably in the future when NV and Crytek smooth things out for the NV4x cards. Also keep in mind that FarCry does not necessarily use an abundance of PS 2.0 shaders as initially believed (see the Firingsquad.com article), but in reality uses mostly PS 1.1 shaders, with PS 2.0 shaders used primarily for lighting. Of course, when the PS 3.0 patched version of FarCry comes out, things may change for the better for NV40, but this will not be able to be used until DirectX 9.0c is out (right now, it can only run on DirectX 9.0c beta, available to a select few).
 
jimmyjames123 said:
Note that the NV40 was being detected as NV3x hardware by FarCry using the v1.1 patch.

That seems a likely explanation--I keep forgetting how extensively nVidia was pushing "optimization" last year involving developers writing paths eschewing ps2.0 so that nV3x could turn decent framerates. (Not having a nVidia product myself since 9/'02 that kind of thing never impacted me personally last year, so it's easy to forget how that worked with some games.) Jeeps, I imagine the poor developers going to all that trouble last year to write custom paths for nV3x to use as a crutch are going to be delighted by having to undo all of that for nV4x. Maybe the quickest solution for nV4x for a developer is simply to have nV4x, when identified by the application, run the same path the ap assigns to R3x0? Might save a bunch of needless work on their parts in undoing nV3x optimization, while leaving nV3x paths intact for folks still unfortunate enough to have to endure them.

(Don't really mean to sound less than generous here, but it's been a long day...)
 
IgnorancePersonified said:
Quitch said:
What am I missing?

The glazed eyes and flecks of foam at the mouth that marks a fanboi!

Don't waste my time, or anyone elses, with this rubbish.

My apologies to everyone for the fact that I had to make this post at all.
 
Quitch said:
IgnorancePersonified said:
Quitch said:
What am I missing?

The glazed eyes and flecks of foam at the mouth that marks a fanboi!

Don't waste my time, or anyone elses, with this rubbish.

My apologies to everyone for the fact that I had to make this post at all.

I think I.P. was saying that you were missing those negative qualities, though I'd still agree that the statement probably didn't help matters any.
 
Testiculus Giganticus said:
I feel the need to repeat this-the NV30 path hurts the NV40 more than it helps it.

How so? It's proven that the NV40 is considerably faster in the NV3x codepath, then the DX9 codepath. Performance wise, it helps...

Hardly surprising, as you're simply working with a simpler world to render...
 
Well, for starters, because it provides an INCREDIBLE ammount of ammo for all the people out there that feel that nv is the devil incarnate, and that cant be good.Its architecture is a bit diffrent from the nv3x, so it doesnt like some things that it`s little brother loved ;) Finally, though it has the muscle for full quality, it is forced to have those nice "optimizations" that were necessary before
 
Testiculus Giganticus said:
Well, for starters, because it provides an INCREDIBLE ammount of ammo for all the people out there that feel that nv is the devil incarnate, and that cant be good.
Awww....sure it can be! ;)
 
Testiculus Giganticus said:
Its architecture is a bit diffrent from the nv3x, so it doesnt like some things that it`s little brother loved ;)

Lets get the facts straight:

fact: BOTH the 6800u and the 9800 have a significant performance increase when running the NV3x codepath compared to the normal DX9 path.

That means that part of the NV3x codepath is NOT specific for the NV3x, but improves performance on all videocards. Which part? Well, look here:

fact: The NV3x code path uses less PS2 shaders, and more PS1.x shaders.
fact: BOTH 6800u and 9800 produce the same artefacts when running the NV3x path, and look fine on the normal codepath.

This performance increase comes from running PS1.x shaders giving a much worse image quality.

Now, it might be that the NV3x codepath also inclused pp-hints. Might be that that is only usefull for the NV3x. Might be that the NV3x gains much more from switching from PS2 tot PS1.x

But the fact remains: Part of the performance increase of the NV3x codepath comes from simply running at a lower image quality, produces by simpler PS1.x shaders that use less computing power.
And that will improve performance on ANY videocards. As is demonstrated by the 6800u and 9800 performance increase.

Testiculus Giganticus said:
Finally, though it has the muscle for full quality, it is forced to have those nice "optimizations" that were necessary before

Do you honestly believe it's an accident that the NV40 is running the NV3x codepath? It's bound to have a different ID... So that would mean Farcry defaults to the NV3x codepath for unknown cards? :?

Come on, it's so obvious what's going on. When the 1.1 patch was released, the NV40 was ready. Developers were completely informed about it, probably had been running samples allready. The patch was even claimed to allready included PS3 paths for the NV40.
So much for not being aware of the NV40... :rolleyes:

It is unconceivable that the NV40 is running the NV3x by accident. It is running that path, because NVidia and/or Crytek WANT it to run that way :!:

Why? Because Crytek thinks it's not fast enough to display at full image quality? No way! It's running the normal DX9 path slightly faster than the Radeon 9800XT.
There's only one option left: Because NVidia didn't like the very small performance difference compared to the 9800XT, they pressed Crytek to have the NV40 run a simpler codepath, exchancing image quality for performance.

Does that make NVidia the devil incarnate? Well... you decide! :devilish:
 
Ylandro hits the nail right on the head!!

OH no that can't be says the Nvidia Fans, it's this and this and that and whatever.

2 weeks after the game launched their was a patch, read what it says:

And why would anyone be surprised and make excuses, this wouldn't be the first time would it. Just because this is a new card dosn't mean they won't do everything in their power to dupe the gullible.

Oh yes and not that long ago they show us PS 3.0 shaders VS 2.0 in Farcry (LOL) oh yes this was a mistake! LOL. They either have the stupidist PR people that don't know how to make certain that you properly represent the topic or it was an intentional misleading of the public, hopeing that most that seen it and said you see how much better 3.0 is over 2.0 and walk away never to find out the true facts.

I have been in sales at manufacturer level long enough to know how the game is played, the graphic industry is obviously no different when the stakes are this high.
 
Evildeus pointed to this at www.vr-zone.com

News : Software : NVIDIA Forceware 61.11 Drivers Info
NVIDIA has a new driver for the GeForce 6800 series -- version 61.11. Compared to the 60.72 driver, this new driver features:

Around 5% boost in Aquamark
Around 5% boost in X2 performance
Around 5% boost in Tomb Raider: Angel of Darkness performance
Around 10% boost to Call of Duty performance
Over 20% improvement in Far Cry performance (fixed a z cull bug)


so it looks to be bugged and if you add 20% it no longer looks like it just performs slightly better than 9800XT, so these amateur theories no longer hold water


[/b]
 
dizietsma said:
Evildeus pointed to this at www.vr-zone.com

News : Software : NVIDIA Forceware 61.11 Drivers Info
NVIDIA has a new driver for the GeForce 6800 series -- version 61.11. Compared to the 60.72 driver, this new driver features:

Around 5% boost in Aquamark
Around 5% boost in X2 performance
Around 5% boost in Tomb Raider: Angel of Darkness performance
Around 10% boost to Call of Duty performance
Over 20% improvement in Far Cry performance (fixed a z cull bug)


so it looks to be bugged and if you add 20% it no longer looks like it just performs slightly better than 9800XT, so these amateur theories no longer hold water


[/b]

20% faster running the normal dx9 path or the nv30 path? ;)
So far the "amateur theories" seem more believable than anything Nvidia has said in a LONG LONG time.
 
According to HFR, there's no significant differencies in FPS with Nv3* or R3** ^path with the 6800U, plus the R3** path get rid of the quality issues of the Nv3* path.
 
Evildeus said:
According to HFR, there's no significant differencies in FPS with Nv3* or R3** ^path with the 6800U,

HFR = ?? Do you have a link to their numbers?


Driverheaven reported the following numbers:

6800u Nv3* path: Min fps 32, Avg fps 60
6800u R3** path: Min fps 20, Avg fps 48

(http://www.driverheaven.net/showthread.php?s=&threadid=44253&perpage=15&pagenumber=1)

That is 20% performance drop on the average fps, and a 40% performance drop on minimal fps.

I call that a significant difference. ;)
 
Back
Top