NV shader "optimizations"

flexy

Newcomer
http://www20.tomshardware.com/graphic/20040414/geforce_6800-46.htmll

In these shots you can clearly see that Nvidia uses 'driver optimizations' and 'rough pixels'.

a) BUG ????
b) "Optimizations" ? <- if yes, why ? I thought nv40 is fast enough ?

Do you think there will be a performance hit once Nvidia decides to remove the 'driver/shader optimizations' ? If yes, how big ?

When will Nvidia remove this 'optimizations' ?

Comparing the *performance* of two cards whereas ONE card renders/calculates *every* pixel RIGHT and the other card skips every 16th/32th pixel in certain shader operations is just ridiculous and worthless.....

Also..does anyone have information whether this behavior is only apparent in FarCry or also elsewhere ? (Depending on the used shaders etc...)
[/url]

Edit: Typo in link corrected
 
Link is broken for me.

If you're talking about what I believe you are, it's a bug in Far Cry itself with the 1.1 patch applied.
 
Some of thebanding in Far Cry I thought was caused by Low Res Normalization Cube Maps, Wouldnt it be more efficient to use shaders on the NV40 anyway?

Course I could be wrong.
 
flexy said:
http://www20.tomshardware.com/graphic/20040414/geforce_6800-46.htmll

In these shots you can clearly see that Nvidia uses 'driver optimizations' and 'rough pixels'.

Tomshardware wrote:

"However, with the current driver, the 6800 U was detected as PS1.1/2.0 hardware by the game"

This basicly means that the game most likely still uses the GF FX path, where some of the PS2 shaders are replaced with PS1.1. The screenshots are most likely comparing PS1.1 shaders on the nVidia cards with a PS2 shader on the Ati.
 
If it's a bug in the game the rendering errors should be visible using all hardware, not just one... Does the article compare against NV3x, R3x0 etc, or is Tom just trying to invent a "scoop" (which I believe he hasn't had since the 1.13GHz P3 bug was discovered :LOL:)...
 
It's common that a game uses different code paths for different video cards. For example, since NV3X has slow pixel shader 2.0 performance, some games may want to avoid using pixel shader 2.0 on NVIDIA cards.
 
Tim said:
Tomshardware wrote:

"However, with the current driver, the 6800 U was detected as PS1.1/2.0 hardware by the game"

This basicly means that the game most likely still uses the GF FX path, where some of the PS2 shaders are replaced with PS1.1. The screenshots are most likely comparing PS1.1 shaders on the nVidia cards with a PS2 shader on the Ati.

As Hanners already pointed out this is a bug introduced by the FarCry 1.1 Patch.
In FarCry1.0 my GeForceFX rendered the scene without these artefacts ("lighting quality" set to "high" -> PS 2.0 used). After i applied the 1.1 patch (which claimed to speed up FX rendering speed) these artefacts (plus a few others, like overdone specular component on pipes) appeared UNLESS i lowered "lighting quality" one step (-> PS 1.1 used). So its pretty clear PS 2.0 support was kinda "broken" by the patch.

Hope this clears up the confusion. All of the above was tested with different sets of NVidia drivers.

regards, alex
 
but why there is difference between fx & 6800 shots, if they use the exact same path? (6800 having more effects than fx)
 
Right now I think it's very careless to talk about any optimizations based on early review drivers. The subject will get hot however when retail boards (or final revisions) are shipped with available WHQL drivers from NVIDIA. Currently, IMO it's just throwing words in the wind.
 
Kaotik said:
but why there is difference between fx & 6800 shots, if they use the exact same path?

Because they are different cards?

Kaotik said:
(6800 having more effects than fx)

And jduging by the shots, i would not say that 6800 shows more artefacts than the 5950.

@volt: This is not a matter of reviews boards or not, i just wanted to point out, that i dont think this is "driver optimizations" at work with the FarCry Patch 1.1 installed.

regards, alex
 
My point is early drivers are supposed to be buggy. You can't expect IVH to compile a perfect working driver set for product introduction. Though obvioulsy it's nice of them to point out the differences -- which don't appear to be valid for GeForce 6 running current drivers.

I'm not defending NVIDIA rather pointing out the obvious. And I quote: This comparison shows that the GeForce 6800 Ultra still uses optimized shader code in FarCry. I still don't know what to call this -- a bug or optimization, how can this be a valid statement?
 
As has been discussed many times previously:

a) BUG ????
Probably not, but it's a new game on a new card with new drivers, so it's possible.

b) "Optimizations" ? <- if yes, why ? I thought nv40 is fast enough ?
Probably yes. NV40 is probably just using the NV3x path, which is riddled with IQ compromises to accomodate the "compromised" NV3x architecture. NV40 obviously has the speed to render as nicely as ATi, so it's probably just a matter of Crytek inserting a new path for NV4x. The new "mod" they spoke about at the 6800U launch will probably fix things.

Also..does anyone have information whether this behavior is only apparent in FarCry or also elsewhere ? (Depending on the used shaders etc...)
Well, the Lock On screenshots in the FiringSquad review show some banding, too, but I don't know if that's related. Again, given NV40's speed, it's probably a correctable error.

Do we really need five new threads on every subject? Most everything has been covered in the initial few (large) threads. Do your homework, ppl. :p
 
So its pretty clear PS 2.0 support was kinda "broken" by the patch.

I wonder why. Is it possible that the patch alters some shaders that nVidia was replacing? I mean it does work on ATI cards right? If the code is written in standard DX9 it should work on both cards, shouldn't it? Unless nVidia's drivers don't like something with the code, which doesn't exactly speak well of nVidia's drivers unless the FarCry people are doing some wonky stuff with the patch.
 
Oh no here, we go again... the drivers are not even beta and people are claiming optimizations.. In the past history of NVidia's drivers, they never pull out the big guns untils after the card is release anyway.. I think the current benchmarks are going to way under what the release ones are going to be. ( or atleast after ATI releases there cards ).
 
Eolirin said:
So its pretty clear PS 2.0 support was kinda "broken" by the patch.

I wonder why. Is it possible that the patch alters some shaders that nVidia was replacing? I mean it does work on ATI cards right? If the code is written in standard DX9 it should work on both cards, shouldn't it? Unless nVidia's drivers don't like something with the code, which doesn't exactly speak well of nVidia's drivers unless the FarCry people are doing some wonky stuff with the patch.

Your forgeting that Nvidia cards have a custom code path to get decent performance. I doubt the 6800 will require this because the hardware surprise, surprise is very similar to R3XX.
 
My question is what if there are games out there that use the low-quality nv3x "optimized" path but aren't recent enough for the developer to be willing to invest the time to create a new path for nv40.

In those cases the game would recoginze the nv40 as a nv3x and use the low quality path so we would never get the full quality nv40 is capable of, correct?

Thanks,
Mario64
 
The same could be said for alot of games used in benchmarks including a certain website that likes to use warez and titles it 'highly anticipated DX9 game'.
 
mario64 said:
My question is what if there are games out there that use the low-quality nv3x "optimized" path but aren't recent enough for the developer to be willing to invest the time to create a new path for nv40.

In those cases the game would recoginze the nv40 as a nv3x and use the low quality path so we would never get the full quality nv40 is capable of, correct?

Thanks,
Mario64

I'd assume NV4x can run standard paths just fine and I hope it will.
 
volt said:
mario64 said:
My question is what if there are games out there that use the low-quality nv3x "optimized" path but aren't recent enough for the developer to be willing to invest the time to create a new path for nv40.

In those cases the game would recoginze the nv40 as a nv3x and use the low quality path so we would never get the full quality nv40 is capable of, correct?

Thanks,
Mario64

I'd assume NV4x can run standard paths just fine and I hope it will.

if(chip == "NV4X") setPath("standard DX9");

Just 1 line of code. Surely the developers will muster enough strength to do that :)
 
Sure doesn't take a lot of code, but it requires that the theoretical game gets a patch released to fix the problem. That requries a whole lot of effort IF there wasn't already going to be a patch released.

But it comes down to can people actually list game which this might be a problem for. The only one I could think of is FarCry and well, more than likely that its going to be patched again.
 
Back
Top