Filtering verdict

OK kids, I know you've been chucking this hot potato back and forth plenty recently, but I've been a terribly busy bee of late and haven't been able to join in on the festivities. So...what's the bottom line regarding the latest filtering 'scandal'. Does it boil down to ATI not actually doing anything worse than NV's filtering optimisations but suffering from the fact that they (ATI) have been playing holier than thou regarding optimisation?


EDIT:
No we don't need another of these threads, but one or two well in formed posts might be useful for those who don't have time to read through 26,378 posts.
 
OK, christ I didn't realise how out of hand it has become, there's already "can someone sum this up" threads. And even those threads are long and complicated. I may just let this one slip under the radar, it's rather dull.

In more interesting news, I notice that NV40 is still running partial precision shaders in Far Cry with the latest driver released this week, which is worrying for NV fans I would think. What is the good of shader 3 support if NV40 is too slow to run complex PS2.0 ops?
 
Basicly we have two vides showing the problem. both seem to be game engine problems and not the filtering problems
 
In more interesting news, I notice that NV40 is still running partial precision shaders in Far Cry with the latest driver released this week, which is worrying for NV fans I would think. What is the good of shader 3 support if NV40 is too slow to run complex PS2.0 ops?

I seriously doubt there is anything to worry about for 6800U owners. Until DirectX 9.0c is available and the SM 3.0 add-on for FarCry is released, then 6800 owners will not be running the game optimally. Keep in mind that FarCry, in it's current form, utilizes mostly PS 1.1 shaders, with PS 2.0 shaders used primarily for lighting effects. To be realistic, neither the forced R300 path nor the NV3x path should be considered optimal for the NV4x.
 
jimmyjames123 said:
In more interesting news, I notice that NV40 is still running partial precision shaders in Far Cry with the latest driver released this week, which is worrying for NV fans I would think. What is the good of shader 3 support if NV40 is too slow to run complex PS2.0 ops?
I seriously doubt there is anything to worry about for 6800U owners. Until DirectX 9.0c is available and the SM 3.0 add-on for FarCry is released, then 6800 owners will not be running the game optimally. Keep in mind that FarCry, in it's current form, utilizes mostly PS 1.1 shaders, with PS 2.0 shaders used primarily for lighting effects. To be realistic, neither the forced R300 path nor the NV3x path should be considered optimal for the NV4x.
For performance, the NV3x path should be optimal for the NV4x since it uses less complex shaders. You think things are going to get faster with more PS 2.0/3.0 shaders replacing PS 1.x shaders?

-FUDie
 
FUDie said:
jimmyjames123 said:
In more interesting news, I notice that NV40 is still running partial precision shaders in Far Cry with the latest driver released this week, which is worrying for NV fans I would think. What is the good of shader 3 support if NV40 is too slow to run complex PS2.0 ops?
I seriously doubt there is anything to worry about for 6800U owners. Until DirectX 9.0c is available and the SM 3.0 add-on for FarCry is released, then 6800 owners will not be running the game optimally. Keep in mind that FarCry, in it's current form, utilizes mostly PS 1.1 shaders, with PS 2.0 shaders used primarily for lighting effects. To be realistic, neither the forced R300 path nor the NV3x path should be considered optimal for the NV4x.
For performance, the NV3x path should be optimal for the NV4x since it uses less complex shaders. You think things are going to get faster with more PS 2.0/3.0 shaders replacing PS 1.x shaders?

-FUDie

Exactly! The R300 path has been described as a DX9 path, plain and simple. In any case, the fact is, NV40 is usually slower than R420 in Far Cry despite running lower precision. The point here is this: are there implications for running the whole fleet of upcoming dx9, shader intensive games, like *yawn, sorry* HL2? I suspect there are.
 
For performance, the NV3x path should be optimal for the NV4x since it uses less complex shaders. You think things are going to get faster with more PS 2.0/3.0 shaders replacing PS 1.x shaders?

Not necessarily. In some of the synthetic tests done, the NV40 was actually faster in some cases using PS 2.0 vs using PS 1.1. The NV3x cards share FP16/FP32 precision with the NV4x, but architecturally the differences are eye opening (NV4x has superscalar architecture, different pipeline structure, different AA/AF algorithms, FP16 blending, SM 3.0, etc). Also keep in mind that the forced R300 path apparently has some coding optimizations for the R3xx cards, hardly what one would call "optimal" for the NV4x. SM 3.0 is, of course, designed to help make certain effects more efficient vs PS 2.0. Still, "optimal" in this case would refer to combination of image quality and performance.
 
jimmyjames123 said:
For performance, the NV3x path should be optimal for the NV4x since it uses less complex shaders. You think things are going to get faster with more PS 2.0/3.0 shaders replacing PS 1.x shaders?

Not necessarily. In some of the synthetic tests done, the NV40 was actually faster in some cases using PS 2.0 vs using PS 1.1. Also keep in mind that the forced R300 path apparently has some coding optimizations for the R3xx cards, hardly what one would call "optimal" for the NV4x. SM 3.0 is, of course, designed to help make certain effects more efficient vs PS 2.0. Still, "optimal" in this case would refer to combination of image quality and performance.

Agreed. But if NV40's shaders really are as fast in PS2.0 as R420, and considering Far Cry is in NV's The Way Its Meant To Be Played program, and that this problem is being constantly highlighted, why the hell haven't NVIDIA sorted it. Don't forget Far Cry patch 1.1 *added* support for NV40, why didn't it do so with proper full precision shaders? Hmm?

One might conclude that NV40s shaders carry over some of the registry space problems of NV30, but not to the sam extent, and that Crytek are having a tough time getting Far Cry to run fast on NV40 with full precision, hence the ongoing nature of the problem.
 
Patch v1.1 on FarCry was designed to help boost NV3x performance, correct?

Just give it some time, and wait for the SM 3.0 add-on before jumping to any conclusions. One guy at NVN had the opportunity to play FarCry using NV40 and SM 3.0, and he was quite impressed.
 
jimmyjames123 said:
Patch v1.1 on FarCry was designed to help boost NV3x performance, correct?

Just give it some time, and wait for the SM 3.0 add-on before jumping to any conclusions. One guy at NVN had the opportunity to play FarCry using NV40 and SM 3.0, and he was quite impressed.

Yes, but it also added support for NV40 - Try running Far Cry on NV40 without 1.1 applied and you'll see that the whole thing runs totally corrupted. Ultimately, until NV fix it, they can hardly complain about people pointing out that Far Cry runs both slower and with lower shader quality than the supposedly less sophisticated R420. And lest ye forget, NV30 is capable of running much more sophisticated shaders than R300, but we'd all agree that's irrelevant in the light of NV30's v poor performance.

Any neutral observer would have to conclude there might be a problem with shader performance on NV40.
 
jimmyjames123 said:
For performance, the NV3x path should be optimal for the NV4x since it uses less complex shaders. You think things are going to get faster with more PS 2.0/3.0 shaders replacing PS 1.x shaders?
Not necessarily. In some of the synthetic tests done, the NV40 was actually faster in some cases using PS 2.0 vs using PS 1.1.
Sounds like a driver issue, not a HW issue. There's no reason why PS 1.1 should not run at the same speed, or faster, as PS 2.0, look at the R300 for example.
The NV3x cards share FP16/FP32 precision with the NV4x, but architecturally the differences are eye opening (NV4x has superscalar architecture, different pipeline structure, different AA/AF algorithms, FP16 blending, SM 3.0, etc).
What does any of this crap have to do with the issue at hand? Far Cry doesn't use FP16 blending. AA/AF algorithms don't factor in to the discussion about shaders. Whether the pipeline is "superscalar" is just marketing, applications don't change their shaders because some chip is "superscalar".
Also keep in mind that the forced R300 path apparently has some coding optimizations for the R3xx cards, hardly what one would call "optimal" for the NV4x.
According to who? And was I talking about the R300 path?

You have a knack for trying to confuse the issue.
SM 3.0 is, of course, designed to help make certain effects more efficient vs PS 2.0. Still, "optimal" in this case would refer to combination of image quality and performance.
Which is why I explicitly mentioned performance, not image quality.

-FUDie
 
jimmyjames123 said:
One guy at NVN had the opportunity to play FarCry using NV40 and SM 3.0, and he was quite impressed.

What you conveniently forget to mention is that while this same individual was impressed with the look of Far Cry in SM 3.0, he ALSO stated that it was VERY choppy and appeared to be running in the low 20's framerate-wise....so why don't we turn off the speculation until we actually have:

a) 6800 U cards in the hands of REVIEWERS, much less end users

b) DX 9.0C released (several months away is the latest I have heard)

c) Far Cry patch enabling SM 3.0....which will likely coincide with the release of DX 9.0C...

Until such time, anything and everything is nothing but pure speculation based on little to no actual facts....
 
caboosemoose said:
OK kids, I know you've been chucking this hot potato back and forth plenty recently, but I've been a terribly busy bee of late and haven't been able to join in on the festivities. So...what's the bottom line regarding the latest filtering 'scandal'. Does it boil down to ATI not actually doing anything worse than NV's filtering optimisations but suffering from the fact that they (ATI) have been playing holier than thou regarding optimisation?


EDIT:
No we don't need another of these threads, but one or two well in formed posts might be useful for those who don't have time to read through 26,378 posts.

To sum this up, there appears to be some shimmering under specific conditions in FarCry, however it has not yet been shown to be a result of trylinear, and it has been suggested (with further shots) that it is because of the way the engine, or the effect it is using, is rendered that this issue occurs.

I'd say the ball is still up in the air on this one. Read the posts though, it will help you draw up your "IHV bias" list simply from the way people swing on the basis of very little.
 
Is there a R300 path in FarCry? I thought R9500-9800/X800 used a standard dx9 path.
Or was it that the R300 path is a standard dx9 path? But then why the need for a R300 path? Im confused :oops:
 
There really is no conclusion on an entirely subjective matter, Some people have said they can see it, Some say they cannot, It's the same argument with Brilinear Verses Trilinear.

Not many people are going to agree.`
 
ChrisRay said:
There really is no conclusion on an entirely subjective matter, Some people have said they can see it, Some say they cannot, It's the same argument with Brilinear Verses Trilinear.

Not many people are going to agree.`
actually when brilinear first came out , I don't recall anyone saying they couldn't see the diffrence. As a matter of fact the whole issue came to light right off the bat in beyon3d.coms preview of the 5800ultra . Because they could see it in games. This came about by comparing mathmaticly 1 image to another image to see a diffrence.
 
jvd said:
ChrisRay said:
There really is no conclusion on an entirely subjective matter, Some people have said they can see it, Some say they cannot, It's the same argument with Brilinear Verses Trilinear.

Not many people are going to agree.`
actually when brilinear first came out , I don't recall anyone saying they couldn't see the diffrence.

I recall entire sites saying this.
 
Quitch said:
jvd said:
ChrisRay said:
There really is no conclusion on an entirely subjective matter, Some people have said they can see it, Some say they cannot, It's the same argument with Brilinear Verses Trilinear.

Not many people are going to agree.`
actually when brilinear first came out , I don't recall anyone saying they couldn't see the diffrence.

I recall entire sites saying this.
links please ?
 
Back
Top