x800 texture shimmering: FarCry video

Bjorn said:
Grestorn said:
It is neither acceptable that nVidia is forcing their brilinearity on us (currently only on the FX line, but that doesn't matter),

I agree with the rest of the post but, i wouldn't say that forcing brilinear on the FX doesn't matter. Thankfully, i don't own a FX myself but if i would, i would like a full triliner option (and a non angle dependant option).

You can disable Brilinear with the latest leaked 61.32 drivers for the Geforce FX cards,

Theres a broken 16x Mode, and a new mode called High Quality, (which seems also made for the 6800 series)

But mostably. 8x Brilinear can be disabled with the trilinear Optimisation removal item for Geforce FX users.
 
ChrisRay said:
Bjorn said:
Grestorn said:
It is neither acceptable that nVidia is forcing their brilinearity on us (currently only on the FX line, but that doesn't matter),

I agree with the rest of the post but, i wouldn't say that forcing brilinear on the FX doesn't matter. Thankfully, i don't own a FX myself but if i would, i would like a full triliner option (and a non angle dependant option).

You can disable Brilinear with the latest leaked 61.32 drivers for the Geforce FX cards,

Theres a broken 16x Mode, and a new mode called High Quality, (which seems also made for the 6800 series)

But mostably. 8x Brilinear can be disabled with the trilinear Optimisation removal item for Geforce FX users.


And now fanatics have been left with no valid criticisms of drivers for GF FX cards, and ATi has been left with no justification for continuing to force its optimizations without choice.

One company has learnt and one hasn't.
 
radar1200gs said:
One company has learnt and one hasn't.

That's just for this particular issue though. Let's see how things pan out when we start to see more shader intensive games.
 
Bjorn said:
radar1200gs said:
One company has learnt and one hasn't.

That's just for this particular issue though. Let's see how things pan out when we start to see more shader intensive games.

Well at that point the GF-FX will suffer, unfortunately.

You can't magically add missing shader performance with software. You may be able to reschedule things to make the most of your available shading power.

So long as no illegal shader replacement happens I'll be happy.
 
I find it very interesting to see how much effort people need to put in to convince people ATI are optimising and it is causing IQ degredation ( while increasing FPS ).... why ? Surely no ATi bias here ?

At present it seems, given a 6800 and the right driver set that nvidia are the company for IQ quality and ATi are not, they sacrifice IQ for speed.

The ATi is quicker though, this will come more into effect when newer games come out and you have to sacrifice that IQ to get smooth frame rates.
 
Bjorn said:
radar1200gs said:
One company has learnt and one hasn't.

That's just for this particular issue though. Let's see how things pan out when we start to see more shader intensive games.

Agreed. Some people seem too ready to forget there's still few or no 6800s available. When they do come available, FX5900s and lower will remain on the market for quite while, and that's with FP16 and even FX12 for some cards.
As it is now, Nvidia still sells chips that could barely be called dx9 compliant and will only perform well by sacrificing image quality.

It's too early to say Nvidia is willing to learn from the past. For all we know future det drivers could have worse trilinear again. They've done it before.
 
Cant post at Rage 3d at the momemnt :( dont know why :(

Has anyone posted this there? Given that ATI said they would fix any place where the optimisation is noticable - surely they should fix this?
 
ChrisRay said:
You can disable Brilinear with the latest leaked 61.32 drivers for the Geforce FX cards,

Theres a broken 16x Mode, and a new mode called High Quality, (which seems also made for the 6800 series)

But mostably. 8x Brilinear can be disabled with the trilinear Optimisation removal item for Geforce FX users.

This just means that (a) Nvidia are trying it on with "leaked" drivers that won't make it into a full release, or (b) Nvidia don't care about the performance on GFFX now they have superceded it - they are happy for GFFXes to take the performance hit, because it just makes the NV40 look better again.

Nice of Nvidia to finally take the high moral ground now that it no longer matters to them. :rolleyes:
 
dizietsma said:
I find it very interesting to see how much effort people need to put in to convince people ATI are optimising and it is causing IQ degredation ( while increasing FPS ).... why ? Surely no ATi bias here

no, the reason is different: ati got caught with a very faked prove (means not simply by seeing, but by doing blind measurements and bit comparisons and all), they agreed they do it, and sayd they do it since long, and try to not kill image quality itself.

this means a simple thing:
this thing exists yet. nobody have ever noted, and ever got hurt by it. the claim ati is cheating with lower quality thus got.. well.. taken from hot air. else people would have complained right from the first time it was implemented.

nobody got hurt till one could show "there possibly is something". from this point on, people started to search prove in tons of forms, to validate there is something.

brilinear on the other hand, was obvious the first time you started a game with new drivers.



this is why people have a hard time to believe it really results in quality damage. because over a year of expirience of tons of users shows different.


i've yet to see as much shimmering on my 9700pro as in that video anyways.
 
dizietsma said:
At present it seems, given a 6800 and the right driver set that nvidia are the company for IQ quality and ATi are not, they sacrifice IQ for speed.
Even with this "toggle" for trilinear, investigation will still be required into these drivers - afterall, hasn't someone already spotted optimisations in aquamark despite all optimisations supposedly being off?
 
whql said:
dizietsma said:
At present it seems, given a 6800 and the right driver set that nvidia are the company for IQ quality and ATi are not, they sacrifice IQ for speed.
Even with this "toggle" for trilinear, investigation will still be required into these drivers - afterall, hasn't someone already spotted optimisations in aquamark despite all optimisations supposedly being off?

I haven't heard anything like that, but then I don't visit heaps of forums. Link?
 
radar1200gs said:
And now fanatics have been left with no valid criticisms of drivers for GF FX cards, and ATi has been left with no justification for continuing to force its optimizations without choice.

One company has learnt and one hasn't.

Yes, but nVidia appears to have learned only after the "fanatics" were making their valid criticisms for the last 12 months, at least. It seems that finally the message may have been absorbed by nVidia that "It's not nice to cheat Mother Customer." Heh...;) Better late than never, I always say...

But...then we still have to slog through a lot of testing to determine whether or not the currently implemented disable switch for brilinear actually works for any and every 3d game you might like to test. If there's one thing nVidia's taught us well over the last several months, it's that the presence of "switches" in a driver control panel is no assurance that the switches are more than cosmetic, and that they actually perform the functions they purport.

Still, though, early signs are somewhat encouraging that nVidia's possibly managed to drop some cotton/wax from its collective ears, finally, and at last, after all of this time. Or am I simply waxing too optimisitic?...:D
 
I've never defended or excused nVidia for failing to provide a brilinear disable switch for the FX series; in fact it was because of the FX's filtering issues I decided not to purchase one. The shader performance wasn't really a concern by comparison.

nVidia were/are stupid for taking as long as they have to address the issue, but, it is now being addressed.

As for the swich sometimes working sometimes not, well, the drivers are still beta for a reason you know (edit: i'm talking between releases here, not what waltc was suggesting btw)...
 
tEd said:
Grestorn said:
tEd said:
after trying farcry i can confirm that the x800pro does show more texture aliasing in that place. Pretty much i only see it when i use the flashlight though . After playing around 1-2 levels and besides that wall i didn't notice any problemes with filtering anywhere else.

After all that tesing i've done so far i would conclude whatever ati does in that new AF mode it can lead to more noticeable texture aliasing or light mipmap banding in some places. The question is whether you care about it . As long as i have enough performance room to get around it i'm certainly will do so. If there is a time i need the 5-15 fps boost it can give me i'm happy to take it for alittle IQ decrease here and there. For me that time is not here yet
Well, that's about exactly the same thing I said. It's not that you can see these effects left and right, but only at very specific spots.



My point is: Brilinear can be a sensible optimization. But since there is currently no way for the game programmer to enable or disable it (through a DX flag), it has to be an option for the (advanced) user. There is just no other way.

i can tell you one thing though the texture shimmering doesn't come from brilinear . It's the new AF mode they using on the x800. You mentioned that you tried to get around texture stage optimization with rtool well it doesn't work anymore. The new AF mode forces texture stage optimization and it doesn't matter whether you use rtool or use the ingame AF setting , you can't disable it that way.

From my expereince i would say brilinear is mostly fine. Having an option to disable it won't hurt though

Did you try the latest build of RadLinker? (just want to outsource all possibilities). I agree also that it doesn't sound like a brilinear originating problem.
 
radar1200gs said:
I haven't heard anything like that, but then I don't visit heaps of forums. Link?
Well, you visit these forums.

http://www.beyond3d.com/forum/viewtopic.php?p=295981#295981
http://www.beyond3d.com/forum/viewtopic.php?p=296002#296002

radar1200gs said:
I've never defended or excused nVidia for failing to provide a brilinear disable switch for the FX series; in fact it was because of the FX's filtering issues I decided not to purchase one. The shader performance wasn't really a concern by comparison.
So, whats your interest here? You just like bashing ati I assume?
 
thx to the article quasar linked users seems to have now the option to disable the new filtering optimizations:

"RV350TRPER" 1
"RV350ANTHRESH" 1
"R420AnisoLOD" 2

i guess its more than likely that the 3d party tools will adapt this soon.
 
I wonder how many here that have found "problems" have reported them to ati ? Or is it that once they report them and ati fixes them they wont have anything to bitch about ?
 
Bouncing Zabaglione Bros. said:
ChrisRay said:
You can disable Brilinear with the latest leaked 61.32 drivers for the Geforce FX cards,

Theres a broken 16x Mode, and a new mode called High Quality, (which seems also made for the 6800 series)

But mostably. 8x Brilinear can be disabled with the trilinear Optimisation removal item for Geforce FX users.

This just means that (a) Nvidia are trying it on with "leaked" drivers that won't make it into a full release, or (b) Nvidia don't care about the performance on GFFX now they have superceded it - they are happy for GFFXes to take the performance hit, because it just makes the NV40 look better again.

Nice of Nvidia to finally take the high moral ground now that it no longer matters to them. :rolleyes:


I dont see how thats really relevent, Trilinear options are still there, The ability to disable them just happens to be in drivers, It's not like Nvidia systematically removed the Nv3X AF optimisations. Just allowed you to disable them.

I dont see how its any different, if you disable Brilinear on the NV40, It takes a performance hit. If you disable it on the NV3x, It takes a performance hit. Just how is this Nvidia trying to make the NV3x look bad?

Anyway Trilinear AF was never done if the aplication didnt request trilinear filtering. Only Bilinear AF was applied.
 
Back
Top