x800 texture shimmering: FarCry video

jvd is just trying to side track the argument with nvidias 8xAA mode, one option that is a mixed mode and it's not even a mode that's benched a lot or one could do fair comparisons with against competitors.

Facts still remain, ATI uses brilinear reduced filtering and claimed it to be full trilinear and lied to everyone about it. ATI claimed this new brilinear technique didn't reduce quality, again we have ample proof this is not the case, with lots of games showing increased aliasing and texture shimmer from reduced filtering. ATI refuse to give users the option to turn full trilinear on, a decisions motivated from the fact that we've seen full trilinear reduces their bench scores by a considerable margin, one which probably mean loosing the perceived performance edge they have with high degrees of AF.

ATI is playing a dangerous and deceptive game here, fact is their product HAS to be faster then their competitors, if not the X800 series will fail because feature wise it is behind and last years tech in a lot of regards.

jvd accept the facts and deal with it dude, god why you show such dedication to ATI is beyond me, they stuffed up, no need to make excuses and try to cover it up and pretend nothing is wrong. The sooner fanatics like you admit and get on ATI's case the better it will be for you in the long run, ie you'll get the options to disable these hacks when it affects visual quality.
 
Ok now the topic has changed to hybrid antialiasing modes. I don´t see the relevance to the topic but anyway...

Your correct changing the aa has affected image quality and its a reproducable in game quality change for the worse. Trylinear has not ;)

Still entirely apples vs oranges. I don´t see what it has to do with the issue at hand. Antialiasing belongs IMHO into a separate thread.

Wanted to see how those attacking ati would respond to nvidia reducing iq and not allowing users to change it back. This time though the proof can be seen in screen shots unlike ati's optimizations .

The result is exactly what I expected.

People brushing it off , calling me a ati fan , saying this is nothing like what ati is doing .

However I agree with this move (like i do with ati's ) as now the feature has a chance to be usable in most cases . But nvidia has still reduced aa and taken away the option for us to change it back.



Why lament the loss of a useless feature?
why lament a speed increase that 99.9% of the time does not affect iq ?



You mean a reviewer would actually have to do his job of evaluating a product... oh the shame.
Well they aren't doing it now. Saying lets gimp the card as much as we can cause they are optimizing even though we can't find any iq decreases because of this . But lets do it anyway .

Wow jvd...just wow.

I agree. I figure the responses would be like this. But the people brushing it off suprised me .

jvd is just trying to side track the argument with nvidias 8xAA mode, one option that is a mixed mode and it's not even a mode that's benched a lot or one could do fair comparisons with against competitors.

Well isn't the ati trylinear just a way to side track everyone from the fact that ati's cards are much faster than nvidias ?

Yup its 8xaa that now looks like 6x fsaa from nvidia and has perfromance no where near its compeitiors perfromance .

Just a marketing feature .

ATI claimed this new brilinear technique didn't reduce quality, again we have ample proof this is not the case, with lots of games showing increased aliasing and texture shimmer from reduced filtering. ATI refuse to give users the option to turn full trilinear on, a decisions motivated from the fact that we've seen full trilinear reduces their bench scores by a considerable margin, one which probably mean loosing the perceived performance edge they have with high degrees of AF

Really link me to the lots of games showing increases aliasing and texture shimmering because of the reduced filtering ?

I actually have yet to see these in articles . I look foward to reading these .

Here we have nvidia claiming to have an 8x fsaa mode which now looks like 6x fsaa from ati. Yet still calls it 8x and refuses to give us the option to pick between the higher quality and lower quality 8x .

ATI is playing a dangerous and deceptive game here, fact is their product HAS to be faster then their competitors, if not the X800 series will fail because feature wise it is behind and last years tech in a lot of regards.
I dunno if 8x fsaa is a hint of the features nvidia has over ati then I guess ati will be just fine .

jvd accept the facts and deal with it dude, god why you show such dedication to ATI is beyond me, they stuffed up, no need to make excuses and try to cover it up and pretend nothing is wrong

I have accepted the facts . Ati has a optimization that 99.9 percent of the time does not degrade image quality. This is something nvidia has been unable to do.

I am not trying to cover up and pretend nothing is wrong. I will admit that nvidia is much to slow when using options that displays the same iq as ati products . ;)




Thank you all for posting exactly how i thought you would.

For those of you the double standard is much to great.

Some of your bias is extremely bad. SO bad as to attack me for speaking up on something nvidia has actually done . I'm sorry it didn't not cast them in a good light but please do not personaly attack me .
 
ninelven said:
jvd said:
why lament a speed increase that 99.9% of the time does not affect iq ?
I don't believe I have.
actually that was an open question. Wasn't realy directed at you. It was basicly meant to say why can't i be upset about something that actually is proven to reduce quality when people can get upset over something that has yet to be proven to reduce quality
 
You never know your luck jvd. I'm sure nVidia is watching the discussions we've been having and they may just include the ability to choose your 8x mode in a future driver - just for you.
 
radar1200gs said:
You never know your luck jvd. I'm sure nVidia is watching the discussions we've been having and they may just include the ability to choose your 8x mode in a future driver - just for you.

good thats what i want to happen.

Its all about the choices :oops:
 
Your correct changing the aa has affected image quality and its a reproducable in game quality change for the worse. Trylinear has not.

I can't follow you're reasoning exactly.

a) Multisampling + AF is in relative terms a "performance alternative" to pure Supersampling. Apart from some alpha corner cases MSAA+AF has been so far sufficient, especially since you can utilize much higher resolutions than with pure Supersampling.

b) Changing the combination of a hybrid MS/SSAA antialiasing mode hardly can be considered an optimisation. It's virtually a change in the sampling pattern. Considering point (a) if you balance things and consider the performance tradeoff from the sampling pattern change you actually may get higher IQ after all, since you can pick a higher resolution too. Whatever you lose from the missing 2xSSAA you will gain from the higher resolution.

c) Hybrid bi-/trilinear texture filtering optimisations have nothing, absolutely nothing to do with the above. And yes said method and NVIDIA's alternative (well they actually introduced theirs first) do degrade image quality. It's up to the final user's tolerance there, if he wants to accept it or not, or if he can notice it or not.

Actually I hit against people that can't notice the stuff I do or get annoyed by the same things I do. I don't see either why everyone has to agree with me or I with everyone else.

Frankly there are admittedly cases where I really don't bother and just pick bi/AF in combination with a higher resolution, yet there are cases where for whatever reason I do have to disable the texturing stage optimisations in order to get rid of nasty side-effects. And yes it's irrelevant to brilinear or what you guys call it lately, but I'd still like to continue pleading for various in driver switches to shut all or some optimisations off, whenever something annoys me.


Wanted to see how those attacking ati would respond to nvidia reducing iq and not allowing users to change it back. This time though the proof can be seen in screen shots unlike ati's optimizations .

Frankly I couldn't care less what each of you have as an agenda, why and for what. I care about the technology, it's advancements and what I can get for what as a consumer and at what price. That said I pick whatever I consider will work best for me.

Texture aliasing is not something I'll be able to show you in screenshots either and for me personally it's far more annoying than edge aliasing or MIPmap banding (basically anything that moves or wobbles in a scenery is giving me the creeps but hey one can not have it all).


People brushing it off , calling me a ati fan , saying this is nothing like what ati is doing .

No it actually isn't; and if you can't see that one thing has nothing to do with the other then I'm lost anyway.

Bottomline is that I care about something entirely different and that is added texture aliasing; I'm having an extremely hard time getting user feedback due to you people having your catfights and you're not exactly innocent either here. Obviously it isn't in either sides interest to detect weaknesses or bugs and it has to get swept under the carpet either way.

However I agree with this move (like i do with ati's ) as now the feature has a chance to be usable in most cases . But nvidia has still reduced aa and taken away the option for us to change it back.

Even in 3rd party applications? And it's not like you're going to use 8xAA in the majority of cases anyway. At least you're left in official drivers (ignoring the opportunity 3rd party applications give to unlock countless of aa modes in the meantime) with a 2x Supersampling hybrid option for rare cases. What exactly will you do on the Radeon should you need Supersampling?

Yup its 8xaa that now looks like 6x fsaa from nvidia and has perfromance no where near its compeitiors perfromance .

Nope the EER on 8x is still higher. Staying in devil's advocate mode here, that's the major point of Multisampling or any other form of edge antialiasing anyway; edge equivalent resolution.

It basically turned from a nearly unusable mode into an improved "4xS variant" sustaining the 4*4 sampling grid that the former 8x sampling pattern also had.

Some of your bias is extremely bad. SO bad as to attack me for speaking up on something nvidia has actually done . I'm sorry it didn't not cast them in a good light but please do not personaly attack me .

I haven't seen you specifically being exactly impartial in this thread either.

Why don't you guys just calm down consider that it's actually about a couple of pieces of silly silicon or that you have nothing to win or to lose after all and just go and play a couple of games. Dunno maybe I'm just weird after all.........
 
Actually, ATi started the optimizations back with R200 and QUAK.

It would be interesting to see exactly when nVida introduced brilinear. I have a suspicion it would line up very neatly with the release of the 9600.
 
radar1200gs said:
Actually, ATi started the optimizations back with R200 and QUAK.

It would be interesting to see exactly when nVida introduced brilinear. I have a suspicion it would line up very neatly with the release of the 9600.

In an ideal world there wouldn't be a need for ever increasing performance only orientated optimisations or even cheats if you prefer, as of course people with an agenda.

It would be interesting to see at least some attempts for some impartiality for a change.
 
radar1200gs said:
Actually, ATi started the optimizations back with R200 and QUAK.

It would be interesting to see exactly when nVida introduced brilinear. I have a suspicion it would line up very neatly with the release of the 9600.

quack :rolleyes: do a search on that in these forums .

As for when nvidia introduced brilinear . Well that was with the 5800ultras .

Oh wait i get why think it came out with the 9600. YOu think it happend cause ati did it and your trying to say they did it first . To bad that is not the case.

Whats more . If they did come out at the same time which they didn't but if they did then nvidia needs a new driver team as thiers looked like crap and ati's had gone unnoticed for over a year .
 
mozmo said:
ATI claimed this new brilinear technique didn't reduce quality, again we have ample proof this is not the case, with lots of games showing increased aliasing and texture shimmer from reduced filtering.

a) We don't have ample proof, we have some FarCry videos which has been suggested may show an engine issue, not a card one. I'm not so sure about that considering the difference between th 9700 and X800, but this remains to be seen

b) "Lots" of games?

ATI refuse to give users the option to turn full trilinear on

They have done nothing of the sort.

ATI is playing a dangerous and deceptive game here, fact is their product HAS to be faster then their competitors, if not the X800 series will fail because feature wise it is behind and last years tech in a lot of regards.

Oh give me strength! It's hardly last years tech when it's barely been used. Take it to a SM2.0 Vs. SM 3.0 topic.
 
So i got my X800 pro yesterday as retailers near my home got a lot of them since Monday. I installed it, had the luk to get the 4.6 same day and loaded Farcry. I am sure i must be blind because all i see is a wonderfull game with almost zero problems. I wonder where is the shimering i should see ?
 
PatrickL said:
So i got my X800 pro yesterday as retailers near my home got a lot of them since Monday. I installed it, had the luk to get the 4.6 same day and loaded Farcry. I am sure i must be blind because all i see is a wonderfull game with almost zero problems. I wonder where is the shimering i should see ?
Read the original messages in this thread, I describe in detail where and how you can see these artifacts.

And I've repeatedly written that these artifacts are not common at all but very rare.
 
Oh i read that s why i used my saves to go thru the games. Thoses artifacts must be so rare that i found no problem. Maybe try again with the 4.6 and unless you are really looking for a problem you have been warned off i doubt any player will ever see anything. Well, my personnal experience with the game, not thru videos is a saw nothing and i tried hard all the evening.
 
mozmo said:
ATI is playing a dangerous and deceptive game here, fact is their product HAS to be faster then their competitors, if not the X800 series will fail because feature wise it is behind and last years tech in a lot of regards.

What competitors? XGI?
Nvidia? They're selling hardware that is even behind Ati's "last years tech". FX5900, FX5600, FX5200. Even when the FX6800 does finally arrive in the stores, it's the older FXs they'll be selling.

As for nvidia's new tech: game developers have to sell games to a market that is still being flooded with FX cards that are barely even dx9 capable. Why even bother with PS3.0 now?
 
PatrickL said:
Oh i read that s why i used my saves to go thru the games. Thoses artifacts must be so rare that i found no problem. Maybe try again with the 4.6 and unless you are really looking for a problem you have been warned off i doubt any player will ever see anything. Well, my personnal experience with the game, not thru videos is a saw nothing and i tried hard all the evening.
Load the save game I specified - Fort, the save in the old bunker - turn on your flashlight, make sure you use 4xAF (in game setting).

Don't use any registry hacks (the ones recently published on computerbase).

I'll try 4.6, but I very much doubt that it will make any difference.
 
Since so many seem to have problems finding the spot I recorded in the demo, I've prepared a .zip file with the save game.

Instructions:
  • Unpack the zip file into your FarCry installation. It uses its own Profile, so your other save games should be save. Anyway, it's a good idea to create a backup you Profile directory first.
  • Start the game. From the main menu, chose "Profile" and select the profile "ShimmerCheck".
  • Then select "Campaign" and load the only save game present in that profile.
  • Turn on your flashlight, look at the left wall (also visible on the right wall), and move forward and backwards.

Here's the link: http://grestorn.webinit.de/FC_ShimmerSave.zip

And, yes, the shimmering is also visible with Cat 4.6.
 
Does anyone know how the 6800's look in that part, do they have better or the same IQ :?:

And comparing NV 8xAA with ATI Brilinear is just daft and totally unrelated.
 
jvd said:
why lament a speed increase that 99.9% of the time does not affect iq ?

Umm I'm very interested in how you calculated it doesn't effect IQ 99.9% of the time care to show us the maths?
 
Back
Top