x800 texture shimmering: FarCry video

I think someone such as DB should do a similar series of benches, with the 60.72's and the optimizations off, and the ATi drivers originally used for the r420 reviews (doesn't matter about the texture stage optimizations so long as both cards do the same thing, either all trilinear, or one stage tri, rest bilinear) and see how it pans out.
 
Mirrors & Fraps Codec

Here are some mirrors for jdv's videos, also I found the fraps codec one needs to play back the videos. Simply extract the dll and place it into your windows/system32 directory. The mirrors will last as long as my bandwidth holds out. Let me know if you need something else hosted in the mean time. :)



wowx800pro 16aniso.rar - 32MB
x800temporal 6x.rar - 23 MB
FrapsCodecs.rar
 
radar1200gs said:
I was assuming they updated the tests (though like you I can hardly make sense of some of that article).

i don't see why they would . On that webpage they don't bench it against anything but it self . So the comparision to itself still stands as they are showing what happens with fulll trilinear on all stages (bottom one)

I asume it goes bilnear / af /
optimized tri / af / stage 1 tri
full quality tri /af/ stage 1
full quality tri/af/ all stages


But they changed the lod too and i can't understand why .

I know changing the load to less aggresive settings increases my score on the 9700pro( haven't messed with the x800 yet) quite alot . So i'm asuming increasing the aggresiveness of the lod will decrease perfromance quite alot

If so when nvidia goes from trilnear to bilinear do they change the lod both ways ?

if not did this site change the lod like they did on the radeons ?

Also note that the r9600s were at 800x600 and the x800s were at 1600x1200


Some interesting things though if i read it correctly

using this optimization ati gains 17 fps compared to trilenar with and agressive lod

but is only looses 9 fps to bilinear .

if the net gain is so much for something that people are digging tooth and nail for this must be a very efficent optimization .

I would have thought it to fall in between the two . Not such a diffrence .
 
radar1200gs said:
I think someone such as DB should do a similar series of benches, with the 60.72's and the optimizations off, and the ATi drivers originally used for the r420 reviews (doesn't matter about the texture stage optimizations so long as both cards do the same thing, either all trilinear, or one stage tri, rest bilinear) and see how it pans out.
well when asked

You may have noticed that we haven't been lacking in things to write about recently - I've hardly had any time to myself since April. If people are willing to pay me to do this full time then I will and then I'll have more time to look at more things, until then I only have a finite quantity of time and you'll have to accept what we give.

You'll also note that the majority of articles looking at this so far have taken it from a theoretical level and having been coding tools to specifically highlight the issue. For one, we don't have the coding resource to look at it (however, we generally use what people give us, hint hint) and second, as this thread bears evidence to, there are still very few actual examples of notable differences in games. I am, however, still waiting for an interview to come back from ATI (I have been chasing them) on the subject and if I get some time I might take a closer look look at some game differences then.

and a few posts down

Again, I can use the tools at my disposal to use - those tools are either are / in games or coded elsewhere. If none of the tools at my disposal highlight any issues to any great extent what am I supposed to do?

You may remember in in initial thread it was us that highlighted that this had occured since the first set of drivers for 9600

edited to put in linkhttp://www.beyond3d.com/forum/viewtopic.php?p=298809#298809
 
farcryperpixelAFproblem.jpg


Once again...

9500 Pro...

Extreme problem with surfaces in FC that combine regular bump mapping, specular per pixel lighting, and AF... Do I literally have to release a demo so people can see this problem is on every card and with the game? That would be very sad but a the fan boy wars don't stop I might...
 
jvd said:
I would post showing temporal aa (dunno if it will work ) but on that game i can't go past 30fps

It won't work, even if the captured and rendered FPS are 60Hz+ (unless everybody installs ReClock).
 
Mr. Travis said:
Once again...

9500 Pro...

Extreme problem with surfaces in FC that combine regular bump mapping, specular per pixel lighting, and AF... Do I literally have to release a demo so people can see this problem is on every card and with the game? That would be very sad but a the fan boy wars don't stop I might...

I don't think people here are complaining about the effect you seem to repeatedly show in that screenshot.

That of course if I haven't misunderstood things in between all the catfights and juvenile pissing contests.

What I've seen so far being mentioned in articles, claimed from users and which really worries me is some sort of dynamic LOD flip-flopping in exchange of less AF samples.

If that truly should be the case then we're talking about aliasing that comes from too aggressive LOD values and is completely irrelevant to "brilinear" or the effect you're describing.

For the record even on R3xx the MIPmap LOD default "0" value seems to be too aggressive for some cases and there I usually tune down to "quality" in order to get rid of added aliasing. In that case it works. Hence my question pages ago if on the X800PRO any fiddling with driver LOD sliders (to more positive values) helps any.

---------------------------------------------------------

And something for all: I'm tired through all the years every time an issue is being mentioned on X, Y or Z accelerator it potentially turning into a flamewar. It's really for the benefit of the final user and the affected IHV himself to fine tune his drivers and if it's some sort of bug to get rid of it as soon as possible.

It's really irrelevant which IHV it affects and everytime in the past I've mentioned an issue in public there always have been attempts to try to sweep it under the carpet and there neither NVIDIA nor ATI followers are immun.

No I want to see it in public, I want to read as much as possible about it and no website reviews frankly never cover all of it. I want to find out as much as possible positive as negative side of an offering before I walk into a store and spend =/>400$ on an accelerator. And frankly I couldn't care less about anyone's personal agenda either; there are better suited boards out there to put it a bit more elegant.

Finally I am willing to keep an open mind until I can reach a conclusion; until then I would like any IQ analyzing attempts to continue and then let me or everyone else be the judge what I can tolerate or not. I haven't noticed anyone funding my purchases so far for the record.
 
As has been said a hundred times the only reason at all to resist ATI putting in a toggle to turn the optimization off is b/c as a fanboy a person wants their IHV to win in a benchmark on some website review.

As for ATI's motivations they are obvious, win sell more :) But any reasonable person as a consumer should desire that they put a toggle in the driver. I am sure in the future reviews will offer either dual set, or a rough percent performance loss, or just bench the boards on optimized filtering for both.
 
Sxotty said:
As has been said a hundred times the only reason at all to resist ATI putting in a toggle to turn the optimization off is b/c as a <bleep> a person wants their IHV to win in a benchmark on some website review.

As for ATI's motivations they are obvious, win sell more :) But any reasonable person as a consumer should desire that they put a toggle in the driver. I am sure in the future reviews will offer either dual set, or a rough percent performance loss, or just bench the boards on optimized filtering for both.

I want a toggle to turn on all the diffrent modes of 8x fsaa that nvidia now has .

I don't like that they reduced the image quality (noticably) in the latest drivers with no toggle.

How can they do this ? :oops:
 
You have a knack of turning negative attention away from ATI and towards NV. :D

NV offers more than one 60 series forceware driver that allows the user to enable/disable trilinear optimizations.

8xAA can be had in different forms: 4xSS+2xMS, 4xMS+2xSS. From a practical standpoint, the latter mode will be more useful for most gamers.

By the way, regarding 8xAA, have you actually tested a 6800U with 8xAA enabled? Just keep in mind that we are speculating on various beta drivers. Who knows what we will see in the official WHQL version.
 
Like I said only a select few individuals who have an agenda would argue that ATI should not make this a feature in their future drivers. But everyone that does it at least clearly shows their bias.
 
Sxotty said:
As has been said a hundred times the only reason at all to resist ATI putting in a toggle to turn the optimization off is b/c as a <bleep> a person wants their IHV to win in a benchmark on some website review.
Not true. There are other good reasons to avoid putting in this toggle. If it turns out that all of the artifacts people claim to be seeing in this thread are not caused by the optimization, then the toggle's "off" position would be just degrading performance without improving image quality. Such a setting, besides being a waste of space, would worsen the user's experience when they were expecting an improvement.

Even if the optimizations do turn out to be the source of these issues, I don't see how taking the brute force approach is a good solution. I don't think anyone is arguing that the IQ differences are huge, or even visible at all in more than a tiny handful of situations. Why provide an option that potentially kills performance across the board for the sake of those situations? It would make much more sense for ATI to just improve the algorithm to handle these corner cases better, and maintain as much performance as possible.
 
jimmyjames123 said:
You have a knack of turning negative attention away from ATI and towards NV. :D

NV offers more than one 60 series forceware driver that allows the user to enable/disable trilinear optimizations.

8xAA can be had in different forms: 4xSS+2xMS, 4xMS+2xSS. From a practical standpoint, the latter mode will be more useful for most gamers.

By the way, regarding 8xAA, have you actually tested a 6800U with 8xAA enabled? Just keep in mind that we are speculating on various beta drivers. Who knows what we will see in the official WHQL version.

WHat ? I thought jpeople were mad because the trilinear of the x800s were reduced over the previous genration and there was no way to turn it of in the drivers ?

Now we have nvidia that has changed the 8x mode drasticly with in what 3 drivers sets and has changed the image quality with no way to turn it of .

TO me that is the same thing. Yet i don't see radar , you or any of the others who are against ati for not having a toggle complaining about this .

Personaly on the 8x mode i think its a good thing as the old (Even the new) is hardly playable and thus as long as its better than thier 4x mode they can reduce enough to get it to run at decent speeds.

Of course in a thread where antep talks about this and posted pics the quality is about equal to slightly better than 6x fsaa ati but has drasticly reduced image quality .

So why can nvidia call it 8x . :devilish: I can play this out like everyone has with ati :devilish: but to me sometimes things are done because they have to be.

I would love for the x800s , 6800s and all cards to have a way to turn on super sampling fsaa. But i know that is not there because it tanks perfromance and we'd be barely using 4x super sampling right now compared with 6x fsaa in the 80+ frame rates
 
Comparing different 8xAA modes to trilinear optimization toggle makes little sense. 8xAA is an "option" already. ATI's optimized filtering method it not "optional". Talk about grasping for straws.

You are complaining about options within an option. You are also speaking of 8xAA modes on various leaked beta drivers, without any in-depth IQ analysis done between the various 8xAA modes. An 8xAA mode that is actually playable is an option that most gamers would receive very enthuastically. I suspect that most people would prefer 8xAA with 4xMS+2xSS vs 8xAA with 2xMS+4xSS, even if there is less supersampling applied, simply because the performance/image quality ratio would be higher.
 
Comparing different 8xAA modes to trilinear optimization toggle makes little sense. 8xAA is an "option" already. ATI's optimized filtering method it not "optional". Talk about grasping for straws.

So its okay for nvidia to reduce image quality ?



You are complaining about options within an option

No i'm complaining that the quality of 8x has been reduced for speed increases .

are also speaking of 8xAA modes on various leaked beta drivers, without any in-depth IQ analysis done between the various 8xAA modes

Well i hate to break it to u but for 6800 users beta drivers are the only option .

Any changes in the beta driver right now will affect the users of the cards .

An 8xAA mode that is actually playable is an option that most gamers would receive very enthuastically. I suspect that most people would prefer 8xAA with 4xMS+2xSS vs 8xAA with 2xMS+4xSS, even if there is less supersampling applied, simply because the performance/image quality ratio would be higher.

Well there are a few problems. You assume the new option is playable .

Then you say people will will be happy with reduced image quality because the ratio will be higher.

I can use this with ati's trilinear optimizations. But i can also include that in 99% of the cases image quality is exactly the same.

After that I can say if the image quality is equal to that of ati's 6x mode why is it allowed to be called 8x ? Should it not be called 6x ?



I don't understand your double standard .
 
jvd said:
No i'm complaining that the quality of 8x has been reduced for speed increases.
You mean the 8x mode that everyone was whining about being too slow for use in current/upcoming games?

jvd said:
You assume the new option is playable
You are worried that the higher performing 8x will not perform well enough, but this seems to undermine the original point.
Evidence that the new method is inferior?

After that I can say if the image quality is equal to that of ati's 6x mode why is it allowed to be called 8x ? Should it not be called 6x ?
No, a name should be derived from the function/work being performed. What you seem to be looking for is an adjective: excellent 6x, lackluster 8x.
 
You mean the 8x mode that everyone was whining about being too slow for use in current/upcoming games?

so where does it stop. If they keep reducing iq to get speed its not going to look any better than 4x fsaa

You are worried that the higher performing 8x will not perform well enough, but this seems to undermine the original point.
Evidence that the new method is inferior?
if its only a few frames per second why change the quality ?

Why add a useless feature first off ?

http://www.beyond3d.com/forum/viewtopic.php?t=13127&postdays=0&postorder=asc&start=0

ante p seems to see reduced quality and the pics of the 8x new is on par with 6x fsaa from ati. But is no where near the perfromance .

No, a name should be derived from the function/work being performed. What you seem to be looking for is an adjective: excellent 6x, lackluster 8x.

this i don't agree with. Otherwise every single review must post a disclaimer saying that the image from 8x is the same as 6x fsaa .



TO me these issues are nothing alike. Ati has done something that 99% of the time does not reduce image quality . Nvidia has done something that unlessed changed will know allways have reduced image quality .
 
Ok now the topic has changed to hybrid antialiasing modes. I don´t see the relevance to the topic but anyway...

It´s natural that changing the 8x mode from 4xOGSS + 2xRGMS to 4xRGMS + 2xOGSS performance will be a lot better in the latter case, yet with understandably less texture quality due to a smaller Super sampling proportion (2x SSAA on one axis at a time vs 4x SSAA on both x,y). However the EER for both is still at 4*4.

The usability of the hybrid modes is usually quite limited or limited to low(-er) resolutions if you prefer. However it´s still better to have any kind of Supersampling than none, and it´s a far better sollution for any kind of texture aliasing in extreme cases when combined with AF.

Still entirely apples vs oranges. I don´t see what it has to do with the issue at hand. Antialiasing belongs IMHO into a separate thread.

***edit:

if its only a few frames per second why change the quality ?

I don´t think it´s only about a couple of frames. 3DCenter has used both modes in it´s reviews and the 4xRGMS + 2xOGSS mode comes out (understadably) twice as fast.
 
so where does it stop. If they keep reducing iq to get speed its not going to look any better than 4x fsaa
:LOL:

Why add a useless feature first off ?
Why lament the loss of a useless feature?

this i don't agree with. Otherwise every single review must post a disclaimer saying that the image from 8x is the same as 6x fsaa .
You mean a reviewer would actually have to do his job of evaluating a product... oh the shame.

Maybe Tropicana should label their orange juice hand squeezed and if anyone complains just tell them its better quality anyway++. Incorrectly representing a product is defrauding the consumer; it is unethical and against the law.
 
Back
Top