4x AA performance comparisons invalid?

THe_KELRaTH said:
In fact, and I expect there's no relevance here, is that the shadermark scores are have a very similar ralation to the 200m / 350m triangles per sec rates in terms of percentage difference.
I don't believe ShaderMark is vertex limited in any way...
 
Hyp-X said:
I wan't to point out an error everyone tends to have.

Antialisaing should not be compared in still screenshots.
The most important point in anti-aliasing is to (surprise) remove aliasing that you get when the geometry is moving on screen.

For example 2xAA on R300 will beat 4xAA on GF4/FX on still screenshots, while it will lose to it (due to less samples), in movement.

A supposition, of course, based on the belief that when 4x is selected in the drivers the driver is actually taking the expected number of samples. If the number of samples varies from the norm, or the pattern of samples is ineffectual, you might not get a clear indication from movement, either. All things being equal, I'd agree with your statements--but the difficulty is in trying to define what's equal and what isn't.


The problem of web reviews is that how can aliasing presented?
It could only be done with test programs - but it has to be sw rendering otherwise you cannot really do comparsions.
It also need to have high framerates - the higher the framerate the more obvious the aliasing.
And it obviously has to be high-polygon.

What do you do, though, when the driver presentation is so confused and befuddled that even the people reviewing the product have a difficult time discerning what's what...? It seems to me that finding out what's what in the nVidia driver platform currently concerning the GF FX is not straightforward and simple and intuitive--but rather complex, and needlessly complex, at that. Which leads to the inevitable questions of either competence or concealment. Were they so incompetent that they couldn't at least copy the driver GUI from ATI or some other company so as to come up with an intuitive, easy-to-use interface, or were they seeking to deliberately obscure the issue in the hopes that some of their lower-quality imaging modes would be selected for performance comparisons? The information Anand quoted from the nVidia "guide" which instructed him on how to "approximate" the image quality of the ATI products by using certain nVidia driver settings did not seem at all correct--it seemed to me as if Anand concluded otherwise. This would tend to steer the possiblities toward concealment--or at least a deliberate effort at confusion. Or, it may simply be that nVidia's driver team isn't any good with GUI's....*chuckle*....Not knowing is a big part of the problem--we shouldn't even have to have such conversations, really.
 
Back
Top