Excellent article on NVidia and ATi Optimisations

g__day

Regular
http://www.3dcenter.org/artikel/ati_nvidia_treiberoptimierungen/index2_e.php

English now - what an interesting read those graphs show before and after optimisations removed.

Conclusion

The following can now be summarized:

nVidia uses in 3DMark2001 and 3DMark03 application specific optimizations in the drivers 44.03 to 44.90, which under first benchmark approx. 16 percent and under second benchmark approx. 62 percent (!) performance advantage under 8x anisotropic filter bring in. Whether these application specific optimizations bring disadvantages in the image quality with itself was not tested, since application specific optimizations do not have anyway absolutely to be in theoretical benchmarks.

nVidia uses further under Unreal Tournament 2003 a application specific optimization in the drivers 44.03 to 44.90, which brings a performance advantage in the flybys about approx. 57 percent (!) under 8x anisotropic filter. To exactly determine how much of these 57% were reached only with loss of image quality and how much were reached with identical screen output is beyond our possibilites and must remain an open question. It was clearly shown, however, that nVidia at least uses a pseudo trilinear filter particularly under Unreal Tournament 2003, as well as partial MipMaps shifts to the rear in the anisotropic filter and in some cases the full anisotropic filter did not work. Further it could be proven that this is not the normal condition of the nVidia drivers, since these filter cleanly trilinear anisotropic in all other applications. Thus it is considered proven that the application specific optimization of Unreal Tournament 2003 brings a definitely worse image quality with itself, even if the difference is quite small.

ATi used a application specific optimization, which brings in approx. 2 percent performance advantage for this benchmark under 16x anisotropic filter under 3DMark03 up to the driver 03.4. In the drivers 03.5 and 03.6 this optimization is no longer verifiable, so we can rest this case.

ATi uses a application specific optimization in the drivers 03.4 to 03.6, which brings in approx. 4 percent performance advantage under 16x anisotropic filter under 3DMark2001. Wheter this application specific optimization uses degradations in terms of image quality was not tested, since we consider any optimization for a synthetic benchmark an attempt to deceive the public.

ATi used further a general optimization under the anisotropic filter in the drivers 03.2 to 03.6, which yields an approximately 20% performance advantage under 16x anisotropic filter in the flyby benchmarks of Unreal Tournament 2003 while producing no mentionable effect in some of the other benchmarks. In other benchmarks however as well do not fasten as at all. ATi filters the base texture when anisotropic filtering only trilinear, any further textures (stages) however only bilinear. Disadvantages in the image quality apart from this bilinear/trilinear filter-mixture could not be proven. Nevertheless, the question as to whether this general optimization of the anisotropic filter does not represent an unallowed optimization in itself has to remain unanswered, since ATi's postulated "trilinear filtering" is not achieved. Also, nVidia offers here - except for the shown execption in Unreal Tournament 2003 - a normal trilinear anisotropy for all texture stages.

Generally it can be recommend not to use 3DMark for benchmarking because of the determined application specific optimizations on the part of ATi and nVidia. And, for the future in general, keep a keen eye on for the delivered image quality - especially with new drivers - and rather double-check on your screen output instead of leaving it all up to the GPU vendors.

So we are in an very tricky situation in Unreal Tournament 2003. Surely nVidia uses an unallowed optimization - this, however, degrades the static image quality only marginally, but replaced "only" the trilinear filter by a pseudo trilinear. Furthermore, this is an unallowed application specific optimization, but on the other hand this is also - roughly compared - the normal condition with ATi und anisotropic filtering, as they are replacing the trilinear filter with a bilinear/trilinear-combo by default.

The situation in Unreal Tournament 2003 is nearly the same for nVidia and ATi under anisotropic filter since both use no complete trilinear filter. Theoretically we could go on without remarking that this specific optimization is an unallowed one at nVidia and a partical unallowed at ATI. Ironically, both chip manufacturers have de-optmized the image quality so far, that they find themselves facing each other at knee-level again. nVidia has "created" an unallowed optimization which looks not worse than the ATi normal condition ;-).

As a result of this comparison, however, some other problem arose: Because without anisotropic filter ATi filters normally (purely trilinear) under Unreal Tournament 2003, while nVidia uses the pseudo trilinear filter. But other benchmarks show an disadvantage for nVidia in image quality. Because while nVidia filters outside from Unreal Tournament 2003 everywhere clearly trilinear at anisotropic filtering, there are the observed bilinear/trilinear mix at ATi in all applications. If one regards however this latter comparison as generally fair, then also the application specific optimization is not unallowed Under unreal Tournament 2003 any more, since it does in the long run only the same.

The whole thing reminds a gordian knot, which you can cut but not untie. Surely it would be the simplest to set the bilinear/trilinear anisotropic filtering of the ATi driver as unallowed optimization and to use only the pure trilinear filter offered by tools like rTool. Then we would have weapons equally (with exception Unreal Tournament 2003). But we have also to consider that most of the users take the control panel for switching the anisotropic filter and not any externat tools like rTool or aTuner.

If one makes the benchmarks only with such external tools he works past the control panel and so past to most of the users. Because of that we cannot say how we will benchmark in our upcoming high end graphics cards roudup. Maybe this desicion will come up in the discussion to this article.

Surely, it would be desirable if both graphics chip developers left the quality modes in their drivers untouched and any optimizations whether application specific or general would be build into specially modes. Nobody would likely be argueing against optimizations, which have hardly a visible quality loss and yield a performance advantage between 20 and 57 percent. If we should find such extra modes in future drivers, even only for individual games usable, we would bench this probably additionally to the pure quality mode. Only the quality modes should not use such optimizations, if both chip developers want to assume the role of outriders of image.
 
Generally it can be recommend not to use 3DMark for benchmarking because of the determined application specific optimizations on the part of ATi and nVidia.

Utter and complete silliness... By this reasoning, we should not use any application, game or synthetic, that has any specific optimization. We should pressure the IHVs to drop specific optimizations from synthetic benchmarks, but not stop using synthetic benchmarks altogether. How hard is that to grasp ?
 
CorwinB said:
Utter and complete silliness... By this reasoning, we should not use any application, game or synthetic, that has any specific optimization. We should pressure the IHVs to drop specific optimizations from synthetic benchmarks, but not stop using synthetic benchmarks altogether. How hard is that to grasp ?

Err... they explicity state "3DMark" - not "synthetic benchmarks". No need to start frothing at the mouth. 3DMark is compromised. As you say, that doesn't mean that other synthetic benchmarks necessarily are, nor that they can't be quite informative.

Entropy
 
Was it me or did they miss out the fact that if you use application Anisotropic Filtering with ATI you gat full filtering?
 
DaveBaumann said:
Was it me or did they miss out the fact that if you use application Anisotropic Filtering with ATI you gat full filtering?
It seems they missed it. This is what they did:
Fortunately the ATi driver leaves the user a small back door, if a genuine trilinear filter is preferred. Because when we enabled the anisotropic filter not via ATis control panel but used the tweak tools aTuner 1.4.32.4465 (comes with a experimental but in our benchmarks nevertheless efficient Radeon support) or the rTool 0.9.9.6d (the previous version 0.9.9.6c wasn´t able to do that!), we had no problems to unite the trilinear and the anisotropic filter (compare with the original ATi screenshot on the right):
Perhaps if you sent a gentle mail pointing this out to them they would add a note about it. After all, it is not exactly obvious what you have to do to get trilinear to work, so there's no need to assume any ill will involved.

Entropy
 
Well, so am I, but it's a wide web, and the deeper into non-english speaking regions you come... Besides, much of what is "common knowledge" here is so only because it is referred to in the forums over and over. If you don't immerse yourself in the forums, for lack of time, language skills or inclination, you'll be much more likely to miss technicalities such as this.

Entropy
 
DaveBaumann said:
Well, I'm surprised they haven't read it here if the truth be told.
We know this "aplication preference"-setting in the driver.

But it is irrelevant for the most gamers because only a handful of games allow you to change this settings. When you want to use aa + af in the most games, you need the control panel to enable them.
 
Well, if you knew about it then it should have been mentioned - there are games that application controlled AF and users should have the information to hand that states that if they control it via the application then they will get full options; i.e. if the game has it, just use the game rather than fiddling around with the control panel or any other external tools.
 
I agree that it should have been mentioned - at least you have the possibility to get full trilinear if the app supports it.
But it's true - sadly far too few games support AF ingame.
What I would like to see:

1) When not using "application" setting: a combobox (or similar) in the control panel to adjust the number of stages for which trilinear should be applied.

2)A setting comparable to the VSync-default on/off, i.e. for example "Use trilinear aniso (at the setting specified in control panel) unless the application specifies otherwise". Is this technically possible?
Then we would really have a "set and forget" option.

Edit: spelling errors - and I completely forgot: Congrats, a really good read!
 
3dcenter said:
that nVidia at least uses a pseudo trilinear filter particularly under Unreal Tournament 2003..
I have my own word for this, which is brilinear.

Generally it can be recommend not to use 3DMark for benchmarking because of the determined application specific optimizations on the part of ATi and nVidia.
So, it is "recommended" to use every other game out there for benchmarking because they/we don't know if there are "optimizations"? Have we managed to discover if any IHV is "optimizing" (in the ways only they currently know about) in 99% of the games out there?

FM, as a developer, did Something Good. Which has led to many saying the same thing as this 3DCenter article about 3DMark03.

What about developers that keep quiet? Or media outlets don't make similar discoveries?

What are we left with, other than to discredit/ignore applications because its developers took the time, the valour and the risk that comes with being proud of what they stand for, which is to reveal stuff they do not believe in (and of which is taken advantage of, and getting away with it through threats)?

Wot an incredibly silly media industry this is.
 
Exxtreme said:
We know this "aplication preference"-setting in the driver.

But it is irrelevant for the most gamers because only a handful of games allow you to change this settings. When you want to use aa + af in the most games, you need the control panel to enable them.


No what's irrelevant here are these comments, if I may be so bold...;)

The preferred method for instituting game settings is from within the game--not the cpanel. The cpanel override is only an affectation which has come about over the past several years to address developer shortfalls in this regard, and as such there many settings (such as in UT2K3) which can only be set from within the UT2K3.ini--and are not adjustable from anybody's Cpanel (such as LOD, for one.)

Besides, calling the function of the ATi control panel an "optimization" or cheat relative specifically to UT2K3 is completely inaccurate, since the ATi control panel employs the same texture stage treatments for all games in the same way--it does not do what it does specifically for UT2K3. The reason it doesn't quite work in UT2K3 is because of the way UT2K3 layers its textures--UT2K3 does this differently than any other game I'm familiar with. The solution is to allow the application to instruct the drivers on what texture stages to treat for full trilinear, and the ATi drivers allow this.

What nVidia is doing is completely different and there is no comparison, even indirectly. They've specifically singled out UT2K3 in their drivers not to provide full trilinear in any situation at all--whether the user commands it from their control panel OR commands it from within the game. Completely different situation. I don't think it serves any purpose to get fuzzy on what's going on here.
 
Hi,
the author just added a couple of paragraphs to address the UT2k3 thing. Translates into something like this:
Regarding ATI's anisotropic filtering optimizations, we should add that in Unreal Tournament 2003, the game we selected four our tests, that these can be circumvented without resorting to third party tools. To do so, anistropic filtering must be disabled in ATI's control panel and set it in the game's ini file (UT2003.ini, LevelOfAnisotropy). This approach has its caveats: first, it's common practice to control anisotropy through the control panel or tweak tools, not in games. This also allows playing games with one AF level. Gamers wishing to play everything at 8xAF, it would be an inconvenience to turn it off in the control panel before playing Unreal Tournament 2003 and to reenable it for other games.

Second, and more significant: ATI optimizes anisotropic filtering for all Direct3D based games. Even though there's a way around that for Unreal Tournament 2003, this won't improve the situation for the vast majority of Direct3D based games, where there simply aren't any anisotropy controls. This workaround for one game can't be regarded as a general solution for ATI anisotropy optimizations.

Reverend,
regarding your 3DMark concers, I somewhat share your view of things. 3DMark isn't the source of all evil, if there's something like that, it would be the IHVs' driver departments.

But let's be pragmatic for a second. 3DMark is (ab?)used to find out which graphics card is "the best" *shudder*
For that purpose, it has lost its usefulness. That needed to be made clear, I think. You can still more or less safely use it to compare cards from the same IHV.

I'll have to check back, haven't been part of the complete translation effort, but it may be that something along these lines is in the German version and has been lost on the way.

But maybe that's just me and Leonidas (orig. author) feels differently. He's quite a strong believer in his own beliefs :)
(which are typically well founded, mind you)
 
Regarding ATI's anisotropic filtering optimizations, we should add that in Unreal Tournament 2003, the game we selected four our tests, that these can be circumvented without resorting to third party tools. To do so, anistropic filtering must be disabled in ATI's control panel and set it in the game's ini file (UT2003.ini, LevelOfAnisotropy). This approach has its caveats: first, it's common practice to control anisotropy through the control panel or tweak tools, not in games. This also allows playing games with one AF level. Gamers wishing to play everything at 8xAF, it would be an inconvenience to turn it off in the control panel before playing Unreal Tournament 2003 and to reenable it for other games.

Second, and more significant: ATI optimizes anisotropic filtering for all Direct3D based games. Even though there's a way around that for Unreal Tournament 2003, this won't improve the situation for the vast majority of Direct3D based games, where there simply aren't any anisotropy controls. This workaround for one game can't be regarded as a general solution for ATI anisotropy optimizations.

Maybe it is just me, but I find in-game setting to be much better. Firstly you let developer to run the game as he intended and secondly it allows you to have different profiles for each game.

For example if game is fill-rate bound I can turn max AA, but turning AF on would bring large performance loss. Similar if memory bandwith is the problem.

No if I just put AA and AF on I'll have problems with playing both "hipotetical" games, while using their in-game controls I could make different profiles for each one.

I know there are scripts that act as profiles, but somehow I think it is most acceptable for each setting to be picked in game.

Regarding ATI and nVidia and author stance with optimizations I find his conclusion to be completely wrong.

With ATI you can force trilinear + anizo filtering via registry key, or by setting Control Panel to application and then select appropriate setting in game. It is Epics fault for not having anizo slider in their game, but fortunately changing ini allows you full control of ATI based cards.

With nVidia and UT2K3 story is completely different. Not only there is no Application setting in drivers, but there are no reg keys (or at least nVidia doesn't want to give me any after exchanging around 10 e-mails with them) to force trilinear with anizotropic which is shame if you consider that they advertise FX5900 cards as "All On" architecture. Thing that they emphasized almost in every exchanged e-mail.

Zvekan
 
WaltC said:
Exxtreme said:
We know this "aplication preference"-setting in the driver.

But it is irrelevant for the most gamers because only a handful of games allow you to change this settings. When you want to use aa + af in the most games, you need the control panel to enable them.


No what's irrelevant here are these comments, if I may be so bold...;)

The preferred method for instituting game settings is from within the game--not the cpanel. The cpanel override is only an affectation which has come about over the past several years to address developer shortfalls in this regard, and as such there many settings (such as in UT2K3) which can only be set from within the UT2K3.ini--and are not adjustable from anybody's Cpanel (such as LOD, for one.)
I know, that the ingame control over aa+af should be the preferred method, but how many games do you know, which allow you to control this settings? I actually know one game and one benchmark.

And tools like my rTool make it not better. ;)

You have absolutely no chance to enable aa + af in the most games without the cpanel or other tools. :(
This is bad but no one can change this.
WaltC said:
Besides, calling the function of the ATi control panel an "optimization" or cheat relative specifically to UT2K3 is completely inaccurate, since the ATi control panel employs the same texture stage treatments for all games in the same way--it does not do what it does specifically for UT2K3. The reason it doesn't quite work in UT2K3 is because of the way UT2K3 layers its textures--UT2K3 does this differently than any other game I'm familiar with. The solution is to allow the application to instruct the drivers on what texture stages to treat for full trilinear, and the ATi drivers allow this.
I think, ATi and Nvidia should offer new options, which allow the user to change the quality level. I don't really like this angle-dependent AF in the current radeon products. I want to have the choice between high quality or lower quality and better performance.
WaltC said:
What nVidia is doing is completely different and there is no comparison, even indirectly. They've specifically singled out UT2K3 in their drivers not to provide full trilinear in any situation at all--whether the user commands it from their control panel OR commands it from within the game. Completely different situation. I don't think it serves any purpose to get fuzzy on what's going on here.
Your're right.
 
Exxtreme said:
I know, that the ingame control over aa+af should be the preferred method, but how many games do you know, which allow you to control this settings? I actually know one game and one benchmark.

And tools like my rTool make it not better. ;)

You have absolutely no chance to enable aa + af in the most games without the cpanel or other tools. :(
This is bad but no one can change this.

Game developers can change it, though. Lots of games allow differing degrees of in-game control over various things--the UT/UT2K3-engine games, NOLF2, NWN, Q3 and its derivatives, etc.--there are actually a lot of games which provide varying degrees of in-game control. The problem is one of consistency and standards, but I don't see this changing much because all game engines are different. I do think they could standardize and all include support for FSAA and AF, certainly.

But basically, I'm just saying it's fine to use the Cpanel when a game doesn't support configuration like that, but it should be understood that to tune many games it just won't be possible to do it via the Cpanel--because the IHV's are only supporting a limited set of configuration options--and they only do that much because these options have to be forced in games which don't allow their configuration. When you go to generically forcing things in 3d games you necessarily leave the area of specifics and move toward generalities. You've got to make certain assumptions when you do it, and games which deviate from those assumptions won't respond properly from the control panel (like Splinter Cell, UT2K3, etc.) But if developers start implementing common controls for the features commonly seen in IHV control panels, then the end user can set specifics from within the game and such problems recede, and the Cpanel controls would become redundant and, depending on the game, inferior to in-game control.

I think, ATi and Nvidia should offer new options, which allow the user to change the quality level. I don't really like this angle-dependent AF in the current radeon products. I want to have the choice between high quality or lower quality and better performance.

I think the idea of angle-dependency is only that planes when viewed from certain angles won't show the benefits of AF, so why treat them with AF...Right? I think the best criticism of it is that it's not 100% efficient--but I think the idea has merit. I mean if we think it's advantageous not to render pixels in a frame which are occluded by other pixels from the vantage of the camera, what's wrong with not treating textures with AF when viewed at angles at which the AF isn't visible?

But anyway...I was under the assumption that the Dets, and I certainly know it's true with the Cats, already offer "performance" and "quality" options via their control panels. What new options did you have in mind?

Let's imagine that IHVs start allowing for selection in the control panel of which texture stages are treated for AF, Trilinear, etc. I can see a big negative there when people start selecting all texture stages for treatment without realizing that in most games you only see one texture layer, the last one laid down relative to the camera (as noted, UT2K3 is an exception.) In most cases the users who don't know what in-game controls are and so rely on the Cpanel also won't have a clue as to which texture stages they need to select for which games so that they can get the IQ they want without paying a performance penalty. Instead, they'll just select for all of the stages to be treated, imagining this improves IQ, when the only thing it will do is lower performance. Besides, most developers never release that kind of info to the general public--which wouldn't know what to do with it if it had it...;)

I think it's far better, then, that developers provide for in-game control so that when a user selects "trilinear filtering" in a game the *game* can tell the drivers which stages to treat--which would also make things much simpler for the end user. In fact, this is exactly the situation we see in UT2K3 with the Cats. That way you always get the texture treatment you need and the maximum performance possible. You know as well as I that should an IHV start putting texture-stage treatment selections in its Cpanel that most people will automatically choose to have all of them treated--which in no case that I know of will increase IQ but will always drop performance--possibly substantially. And also--no matter what you do in the Cpanel there will always be game-engine controls which are specific to individual games--not just like LOD, but think about things like POV in UT2K3. If you've got to go into the game to set things like that--why not just set everything there in the first place?

Shoot, if developers had been doing their part in all of this all games would be configurable internally and the only setting an IHV would need to provide in the Cpanel was Application Preference. I think we need to light fires under the developers--the IHVs have for the most part done a very good job in taking up the slack they've left for the last few years. I'd really like to see developers start implementing standards for in-game controls and eventually see the cpanel overrides as "legacy" stuff that eventually falls away.
 
Exxtreme said:
I think, ATi and Nvidia should offer new options, which allow the user to change the quality level.
Huh? There are plenty of options now, which people are already misunderstanding. If people can't figure out what's offered now what chance do they have of understanding more choices?
 
OpenGL guy said:
Exxtreme said:
I think, ATi and Nvidia should offer new options, which allow the user to change the quality level.
Huh? There are plenty of options now, which people are already misunderstanding. If people can't figure out what's offered now what chance do they have of understanding more choices?

You're true but a simple no-compromises checkbox could be great ;) Especially with high-end boards !
 
Tridam said:
You're true but a simple no-compromises checkbox could be great ;) Especially with high-end boards !

So how would such a checkbox guarantee the "no-compromises" you are talking about (not to mention that such a phrase could be interpreted to mean lots of different things)?

Right now, if you select the "full trilinear checkbox" (speaking figuratively) in the Detonator cpanel for UT2K3, or you select the "full trilinear checkbox" within UT2K3 itself, the Detonators still don't give it to you. What then?
 
Back
Top