The truth about UT2003 and nVidia

Demirug

Veteran
I have done some work for the german website 3dcenter.de about the behavoir of the nVidia drivers.

Here are the results

The optimiziation depends on the settings in the controlpanel.

Using Quality:

- Texturestage/sampler 0 use the AF-level that is requested by the application or the panel.
- Texturestages/samplers 1-7 use the AF-level that is requested by the application or the panel. But never more the 2xAF.
- If the application request a trilinear filter it will only get an partial trilinear filter (see Image below)

Using Performance:

- Texturestage/sampler 0 use the AF-level that is requested by the application or the panel.
- Texturestages/samplers 1-7 use the AF-level that is requested by the application or the panel. But never more the 2xAF.
- If the application request a trilinear filter it will only get an bilinear filter.

Using High Performance:

- Texturestage/sampler 0 use the AF-level that is requested by the application or the panel. The default "High Performance" AF-Filter is used.
- Texturestages/samplers 1-7 get no AF.
- If the application request a trilinear filter it will only get an bilinear filter.

The images below show how much trilinear filtering is used if the driver detect UT2003. The light-green color shows how much pixel use two mip.maps.


UT2003 (Quality) vs. Quality

UT2003AFCompare1.jpg


Top Left: UT2003 Stage 0 8xAF tri
Top Right: normal Stage 0-7 8xAF tri

Bottom Left: UT2003 Stage 1-7 8xAF
Bottom Right: normal Stage 0-7 2xAF tri

UT2003 (Quality) vs. Performance

UT2003AFCompare2.jpg


Top Left: UT2003 Stage 0 8xAF tri
Top Right: normal Stage 0-7 8xAF tri

Bottom Left: UT2003 Stage 1-7 8xAF
Bottom Right: normal Stage 0-7 2xAF tri
 
Are there any situations in-game where you would want to have anything but stage 0 to use high levels of anisotropic filtering? If not, then this seems like a perfectly-valid performance setting, though the game itself should be applying the settings.
 
Chalnoth said:
Are there any situations in-game where you would want to have anything but stage 0 to use high levels of anisotropic filtering? If not, then this seems like a perfectly-valid performance setting, though the game itself should be applying the settings.
Detail textures, perhaps? Lightmaps? Normal-maps?

IMHO, it is the application that should decide what AF treatment a texture should receive. If the developer didn't want a texture to receive AF they would have coded it that way. Of course, this doesn't really apply in cases where the application has no option for AF -- but it certainly does in games like UT2003, where the application does implement such an option.
 
Chalnoth said:
Are there any situations in-game where you would want to have anything but stage 0 to use high levels of anisotropic filtering? If not, then this seems like a perfectly-valid performance setting, though the game itself should be applying the settings.

Sure this is a good performance setting for UT 2003 but with full trilinear 8xAF on all stages you will still get a better image quality. My Problem is that nVidia do not allow me to get the best image quality that is possible even if i want.
 
Ostsol said:
Chalnoth said:
Are there any situations in-game where you would want to have anything but stage 0 to use high levels of anisotropic filtering? If not, then this seems like a perfectly-valid performance setting, though the game itself should be applying the settings.
Detail textures, perhaps? Lightmaps? Normal-maps?

IMHO, it is the application that should decide what AF treatment a texture should receive. If the developer didn't want a texture to receive AF they would have coded it that way. Of course, this doesn't really apply in cases where the application has no option for AF -- but it certainly does in games like UT2003, where the application does implement such an option.

Yes, the designer should know how much AF a texture need. But most games use AF as an global setting. UT 2003 is such a game. Each texture get the same level. This is one of the main reasons why you lose so much performance if you activate AF.
 
Ostsol said:
Detail textures, perhaps? Lightmaps? Normal-maps?
Why would any of these need anisotropic filtering? Detail textures (are they still used in UT2k3?) are only for looking at surfaces close-up anyway. Lightmaps are typically low-resolution, and it would be a waste of time to attempt to filter them anisotropically. Normal maps don't lend themselves well to traditional texture filtering, and so anisotropic filtering should never be applied to them.

But yes, while it is the developer who should select the anisotropic settings for each texture stage, no developer has yet bothered to optimize their games for anisotropic filtering.
 
Chalnoth said:
Ostsol said:
Detail textures, perhaps? Lightmaps? Normal-maps?
Why would any of these need anisotropic filtering? Detail textures (are they still used in UT2k3?) are only for looking at surfaces close-up anyway. Lightmaps are typically low-resolution, and it would be a waste of time to attempt to filter them anisotropically. Normal maps don't lend themselves well to traditional texture filtering, and so anisotropic filtering should never be applied to them.

But yes, while it is the developer who should select the anisotropic settings for each texture stage, no developer has yet bothered to optimize their games for anisotropic filtering.
Detail textures provide "normal" textures the appearance of being higher resolution than they are. As such, the sudden loss of detail caused by bilinear filtering can be just as apparent as with a bilinear filtered, high resolution texture. Same thing for anisotropic filtering. I think the same really applies to lightmaps, too, which are definitely of a much higher resolution than they used to be. I'm not sure about normal maps since I haven't gotten my hands on any games that use them, yet. :)

I concede the point to Demirug regarding how aniso is applied. If it is simply applied globally by the app to all textures, the implementation is simply bad. However, I cannot say it is any worse than simply applying it to only one texture unit/stage. Doing so only benefits performance, compromising quality.

In any case, the user should always have the option to do as (s)he pleases regarding quality settings. To be disallowed by the driver from using the maximum quality allowed by the game is quite unacceptable. This is especially true for developers.
 
Chalnoth said:
Ostsol said:
Detail textures, perhaps? Lightmaps? Normal-maps?
Why would any of these need anisotropic filtering? Detail textures (are they still used in UT2k3?) are only for looking at surfaces close-up anyway. Lightmaps are typically low-resolution, and it would be a waste of time to attempt to filter them anisotropically. Normal maps don't lend themselves well to traditional texture filtering, and so anisotropic filtering should never be applied to them.

But yes, while it is the developer who should select the anisotropic settings for each texture stage, no developer has yet bothered to optimize their games for anisotropic filtering.

Take a look at the desert shot here: http://www.3dcenter.org/artikel/ati_nvidia_treiberoptimierungen/index3.php

As is clearly noticeable, there is a difference. Full trilinear looks quite a bit better than what the driver is doing.
 
Full trilinear looks quite a bit better than what the driver is doing.

Yes it does, even on my R300 full trilinear with 16x AF looks much better than 16xAF with 1 stage trilinear with the rest using bilinear. If I'm not mistaken there are 4 texture stages.
 
You guys should look at the ATI pages in that article too, some interesting information about ATI's AF optimizations there...

edit: just realized that some of this has already been covered in the 3D Graphics Companies and Industry forums
 
Gollum said:
You guys should look at the ATI pages in that article too, some interesting information about ATI's AF optimizations there...

I don't think there was any new information there.
 
StealthHawk said:
Take a look at the desert shot here: http://www.3dcenter.org/artikel/ati_nvidia_treiberoptimierungen/index3.php

As is clearly noticeable, there is a difference. Full trilinear looks quite a bit better than what the driver is doing.

There's a large difference with 8xAF, where the driver is messing with the LOD. That has nothing to do with full trilinear or not.

There's also a difference with no AF--which is the difference due only to the use of mixed bi/trilinear instead of full trilinear. But it is very subtle, to the point of being not noticeable, at least in still shots. (That is to say, I can see a difference when I flip from one pic to the other, but I doubt I could see the difference when looking at the bi/trilinear pic in isolation.)

I could notice the mixed bi/trilinear effect in the still shots Wavey posted, but it's pretty darn difficult.
 
Hehe, yeah I only realized that a lot of this was discussed in a different forum, added an edit. Still, I wasn't aware that the control panel quality setting on ATI cards only gives trilinear on stage 0 and only does bilinear on the rest, before reading that article and the threads on the graphics companies forum that is. I personally would like the option of full quality when ask for it, and not only when an applications asks for it, so although I've been very happy with my 9700, this has be a bit annoyed. When looking at the plain facts so far it looks like Nvidia's driver panel is closer to delivering on that promise than ATI's, except for UT2003 of course. Doesn't really matter, I don't get to do much gaming right now anyway, so I'm sitting all this driver quality and cheating shit out and hope for "honest" drivers from both ATI and Nvidia around the time HL2 comes about... ;)
 
DaveBaumann said:
You know, it would be nice if people read some of the reviews aound here...

http://www.beyond3d.com/reviews/ati/9800_256/index.php?p=13

:!: ;)
LOL! Sorry Dave, but I only recently got myself a 9700, so I'm not reading every page of every new review these days as I'm not in the market for a new card for a while. Having said that, I actually read about half of that particular review, but skipped over the AA&AF pages as I've read dozens of those here at B3D covering R300 before and didn't really excpect a whole lot new information. Seems I was mistaken, my bad... ;)
 
Dave H said:
There's a large difference with 8xAF, where the driver is messing with the LOD. That has nothing to do with full trilinear or not.

The driver never change the default LOD-Bias if it detected UT2003.exe. The maximum AF-level you can get for the stages 1-7 is changed depending on the panelsetting. 2xAF for Quality and Performance. 0xAF for Highperformance. The first stage still support 8xAF.

Dave H said:
There's also a difference with no AF--which is the difference due only to the use of mixed bi/trilinear instead of full trilinear. But it is very subtle, to the point of being not noticeable, at least in still shots. (That is to say, I can see a difference when I flip from one pic to the other, but I doubt I could see the difference when looking at the bi/trilinear pic in isolation.)

I could notice the mixed bi/trilinear effect in the still shots Wavey posted, but it's pretty darn difficult.

I do not like the name "mixed bi/trilinear" for the filter that is used in UT 2003 and the Performance/Highperformance Panelsetiings with other games. At least the chip always calculate the multiplication-factor for both mip-maps. The difference to normal trilinear is that the chip do not sample from a mip-map if the factor below a definable value. The value must be definable because the driver use different values for UT2003 in Stage 0, UT2003 Stage 1-7 and in the case Performance settings with an other game as UT 2003. You can see this in the shots above. That are the reasons why i prefer the name "configurable clamped trilinear"

You are right that is difficult to see it on shots because on the first stage the clampvalue that deactivate sampling from a mipmap is not very high.
 
Demirug said:
Dave H said:
There's a large difference with 8xAF, where the driver is messing with the LOD. That has nothing to do with full trilinear or not.

The driver never change the default LOD-Bias if it detected UT2003.exe. The maximum AF-level you can get for the stages 1-7 is changed depending on the panelsetting. 2xAF for Quality and Performance. 0xAF for Highperformance. The first stage still support 8xAF.

Ok, fine; it's doing 2xAF instead of 8x AF. This does, of course, change the LOD of the mipmaps used. Point is, it has nothing to do with, er, configurable clamped trilinear.

Dave H said:
There's also a difference with no AF--which is the difference due only to the use of mixed bi/trilinear instead of full trilinear. But it is very subtle, to the point of being not noticeable, at least in still shots. (That is to say, I can see a difference when I flip from one pic to the other, but I doubt I could see the difference when looking at the bi/trilinear pic in isolation.)

I could notice the mixed bi/trilinear effect in the still shots Wavey posted, but it's pretty darn difficult.

I do not like the name "mixed bi/trilinear" for the filter that is used in UT 2003 and the Performance/Highperformance Panelsetiings with other games. At least the chip always calculate the multiplication-factor for both mip-maps.

I'm not sure what you mean by "the multiplication-factor", but if you mean what I think you mean (texture coordinates), I rather doubt this is true. Remember, modern chips can't do trilinear in one cycle. Only using one mipmap should ideally save you fillrate as well as bandwidth. That means saving all the calculations associated with the second mipmap...

The difference to normal trilinear is that the chip do not sample from a mip-map if the factor below a definable value. The value must be definable because the driver use different values for UT2003 in Stage 0, UT2003 Stage 1-7 and in the case Performance settings with an other game as UT 2003. You can see this in the shots above. That are the reasons why i prefer the name "configurable clamped trilinear"

Still not clear on why "mixed bi/trilinear" doesn't work for ya'. In any case, both are too long for a permanent name for this (which we will eventually need). I suggest "brilinear". Or perhaps "lielinear".

You are right that is difficult to see it on shots because on the first stage the clampvalue that deactivate sampling from a mipmap is not very high.

I thought there were detail textures in use in that screenshot too. Might I suggest instead that it's difficult to see because it's difficult to see? (The lack of full trilinear. Using 2xAF for 8xAF is of course very easy to see.)
 
Dave H said:
Ok, fine; it's doing 2xAF instead of 8x AF. This does, of course, change the LOD of the mipmaps used. Point is, it has nothing to do with, er, configurable clamped trilinear.

OK my fault. I believe you are talk about LOD-Bias.

Dave H said:
I'm not sure what you mean by "the multiplication-factor", but if you mean what I think you mean (texture coordinates), I rather doubt this is true. Remember, modern chips can't do trilinear in one cycle. Only using one mipmap should ideally save you fillrate as well as bandwidth. That means saving all the calculations associated with the second mipmap...

I am do not talk about texture coordinates. I am talk about the linear blendfactor that is used between the samples from the two mipmaps. Only if this factor is in a defined range the chip do use both mipmaps.

Sure it save bandwidth and fillrate. If it does not do this nobody will implement such an optimisation.

Dave H said:
Still not clear on why "mixed bi/trilinear" doesn't work for ya'. In any case, both are too long for a permanent name for this (which we will eventually need). I suggest "brilinear". Or perhaps "lielinear".

Yes, we need a short name for this kind of filter.

Dave H said:
I thought there were detail textures in use in that screenshot too. Might I suggest instead that it's difficult to see because it's difficult to see? (The lack of full trilinear. Using 2xAF for 8xAF is of course very easy to see.)

I was only talk about the trilinear filtering.
 
Gollum said:
Still, I wasn't aware that the control panel quality setting on ATI cards only gives trilinear on stage 0 and only does bilinear on the rest, before reading that article and the threads on the graphics companies forum that is. I personally would like the option of full quality when ask for it, and not only when an applications asks for it, so although I've been very happy with my 9700, this has be a bit annoyed....

Couple of misapprehensions here...First of all, in most games treating the 1st texture stage with trilinear is all you need to do since treating the other stages would impact performance with zero impact on IQ. UT2K3 is the exception rather than the rule because of the somewhat unique way in which the game puts down textures. I have absolutely no problem with ATi or nVidia treating whatever texture stages they wish provided the approrpiate IQ is achieved in the particular game.

Second, when the term "the application asks for it" used, this does not mean "only the application has control" over the feature's implementation. What it means is that the end-user (you or me) sets the kind of IQ we want to see *within the game* and then the game itself instructs the driver to deliver it. Now, how is it "easier" theoretically to set IQ options from the control panel rather than from within each game? It isn't.

In practice, though, it's become easier to use the IHV control panel because software developers are skimping on providing easy-to-use IQ settings from within the games they are making. Some games provide no interface for setting IQ at all--so the only thing that *can* be done in these cases is to force settings through the 3d-card driver cpanel. The difference between the Cats and the Dets relative to full trilinear in UT2K3 is that you can get it with the Cats and you can't with the Dets. Whether you get it through the cpanel or the game's own settings is irrelevant, IMO. The point is whether you can get it or not.

Now, what if ATi put in an option into the cpanel to force trilnear on more than one texture stage? IMO, most people would run with this option always enabled because of the misapprehension they'd have that "filtering all of the texture stages must be better than just filtering one" even though that surely isn't the case. And so while an application like UT2K3 could then receive full trilinear without fussing over the UT2K3.ini settings to enable trilinear, *none of your other* 3D games would look any better at all, but they'd all run appreciably *slower* than they did before.

It's very much like the principle of pixel occlusion in a rendered scene. What is the purpose of rendering pixels invisible in a frame because they are behind other pixels along the z-axis? There is no purpose to it, and various occlusion techniques improve frame rate performance because a number of these invisible pixels are simply not rendered at all. Likewise with treating texture stages. If a game engine lays down its textures such that only the first texture stage is visible in the scene, what is the purpose of treating more than one texture stage--regardless of the kind of filtering you are talking about? There is none from an IQ standpoint, and doing so creates wasted work which negatively impacts performance.

This is exactly why the superior method of setting up IQ is to set it up in the application as opposed to forcing it through the cpanel. The application knows what textures stages need treatment and can properly instruct the drivers to deliver the appropriate treatments. All driver cpanel controls are dumb and set to force a pre-set which is supposed to impact all 3D games. Problem is all 3D games are not alike and so such dumb setups do not always work.

Bottom line is that the driver guys aren't idiots. Lots of things they do have specific and well-justified reasons behind them.
 
Back
Top