GeForce FX & UT2003: "Performance" Texturing A

Hmmm. I think this discussion is becoming a bit convoluted and starting to hypothesize down (possibly) the wrong track.

This was discussed in a number of threads and it came down to a very strange behavior in drivers with anisotropic filtering, as well as with UT2003.

What was discovered in other threads was the following-
1) If anisotropic filtering is specified in drivers (NVIDIA or ATI), bilinear filtering seems to be the result.
2) If drivers are left to "Application Default" and anisotropic filtering is used in-game, proper trilinear application is the result.

ATI came forward with some talk about how the POSSIBLE reason for this in UT2003 (but not necessarily seen in other games) was how multitexturing was used. It was described that with multiple texture layers, that when anisotropic filtering was forced in drivers, only texture 0 anisotropy is applied with trilinear filtering, but other texture layers (usually for maps, overlays, etc.etc.) are bilinear. If anisotropic filtering is controlled by the application, the behavior is such as defined within the application. The same behavior was noted with NVIDIA drivers/hardware with UT2003.

So what the (possibly incorrect, but possibly correct) assumption was- UT2003 wasn't necessarily using texture 0 as the base/bitmap "real" texture for it's levels.. and this could ALSO change base on the texture quality settings in game to some degree.

I always theorized that with UT2003, the difference between texture detail settings from say medium to cranked max was- max actually textured the scene the same way, then simply overlayed a more detailed texture for a single mipmap/close (almost kinda like how racing sims do for the entire road- but for a single mipmap in this case). As the "base" texture is just a different layer, the topmost layer falls into bilinear as it's no longer texture 0, but a later layer.

This is totally basic/non-technical so not entirely accurate, but instead a higher-level description of the behavior as noted in other discussions concerning the possible visual appearance of using higher-detail texture settings in UT2003 with driver-forced anisotropic filtering. The whole "texture 0" thing was an interesting find... and makes sense from an implementation point of view. After all, why would you want to calculate anisotropy for additional layers 2/3/4 for map, overlays and the like when bilinear will do just fine?
 
Let me expand a little bit deeper as re-reading the last post I made, I don't know if what I've discovered seems clear nor the methods I've used to derrive this hypothesis concerning "Sweeney Code and You." :)

Back when Unreal came out, it's texture detail slider had a neato effect you didn't see in a lot of games. Between it's maxed and second from maxed setting, the ONLY difference was if you hiked right up and looked at a wall or floor texture from like right on top of it. Suddenly, the closest mipmap was "painted" with an extremely detailed, high resolution texture. The effect was a bit jarring- you could "see" the first mipmap boundry quite visibly because it was where the "extreme" detail mipmap and the normal scene's 2nd mipmap overlayed, and the trick was most definately done with multitexturing rather than a whole new texture with an extremely detailed first mipmap level within.

The reason why it was discovered to be a multitexturing "trick" was- the first few sets of 3dfx drivers had a pretty big (and documented) multitexturing bug. ALL games with multitexturing- only the base texture worked. For Unreal, this meant playing the game with the texture slider at maxed or second from maxed- there was absolutely no difference. All other games showed this bug in the form of lightmaps and other overlays being missing, or "frosted peaks" of grass and ground textures using multitexturing/layering were missing. Once they fixed this, Unreal returned back to it's slight "jarring", but very cool re-paint on the fly mode of having the closest mipmap being some extremely detailed texture.

Given this history, it is natural to assume that UT2003 uses the same kind of technique where the super detailed, closest mipmap is simply a layer 2/3 overlay on the closest textures, albeit it is much more subtle and smooth in UT2003 compared to the same technique in Unreal. You can also "see" the repaint in a couple maps, and if you have anisotropic filtering enabled in drivers, the boundry from the first mipmap is plain as day. In one or two maps, if you look at the ground and walk- you can also see the effect in the form of totally missing details between the closest mipmap and the next mipmap. The forrest DM level walkways are a good example. There are little circular detail patterns if you look straight down at your feet- but if you look ahead at the next mipmap, they are not there in the next mipmap. Normally, one would figure this is just a texture with very custom (and a big variance in detail) put into the individual mipmaps... but I'd bet my bottom dollar if you could disable mipmapping in UT2003, you'd see a different texture in place of the high-definition, close mipmap that is simply overlayed over the normal, medium detail scene.

I could be wrong, but all visual run throughs completely support this theory and the old 3dfx issue with Unreal would also strengthen that this is a trick that Epic uses to provide that extremely cool and detailed close-level texture- but without having to go in and butcher a 3rd or 4th set of textures with defining, custom mipmaps.. nor causing the rendering overhead of brute force mipmapping down such a high-resolution texture for further mipmap approximations of this super detailed, close-only mipmap.
 
Sharkfood said:
2) If drivers are left to "Application Default" and anisotropic filtering is used in-game, proper trilinear application is the result.
That was only the case for ATI.

-FUDie
 
Reverend said:
So... the most important thing is that basically what all this means is that it's not true when NVIDIA say that their drivers' "Quality" Image Setting equates Trilinear if Trilinear is set in-game, but only insofar as our study of UT2003 is concerned... right? That basically, as far as UT2003 is concerned, there really is no "As Per Application Setting" filtering, correct?

Sorry if I'm a little dense... up almost all night working on my research

That's what I get out of it. In fact, I was going to ask a similar question in the last post on the previous page, but I found the answer I was looking for :)

The important thing to mention is that it is not just when AF is enabled that we are not getting full trilinear.
 
That was only the case for ATI.

So, when exactly did ATI start manufacturing my VisionTek Geforce4 Ti4600 with Detonator drivers?

It displays the same, exact behavior in UT2003. It's also visible on the first page of this thread.. from screenshots of the 5900U.
 
SF: It seems to me like you are misunderstanding what he said. I read what he said as only with ATI does application setting result in trilinear, whereas with NV it seemingly does not.
 
Sharkfood said:
That was only the case for ATI.
So, when exactly did ATI start manufacturing my VisionTek Geforce4 Ti4600 with Detonator drivers?
That's funny, I didn't see any screenshots posted by you.
It displays the same, exact behavior in UT2003. It's also visible on the first page of this thread.. from screenshots of the 5900U.
No, it's not.
Reverend said:
NV31 with 44.03 drivers using "Quality" Image Settings with No AA nor AF

nv31_4403.jpg


R300 with 7.90 drivers using "Custom" Image Settings ("High Quality" Texture Preference and Mipmap Detail Level, with No AA nor AF

r300_790.jpg
Note that the NV31 is not doing trilinear with default settings but the R300 is.

The other shots are with AA and AF enabled in the driver which is not the defaults.

-FUDie
 
Reverend said:
So... the most important thing is that basically what all this means is that it's not true when NVIDIA say that their drivers' "Quality" Image Setting equates Trilinear if Trilinear is set in-game, but only insofar as our study of UT2003 is concerned... right? That basically, as far as UT2003 is concerned, there really is no "As Per Application Setting" filtering, correct?

Yes. WRT UT2003 (and whatever other benchmarks out there that we haven't investigated) there is no real, correct, Application Trilinear filtering, there is only what NVIDIA dictate users can use. You'll note, that in fact the quality sliders in the control panel do still make some changes, but these alter the level of the Bi/Trilinear mix on the default textures most, as NVIDIA have deemed that users of their boards should not need correct Trilinear filtering on detail textures reqardless of whether they ask for it or not.

ATI does what he application tells it to do if you don't use the driver control panel.
 
DaveBaumann said:
ATI does what he application tells it to do if you don't use the driver control panel.

i think in most case, the people will use the driver's control panle to set the AF level.
 
cho said:
DaveBaumann said:
ATI does what he application tells it to do if you don't use the driver control panel.

i think in most case, the people will use the driver's control panle to set the AF level.

Agreed. I know I do... Or rather did, until I learned this. I'm suprised that this wasn't public knowledge before now, I wouldn't have realised if it weren't for this thread.
 
cho said:
DaveBaumann said:
ATI does what he application tells it to do if you don't use the driver control panel.

i think in most case, the people will use the driver's control panle to set the AF level.

That's because most current games do not give you the option to set AF within the game and leave the control panel on "application preference". As more games arrive that correctly detect the capabilities of your card, I expect to see more in-game options that allow you to set the amount of AA/AF without having to mess with the control panel.

This is preferential anyway, as it allows you to set your levels of AA/AF on a per-game basis, rather than one global setting for every D3D or OGL application.
 
No, it's not.

Actually it is- but we are talking apples vs. oranges from the examples you have provided. :) No disagreement there with no aa/no af, and I dont have a GF FX to compare with.

I stipulated specifically the behavior illustrated when anisotropic filtering is enabled (i.e. not the no aa/no af examples) and how suddenly the display seems to default to bilinear filtering + AF... and it's due to how forcing anisotropic filtering does NOT automatically apply trilinear to all texture layers, but instead assumes the "base" scene textures will always be in texture 0, with additional layers assumed to be lightmaps/overlays and whatnot, and therefore are just defaulted to bilinear filtering. Both ATI and NVIDIA drivers make this assumption and UT2003 (for whatever reason- be it my theory above with detail textures or something else) puts the "base" scene texture a layer or two up.

Back on your topic though- on my GF4 Ti4600 (and the reason I didn't notice this) is with the 43.45 drivers, the "Quality" setting does not exhibit the same behavior as the 44.01's are in Dave's example. no aa/no af actually does have smooth, trilinear gradient lines with "Quality" mode selected + color mipmaps + no aniso (drivers nor ini).

I'll install the 44.01's later this weekend and see if the GF4 also reproduces this unusual bilinear-like smaller gradients in "Quality" mode.
 
U should noticed UT 2003 end splashscreen with Nvidia's logo and slogan:

"The way it meant to be played! " ;)
 
Hanners said:
cho said:
DaveBaumann said:
ATI does what he application tells it to do if you don't use the driver control panel.

i think in most case, the people will use the driver's control panle to set the AF level.

Agreed. I know I do... Or rather did, until I learned this. I'm suprised that this wasn't public knowledge before now, I wouldn't have realised if it weren't for this thread.

This behavior has been established for quite a while, IIRC, and I think it also relates to what Extremetech discussed in their followup article to the 3dmark 03 controversy (or, more particularly, their addendum to the followup). It does appear to be something done for benchmarking in response to the bi/tri and aniso label futzing for the GF FX, but it does seem to behave according to what the application asks for which makes the issue a matter of reviewer education (atleast as far as ATI, and games that support specifying trilinear and aniso behavior).

I do hope the mentioned aniso control option improvements implement some of my past suggestions (which would tend to indirectly result in a way for "non-benchmark" trilinear filtering to be offered intuitively from the control panel). By the sounds of it, it might even go quite beyond what I'd asked for (I'm quite curious about what they're doing). OTOH, I've never gotten any direct indication that my particular feedback is a factor in what they do, so it might just be Something Completely Different. :p
 
I do hope the mentioned aniso control option improvements implement some of my past suggestions (which would tend to indirectly result in a way for "non-benchmark" trilinear filtering to be offered intuitively from the control panel).

I didn't catch your past suggestions, but I could theorize a future anisotropic filtering option that also allows user selection of which texture stage/layer receives the trilinear treatment.

This has absolutely no tie to "benchmark trilinear" versus non-benchmark, but it would be a good option for the user's to have at their disposal for any future titles that wind up with bilinear filtering of base textures with control-panel forced anisotropic... for applications that don't have an application setting built-in.
 
Sharkfood said:
I didn't catch your past suggestions, but I could theorize a future anisotropic filtering option that also allows user selection of which texture stage/layer receives the trilinear treatment.
I doubt you'll see this. With PS 2.0, for example, you can have up to 16 textures at once... information overload for the end-user.
 
OpenGL guy said:
Sharkfood said:
I didn't catch your past suggestions, but I could theorize a future anisotropic filtering option that also allows user selection of which texture stage/layer receives the trilinear treatment.
I doubt you'll see this. With PS 2.0, for example, you can have up to 16 textures at once... information overload for the end-user.
Agreed. Frankly, I really hope general anisotropic settings will disappear from the control panel over time, and may only be used for legacy applications. The developers know for sure what textures need what degree of filtering--it's their job, after all, to define what the game should look like in the end. Hopefully they make the right choices, mind . . . ;)

93,
-Sascha.rb
 
nggalai said:
The developers know for sure what textures need what degree of filtering--it's their job, after all, to define what the game should look like in the end.

Actually it's the hardware's job to find out what degree of filtering is needed where. You just set the maximum level to use.

Except for special places like shadow maps, where aniso would cause bugs, the only reason to turn it of is performance.

Thats why it should be a detail option in every game - and it should be on in high detail settings.
 
Well, obviously, an option for the programmer to give a *percantage* of the level in the driver panel would be an awful lot more efficient IMO - with an optional minimum and maximum ( which the driver could change anyway if the hardware didn't support that - trying to do 4x AF on a Parhelia wouldn't work too well, neither would 16x on a GFFX... )

I mean, a percantage for every texture stage and stuff - to fix things like shadowmaps, yet not force 8x AF like Doom 3 does in HQ mode.


Uttar
 
Back
Top