GeForce FX & UT2003: "Performance" Texturing A

OK, here's a couple of normal images.

This is the Antidetector image, which what looks to be full Trilinear when the mip colours are on:
anti_1_no.jpg


This is an image from the drivers as they come, with 'Quality' filtering in use:
norm_1_no.jpg


You can see that the mipmap transitions are much more progressed all over the AntiDetector image, as you would expect with full trilinear. With the normal image the base textures don't show up the mipmap transitions that much, because there is more pregression between the levels - however, it is noticable on the detail textures.

Here's a couple of highlighted mip transitions:
highlight.jpg


There’s two things to consider here. First off, the detail textures are just that: detailed, and hence high resolution (in comparison to the default textures) – not properly filtering through all levels will save a lot of performance (this is like the difference between ‘Application’ and ‘Aggressive’ on the pre 44 drivers).

The other point is that NVIDIA do not appear to be rendering what they are telling people they are rendering. Their review guide actively promotes downloading SamX’s texture filtering app from here to check out the filtering levels on the FX series – with SamX’s app this shows full mip-map transitions when “Quality” Trilinear is enabled, however here we see that this just isn’t happen in the game. This also means that there effectively is no “Application” mode anymore, at least in UT2003 – this is the best filtering that can be had, regardless of driver settings.
 
Last edited by a moderator:
Do said cheats exist in OpenGL as well as D3D? Or, Dave, if you're busy, I'll try when I get home :p
 
Hm, all right. Might try OGL, D3D, and D3D AntiDetector and compare the IQ in all of them.
 
:oops: Well... this IMHO looks like they have some kind of clamp operation on the mipmap blend. They only seem to do the correct trilinear blend when the mipmap blend is close to 50/50 - when the blend moves away from this most obvious area they ignore the blend and assume 100/0 or 0/100 blends, with other words they drop from trilinear to bilinear. This swap over point from mipmap blend to single map is very clear in the colored mipmap levels and its also visible as a cutline in the normal screenshots.

As you know latest NVIDIA and ATI hardware does no longer support single cycle trilinear, with other words trilinear does not come "for free", it's processed over 2 clocks meaning that fillrate performance is potentially halved. By using this clamp operation they can drop down to bilinear on a lot of pixels and hence save fillrate...

When using this mode I don't think the driver would pass the WHQL tests ...

K-
 
Kristof said:
:oops: Well... this IMHO looks like they
"They" being NVIDIA.
have some kind of clamp operation on the mipmap blend. They only seem to do the correct trilinear blend when the mipmap blend is close to 50/50 - when the blend moves away from this most obvious area they ignore the blend and assume 100/0 or 0/100 blends, with other words they drop from trilinear to bilinear. This swap over point from mipmap blend to single map is very clear in the colored mipmap levels and its also visible as a cutline in the normal screenshots.

As you know latest NVIDIA and ATI hardware does no longer support single cycle trilinear, with other words trilinear does not come "for free", it's processed over 2 clocks meaning that fillrate performance is potentially halved.
When was trilinear ever free on ATI hardware? I recall that the original GeForce supposedly had free trilinear, but that was the only NVIDIA chip that did, I believe.
amp operation they can drop down to bilinear on a lot of pixels and hence save fillrate...
"They" being NVIDIA.
 
as far as I know nVidia and ATI never had free trilinear the only company that I know has it was S3 in their Savage 3d and Savavge 4 cards
 
FUDie said:
"They" being NVIDIA.

If you say so :LOL:

Hmm, could have sworn that at some point Trilinear was free, maybe I am just confused with the ability to use the second TMU, if free, to do the trilinear blend in a single cycle - obviously this would fail when doing dual texturing.

Actually KYRO also had a single cycle trilinear mode, but only for DXT compressed textures where all samples were taken from a single map and combined on chip to generate the lower miplevel. So effectively only one level is fetched but the lower level is auto generated by the hardware (which means it does not show up when using colored mipmap levels but it does work quite well assuming the mipmap levels are sufficiently similar).

K-
 
DaveBaumann said:
Bambers - is that via application filtering or control panel selection?

The higher numbers come from 'quality' AF selected via Control Panel, the lower scores are from the in-game selectin via ut2003.ini
 
Kristof said:
FUDie said:
"They" being NVIDIA.

If you say so :LOL:

Hmm, could have sworn that at some point Trilinear was free[...]

It was.

As samX pointed out to me over at the german 3dCenter-Forum, the original GeForce256 and it's DDR-sibling both had single-cycle TMUs (one on each of its four Pipelines).
 
thanks dave... pics are greatly appreciated... I can see what you mean about the details...

nice work...

any chance to post comparitive pictures for ati cards ?
 
This is probably old news, but the ATI Catalyst drivers seem to do the same thing. This is evident even without using the antidetector. Set UseTrilinear=True and LevelOfAnisotropy=8 in the UT2003.ini file. Force AF8X-Quality in the ati control panel and then run UT2003 with firstcoloredmip 1. You'll notice that the image has sharp transitions and appears to be bilinear filtering. Now set AF to Application Preference in the control panel and run UT2003 again. Filtering appears to be the expected trilinear now.

This has been discussed on the Rage3d forum before.
 
ATI 9800 PRO 268MB Results

OK, here are some similar shots from a 9800 PRO.

First up, this is just plain Trilinear with everything at the default settings:
9800_norm.jpg


Same again, but with colored mip levels:
9800_norm_col.jpg


This is Trilinear with Antidetector installed:
9800_anti.jpg


Again, with colored mips:
9800_anti_col.jpg


The performances with and without Antidetector are:

Code:
       Normal Anti
Flyby  120.98 120.98
Bot    69.39  69.31

So, IMO this leave 3 options:

1.) Antidetector wasn’t installed properly
2.) ATI has detection for UT2003 but Anti doesn’t find them
3.) ATI has no detection for UT2003.

I hope that it isn’t 1 since it the script says it worked and I copied the altered .dll back into the system32 directory, however I welcome anyone verifying things. [Edit] OK, I can rule this out - I've just run GT4 in 3DMark2001SE and there were differences in performance with the old and new .dll, hence Antidetect was installed correctly.

There's also been a few questions on ATI's AF implemtation.

ATI has been relatively upfront (when people ask - they didn't exactly advertise this) that the "Quality" forced AF in the control panel is a mixture of Tri and Bilinear - only one texture layer gets full trilinear, the rest is Bilinear. ATI's view is the control panel is only there for legacy apps anywa, and the 'Application' setting should be used for new apps and full control is given to the developer when its used.

First, here's a similar shot with 8X AF forced through the control panel:
9800_control_8xdefaultlod.jpg


Likewise, here's an 8X AF shot but with application settings (altered in ut2003.ini file):
9800_app_8xdefaultlod.jpg


As you can see, in the Control Panel AF shot Trilinear is there on the default detail textures (the larger bar running in line with the gun), however the detail texture mip levels only show bilinear – this is because there will be another texture that’s used in conjunction with this that does have Trilinear. When AF is used via the application everything is Trilinear, as ATI's drivers to force any options over the application settings.

You may have noticed, though, that the detail textures have a lower LOD. This would appear to be because the application has a default of 0.8000 set in it, and this changes what ATI’s Aniso is doing. Here is what happens if you change the Detail Texture LOD to -0.5000 (same as Default):

http://www.beyond3d.com/images/reviews/nv35u/utanti/9800_app_8x-0.5lod.jpg

It appears to be that case that if you let the application select things in with ATI's drivers it will do as the application requests.
 
Last edited by a moderator:
Re: ATI 9800 PRO 268MB Results

DaveBaumann said:
ATI has been relatively upfront (when people ask - they didn't exactly advertise this) that the "Quality" forced AF in the control panel is a mixture of Tri and Bilinear - only one texture layer gets full trilinear, the rest is Bilinear. ATI's view is the control panel is only there for legacy apps anywa, and the 'Application' setting should be used for new apps and full control is given to the developer when its used.

It's not a very good decision IMO. Just take ut2k3 engine games for instance(very popular engine) . You can't set aniso through the game menu for the most of them , so if you want aniso you have to force it and unfortunately you get mipmap banding(because of atis aggressive optimization) , very noticable in the most cases(postal2,u2,ut2k3). Of course you can enable aniso manually in the ini file but it's not very comfortable and there are games where you can't even enable aniso through a config file and are left off with a less quality aniso than actual possible. Ati should give me a choice here. Let me choose to force their optimized quality aniso( as it is since cat3.1) or the full quality aniso with tril/aniso no matter what(as it was in the pre 3.1 cats)

There are so many games now where i get mipmap banding even with quality aniso forced.
 
Dave (or Unwinder), what exactly is the AntiDetector doing in UT2003?

Would renaming ut2003.exe have the same effect? Do you see the same observations re NV35 if you renamed ut2003.exe to something else? I'd test this myself with a NV31 but I'd just replaced it with a R300 and I'm just too lazy to switch again, plus not sure if similar things happen regardless of NV3x board.
 
Sorry to barge in here, but would it be possible for someone to do a run of B3D's oilrig Splinter Cell demo with and without the anti-detector patch with a gfFX card?
 
So... the most important thing is that basically what all this means is that it's not true when NVIDIA say that their drivers' "Quality" Image Setting equates Trilinear if Trilinear is set in-game, but only insofar as our study of UT2003 is concerned... right? That basically, as far as UT2003 is concerned, there really is no "As Per Application Setting" filtering, correct?

Sorry if I'm a little dense... up almost all night working on my research
 
Back
Top