More Texture Filtering

Myrmecophagavir said:
WaltC said:
I think the bi-tri mix is perfectly OK for any IHV when it's restricted to a "performance" setting in which the end user is aware he's running in a performance IQ mode.
Does that include ATI's "Quality" AF control panel setting, which only does trilinear filtering on the first texture stage?
WaltC said:
The thing about Det behavior in UT2K3 that I dislike is the fact that even when the application calls for full trilinear the Dets don't give it--but provide a "performance blend" instead.
So the same could be said for the Cats...

But ATI have never hidden this or lied about it. It only happens when quality mode is forced from the control panel. When a UT2K asks for trilinear directly, it gets trilinear from the ATI card.

With NV3x, when UT2K directly requests trilinear, it gets a bilinear hack which Nvidia is calling quality. This only happens under the new drivers for UT2K, so that Nvidia can trumpet an increase in performance, without telling anyone it happens on that one single game only because they hack down the filtering quality on the sly.

Like the original 3Dmark cheat, Nvidia hacks out a faster shader that uses lower precision and produces inferior quality in order to keep the speed up, while ATI reorder instructions to make a faster shader without compromising image quality.

The two companies and their respect for their customers are world's apart . :rolleyes:
 
DaveBaumann said:
I can't find it enabled under R300/R350 but RV350 does have this under DirectX (of course, there was also a less advanced form in R200).


What about opengl? my 8500 seemed to be a bit sneaky wrt that.

If a game had colour mipmaps on then it would look all smooth, if they were off (at the default 'quality' tex pref setting of opengl) the mipmaps could be seen. and it seemed to be doing a bi/tri mix similar to the ones in d3d but more extreme that the high performance setting of d3d.

I could confirm this in both the opengl AF test and q3.
 
Myrmecophagavir said:
Does that include ATI's "Quality" AF control panel setting, which only does trilinear filtering on the first texture stage?

Certainly, because the preferred method of control is through the application, and the ATi drivers provide full trilinear in UT2K3 when it's asked for by the game. People often forget that the control panel settings for these features are there to force them in applications which don't allow them to be set from within. Ideally, all 3D games would let you set these things from within them and the control panel would never have to be set to anything other than "Application Preference."


So the same could be said for the Cats...
Nope, when Trilinear is set from within the game you get full trilinear with the Cats--can't get full trilinear with these Dets in UT2K3 from either the control panel or from within the game. That's the problem.
 
WaltC said:
Nope, when Trilinear is set from within the game you get full trilinear with the Cats--can't get full trilinear with these Dets in UT2K3 from either the control panel or from within the game. That's the problem.
No, no; it's not a problem....it's a feature... :rolleyes: ;)
 
I knew it guys. It was an R3xx chip. :D

Dave, do I get a cookie mailed to me or something?

Anyway Dave, where in hte living blazes did you find the registry enteries for the setting?
 
digitalwanderer said:
WaltC said:
Nope, when Trilinear is set from within the game you get full trilinear with the Cats--can't get full trilinear with these Dets in UT2K3 from either the control panel or from within the game. That's the problem.
No, no; it's not a problem....it's a feature... :rolleyes: ;)


uh-oh, now Microsoft is gonna sue Nvidia for patent infringement! ;)
 
One member of Muropaketti (http://www.muropaketti.com) tested UT2k3 with FX5800U, GF4 and Radeon 9700 and posted some screenshots. His claim is that 9700 isn't doing real trilinear-filtering even when using UT2k3 to set the filtering-method. Is he right or is there some setting in UT2k3.ini which could cause this kind of behaviour? Here are the screenshots:

1. FX5800U without antidetector:
2. FX5800U with the script
3. GF4
4. Radeon 9700's trilinear

On a side note, how one can set the UT2k3 to use trilinear filtering? Through game-settings or via ut2003.ini?
 
Miksu, on my 9700 Pro I did get a difference in IQ as I have previously stated over quality forced mode in the CP.

You must go into the UT ini file and set AF to 16.
 
That´s a ridiculous comparison, isn´t it?
The 9700 screenshot is from another viewing angle relative to the floor than the GF:s. Of course the result will look a little different, but it´s still trilinear.
 
DaveBaumann said:
Brent - in your upcoming 9600 review will you be investigating the texture quality slider under D3D to see if you can get a similar UT2003 quality to NVIDIA and then benchmarking them that way?

yes
 
Brent said:
DaveBaumann said:
Brent - in your upcoming 9600 review will you be investigating the texture quality slider under D3D to see if you can get a similar UT2003 quality to NVIDIA and then benchmarking them that way?

yes

Can you give me the registry keys?
 
K.I.L.E.R said:
Brent said:
DaveBaumann said:
Brent - in your upcoming 9600 review will you be investigating the texture quality slider under D3D to see if you can get a similar UT2003 quality to NVIDIA and then benchmarking them that way?

yes

Can you give me the registry keys?

There are no registry settings to mess with

Just pull the Texture Preference slider down from High Quality to Quality, Performance or High Performance. So far it looks like High Performance matches the closest with what NVIDIA is doing.
 
Back
Top