Excellent article on NVidia and ATi Optimisations

WaltC said:
But basically, I'm just saying it's fine to use the Cpanel when a game doesn't support configuration like that, but it should be understood that to tune many games it just won't be possible to do it via the Cpanel--because the IHV's are only supporting a limited set of configuration options--and they only do that much because these options have to be forced in games which don't allow their configuration.
That's fine. However, the problem the article addendum tries to address is that you have to disable any control panel settings to get the in-game settings to actually work. There's some fuzzy interaction going on here that can't be ironed out as long as there are "application specific optimizations" in the driver. You get more quality if you select less in the CPanel. How many end users are supposed to understand that?
Heck, speaking of Tom's in the other thread, how many reviewers can be reasonably expected to get it? "They" primarily screw the press, and by instrumenting the press "they" screw the public. It has never been doubted that those in the know can get more information and more control. Doesn't help Joe Sixpack at all, unfortunately.

WaltC said:
Exxtreme said:
I think, ATi and Nvidia should offer new options, which allow the user to change the quality level. I don't really like this angle-dependent AF in the current radeon products. I want to have the choice between high quality or lower quality and better performance.

I think the idea of angle-dependency is only that planes when viewed from certain angles won't show the benefits of AF, so why treat them with AF...Right? I think the best criticism of it is that it's not 100% efficient--but I think the idea has merit. I mean if we think it's advantageous not to render pixels in a frame which are occluded by other pixels from the vantage of the camera, what's wrong with not treating textures with AF when viewed at angles at which the AF isn't visible?
I believe he was referring to the flower shape of the AF mipmap selection, as visualized in SamX' AF-Tester. In a perfect world, this should be a circle (NV comes quite close w the right driver settings).
The other type of angular dependency optimization is common to all cards offering AF at all, AFAIK, and can't be turned off - it would be useless, just like you already stated.

WaltC said:
But anyway...I was under the assumption that the Dets, and I certainly know it's true with the Cats, already offer "performance" and "quality" options via their control panels. What new options did you have in mind?
If I may answer for Exxtreme (we've briefly discussed this):
ATI drivers apparently have a "force trilinear" option, controlled by some super-secret reg key. That one could be exposed.
 
WaltC said:
Tridam said:
You're true but a simple no-compromises checkbox could be great ;) Especially with high-end boards !

So how would such a checkbox guarantee the "no-compromises" you are talking about (not to mention that such a phrase could be interpreted to mean lots of different things)?
By no-compromises I want to say full-quality of course.

The checkbox would be implemented by IHV to offer a full quality option.

WaltC said:
Right now, if you select the "full trilinear checkbox" (speaking figuratively) in the Detonator cpanel for UT2K3, or you select the "full trilinear checkbox" within UT2K3 itself, the Detonators still don't give it to you. What then?
With a full quality checkbox, it couldn't be the case.
 
zeckensack said:
That's fine. However, the problem the article addendum tries to address is that you have to disable any control panel settings to get the in-game settings to actually work. There's some fuzzy interaction going on here that can't be ironed out as long as there are "application specific optimizations" in the driver. You get more quality if you select less in the CPanel. How many end users are supposed to understand that?

How do you figure that setting "Application Preference" equates to to "less" quality? IMO, what that setting means to me is that whatever quality setting you get is determined by the application you're running, and the settings you make relative to quality within the application. The idea behind AP is that you don't need the control panel to force IQ in the application because you're using the application to set the quality levels of IQ independently of the cpanel. The drivers are set to disable the cpanel automatically when AP is selected expressly so there won't be a conflict.

Were all games properly configurable internally you'd never need to budge off the AP setting in the cpanel.

Heck, speaking of Tom's in the other thread, how many reviewers can be reasonably expected to get it? "They" primarily screw the press, and by instrumenting the press "they" screw the public. It has never been doubted that those in the know can get more information and more control. Doesn't help Joe Sixpack at all, unfortunately.

Of course, these debates are constrained to people who understand them. Joe 6-pack isn't going to care about what goes on in the Cpanel, or in the application, or both...;) But if Joe hangs in there he'll eventually learn the basics.

I believe he was referring to the flower shape of the AF mipmap selection, as visualized in SamX' AF-Tester. In a perfect world, this should be a circle (NV comes quite close w the right driver settings).
The other type of angular dependency optimization is common to all cards offering AF at all, AFAIK, and can't be turned off - it would be useless, just like you already stated.

Ok, in a "perfect world" one might also say that all of the pixels in a frame should be rendered regardless of whether they are visible in a frame....;) One man's "perfect world" is another man's "imperfect world," I guess.

ATI drivers apparently have a "force trilinear" option, controlled by some super-secret reg key. That one could be exposed.

OK, I really don't understand what you're saying. The issue relative to UT2K3 is not that that the Cat Cpanel setting doesn't provide trilinear, but that it applies applies TF to a single texture stage--which in most all other games is sufficient, since that one texture stage is all that is rendered visible by the camera. IE, in other games, the textures are laid so that the visible texture statge is the same one treated with trilinear--no point in treating the other stages since they underlie the visible stage and can't be seen--hence treating those stages would not improve IQ relative to TF but would negatively impact performance. But in UT2K3, you have more than one texture stage exposed to the camera and so you get the standard TF forced for the usual stage, which is visible, but you also see another stage relative to detail textures, which only receives bilinear treatment--but this only happens in UT2K3 as far as I know. So you go into the UT2K3.ini and turn trilinear on, and set the Cpanel to application preference, and the game engine then instructs the driver as to which texture stages need trilinear in order for full trilinear to take place inclusive of detail textures, and you then get full trilinear in UT2K3 with the current Cats. So it's not a question of turning TF on, but rather of which textures stages receive TF prior to rendering. Hence, a registry setting for "full trilinear" would not be required--and indeed is not required to get full trilinear in UT2K3 with the Cats.

That's the way I understand it. Of course if this is incorrect, I'd like to know it--but it's what I've gathered is true in this case from the evidence presented.
 
Tridam said:
With a full quality checkbox, it couldn't be the case.

Nvidia's control panel does not need a single extra button. All they have to do is to remove an application-specific drop in quality.
 
zeckensack said:
WaltC said:
But basically, I'm just saying it's fine to use the Cpanel when a game doesn't support configuration like that, but it should be understood that to tune many games it just won't be possible to do it via the Cpanel--because the IHV's are only supporting a limited set of configuration options--and they only do that much because these options have to be forced in games which don't allow their configuration.
That's fine. However, the problem the article addendum tries to address is that you have to disable any control panel settings to get the in-game settings to actually work. There's some fuzzy interaction going on here that can't be ironed out as long as there are "application specific optimizations" in the driver. You get more quality if you select less in the CPanel. How many end users are supposed to understand that?
Who said "Application Preference" had to be less quality than the other settings? In fact, this could be the best quality option in many cases simply because the application (should) know what pieces of the scene need what settings.
ATI drivers apparently have a "force trilinear" option, controlled by some super-secret reg key. That one could be exposed.
As has been already noted elsewhere, this key does not work the way people seem to think.
 
Tridam said:
With a full quality checkbox, it couldn't be the case.

Can't understand your logic, there. An IHV can have any checkbox in his drivers say whatever he wants it to say, and tie any function he chooses to that checkbox, regardless of what it says. Ie, if an IHV can code his drivers to ignore a request for full trilinear, either from a game or his Cpanel (when both the cpanel and the game say "full trilinear" on the respective checkboxes), he could just as easily setup his drivers to reroute the "full quality checkbox" when selected to any function, or combination of functions, he chooses.

The only defenses an end-user has against this sort of tactic are clever and observant hardware reviewers who catch it and figure it out--and expose the fallacy. The logic is very straightforward.
 
WaltC said:
Tridam said:
With a full quality checkbox, it couldn't be the case.

Can't understand your logic, there. An IHV can have any checkbox in his drivers say whatever he wants it to say, and tie any function he chooses to that checkbox, regardless of what it says. Ie, if an IHV can code his drivers to ignore a request for full trilinear, either from a game or his Cpanel (when both the cpanel and the game say "full trilinear" on the respective checkboxes), he could just as easily setup his drivers to reroute the "full quality checkbox" when selected to any function, or combination of functions, he chooses.

The only defenses an end-user has against this sort of tactic are clever and observant hardware reviewers who catch it and figure it out--and expose the fallacy. The logic is very straightforward.

Ok, I wasn't clear enough.

In fact the checkbox I'm talking about represent the ability to disable some general optimisations (that I prefer to call compromises). By general optimisations, I mean bilinear filtering on stage >0, adaptive anisotropic filtering... Which are good in many cases but not always in ALL cases.

Of course, this checkbox would also disable app specific optimisations.

This checkbox would be implemented by the IHV to disable optimisations and offer a non-compromise mode to users. I'm not talking about benchmarks and optimisations. I'm just talking about a full quality possibility for people who buy a high-end card. The only goal of the checkbox will be full quality. I just wanted to ask Opengl Guy why they don't implement such a checkbox.
 
zeckensack said:
ATI drivers apparently have a "force trilinear" option, controlled by some super-secret reg key. That one could be exposed.

it actually only forces AF but not trilinear.
 
WaltC said:
OK, I really don't understand what you're saying. The issue relative to UT2K3 is not that that the Cat Cpanel setting doesn't provide trilinear, but that it applies applies TF to a single texture stage--which in most all other games is sufficient, since that one texture stage is all that is rendered visible by the camera. IE, in other games, the textures are laid so that the visible texture statge is the same one treated with trilinear--no point in treating the other stages since they underlie the visible stage and can't be seen--hence treating those stages would not improve IQ relative to TF but would negatively impact performance.

I'm not sure how you and others support that conclusion. Looking at 3DCenter's tests, it seems that doing trilinear on all the texture stages barely affects performance across a multitude of games.

http://www.3dcenter.org/artikel/ati_nvidia_treiberoptimierungen/index8_e.php

Sure, you lose 5% in Max Payne and ~10% in Aquanox2 by doing trilinear in all texture stages. And 3dmark2001 gains ~10%. What about other games? The scores are pretty static. Oh, except UT2003 :?

Color me clueless, because there isn't much of a palpable gain in ATI's method. Why did they wait until Catalyst3.2 to implement this? Isn't the reasoning that older games would have lower performance doing trilinear in all the texture stages while not increasing visual quality? But apparently this doesn't seem to be the case for the majority of titles?
 
StealthHawk said:
I'm not sure how you and others support that conclusion. Looking at 3DCenter's tests, it seems that doing trilinear on all the texture stages barely affects performance across a multitude of games.

http://www.3dcenter.org/artikel/ati_nvidia_treiberoptimierungen/index8_e.php

Sure, you lose 5% in Max Payne and ~10% in Aquanox2 by doing trilinear in all texture stages. And 3dmark2001 gains ~10%. What about other games? The scores are pretty static. Oh, except UT2003 :?

Color me clueless, because there isn't much of a palpable gain in ATI's method. Why did they wait until Catalyst3.2 to implement this? Isn't the reasoning that older games would have lower performance doing trilinear in all the texture stages while not increasing visual quality? But apparently this doesn't seem to be the case for the majority of titles?

I still don't know why you'd want to implement it on texture treatment stages you can't see...? I mean, I followed your link and didn't see any screeshots explaining why someone would want to set more than a single stage if the application engine only renders the treated stage visible to the camera because of the way the textures are laid. This was my earlier point--that people would want to enable them on the assumption they'd get better IQ, when you wouldn't be able to see the difference except in performance.

I first became familiar with doing something like this using the Detonators with one of Unwinder's RT scripts well over a year ago with a GF4. Didn't deal with TF, but AF, and RT allowed me to go in and set various stages of texture treatment for AF beyond the single stage the Dets set by default. It was kind of interesting and I toyed with it a couple of days but I found I could never improve on the AF IQ regardless of which combination of stages I'd set the drivers to treat above that set in the default settings--I did however manage to reduce performance by quite a bit. But this was with Dets earlier than the 30.82's...;)

But you've nailed the very reason why I think the IHVs would be nuts to put texture-stage treatment settings into their cpanels--lots of people would be setting them all just on "general principle" without worrying over whether they got any real IQ improvement--it would mostly be a psychological exercise. You'd have articles written like this one in which people show framerate differences and say, "Ah-ha! This proves there's a difference!" Yep, you can see the difference in the frame rates, but not in the IQ. As I've said before, doing levels of filtering on the texture stages which are visible, and not doing the same filtering on the invisible stages, is no more of a "cheat" or an "optimization" than is failing to render occluded pixels in a frame. I think most people can understand why it's better not to render occluded pixels in a frame (pixels behind other pixels along the z axis), because you can't see them even if they are rendered. Why then might it be a good idea to treat occluded texture stages? I can't see how it would, for the same reasons.
 
Tridam said:
Of course, this checkbox would also disable app specific optimisations.

This checkbox would be implemented by the IHV to disable optimisations and offer a non-compromise mode to users. I'm not talking about benchmarks and optimisations. I'm just talking about a full quality possibility for people who buy a high-end card. The only goal of the checkbox will be full quality. I just wanted to ask Opengl Guy why they don't implement such a checkbox.


I'm not trying to flippant but some of this isn't making much sense, at least to me. It seems like a trend is developing so that if someone can do *something* to a set of drivers to cause the framerates to drop this is being considered "proof" of "compromises" to IQ by the IHV. I see a marked tendency to want to generalize the issues rather than to examine them in the kind of detail they deserve...because you can do many things in the drivers to reduce framerate which do not increase IQ. (Like rendering occluded pixels, for starters.)

The issue is not "Do framerates drop when I do x,y, or z?" but rather "When framerates drop when I do x,y, or z, does IQ increase as a result?" IE, we don't want to get to a place where mucking up drivers simply to get reductions in framerates assumes a corresponding increase in IQ. There are many ways to drop framerates in a driver without increasing IQ--asking for them would seem a big step backwards, to me.

I mean, if you want to go purist and really talk about compromises, then we need to stop worrying about framerates altogether and talk about software ray tracing...;) IMO, 3d isn't just about framerates and it isn't just about IQ--it's about getting the best of *both* with available 3d technology. Right? I don't think you can fault an IHV for trying to deliver the best of both.
 
WaltC said:
StealthHawk said:
I'm not sure how you and others support that conclusion. Looking at 3DCenter's tests, it seems that doing trilinear on all the texture stages barely affects performance across a multitude of games.

http://www.3dcenter.org/artikel/ati_nvidia_treiberoptimierungen/index8_e.php

Sure, you lose 5% in Max Payne and ~10% in Aquanox2 by doing trilinear in all texture stages. And 3dmark2001 gains ~10%. What about other games? The scores are pretty static. Oh, except UT2003 :?

Color me clueless, because there isn't much of a palpable gain in ATI's method. Why did they wait until Catalyst3.2 to implement this? Isn't the reasoning that older games would have lower performance doing trilinear in all the texture stages while not increasing visual quality? But apparently this doesn't seem to be the case for the majority of titles?

I still don't know why you'd want to implement it on texture treatment stages you can't see...? I mean, I followed your link and didn't see any screeshots explaining why someone would want to set more than a single stage if the application engine only renders the treated stage visible to the camera because of the way the textures are laid. This was my earlier point--that people would want to enable them on the assumption they'd get better IQ, when you wouldn't be able to see the difference except in performance.

I first became familiar with doing something like this using the Detonators with one of Unwinder's RT scripts well over a year ago with a GF4. Didn't deal with TF, but AF, and RT allowed me to go in and set various stages of texture treatment for AF beyond the single stage the Dets set by default. It was kind of interesting and I toyed with it a couple of days but I found I could never improve on the AF IQ regardless of which combination of stages I'd set the drivers to treat above that set in the default settings--I did however manage to reduce performance by quite a bit. But this was with Dets earlier than the 30.82's...;)

But you've nailed the very reason why I think the IHVs would be nuts to put texture-stage treatment settings into their cpanels--lots of people would be setting them all just on "general principle" without worrying over whether they got any real IQ improvement--it would mostly be a psychological exercise. You'd have articles written like this one in which people show framerate differences and say, "Ah-ha! This proves there's a difference!" Yep, you can see the difference in the frame rates, but not in the IQ. As I've said before, doing levels of filtering on the texture stages which are visible, and not doing the same filtering on the invisible stages, is no more of a "cheat" or an "optimization" than is failing to render occluded pixels in a frame. I think most people can understand why it's better not to render occluded pixels in a frame (pixels behind other pixels along the z axis), because you can't see them even if they are rendered. Why then might it be a good idea to treat occluded texture stages? I can't see how it would, for the same reasons.

Hmm, I'm not saying they should enable it on all texture stages when it would provide no IQ benefit.

Like I said, the reasoning for enabling trilinear only on the first texture stage has been given as such: it provides a performance increase in older games while IQ remains the same.

So again I ask, what performance increase? According to 3DCenter, two applications experienced decreased performance by enabling trilinear on all texture stages, and one application experienced the opposite - increased performance. One of the games in which a performance decrease is seen in is Aquanox2, which is a new game.

To get to the meat of it, I am simply curious as to the point of this optimization. It seems rather that it does not affect older games much at all :!: Unless you have proof otherwise(larger sample, maybe 3DCenter just happened to not test the right applications). Considering that it doesn't help much, but does hurt UT2003, doesn't it seem like the "benefits" outweigh the cost? I am sure that you have seen people in forums and such lumping ATI with NVIDIA even though ATI provides a way to get full quality.
 
WaltC said:
How do you figure that setting "Application Preference" equates to to "less" quality? IMO, what that setting means to me is that whatever quality setting you get is determined by the application you're running, and the settings you make relative to quality within the application. The idea behind AP is that you don't need the control panel to force IQ in the application because you're using the application to set the quality levels of IQ independently of the cpanel. The drivers are set to disable the cpanel automatically when AP is selected expressly so there won't be a conflict.

Were all games properly configurable internally you'd never need to budge off the AP setting in the cpanel.
Now you are talking about a perfect world ;)
That's sure the way it was meant to be, but it didn't happen. Looking at my ATI cp right now, at the top of the pane, there's a 'master' slider. The shipping state is "balanced", which activates the "application preference" checkboxes. If I move this to the right, "application preference" gets turned off and I get 2xAA/8xAF at "high quality" and 4xAA/16xAF at "optimum quality". So it's not really me, it's ATI ;)
And that's pragmatic. For the vast majority of games, app pref means no AA and no AF. Games shops are reluctant to implement these controls because they add complexity where there's no immediate need. Users can use the cp and that's what they've always done. Even UT2k3 doesn't expose its AF controls, in its menus I mean.
That's of course conceptually wrong, but this is how it works today.
*shrugs*

WaltC said:
Heck, speaking of Tom's in the other thread, how many reviewers can be reasonably expected to get it? "They" primarily screw the press, and by instrumenting the press "they" screw the public. It has never been doubted that those in the know can get more information and more control. Doesn't help Joe Sixpack at all, unfortunately.

Of course, these debates are constrained to people who understand them. Joe 6-pack isn't going to care about what goes on in the Cpanel, or in the application, or both...;) But if Joe hangs in there he'll eventually learn the basics.
Joe will only be able to control things he understands ;)
Joe: What's AA? What's AF? How can I use it?
MrX: It makes your graphics look nicer. You can turn it on in the control panel.
Joe: Thx!
MrX: Oh, except for UT2k3. AA is fine, but for AF you need to fiddle with the ini thingagum and do the opposite of what you'd do for other games when it comes to the control panel. Just like Serious Sam.

IMO this isn't exactly simplifying things. We could just as easily admit defeat and keep the 'de facto standard' to handle this stuff.
No offense intended, but you're the first one I happen to know of who complains about the lack of in-game controls :)
WaltC said:
I believe he was referring to the flower shape of the AF mipmap selection, as visualized in SamX' AF-Tester. In a perfect world, this should be a circle (NV comes quite close w the right driver settings).
The other type of angular dependency optimization is common to all cards offering AF at all, AFAIK, and can't be turned off - it would be useless, just like you already stated.

Ok, in a "perfect world" one might also say that all of the pixels in a frame should be rendered regardless of whether they are visible in a frame....;) One man's "perfect world" is another man's "imperfect world," I guess.
Overwriting a pixel that's z-buffered away is a completely destructive process, the old pixel is lost. Texture filters are much more subtle.
I personally don't play these things, but eg the flight sim crowd seems to regularly complain about ATI's AF. A surface's texture sharpness fluctuates not only with its "forward angle" but with its rotation around the z-axis. It does irritate some people, in some specific games. ATI's flower shaped thingy is not a generally valid optimization, so to speak.

WaltC said:
OK, I really don't understand what you're saying.

<...>

So it's not a question of turning TF on, but rather of which textures stages receive TF prior to rendering. Hence, a registry setting for "full trilinear" would not be required--and indeed is not required to get full trilinear in UT2K3 with the Cats.

That's the way I understand it. Of course if this is incorrect, I'd like to know it--but it's what I've gathered is true in this case from the evidence presented.
To be brutally honest, I didn't read the whole article, only the parts I've translated ... and the conclusion :)
But I think that's the gist of it.

You have a point in wanting to prevent users from shooting themselves in the foot. I can second that.

What I wanted to say is that there's something wrong when only 'experts' can get something as simple as AF without getting screwed. It should improve quality, period. Stage optimizations (ie less or no AF on higher stages) are somewhat okay with me, but fiddling with the base filter is not.
If, by activating AF through the usual means, I get a drop in quality somewhere that's hopefully offset by an improvement somewhere else, that's just missing the point.
And bringing this back into context, Joe will be quite angry if he can't reproduce the performance and/or quality he has been promised by the reviews.

Semi-OT:
IMO ATI's R300+ AF controls are just wrong. "Performance" AF is an R200 AF emulator and isn't really needed at all. In yet another perfect world, I'd like to have an AF slider and an "on" knob, and the bilinear/trilinear thing would be completely under application control. Why should the driver create any artificial connection between bi/tri and AF? I guess that would actually work :)
Regarding stage optimizations, a switch to turn on "no compromises" mode would still be appreciated as long as it's clearly labelled. I'd suggest "insane" and a little "did you really mean to click here?" popup :)
 
OpenGL guy said:
Exxtreme said:
I think, ATi and Nvidia should offer new options, which allow the user to change the quality level.
Huh? There are plenty of options now, which people are already misunderstanding. If people can't figure out what's offered now what chance do they have of understanding more choices?
Yes, more choices would be nice. I have a high-end-card here but i cannot enable high-end-image-quality because the driver doesn't allow this.
 
IMO this isn't exactly simplifying things. We could just as easily admit defeat and keep the 'de facto standard' to handle this stuff.
No offense intended, but you're the first one I happen to know of who complains about the lack of in-game controls

nVidia are trying to make cheating the "de facto standard", and I see no reason to bow to that. Just as I see no reason to bow to some stupid idea that all games are the same.

If you want to start moving backwards, good for you, for I want to control settings on a per-game basis. Imagine if you had to set your game resolution via the control panel too!

I don't recall their being a cry for many of the things we see in 3D today, yet if you tried to remove some of them you'd be lynched.
 
WaltC said:
Game developers can change it, though. Lots of games allow differing degrees of in-game control over various things--the UT/UT2K3-engine games, NOLF2, NWN, Q3 and its derivatives, etc.--there are actually a lot of games which provide varying degrees of in-game control. The problem is one of consistency and standards, but I don't see this changing much because all game engines are different. I do think they could standardize and all include support for FSAA and AF, certainly.
Your're right, but do you really know what many developers think about features like AA + AF?
WaltC said:
But basically, I'm just saying it's fine to use the Cpanel when a game doesn't support configuration like that, but it should be understood that to tune many games it just won't be possible to do it via the Cpanel--because the IHV's are only supporting a limited set of configuration options--and they only do that much because these options have to be forced in games which don't allow their configuration. When you go to generically forcing things in 3d games you necessarily leave the area of specifics and move toward generalities. You've got to make certain assumptions when you do it, and games which deviate from those assumptions won't respond properly from the control panel (like Splinter Cell, UT2K3, etc.) But if developers start implementing common controls for the features commonly seen in IHV control panels, then the end user can set specifics from within the game and such problems recede, and the Cpanel controls would become redundant and, depending on the game, inferior to in-game control.
This is the optimal solution when the developers start to implementing the controls in the games. But there are tons of games, which doesn't have this controls.

More options in the control panels would minimize this problem. In the current control panel you can choose between "performance" = bi-AF und "quality" = tri-AF on stage 0. This is IMHO not enough. My rTool offers a third option. This option enables tri-AF on every stage but unfortunately this option cannot force it.

WaltC said:
I think the idea of angle-dependency is only that planes when viewed from certain angles won't show the benefits of AF, so why treat them with AF...Right? I think the best criticism of it is that it's not 100% efficient--but I think the idea has merit. I mean if we think it's advantageous not to render pixels in a frame which are occluded by other pixels from the vantage of the camera, what's wrong with not treating textures with AF when viewed at angles at which the AF isn't visible?
Errm, every angle shows benefits from AF. The current solution is IMHO "lowering the overall image quality to get higher framerates". It is a good compromise for many games but it is not good enough for ALL games.

And here has ATi's competition the better solution.. except UT2003. ;)
WaltC said:
But anyway...I was under the assumption that the Dets, and I certainly know it's true with the Cats, already offer "performance" and "quality" options via their control panels. What new options did you have in mind?
Supersampling and angle-independent AF would be nice. This are real high-quality features.

WaltC said:
Let's imagine that IHVs start allowing for selection in the control panel of which texture stages are treated for AF, Trilinear, etc. I can see a big negative there when people start selecting all texture stages for treatment without realizing that in most games you only see one texture layer, the last one laid down relative to the camera (as noted, UT2K3 is an exception.) In most cases the users who don't know what in-game controls are and so rely on the Cpanel also won't have a clue as to which texture stages they need to select for which games so that they can get the IQ they want without paying a performance penalty. Instead, they'll just select for all of the stages to be treated, imagining this improves IQ, when the only thing it will do is lower performance. Besides, most developers never release that kind of info to the general public--which wouldn't know what to do with it if it had it...;)
Heh?

I think, four settings for AF should be enough for the most users:
high performance -> Bi-AF
performance -> current "quality"
Application -> Tri-AF on every stage except games, which set the texture filtering via API.
Quality -> Force Tri-AF on every texture stage


Additionally a switch between angle-dependence/angle-independence and i would be very happy and some nvidia fans would have less arguments. :D
 
zeckensack said:
If I may answer for Exxtreme (we've briefly discussed this): ATI drivers apparently have a "force trilinear" option, controlled by some super-secret reg key. That one could be exposed.
This "secret" ;) registry value only forces the AF-level. The filtering algorhytm is "application preference".
 
Exxtreme said:
zeckensack said:
If I may answer for Exxtreme (we've briefly discussed this): ATI drivers apparently have a "force trilinear" option, controlled by some super-secret reg key. That one could be exposed.
This "secret" ;) registry value only forces the AF-level. The filtering algorhytm is "application preference".

So THAT'S how rTool sets full trilinear and max AF?
 
K.I.L.E.R said:
Exxtreme said:
zeckensack said:
If I may answer for Exxtreme (we've briefly discussed this): ATI drivers apparently have a "force trilinear" option, controlled by some super-secret reg key. That one could be exposed.
This "secret" ;) registry value only forces the AF-level. The filtering algorhytm is "application preference".

So THAT'S how rTool sets full trilinear and max AF?
Yes, the rTool cannot force tri-af because the driver doesn't allow this.

When you're running 3DMark01 and set this filtering method, you will get higher framerates because 3DMark01 sets the bilinear filtering via API.
 
Exxtreme said:
K.I.L.E.R said:
Exxtreme said:
zeckensack said:
If I may answer for Exxtreme (we've briefly discussed this): ATI drivers apparently have a "force trilinear" option, controlled by some super-secret reg key. That one could be exposed.
This "secret" ;) registry value only forces the AF-level. The filtering algorhytm is "application preference".

So THAT'S how rTool sets full trilinear and max AF?
Yes, the rTool cannot force tri-af because the driver doesn't allow this.

When you're running 3DMark01 and set this filtering method, you will get higher framerates because 3DMark01 sets the bilinear filtering via API.

So if I set application filtering and 16xAF under rTool and I set trilinear filtering in UT03 then I would get full trilinear filtering and 16xAF.

That's what setting I normally play at but I set it in the UT ini file.
 
Back
Top