Excellent article on NVidia and ATi Optimisations

Quitch said:
IMO this isn't exactly simplifying things. We could just as easily admit defeat and keep the 'de facto standard' to handle this stuff.
No offense intended, but you're the first one I happen to know of who complains about the lack of in-game controls

nVidia are trying to make cheating the "de facto standard", and I see no reason to bow to that. Just as I see no reason to bow to some stupid idea that all games are the same.

If you want to start moving backwards, good for you, for I want to control settings on a per-game basis. Imagine if you had to set your game resolution via the control panel too!
I really don't know why you're jumping at me this way.
AA/AF stuff through cp is the de facto thingy because users do it. They have to for most games, you see, so these cp controls can't be removed now. Because the cp stuff is here anyway, game devs can get away with being lax about app control. Vicious circle. As an IHV you can either break it and risk bags full of hate mail or you can accept it.

There's about zero resemblance to NV's blatant cheats (and ATI's perhaps-not-so-blatant cheats). Deception doesn't fall into the "users wish to be able to do this, so we'll support it" category. I hope we can agree here.

AA/AF started out as transparent enhancements for games that were released years earlier. Selectable resolutions and trilinear filtering didn't.

Quitch said:
I don't recall their being a cry for many of the things we see in 3D today, yet if you tried to remove some of them you'd be lynched.
Isn't that what I said? Did I request that applications are disallowed from controlling things? :rolleyes:
 
K.I.L.E.R said:
Exxtreme said:
K.I.L.E.R said:
Exxtreme said:
zeckensack said:
If I may answer for Exxtreme (we've briefly discussed this): ATI drivers apparently have a "force trilinear" option, controlled by some super-secret reg key. That one could be exposed.
This "secret" ;) registry value only forces the AF-level. The filtering algorhytm is "application preference".

So THAT'S how rTool sets full trilinear and max AF?
Yes, the rTool cannot force tri-af because the driver doesn't allow this.

When you're running 3DMark01 and set this filtering method, you will get higher framerates because 3DMark01 sets the bilinear filtering via API.

So if I set application filtering and 16xAF under rTool and I set trilinear filtering in UT03 then I would get full trilinear filtering and 16xAF.
Yes, in this case, you will get full trilinear af.
 
zeckensack said:
...You have a point in wanting to prevent users from shooting themselves in the foot. I can second that.

What I wanted to say is that there's something wrong when only 'experts' can get something as simple as AF without getting screwed. And bringing this back into context, Joe will be quite angry if he can't reproduce the performance and/or quality he has been promised by the reviews...

I think this is the crux of the issue: does it take an "expert" to change settings in the UT.ini files...?

My position would be: hardly...;) I've been changing them since UT shipped years ago. I was changing them in the Quake-engine games long before that--GLIDE games--you name it. There have been "guides" plastered all over the Internet for years on suggested settings for these simple text files, etc.

But the biggest fallacy here, I think, is that "Joe", not presumably being expert enough to even know what a text configuration file actually is, would ever notice that he wasn't getting optimal IQ. Heh...;) Who do you think nVidia is targeting with its removal of full trilinear UT2K3 treatment, if not "Joe" himself....? Only an "expert" as you put it would be able to notice the IQ differences in the first place. And the really funny thing is that "Joe" wouldn't know what texture-stage filtering options in the cpanel were in the first place--and wouldn't be able to determine differences between settings in the second. Options are no less inscrutable to "Joe" simply because they are in the cpanel as opposed to in a game's internal configuration text files.

About control panels in general: we can opine endlessly on how we'd like to see them configured as GUIs, but the bottom line relative to UT2K3-like situations is that the cpanel is irrelevant. What matters is whether the drivers will do what the user instructs them to do *through the application*--that has to be the bottom line and the only litmus test applicable.

The deficiency here comes from the game developers--not the IHVs (unless an IHV does something like the permanent disabling of a function for a particular game.) People are getting fixated on the cpanel when they ought to be fixated on spurring developers to implement thorough, easy-to-use control settings within their applications, IMO.
 
WaltC said:
Tridam said:
Of course, this checkbox would also disable app specific optimisations.

This checkbox would be implemented by the IHV to disable optimisations and offer a non-compromise mode to users. I'm not talking about benchmarks and optimisations. I'm just talking about a full quality possibility for people who buy a high-end card. The only goal of the checkbox will be full quality. I just wanted to ask Opengl Guy why they don't implement such a checkbox.


I'm not trying to flippant but some of this isn't making much sense, at least to me. It seems like a trend is developing so that if someone can do *something* to a set of drivers to cause the framerates to drop this is being considered "proof" of "compromises" to IQ by the IHV. I see a marked tendency to want to generalize the issues rather than to examine them in the kind of detail they deserve...because you can do many things in the drivers to reduce framerate which do not increase IQ. (Like rendering occluded pixels, for starters.)

The issue is not "Do framerates drop when I do x,y, or z?" but rather "When framerates drop when I do x,y, or z, does IQ increase as a result?" IE, we don't want to get to a place where mucking up drivers simply to get reductions in framerates assumes a corresponding increase in IQ. There are many ways to drop framerates in a driver without increasing IQ--asking for them would seem a big step backwards, to me.

I mean, if you want to go purist and really talk about compromises, then we need to stop worrying about framerates altogether and talk about software ray tracing...;) IMO, 3d isn't just about framerates and it isn't just about IQ--it's about getting the best of *both* with available 3d technology. Right? I don't think you can fault an IHV for trying to deliver the best of both.

I think that you don't understand what I mean.

I fully agree with you on this.
It's why I want to make a difference between optimisation and compromise.

Using shader replacement for co-issue or register usage, not rendering occluded pixels... are optimisations.

Tri/bi special filtering, no trilinear filtering on stage >0, no anisotropic filtering everywhere... are compromises.

I want a checkbox to have the possibility to disable these compromises. In some cases these compromises can decrease the quality. So why can't we disable them in theses cases ???
 
StealthHawk said:
So again I ask, what performance increase? According to 3DCenter, two applications experienced decreased performance by enabling trilinear on all texture stages, and one application experienced the opposite - increased performance. One of the games in which a performance decrease is seen in is Aquanox2, which is a new game.

To get to the meat of it, I am simply curious as to the point of this optimization. It seems rather that it does not affect older games much at all :!: Unless you have proof otherwise(larger sample, maybe 3DCenter just happened to not test the right applications). Considering that it doesn't help much, but does hurt UT2003, doesn't it seem like the "benefits" outweigh the cost? I am sure that you have seen people in forums and such lumping ATI with NVIDIA even though ATI provides a way to get full quality.


First, what "optimization" are you talking about? An "optimization" is something you do in your code base which is application-specific--which affects only the one, particular application you are talking about--and nothing else. There's nothing UT2K3-specific about the stage treatment forced by ATi's cpanel when it's used--it forces the same stage treatment for everything you run. What's different about UT2K3 is the way the game layers textures in relation to other games--the behavior of the cpanel in this case is generic.

The fact that you can transfer control of the options from the cpanel to UT2K3 by selecting "application preference" in the cpanel, and achieve full trilinear in UT2K3, *proves* there is no application-specific code in the Cat drivers which is "optimized" to only allow partial trilinear filtering in the game. It proves it. If such an optimization existed in the drivers then it would not matter whether you used the cpanel OR the internal game configuration--you'd only get texture treatment on one stage in UT2K3 (See: nVidia Dets for example of UT2K3 "optimization" in this regard.)

Second, what case have you made that the status quo in regard to cpanel defaults is insufficient? All you have referenced is framerate counters going up or down in just a few cases--you haven't tied the framerates (up or down) with increases or decreases in application IQ. So, I think the fact that framerates stay the same overall simply better supports the current default settings in the cpanel--only provides reason to suggest they are currently fine and do not need to be changed.
 
Tridam said:
...
I want a checkbox to have the possibility to disable these compromises. In some cases these compromises can decrease the quality. So why can't we disable them in theses cases ???

OK, I think I see what you mean. I'm not arguing that such a thing isn't theoretically possible--because it is. My thinking, though, is that pragmatically the implementation of such an option is highly unlikely. What is more likely in the case of some IHVs is more of the same behavior in having an end user believe that a particular function specified is being executed but in reality something different is being substituted (re: nVidia and UT2K3 & full trilinear.)

I think there are practical difficulties, too, such as how an IHV might offer an option to eliminate application-specific optimizations which does not also eliminate application-specific bug fixes in the driver code as well. Then there's the very real possibility of also eliminating performance-related optimizations in the drivers which do not negatively impact IQ when they are implemented.

Let's take for a moment the small shader reordering optimization that ATi originally did for 3dMk03. No IQ was lost, no "work" in the benchmark was cut out, but only the shader code itself was reordered to run more efficiently on R3x0. While ATi is certainly better off leaving benchmarks alone in this regard, what possible benefit do you see in leaving such things off in actual 3d games, when the effect is you run faster with no loss of IQ as a result? In that case what you'd be asking for is an option to allow you to degrade driver functionality, which is the antithesis of what driver development is all about. This is why I think it unlikely an IHV will respond to such a request.
 
WaltC said:
Then there's the very real possibility of also eliminating performance-related optimizations in the drivers which do not negatively impact IQ when they are implemented.

WaltC said:
Let's take for a moment the small shader reordering optimization that ATi originally did for 3dMk03. No IQ was lost, no "work" in the benchmark was cut out, but only the shader code itself was reordered to run more efficiently on R3x0.

WaltC said:
In that case what you'd be asking for is an option to allow you to degrade driver functionality, which is the antithesis of what driver development is all about.

This is why I make a difference between optimisation and compromise ;)


This checkbox could be a hidden fonction like coolbits in nvidia's drivers. But there's another problem... IHVs don't want to give the possibility to do benchmarks without their compromises they've choosen... For my part, if I buy a 500$/€ board, I want to have the ability to choose if I want compromises or not. I want the ability to disable nvidia's tri/bi in UT2003 and I want the ability to have trilinear filtering on texture stage 1 with ATI boards.
 
Tridam, speaking for myself:

I think I agree with you exactly with your distinct label for ATI's implementation of tri/bi. I think the point of conflict somewhat is that you don't (as far as I've noticed) add a 3rd item at the same time, for contrast with the "optimization" and "compromise" labels: "cheat". The "preventing a feature completely for a selected application (and benchmark result distortion)" that was mentioned, that simply isn't true for benchmarking a game like UT2k3 on ATI cards (if the person running them is as informed as these forums). In fact, the games that don't have application side AF control of some type are the ones most likely to (but not guaranteed, certainly) to have this compromise be desirable. With that distinction added, I think I'm agreeing with your recent comments pretty completely.

Why am I not complaining about it?

I do agree that ATI should definitely be providing user control to allow the turning off of the "tri/bi" control panel behavior for "Quality". My response to that situation is to note something, and not just that the application preference check box intuitively and accurately allows the user to bypass this "compromise" (which I think is what is related to your facing disagreement for the lack of the third distinction above): that indications are that ATI will offer exactly the specific control you ask for (and may have already done so for some cards/drivers), and that the registry behavior and control is already there to do it (and I'd think that's the way rtool is working as well).

To be revisited upon a gander at the behavior of the very next driver revision, and the offered explanations, features, and information for consumers (or lack thereof) provided with it.
 
Yes of course, you're right. The third item is cheat! I didn't talk about cheats because no cheat could take place into drivers in a perfect world :D

The 3 items:
- optimisation: better performances without any quality loss
- compromise: better performances with a quality loss (of course the quality loss could be big or small)
- cheat: use predictability, mislead users

Example:
- optimisation: shader reordering to help co-issue (ATI) or to reduce the register usage (NVIDIA) …
- compromise: no trilinear on texture stage >0 (ATI) or false trilinear (NVIDIA) …
- cheat: static clip planes (NVIDIA) …

Are they good ?
- optimisation: always, we want optimisations
- compromise: they're good as possibilities but only good compromise can be enabled by default. IHVs have to give the possibility to disable every compromise.
- cheat: never

Of course, it's just what I think…

The problem is that people use the word optimisation for these 3 items.
 
Well, some compromises can be used to mislead users (i.e., "cheat"), but I think your usage is clearer. There is the possibility that nVidia's UT2k3 behavior could be purely desirable, if offered in the right way (i.e., without nVidia so thoroughly working towards such deception), so using a label different than "cheat" for it makes sense, as long as it is accurate.
 
Tridam said:
I buy a 500$/€ board, I want to have the ability to choose if I want compromises or not. I want the ability to disable nvidia's tri/bi in UT2003 and I want the ability to have trilinear filtering on texture stage 1 with ATI boards.

OK, fine, I can certainly understand your wanting to be able to select full trilinear in the Dets for UT2K3. I think everyone feels that way except nVidia at present.

Regarding ATi and UT2K3, you can get Tri filtering on stage 0 & 1 (assuming those are the relevant stages) by setting the cpanel to application default, and setting Trilinear=True under the D3DDrv.D3DRenderDevice section in the UT2K3.ini text configuration file installed with the game (I have a shortcut to it on my desktop.) So, you get what you want, and the cpanel default stage treatment does not prevent you from getting it. Where's the compromise? (I feel like this is ground plowed many times already.) Can you point to a game other than UT2K3 in which the appropriate cpanel settings for the Cats fails to provide full trilinear support?

IMO, I would see the automatic implementation of 0&1 textures stages for trilinear in games in which only stage 0 is visible to be a degradation of driver function. I think this is where we might disagree. You see it as a "compromise" whereas I see it as simple efficiency.
 
WaltC said:
Regarding ATi and UT2K3, you can get Tri filtering on stage 0 & 1 (assuming those are the relevant stages) by setting the cpanel to application default, and setting Trilinear=True under the D3DDrv.D3DRenderDevice section in the UT2K3.ini text configuration file installed with the game (I have a shortcut to it on my desktop.) So, you get what you want, and the cpanel default stage treatment does not prevent you from getting it. Where's the compromise? (I feel like this is ground plowed many times already.) Can you point to a game other than UT2K3 in which the appropriate cpanel settings for the Cats fails to provide full trilinear support?

I know that I can get full trilinear in ut2003 by editing ut2003.ini. No I can't point you to a game with trilinear problem. But such a game could exist. The potential problem comes only with multi-texturing games when we must enable anisotropic filtering into the cpanel.

In UT2003 this is not a big problem because the detail texture don't need trilinear filtering when they are used as detail textures. But we can see artifacts when the level designers use the detail texture as the main texture. Of course we can enable AF in ut2003.ini and avoit this problem.

(This is off topic but there's a big problem with UT2003 and ATI boards when AF is enabled : the lod seems higher than expected and we loss all details of the detail textures ! It looks really bad when the detail textures are used as main textures. I don't know if it's ATI fault or if it's epic fault but this problem don't exist with NVIDIA boards. We need to decrease the LOD in UT2003.ini with ATI boards.)


So this example of compromise wasn't really good :( But the adaptative AF is a better one. In some games it could be great to enable a full AF.
 
WaltC,

I think I can agree that UT2k3 is 'curable'. I was dwelling on the game mostly in the context of the article. If that's not what we were talking about here anymore, my apologies.
*cough*
The textures used in UT2k3 (and other U toolchain games) are arguably flawed - they are much larger than is justified by the contained amount of detail. A simple proof for that claim is that every U toolchain game released to date by default applies a negative mipmap lod bias to get them sharp. That isn't necessary, not even possible without artifacts, if your mipmaps are right. U more or less by design oversubscribes texture memory and fetch bandwidth and kills texture caches.

That's what makes it an interesting battleground for driver tweaks. Reducing your filtering workload will make most games a little faster, but UT2k3 will get the biggest boost. Add to that, that it's widely used in graphics card reviews.

Due to the overly large impact of filtering performance on UT2k3, even if it isn't specifically cheated on, it's also a very good tool for reviewers to detect general filtering compromises. In this context, for the article, it does make sense to benchmark UT2k3. And it also makes sense to benchmark it in multiple ways, not only the 'right' way, but also the way an average end user would do it.

So much for the article, which IMO isn't too terribly off the mark. As for the other issues, my apologies if I got it all mixed up.
 
Tridam said:
So this example of compromise wasn't really good :( But the adaptative AF is a better one. In some games it could be great to enable a full AF.
This isn't something that can simply be enabled I'm afraid. ATI's AF angle dependency is most certainly a hardware limitation. We'll have to wait for the next generation ;)
 
Tridam said:
I know that I can get full trilinear in ut2003 by editing ut2003.ini. No I can't point you to a game with trilinear problem. But such a game could exist. The potential problem comes only with multi-texturing games when we must enable anisotropic filtering into the cpanel.

In UT2003 this is not a big problem because the detail texture don't need trilinear filtering when they are used as detail textures. But we can see artifacts when the level designers use the detail texture as the main texture. Of course we can enable AF in ut2003.ini and avoit this problem.

(This is off topic but there's a big problem with UT2003 and ATI boards when AF is enabled : the lod seems higher than expected and we loss all details of the detail textures ! It looks really bad when the detail textures are used as main textures. I don't know if it's ATI fault or if it's epic fault but this problem don't exist with NVIDIA boards. We need to decrease the LOD in UT2003.ini with ATI boards.)


So this example of compromise wasn't really good :( But the adaptative AF is a better one. In some games it could be great to enable a full AF.

I think though that we are moving here more in the direction of subjective user IQ preference as opposed to anything objective. For instance, I adjust lots of values in the UT2K3.ini besides trilinear--values which are not adjustable from any current control panel. I thought it was interesting in this case that you mentioned LOD as I adjust it higher than default to my own tastes--I think it looks better than default in that I see more detail, but there is a slight degradation in performance which I can live with. Someone else may adjust these parameters differently (as you say, you prefer to decrease it.) I also slightly skew the POV to my tastes, and a few more things to get the game to both run and render in a manner that most satisfies me. None of these settings is accessible from a driver control panel, however--regardless of which product you own. So if you want to adjust any of these parameters you have to do it from within the game anyway, since there's no other way to do it.

And yes, a game requiring multiple stage texture treatments aside from UT2K3 is of course theoretically possible, I agree. It's also just as theoretically possible that such a game will include in-game controls like UT2K3 so that the game itself can instruct the drivers on which texture stages to treat. In that case--you've not compromised anything...just like with UT2K3 in this narrow regard. Also, within the UT2K3 .ini you can enable AF by using the LevelofAnisotropy=x setting in the .ini, with x being the level you want. I get good results setting this to 16. (However, Epic's convention documentation is non-existent, so I'm guessing at the parameters which actually work with some of their settings.) Epic could do a much better job of documenting the .ini settings so that people who buy their software might use them more efficiently.

I'm not sure that "adaptive AF" is a an accurate description of ATi's AF implementation. Is it? I was under the impression that what it does is not to treat textures with AF at 45-degree angles relative to the camera since AF won't appear at that angle--so to me this simply means they are attempting to simply be more efficient (using that word a lot in this thread...;)) for the sake of performance without impacting IQ. If you aren't talking about the 45-degree angle implementation, what are you talking about exactly?
 
zeckensack said:
WaltC,

I think I can agree that UT2k3 is 'curable'. I was dwelling on the game mostly in the context of the article. If that's not what we were talking about here anymore, my apologies.
*cough*

Well, I was talking about the very important info the original article left out--or if it was there I missed it--that full trilinear with the Cats in UT2K3 is not only possible, but very easy to specify using the in-game controls. (The original article as I read it appeared to try and paint the Dets and the Cats with the same brush as far as the lack of full trilinear support is concerned.) While it may be true to say that some users ignore the UT2K3.ini text configuration file that ships with the game, I would guess that many do not ignore this file, since there are so many other settings within it that cannot be adjusted from a driver's cpanel. The simple fact is that unlike the Cats, the current Dets do not provide full trilinear when it is selcted from the cpanel OR the UT2K3.ini. I don't think that is a trivial difference, or one which should be overlooked.

So much for the article, which IMO isn't too terribly off the mark. As for the other issues, my apologies if I got it all mixed up.

Actually, with the exception above I thought the article was OK. Pretty important omission, though...

This isn't something that can simply be enabled I'm afraid. ATI's AF angle dependency is most certainly a hardware limitation. We'll have to wait for the next generation

Heh...;) Imagine that..."next-gen" ATi chips will be able to run AF all the time, even at angles at which it is invisible to the camera, and so I can enjoy a constant hit on performance the entire session! I can hardly wait!...*chuckle*

Seriously, I rather hope ATi doesn't go backwards like that "next gen"....;)
 
WaltC said:
I would guess that many do not ignore this file, since there are so many other settings within it that cannot be adjusted from a driver's cpanel.
There's alot of settings to be adjusted through the UT03 .INI file, but not a whole lot that affect the visual quality of the game. There are some, but you make it sound as though accessing the .INI file is a necessity of some sorts for people who own and enjoy playing the game.
While I will admit that I myself have been messing around with .INI files since the original Unreal came out, I still am under the belief that it is for the hardcore players/tweakers amongst us, and not for the majority of people.
I enjoy playing UT03 alot more then I enjoy adjusting hidden settings, and worrying about what the drivers are doing in this scene or that scene.
I know that what Nvidia is doing with their AF(or lack thereof) is important from a community standpoint...especially when the game is used as a benchmark to rate cards, but I almost feel like I'm the only one here who seriously plays UT03, it's an awsome game, and I'm really good at it ;)
 
WaltC said:
I'm not sure that "adaptive AF" is a an accurate description of ATi's AF implementation. Is it? I was under the impression that what it does is not to treat textures with AF at 45-degree angles relative to the camera since AF won't appear at that angle--so to me this simply means they are attempting to simply be more efficient (using that word a lot in this thread...;)) for the sake of performance without impacting IQ. If you aren't talking about the 45-degree angle implementation, what are you talking about exactly?

I'm talking about this. The 45-degree angle was with R200.

With R3x0 ATI has improved this algorithm.

Only less than 5° angle and more than 85° angle can have 8x and 16x AF
Only less than 10-12° angle and more than 78-80° angle can have 4x AF
Near 25°angle and near 65° angle seems to have no AF at all !
Everything else is AF 2X, even if you enable AF 16X. ATI also seems to change the LOD with some angles to reduce the difference between the AF levels.

In most cases the ATI AF algortihm is great. But not in all cases.

I really think that ATI can disable this algortihm if they want. Of course, performances would decrease a lot. Performances decrease a lot with some other options... Why couldn't a full AF option exist ??
 
Tridam said:
I'm talking about this. The 45-degree angle was with R200.

Right...I sometimes get R3x0 and R2x0 confused, probably because I have no R2x0 experience. Thanks for the clarification.

With R3x0 ATI has improved this algorithm.

Only less than 5° angle and more than 85° angle can have 8x and 16x AF
Only less than 10-12° angle and more than 78-80° angle can have 4x AF
Near 25°angle and near 65° angle seems to have no AF at all !
Everything else is AF 2X, even if you enable AF 16X. ATI also seems to change the LOD with some angles to reduce the difference between the AF levels.

In most cases the ATI AF algortihm is great. But not in all cases.

Yes, I'd say they had greatly improved it. But wouldn't it be fair to say there's a reason to the rhyme...? I think what they are trying to do is simply force the selected levels of AF at view angles relative to the camera in which the differences AF can make in a frame are most apparent.

But I do understand you're saying you would prefer an option to switch off angled AF --which would have to be in the cpanel, of course, since it would affect driver behavior globally. Fair enough.

I really think that ATI can disable this algortihm if they want. Of course, performances would decrease a lot. Performances decrease a lot with some other options... Why couldn't a full AF option exist ??

I think it probably could, but I think that also the ATi driver developers can't afford to isolate a single function like AF and separate it from FSAA, trilinear filtering, etc. Rather they've got to look at the IQ package as a whole and make their decisions as to how they want to approach a discrete IQ function on that basis. The bottom line is total IQ for them, I would imagine. I said it in another post but 3d today is all about IQ & performance, and all IHVs make "compromises" so that they can deliver the best of both it's possible for them to deliver. The differences in 3d products can be distilled into which products do that the best--the degree to which various products deliver on both IQ AND performance simultaneously. If it were known, I think people might be surprised to learn just how much of a compromise between those goals many features they consider "uncompromised" actually are.

Another way to look at things is that in its cpanel nVidia does not offer an angled AF option for end users to choose, nor 16x AF. So you are probably looking at two sides of the same coin for things like this. The general IQ picture regarding all current 3d cards is such that it's generally agreed current R3x0 products provide superior IQ. That's what I think ATi's after.

I'm not trying to talk you out of wanting what you want--I have no problem with your preferences...;) I've approached this as a discussion about what is likely, and why, instead.
 
micron said:
There's alot of settings to be adjusted through the UT03 .INI file, but not a whole lot that affect the visual quality of the game. There are some, but you make it sound as though accessing the .INI file is a necessity of some sorts for people who own and enjoy playing the game.
While I will admit that I myself have been messing around with .INI files since the original Unreal came out, I still am under the belief that it is for the hardcore players/tweakers amongst us, and not for the majority of people.
I enjoy playing UT03 alot more then I enjoy adjusting hidden settings, and worrying about what the drivers are doing in this scene or that scene.
I know that what Nvidia is doing with their AF(or lack thereof) is important from a community standpoint...especially when the game is used as a benchmark to rate cards, but I almost feel like I'm the only one here who seriously plays UT03, it's an awsome game, and I'm really good at it ;)

Micron, I wasn't talking so much about the number of game-adjustable settings specifically in UT2K3--but rather about the quality and power of those in-game settings to drastically affect the way the game renders (such as LOD and camera POV angles)--and about the fact that many games routinely include such text configuration files for user adjustment (if they don't do everything through an in-game GUI.) One game I left out above that I spent some time configuring a few years ago is Half-Life--which I found needed a lot of internal tweaking to display optimally (for me) with the 3d cards I was using when the game was released.

But if you want to talk about numbers...in the UT2K3.ini in the D3DDrv.D3DRenderDevice section alone I count thirty (30) user-adjustable settings (not counting AdapterNumber=, DescFlags=, or Description=). There are more spread out at various places in the .ini configuration files for the game, as you obviously know. So, maybe there are more of them than one might think at first....;)

My point is that for many of these settings no such counterpart exists in the cpanel override of any IHV, so if you want to adjust them you *have* to do so from within the game. This is true for many best-selling 3d games--not just UT2K3. I think the crux of the issue is that IHV cpanels were never meant to replace in-game settings controls. They were meant to *force* a limited degree of settings in games which have no such internal settings controls (which is why sometimes they don't work on such games, or work very well. Overall the IHVs do a great job in forcing these things where needed, IMO.) Obviously, to the degree they are being used instead of in-game controls, cpanels are being misunderstood and misused, IMO. I think this is probably because the IHVs have generally been able to make them work so well...;)

It's time, though, for the game developers to get off their duffs and start looking at driver control panels and including in-game controls which mimic the functionality IHVs are putting into their control panels so that the drivers will receive the correct instructions from each application, and it's time for developers to work a lot harder on the GUIs they use (if any) and option documentation so that using application controls is no longer a mystery to the people who buy their software. IMO, developers who are depending on IHVs to take up the slack in this regard are just being lazy.

BTW, I can guarantee you are likely far, far better a UT2K3 player than I am...;) Probably, *anybody* is....However, I do enjoy putzing around with the game and fighting bots which I configure to act as close to slugs as I possibly can so that my impaired reflexes can keep up. I personally prefer the "fish in a barrel" mode of play...;) I always win (well, mostly), and this reinforces my tattered need to feel superior at *something*....:)
 
Back
Top