Recent Radeon X1K Memory Controller Improvements in OpenGL with AA

Jawed said:
What's the matter with setting the slider to HQ and leaving AI on?

Jawed
I don't know, but after reading Dave's post above I felt there might be a restriction. I don't have an X1xx to verify this, though.
 
It is in your best interest to leave cat AI enabled all the time as it can provide quite large performance boosts depending on the app without decreasing image quality. The option to turn it off is simply there for the sake of it.
 
DegustatoR said:
What slider?
Dunno - whatever it is that you use to set texture filtering quality to HQ.

The performance hit from HQ is bound to be less than the performance hit of turning AI off.

As far as I know I can't run CCC here on my Radeon 32MB SDR, so I dunno the exact procedure.

Jawed
 
IIRC, the CP has sliders that go from 'Performance' to 'HQ' for both texture and MIP-map quality. ATT does, anyway. But does CatAI supercede other settings (namely, WRT trylinear, etc.)?

ANova, is Humus' Doom 3 fix only enabled with CatAI? Just thinking of one example of an app-specific performance boost. I would think ATI's recently-found D3 AA perf. fix would be independent of CatAI--that is, not user-selectable--b/c it's a perf. win that (ostensibly) doesn't affect IQ. I mean, if CatAI is just for IQ-independent features, as Rys first said, it would seem stupid not to have it on all the time, as making it user-selectable would be akin to offering a "Slower and Dumber?" checkbox (like trying to enable AA with Halo or PoP). IMO, it makes sense to make it user-selectable only if it affects IQ, which is what I understood it to do (with CatAI High being [more] aggressive in toeing the optimization/IQ compromise line). End-users don't need more choices to make regarding tech they probably (don't care to) understand, especially if the choice is a no-brainer.
 
:oops:

This was a gem: "Unfortunately, ATI accidentally uploaded the wrong file to their support site initially. On Friday we contacted ATI and were able to determine that the file originally uploaded to the site, hotfix_xp-2k_dd_ccc_027483.exe, wasn’t the driver with the OpenGL performance enhancements." :rolleyes:
 
Last edited by a moderator:
Holy crap.
Ati>nvidia in quake 4.
Nice.

Thats good for the XT(which it' superior memory bandwidth it should be expected that it win), even better for the XL(which doesn't hold a theoetically advantage to the 7800gt.)
WIth fsaa, an XL is enough to match a gtx:smile:

Now we wait for nvidia's reaction.
 
Last edited by a moderator:
Correct, it appears they calculated all the percentages using the final (higher) fps number as the denominator rather than the original result. :LOL:

Edit: sent Brandon an email to suggest correction, even though we all get the picture.
 
Last edited by a moderator:
Am I reading that correctly, that the Q4 results are improved (dramatically) even without AA applied?
 
Ratchet said:
Am I reading that correctly, that the Q4 results are improved (dramatically) even without AA applied?
You certainly are. I'll try and duplicate his results later on. Wavey (and anyone else with the hardware), are you seeing the same?
 
Rys said:
You certainly are. I'll try and duplicate his results later on. Wavey (and anyone else with the hardware), are you seeing the same?
I have the hardware but not the game (yet). I'll check Doom3 when I get this other stuff done though.
 
Maybe because they tested noAA/8xAF instead of noAA/noAF as other sites usually do?

normally I hate it when sites test without AF since nobody will ever play a game that way, but this time it might be interesting to see where the performance gains come from.


On a side note, I wonder how many sites will not use the patch because they refuse to turn AI on?
 
mjtdevries said:
Maybe because they tested noAA/8xAF instead of noAA/noAF as other sites usually do?

normally I hate it when sites test without AF since nobody will ever play a game that way, but this time it might be interesting to see where the performance gains come from.


On a side note, I wonder how many sites will not use the patch because they refuse to turn AI on?
I think Doom3 always runs with 8xAF if the high quality mode is selected
 
just wondering how nvidia is respinding to all these. is anyone interviewing them? it seems to be quite significant to nvidia that they just lost a major advantage in their product.
 
The list of games that are dominated by the 7800-series gets shortened by A LOT if these performance gains reflect in other Open GL titles as well...

Edited
 
Last edited by a moderator:
Back
Top