Recent Radeon X1K Memory Controller Improvements in OpenGL with AA

Dave Baumann said:
WRT this patch, I'd suggest looking at the pre-patched 4x vs 6x scores. 6x is actually faster than 4x on the XT, suggesting that something may not quite have been right with the 4x memory mappings in the first place.
Pre-patched scores from Guru3d

4x 6x
92 84 (800x600)
74 67 (1024x768)
51 47 (1280x1024)
37 33 (1600x1200)
 
I just have to say that this is one of those threads that makes it clear why the Beyond3D forum is so freaking awesome.

Quality, curiosity, and a passion for understanding.

I love this place.
 
Serenity, those results are from the XL (and are very much along similar lines to those that I already posted in this thread). The telling results are with the XT.
 
Dave Baumann said:
Serenity, those results are from the XL (and are very much along similar lines to those that I already posted in this thread). The telling results are with the XT.
Ah, I see.

Thanks for the link ferro. :)
 
Hanners is reporting over at EB that ATI has publicly released a hotfix for 5.10 that includes the OGL 4xaa performance enhancement. Anybody confirm/deny it is in there?
 
tEd said:
Will these optimisations controlled via Cat.AI , on a per game basis or will they also work if Cat. AI is disabled?
Why would Cat AI have a say on these optimisations beeing turned on or not? Isn't Cat AI supposed to increase performance whilst sacrificing (a bit of) image quality, this patch doesn't alter the image quality (or so I gatherred), so I don't see why it would be under Cat AI's control.
 
XSBagage said:
Why would Cat AI have a say on these optimisations beeing turned on or not? Isn't Cat AI supposed to increase performance whilst sacrificing (a bit of) image quality, this patch doesn't alter the image quality (or so I gatherred), so I don't see why it would be under Cat AI's control.

Because maybe this optimization has different parameters for different games/AA settings?
 
XSBagage said:
Why would Cat AI have a say on these optimisations beeing turned on or not? Isn't Cat AI supposed to increase performance whilst sacrificing (a bit of) image quality, this patch doesn't alter the image quality (or so I gatherred), so I don't see why it would be under Cat AI's control.

Cat A.I has nothing to do with sacrificing image quality for performance. Rather it exists to apply application specific fixes, be they memory controller adjustments or something else. So yes, it's absolutely something that comes under Cat A.I's control if the particular adjustment isn't a global one.

geo: Yeah, they released a couple of things. Serious Sam II driver and Quake IV, erm, something (haven't tried it).
 
Rys said:
Cat A.I has nothing to do with sacrificing image quality for performance. Rather it exists to apply application specific fixes, be they memory controller adjustments or something else.
Eh? Then why is there both a "low" and "high" setting, rather than just "on?" I, too, thought it involved IQ (for better or for worse).
 
Pete said:
Eh? Then why is there both a "low" and "high" setting, rather than just "on?" I, too, thought it involved IQ (for better or for worse).
Bad language on my part. It doesn't solely exist to apply IQ/performance tradeoffs, rather that's just a section of its remit. And yeah, I've often wondered why the granularity myself. The two-stage choice is a really bad UI for what's essentially a profiling system.
 
Rys said:
Cat A.I has nothing to do with sacrificing image quality for performance. Rather it exists to apply application specific fixes, be they memory controller adjustments or something else. So yes, it's absolutely something that comes under Cat A.I's control if the particular adjustment isn't a global one.
Fixes too? Are you sure about that?
 
Absolutely. Think preventing AA from being used if it causes image corruption.
 
Catalyst AI is now a cludge.

Initially AI's remit was a method of enabling the various texture optimisations that are capable in RV350+; the default mode just uses the standard texture opmisations, the high mode is supposed to do some texture frequency analysis on the fly and pick optimal filering modes to improve performance without, supposedly, sacrificing IQ too much (hence the "AI" in the name). Turn it off entirely and all texture optimisations are gone.

That is what Cat AI is supposed to do, however ATI also decided to attach the activation for any app specific fix/optimisation to it - IMO this is a way of making sure that people bench with Cat AI on ("Why doesn't this app work?" "Oh, turn Cat AI on and it will (and, indicidentally the texture optimisations will be present and performance goes up)"), however what was a secondary element to Cat AI has now sort of become the primary element to it.
 
Thanks for the explanations, apparently I was stuck on the optimisations part.
And I agree it is a kludge, I'd rather see the 2 functions split up as I always keep AI turned off because of the optimisations part. (Never had any problems despite AI beeing off though)
 
Is there any Cat option/configuration avialable that maintains app specific memory controller optimizations without forcing texture filtering optimizations?
 
Luminescent said:
Is there any Cat option/configuration avialable that maintains app specific memory controller optimizations without forcing texture filtering optimizations?
What's the matter with setting the slider to HQ and leaving AI on?

Jawed
 
Back
Top