Catalyst AI

Dave Baumann

Gamerscore Wh...
Moderator
Legend
ATI has updated Catalyst Control Center with a new control Panel called "Catalyst AI". Catalyst AI introduces more options for on the fly texture analysis, as well as enabling (or disbaling) application specific optimisations.

We take a short look at Catalyst AI here - please make any comments relating to it in this thread.
 
Just a note that although this post is dated 17 Sept., it is posted on 21 Sept. by me, therefore meeting the ATI embargo date/time for Catalyst AI by media outlets. The reason is due to Dave preparing this post on 17 Sept. before he took his leave of absense.
 
Is it just me or is the cat 4.10 screenshots in the first column with AA while the 4.9 isn't? (On the last page)
 
1 - I hate how the moniker "AI" is applied everything, no matter how dumb it is. :devilish:

2 - I think I just heard a *SWOOSHHHH* sound, I guess we'll have on hell of a ride off the slippery slope ... see you in Blurriistan. ;)
 
Anonymous said:
Will these options be available in the regular control panel or is it a CCC-exclusive thing?

My guess is CCC only. They've stated that they're phasing out the old control panel completely. CCC is the wave of the future, whether we like it or not.
 
my hope (and im poking a few people over it) is that some of the smart cookies who make ATI tweaking utilities can create their OWN interface into it, allowing its control without the CCC installed.
 
Anonymous said:
Anonymous said:
Will these options be available in the regular control panel or is it a CCC-exclusive thing?

My guess is CCC only. They've stated that they're phasing out the old control panel completely. CCC is the wave of the future, whether we like it or not.
Definately a CCC only thing, they mentioned it in the DH & [H] articles.
 
I really dislike their decision, just as I did when Nvidia did it with 3DMark.
But replacing a shader in a benchmark is one thing and replacing one in a real game is another, and I worry about this trend.
 
I just read on PCGames.de (Link) that there's a bug in the CCC that reduces texture filtering to plain biliniear with AI off and AF on. Could this explain the good results for 4.10/AI off vs. 4.9 in D³?
 
Kakaru said:
I really dislike their decision, just as I did when Nvidia did it with 3DMark.
But replacing a shader in a benchmark is one thing and replacing one in a real game is another, and I worry about this trend.
I think what's good is that ATI has said that they're doing this in games instead of keeping quiet about it.

When the initial Doom3 benchmarks came out, ATI lost. We know that, they know that. What ATI has done is to allow ATI owners to have perhaps a more satisfactory Doom3 experience. That's not a bad thing. Especially when they tell you they're doing it. And after the fact that we all know they lost.
 
Reverend said:
Kakaru said:
I really dislike their decision, just as I did when Nvidia did it with 3DMark.
But replacing a shader in a benchmark is one thing and replacing one in a real game is another, and I worry about this trend.
I think what's good is that ATI has said that they're doing this in games instead of keeping quiet about it.

When the initial Doom3 benchmarks came out, ATI lost. We know that, they know that. What ATI has done is to allow ATI owners to have perhaps a more satisfactory Doom3 experience. That's not a bad thing. Especially when they tell you they're doing it. And after the fact that we all know they lost.
But will reviewers use these optimizations when comparing cards?
 
Now things are going to start to suck.

Reviewer A test the 6800ultra with no optimizations on and ati with thier optimizations on , reviewer b tests the 6800ultra with optimizations on and ati with thiers off. Reviewer c tests with ati's and nvidia's off. Only thing is you can't disable nvidia's shader replacements and other app specific optiizations. ONly filtering optimizations .


Big can of worms here
 
jvd said:
Now things are going to start to suck.

Reviewer A test the 6800ultra with no optimizations on and ati with thier optimizations on , reviewer b tests the 6800ultra with optimizations on and ati with thiers off. Reviewer c tests with ati's and nvidia's off. Only thing is you can't disable nvidia's shader replacements and other app specific optiizations. ONly filtering optimizations .


Big can of worms here

Maybe ATI and NVIDIA should start doing "reviewer replacements" instead?
 
The solution to the problem of possible confusion is simple.

Read only Beyond3D reviews. We Never Get Things Wrong.

:)
 
Tokelil said:
Is it just me or is the cat 4.10 screenshots in the first column with AA while the 4.9 isn't? (On the last page)

Don't you mean it the other way around?

look at the pipes, it's almost as if 4.10 didn't have gamma corrected AA and 4.9 did. or 4.10 had 2xAA and 4.9 had 4x. look just left from the soldier :oops:
 
AI could turn out good. Not sure how much the advanced textures analyze mode will do probably only good if you are heavy texture limited.

I think the implementation as it is right is just wrong IHMO. They really should seperate the texture filtering opt. and the app. specific opt. so the user can disable texture filtering opt. without disabling app. specific opt. and vica versa Maybe even distinguish between mipmap opt. and AF opt. like nvidia has it in theory
 
jvd said:
Now things are going to start to suck.

Reviewer A test the 6800ultra with no optimizations on and ati with thier optimizations on , reviewer b tests the 6800ultra with optimizations on and ati with thiers off. Reviewer c tests with ati's and nvidia's off. Only thing is you can't disable nvidia's shader replacements and other app specific optiizations. ONly filtering optimizations .


Big can of worms here

Thinks already suck. I find the performance of Doom3 on ATI cards awefully suspicious considering development started on the 9700.

It is almost as if it was engineered to run better or worse depending on the card vendor.
 
Back
Top