A look at IQ at Firingsquad...

WaltC

Veteran
http://firingsquad.gamers.com/hardware/imagequalityshootout/default.asp

I found this article to be a better one than the ET article, in that it simply does more to demonstrate "quality" because it uses screenshots from actual games which compare products at like IQ settings to make its points and completely ignores framerates...but I found it far too abbreviated to really explore the subject in the depth it deserves.

Are 0xFSAA and 0xAF, 4x FSAA and 8x AF the only IQ settings possible with either product? (rhetorical question.)

To that end this article, too, fails to live up to its title: "Image Quality Showdown: ATi vs. nVidia." Instead, we get a semi-detailed, but very incomplete and very small slice of the real IQ picture between the two.

A positive was that it included comments on the topic of IQ in general from both companies which were fully attributed--I liked that a lot! It's so much better and more professional than merely saying "nVidia/ATi told me blah, blah, blah..." Now we can hang Tony Tomasi out to dry if the next set of Dets doesn't relinquish control of Trilinear in UT2K3 back to the end user...Angelini does well to include names along with such promises.

However, Angelini loses points for this unattributed statement: "Though ATI has been guilty of the same aggressive optimizations NVIDIA has recently taken flak for, ...". I would say that nowhere has it been demonstrated in the last several months that ATi has done the "same thing" as nVidia with regard to either UT2K3 or 3dMK03. In fact, it would be difficult at best to even generally compare the two in this regard because the things the companies have done respectively are so very different.

Taken together, though, I'm wondering if the problem with these articles is that the authors decided to title them after they'd written them...;)
 
I found the difference between ATi and nVidia's attitude towards optimisations very interesting... I was actually suprised to see ATi seem to be going down the road of no application specific optimisations at all.
 
Hanners said:
I found the difference between ATi and nVidia's attitude towards optimisations very interesting... I was actually suprised to see ATi seem to be going down the road of no application specific optimisations at all.

This is kind of what they've said from the start, though...at least it seems so to me. I believe that within the first week of ATi's very minor optimization (which affected neither IQ or workload) being discovered in in 3dMk03 the company stated that future drivers would no longer target benchmarks for recognition and special-case optimization. I also think it's refreshing to see them looking differently at actual 3d games, as well. nVidia's view, of course, has been consistently that they will continue their version of "optimization" in any game or benchmark they choose.

Approaching this from a slightly more technical angle, I think this is something nVidia has discovered it must do in order to have its products appear competitive in applications and benchmarks they deem key to sales targets. They're having to spend a lot of time teaching developers "workarounds" for various things in order to do this, which they call "optimized code paths" and so on, which will vary depending on the API.

micron said:
It is sooo tough to be an Nvidia fan on these forums.....

I agree--but it doesn't have to be. It's the constant rationalizing and slanting of the facts that are often engaged in that make it difficult for them (for those of them who engage in those approaches, that is.) The worst such example of this case is the "Ati did it, too," syndrome...;) ATi has done certain things, and nVidia has done certain things, and it doesn't pay to lump them together in the same stewpot, IMO.
 
However, Angelini loses points for this unattributed statement: "Though ATI has been guilty of the same aggressive optimizations NVIDIA has recently taken flak for, ..."

It's things like that that scare me. I mean, did every reviewer come in late, miss most of the "optimisation" incidents, then shout to their editor "I got the gist of it!" before filing an article?

How can anyone claim to truly understand the issue while writing statements like that, a clear indication of NOT understanding the issue... or not having followed it.
 
Greetings,

Chris Angelini here - just wanted to check in with you guys to answer the questions that have been raised. Yes - the article was particularly short. Yes - there is a follow up planned already, as this one was to focus on comparing apples to apples, not NVIDIA's best settings to ATI's best settings. And the statement about optimizations was in regard to the 3D Mark03 optimizations ATI has admitted to making and later removing. Then, there's still the quasi-trilinear filtering used in UT2003. It may not be on the same level as NVIDIA's "optimizations," but they are optimizations nonetheless. I appreciate the feedback and thanks for keeping it constructive. Have a good weekend all,

Chris
 
Thanks for coming by to help clear up the confusion/answer questions....here's one for ya:

crazipper said:
Then, there's still the quasi-trilinear filtering used in UT2003. It may not be on the same level as NVIDIA's "optimizations," but they are optimizations nonetheless
The "quasi-trilinear filtering" used in UT2003 IS one of nVidia's optimizations...what are you talking about? :?
 
digitalwanderer said:
Thanks for coming by to help clear up the confusion/answer questions....here's one for ya:

crazipper said:
Then, there's still the quasi-trilinear filtering used in UT2003. It may not be on the same level as NVIDIA's "optimizations," but they are optimizations nonetheless

The "quasi-trilinear filtering" used in UT2003 IS one of nVidia's optimizations...what are you talking about? :?

He's talking about ATi only doing trilinear on the first texture stage when AF is forced in the ATi Control Panel. :)



Thanks for joining us Chris.... Just one question - Do you have any plans to start including these kind of image quality tests in all future reviews, or are you going to keep it as a separate issue for now?
 
crazipper said:
Greetings,

Chris Angelini here - just wanted to check in with you guys to answer the questions that have been raised. Yes - the article was particularly short. Yes - there is a follow up planned already, as this one was to focus on comparing apples to apples, not NVIDIA's best settings to ATI's best settings. And the statement about optimizations was in regard to the 3D Mark03 optimizations ATI has admitted to making and later removing. Then, there's still the quasi-trilinear filtering used in UT2003. It may not be on the same level as NVIDIA's "optimizations," but they are optimizations nonetheless. I appreciate the feedback and thanks for keeping it constructive. Have a good weekend all,

Chris

Chris,

The thing that struck me about the AF comparison is that "apples-to-apples" you used nVidia's best-quality AF settings but didn't use ATI's best setting. I had no objection to you comparing 8xAF to 8xAF--none at all, except that, again, it's nVidia's best but not ATi's, that's all. Would have been fine for you to do both 8x & 16x AF in comparison to nVidia's 8x AF, surely.

You said: "So there you have it, ATI tops NVIDIA in anti-aliasing quality while NVIDIA bests ATI in anisotropic filtering. As I mentioned, since you didn't test 16x AF for the ATi product, there's really no basis here on which you could make the above statement. Right? I'm sure you realize that no ATi user is going to refuse to run 16xAF simply because nVidia's limited to an 8xAF setting...?

I also have no objection to a direct comparison between nVidia's 8xFSAA and ATi's 6xFSAA, for the same reasons. Hopefully we'll see that in the follow-up article (as well as 2x, etc.)

In regard to the 3dMk03 optimizations, here are the major distinctions:

(1) According to FM, ATI's optimization was limited to a single test and affected the total score of the bench by < 2%, did not eliminate benchmark workload nor alter image quality. nVidia, though, optimized through several tests, eliminated portions of the benchmark workload, sacrificed IQ, and affected the total score of the bench by >26%.

(2) ATi admitted to what it did. nVidia has never done so.

(3) ATi pledged to no longer target benchmarks for optimizations. nVidia has pledged to keep on optimizing for them.

I see many more differences here than similarities, and I see nothing that would justify your statement that: ""Though ATI has been guilty of the same aggressive optimizations NVIDIA has recently taken flak for..."

The optimizations were not the same and the aggressiveness was not the same. Hopefully you'll clear that up in part two. Yes, there were optimizations by both companies--not nearly "the same" optimizations, however.

Then, there's still the quasi-trilinear filtering used in UT2003.

By nVidia, I'm assuming you mean, since full trilinear is fully supported in the Cats from within the application--but turn it on in the Dets from within the application, and it's still quasi-trilinear, as the Dets don't support full trilinear in the game regardless of where you attempt to turn it on.

Looking forward to part two!...;)
 
I'd like to make it standard in the major tech releases - it can be covered comprehensively when R360/420/NV40 etc. launches. I don't think it is something that necessarily needs to be rehashed for every odd NV35/R350 card, do you? Perhaps keeping tabs on IQ as new drivers are released would be good as well?
 
WaltC said:
Chris,

The thing that struck me about the AF comparison is that "apples-to-apples" you used nVidia's best-quality AF settings but didn't use ATI's best setting. I had no objection to you comparing 8xAF to 8xAF--none at all, except that, again, it's nVidia's best but not ATi's, that's all. Would have been fine for you to do both 8x & 16x AF in comparison to nVidia's 8x AF, surely.

You said: "So there you have it, ATI tops NVIDIA in anti-aliasing quality while NVIDIA bests ATI in anisotropic filtering. As I mentioned, since you didn't test 16x AF for the ATi product, there's really no basis here on which you could make the above statement. Right? I'm sure you realize that no ATi user is going to refuse to run 16xAF simply because nVidia's limited to an 8xAF setting...?

I also have no objection to a direct comparison between nVidia's 8xFSAA and ATi's 6xFSAA, for the same reasons. Hopefully we'll see that in the follow-up article (as well as 2x, etc.)

In regard to the 3dMk03 optimizations, here are the major distinctions:

(1) According to FM, ATI's optimization was limited to a single test and affected the total score of the bench by < 2%, did not eliminate benchmark workload nor alter image quality. nVidia, though, optimized through several tests, eliminated portions of the benchmark workload, sacrificed IQ, and affected the total score of the bench by >26%.

(2) ATi admitted to what it did. nVidia has never done so.

(3) ATi pledged to no longer target benchmarks for optimizations. nVidia has pledged to keep on optimizing for them.

I see many more differences here than similarities, and I see nothing that would justify your statement that: ""Though ATI has been guilty of the same aggressive optimizations NVIDIA has recently taken flak for..."

The optimizations were not the same and the aggressiveness was not the same. Hopefully you'll clear that up in part two. Yes, there were optimizations by both companies--not nearly "the same" optimizations, however.

Looking forward to part two!...;)

Hey Walt,

Indeed, the next piece will have both companies' settings maximized and a few settings in between (I'd like to get a better idea of what ATI setting equals a comparable NVIDIA setting, and vice versa).

To be completely honest, the conclusion was added by my editor after submission. That's not a cop-out - I think adding "in these tests" after the "NVIDIA bests ATI in anisotropic filtering" would make the statement more accurate. Of course, when I play on my 9800 Pro, it's 16x all the way. We're in complete agreement there.

Further, I understand the distinctions in the 3D Mark03 optimizations, and it is commendable that ATI has taken a stance in that regard. It doesn't, however, change the fact that shaders were optimized for enhanced performance in a sythetic benchmark. There are other bugs that could have been fixed in the time it took the driver writers to make 3D Mark03 go a little faster. Of course, I know WHY it was done, but that doesn't mean I agree with it. At any rate, I'll do what I can do be more specific in the second part.

Oh, and as always, thanks for the constructive feedback. I'm off to my lab in Bakersfield - have a great weekend!

*Edit in response to Walt's edit - yes, that is what I was referring to - and I made it a point to mention to NVIDIA about the AF control issue in UT2003. Has anyone checked to see if the same thing happens in Unreal II?

Chris
 
WaltC,

The thing that struck me about the AF comparison is that "apples-to-apples" you used nVidia's best-quality AF settings but didn't use ATI's best setting. I had no objection to you comparing 8xAF to 8xAF--none at all, except that, again, it's nVidia's best but not ATi's, that's all. Would have been fine for you to do both 8x & 16x AF in comparison to nVidia's 8x AF, surely.

how is 8x v 16x apples-to-apples? If the point of the article was to compare each cards best rendering then sure, I would compare 8x to 16x. But Chris was trying to compare the two cards with exactly the same settings and try and state which card did the exact same process better. Sure they might have different sampling techniques for their MSAA but if one wants to compare two products, and Nv wants to call their 8x... 8x then compare with ATI's 8x (which beats Nv pants down...)
 
Kalbaz said:
how is 8x v 16x apples-to-apples? If the point of the article was to compare each cards best rendering then sure, I would compare 8x to 16x. But Chris was trying to compare the two cards with exactly the same settings and try and state which card did the exact same process better. Sure they might have different sampling techniques for their MSAA but if one wants to compare two products, and Nv wants to call their 8x... 8x then compare with ATI's 8x (which beats Nv pants down...)

It should have been included because of Chris's conclusion at the end of the article, "So there you have it, ATI tops NVIDIA in anti-aliasing quality while NVIDIA bests ATI in anisotropic filtering."

Without an examination of 16x AF inluded, the statement has no foundation in the article. I see that in a follow-up response Chris says that his conclusion was written by his editor, apparently, and that he would have qualified it to the scope of the tests he ran.

Anyway, it's no big deal--I'm sure he'll test both cards more fully in the upcoming installment.
 
WaltC I can see what you mean now, there should have been a slight addendum on that sentence you quoted, "... at exactly the same settings, though ATI can push the filtering even further."
 
Hanners said:
digitalwanderer said:
Thanks for coming by to help clear up the confusion/answer questions....here's one for ya:

crazipper said:
Then, there's still the quasi-trilinear filtering used in UT2003. It may not be on the same level as NVIDIA's "optimizations," but they are optimizations nonetheless

The "quasi-trilinear filtering" used in UT2003 IS one of nVidia's optimizations...what are you talking about? :?

He's talking about ATi only doing trilinear on the first texture stage when AF is forced in the ATi Control Panel. :)
Doh! Thanks Hanners.

Sorry Chris, my bad. Please ignore my ignorance this morning.

Thanks again for coming by, I'll try and be quiet now and just lurk 'til the grey matter kicks into gear. ;)
 
crazipper said:
...
*Edit in response to Walt's edit - yes, that is what I was referring to - and I made it a point to mention to NVIDIA about the AF control issue in UT2003. Has anyone checked to see if the same thing happens in Unreal II?

Chris

Sorry about that...;) I was hoping I could get it in there before anyone responded...ah, well...I did think that was what you meant when I read the article, btw...I have read people with GFFX's report that full trilinear is alive and well in U2, and everything else I'm aware of--so the consensus seems to be that nVidia has specifically targeted UT2K3 in this regard. (I think that's at least part of why KB over at [H] blew his top recently on this subject...nVidia had apparently assured him differently as to the recent Det release.) Interestingly, as I saw in another thread here based on a 3dgpu editorial about the matter, 3dgpu says that Brian Burke has said that the quasi-trilinear issue is going to stay as nVidia doesn't understand how it affects image quality...;) But Tony Tomasi --as you know--states it will be fixed in an "upcoming" driver release...so who knows?...:) Maybe in '04 when they field some hardware they think won't need the crutch...?

Good weekend to you, too, Chris...!
 
But Tony Tomasi --as you know--states it will be fixed in an "upcoming" driver release...so who knows?... Maybe in '04 when they field some hardware they think won't need the crutch...?

Walt, it's funny that you brought that up because Rev directly questioned KB on this matter (UT filtering and which driver release it would be resolved in) on his own forums no less than a month or so ago (in the thread where Kyle basically slams B3D and Rev personally). I wish I could say more, but clearly lets just say that Kyle wasn't the only one talking to NVIDIA over this matter. NVIDIA's stance hasn't changed at all and KB shouldn't have been shocked when the optimization was still there in 45.23. Obvisouly Rev saw this one coming and directly questioned him about it!
 
crazipper said:
Further, I understand the distinctions in the 3D Mark03 optimizations, and it is commendable that ATI has taken a stance in that regard. It doesn't, however, change the fact that shaders were optimized for enhanced performance in a sythetic benchmark.

Has anyone checked to see if the same thing happens in Unreal II?

Chris

I just wanted to point out that optimizations in shaders are OK IMO as long as the IQ remains the exact same as the developer intended, that is a TRUE optimization.

While we agree though that optimizing such things in a synthetic benchmark is what is debatable, the difference is the fact that ATI's optimization in 3dmark did not change IQ, while NVIDIA's optimization (and i say that word loosely here) did.

I've looked at Unreal II and do not see the same quasi-trilinear with the 44.03's. I haven't checked with the 45.23's though.
 
crazipper said:
I'd like to make it standard in the major tech releases - it can be covered comprehensively when R360/420/NV40 etc. launches. I don't think it is something that necessarily needs to be rehashed for every odd NV35/R350 card, do you? Perhaps keeping tabs on IQ as new drivers are released would be good as well?

That sounds spot on to me. :)
 
Back
Top