A look at IQ at Firingsquad...

CosmoKramer said:
Well, 16X AF does not in any way improve upon the weakness of ATI's
AF, so don't expect miracles in the follow up.

How do you figure that, the filtering is pushed back much further at 16X AF which gives better depth perception, something AF is supposed to do.

This review was a joke, the lack of 16 X AF on the ATI cards was a major mistake, as the shots shown here would have shown the filtering on the track racing lines, especially the yellow 'groove' to be much more pronounced into the distance.

http://firingsquad.gamers.com/hardware/imagequalityshootout/page4.asp

You can't race from that angle BTW, and doing a review of IQ should always be from the 'view' of the gamer (in car, or rear camera is most popular) :rolleyes:
 
My guess was that Cosmo was referring to ATI cards not doing filtering at all angles. I'd tend to agree with him there, in that some of the screenshots provided looked like the GFFX card was simply crisper, even relatively close to the camera.
 
Doomtrooper said:
CosmoKramer said:
Well, 16X AF does not in any way improve upon the weakness of ATI's
AF, so don't expect miracles in the follow up.
How do you figure that, the filtering is pushed back much further at 16X AF which gives better depth perception, something AF is supposed to do.

Since you are a known ATI supporter I assume you know that ATI's AF algorithm is angle dependant? By angle dependant I don't mean in the way all AF is angle dependant, so don't bring that up.

I assume that you also have read the articles here at B3D and/or tried the aniso tester software? If not, do that. You will find that the only difference between 8X and 16X AF is an improvement very strictly in the 0, 90, 180, 270 angles. Other angles have the same (poor) AF.

Did you notice the screenshots FS used? Especially the one where you look out the window of an airplane taking off? I doubt more than 2X effective AF is taking place there and I predict that it will look exactly the same with 16X as with 8X (or even 4X).
 
Angle dependent yes, but the filtering is applied to the view plane at 16X does give better depth perception.

Looks fine to me where I drive :LOL:

nas2.jpg


nas3.jpg
 
I already agreed in the other thread about finding flaws in ATIs implementation, BUT I already have proven that 8XAF when not hacked into their old 'BALANCED MODE' takes far to much of a performance hit, which requires the user to lower the filtering to 4X AF.

So lets ensure we compare useable AF modes when doing reviews, and not ones that make pretty pictures, as I can gurantee at 1600 x 1200 8X AF on a FX card would be a slide show with a full field.
 
I see it ;)

Doesn't bother me, as I get much better filtering in front of me. In fact I compared a Ti4400 to a 9700 on this exact game, and I could read the labelling on the walls and on the cars much further on a 9700.

Something I prefer.

I don't think people quite understand here what I'm saying, lets take BF 1942, show the Frame Counter.

Take a 9800 and 5900, 1600 x 1200 with 4X AA and 8X AF, well right away the end user must lower the AF on the FX card to make the game playable, while the 9800 will be able to execute 16X.
Now compare screen shots at Playable frame rates (12 is not), lets see the filtering comparisons now ?? Real World.
 
Brent said:
I just wanted to point out that optimizations in shaders are OK IMO as long as the IQ remains the exact same as the developer intended, that is a TRUE optimization.

While we agree though that optimizing such things in a synthetic benchmark is what is debatable, the difference is the fact that ATI's optimization in 3dmark did not change IQ, while NVIDIA's optimization (and i say that word loosely here) did.

I've looked at Unreal II and do not see the same quasi-trilinear with the 44.03's. I haven't checked with the 45.23's though.

While I'd agree that hand-optimizing shaders has its merits, just like any other driver optimization ATI or NVIDIA makes (a compression algorithm, for instance), doing so on a per-applicaiton basis raises concerns about how much of that any given company can do, especially when DX9 games proliferate. Obviously, the benchmarked games will receive the most attention, so it really becomes necessary to diversify test suites so as not to paint in accurate picture. Would you agree? The difference I see right now is that NVIDIA is pledging to go forward with these "tough to track" optimizations and ATI is pulling away from them, reportedly. Guess we'll see what happens, eh?

Thanks for the info on UII - I think that helps show that full trilinear isn't being used in order to augment scores.

Cheers!
Chris
 
Brent said:
I've looked at Unreal II and do not see the same quasi-trilinear with the 44.03's. I haven't checked with the 45.23's though.
What happens if you rename the application to "ut2003.exe"? :D
 
digitalwanderer said:
Hanners said:
digitalwanderer said:
Thanks for coming by to help clear up the confusion/answer questions....here's one for ya:
crazipper said:
Then, there's still the quasi-trilinear filtering used in UT2003. It may not be on the same level as NVIDIA's "optimizations," but they are optimizations nonetheless
The "quasi-trilinear filtering" used in UT2003 IS one of nVidia's optimizations...what are you talking about? :?
He's talking about ATi only doing trilinear on the first texture stage when AF is forced in the ATi Control Panel. :)
Doh! Thanks Hanners.

Sorry Chris, my bad. Please ignore my ignorance this morning.

Thanks again for coming by, I'll try and be quiet now and just lurk 'til the grey matter kicks into gear. ;)
But that isn't a UT2003-specific thing, it happens in all apps. In fact, since you can choose AF from UT2003 instead of using the control panel, UT2003 is exactly one of the apps that can easily be made not to exhibit this behaviour!
 
CosmoKramer said:
Since you are a known ATI supporter I assume you know that ATI's AF algorithm is angle dependant? By angle dependant I don't mean in the way all AF is angle dependant, so don't bring that up.

I would probably call it "angle deliberate", instead...as the angles of coverage are deliberate.

I assume that you also have read the articles here at B3D and/or tried the aniso tester software? If not, do that. You will find that the only difference between 8X and 16X AF is an improvement very strictly in the 0, 90, 180, 270 angles. Other angles have the same (poor) AF.

To me it's somewhat similar to the differences between rotated grid and ordered grid FSAA--ordered looks fine until you hit the near horizontals and near verticals--where it really falls apart. What ATi is doing is applying 16x AF to the angles at which AF is most noticeable in relation to the camera. I can't see criticising it for that anymore than I'd criticize a rotated grid for providing superior FSAA when lines hit certain angles relative to the camera. I think criticising it as "angle dependent" equates to looking at the picture from...well, the wrong angle...;)

Did you notice the screenshots FS used? Especially the one where you look out the window of an airplane taking off? I doubt more than 2X effective AF is taking place there and I predict that it will look exactly the same with 16X as with 8X (or even 4X).

Well then, what we should do is to compare the GFFX's 4x AF to the R9800P's 8x AF, and see what develops...;) That way we'd avoid using the "best" setting for either card and be looking at "apples to apples" from another point of view.
 
WaltC said:
I would probably call it "angle deliberate", instead...as the angles of coverage are deliberate.

Whatever floats your boat... :)

What ATi is doing is applying 16x AF to the angles at which AF is most noticeable in relation to the camera.

That is a very general statement, no? I'd say is dependeant on the geometrical complexity of the game. Recently I played through the fantastic game Gothic II, and ATI's algorithm really fell apart in that game.

No, I'd say ATI's AF implementation was governed by "the games that the clueless reviewers (ie the majority) are most likely to use TM". Like Quake 3 and serious Sam (never show me a review that uses SS to display AF again!).

I can't see criticising it for that anymore than I'd criticize a rotated grid for providing superior FSAA when lines hit certain angles relative to the camera.

The difference is that (in the case of 4X AA) every pixel still has 4 subsamples. I doubt that ATI use the same amount of "subelements" to calculate the pixel colour at 22.5 degrees as it does at 90 degrees.

Thinking about it I may be wrong about that, though...

Well then, what we should do is to compare the GFFX's 4x AF to the R9800P's 8x AF, and see what develops...;) That way we'd avoid using the "best" setting for either card and be looking at "apples to apples" from another point of view.

Imo (owning cards with both types of AF) that is actually not such a bad idea.
 
I hope you are as critical about the angle dependecny of Nvidias AA methods, I also think alot of people are making smoke into fire on this subject.

The people I see complaining must not have owned Nvidia products before, where you almost need a desktop hotkey for the AF slider.

I will take 'real world' filtering anyday over the 'pretty sceen shot' mode with 60% perfomance hits.

I'd like to see somone run the 'true trilinear quality mode' on a 5900 on one of my UT 2003 clan matches, now that would be funny.
 
Doomtrooper said:
I hope you are as critical about the angle dependecny of Nvidias AA methods, I also think alot of people are making smoke into fire on this subject.

Absolutely. ATI's AA is clearly superior imo.

The people I see complaining must not have owned Nvidia products before, where you almost need a desktop hotkey for the AF slider.

That used to be true but i t seems like the newer cards are much faster at AF than NV2X.

I will take 'real world' filtering anyday over the 'pretty sceen shot' mode with 60% perfomance hits.

Most of the time I'd agree.
 
CosmoKramer said:
I assume that you also have read the articles here at B3D and/or tried the aniso tester software? If not, do that. You will find that the only difference between 8X and 16X AF is an improvement very strictly in the 0, 90, 180, 270 angles. Other angles have the same (poor) AF.
You are incorrect. Please be sure to check your sources.
 
Oh dear, we've gone from one extreme to the next.

First we had the framerate, and it was good... nothing else mattered, a good framerate meant the card was good.

Then reviewers looked down on the framerate, and it was bad. So, along came IQ and it was good... nothing else mattered, a good IQ meant the card was good.

Where's the balance? What good is framerate without the IQ on a high-end card? What good is IQ without framerate? One of the first thing reviewers should do after concluding which AA and AF methods are better, is then push them in games to which which ones are actually playable. If nVidia 8x AF is better than ATI 16x AF, yet isn't usable in-game, that's like having the most sleek plane ever, without giving it the necessary engines for take-off.

The way the article was written, a GeForce 2 could have taken on a GeForce FX in AF, and they would have scored pretty close... never mind the fact that the GeForce FX can push its AF to far higher settings, no, we want a (in such cases) meaningless so-called "apples to apples" comparison. Best setting Vs. Second best setting is not "apples to apples", and you should remember than 8x AF by itself means nothing. Best setting, second best setting... these having meaning. In the end, the true meaning is "Who gives me better theoretical IQ? Who gives me the best playable IQ?"

Let's not lose track of what these cards are used for: gaming. 95% of players will be seeing a game from the view of the player, and it is from THERE that these things should be compared.
 
Quitch said:
Let's not lose track of what these cards are used for: gaming. 95% of players will be seeing a game from the view of the player, and it is from THERE that these things should be compared.

Unfortunately, screenshots have been posted, and they exhibit the same angle dependency issues :!: Saying that it is really a non-issue when it clearly exists in a real game from the gamer's perspective is silly.

First we have claims of "I don't see this in any game I play. Then, "well, I admit it's there, but it doesn't detract from gameplay." Jesus, this is the same "logic" that has been thrown around by defender's of NVIDIA's optimizations. And I think we all agree that was flawed logic. It's great to see that the double standard is alive and well.
 
Back
Top