Excellent article on NVidia and ATi Optimisations

Exxtreme said:
Yes, but when you're using a geforce, you have allways the choice between high-quality-image and low performance and low image quality and high performance... except UT2003.

If I understand you, you're are saying that you consider it a nVidia advantage that nVidia's low-quality settings are of a lower quality than ATi's low-quality settings, and because ATi's high-quality settings are at least as good and often better than nVidia's high-quality settings...?

What's the advantage?
 
Exxtreme said:
Yes, but when you're using a geforce, you have allways the choice between high-quality-image and low performance and low image quality and high performance... except UT2003.

So, it's a *good* thing than Nvidia cards give you unsatisfactory results in terms of either IQ or performance in every game? :rolleyes:

I don't think Nvidia should be applauded for producing a card that compromises everywhere, and thus doesn't produce the required good performance and IQ at the same time. Especially when their competitors manage to get good IQ and speed together.

It's like going out to get lunch: In the ATI bar you get the bread and the filling. At the Nvidia bar, you can get either the bread, or the filling - neither of which make a good sandwich by themselves.
 
WaltC said:
Exxtreme said:
Yes, but when you're using a geforce, you have allways the choice between high-quality-image and low performance and low image quality and high performance... except UT2003.

If I understand you, you're are saying that you consider it a nVidia advantage that nVidia's low-quality settings are of a lower quality than ATi's low-quality settings, and because ATi's high-quality settings are at least as good and often better than nVidia's high-quality settings...?

What's the advantage?
No, i mean that with a geforce, you can have really good AF-quality if you want. You have the choice how much performance/quality you will sacrifice.

With a Radeon is this not possible. You have never a very good af quality.
 
The ATI solution may not be ideal, but the point you're missing is it's really a case of ATI doing their best with hardware as it currently stands.

And no article can say nVidia has better AF than ATI without first doing an nVidia 8x AF Vs. ATI 16x AF, followed by some comparitive framerate scores (no good saying X is better than Y, if X is only better in theory and unplayable in practice).
 
Disregarding cheats and preemptive compromises for a second, NV hardware has the ability to perform near perfect AF up to 8x. At a cost.

Starting at a base level (=no AF), you can improve quality to your desired level. And that's it. Of course you lose performance. If it bothers you, reduce AF.

The fundamental difference to R300 (in a perfect world, again) is that there will be deficiencies in the AF filtering results, no matter how high you crank it up. You can't remove them by investing more performance. As it seems to be a hardware limitation it's not "evil", but it's still "bad".

PS: I'm actually an ATI nut more than an NV nut (currently 3 Radeons vs 2 Geforces, plus the Radeon is my preferred gaming solution). But you just have to hand them the AF crown. That's only fair. Just because NV stopped playing fair doesn't mean you have to imitate them :p
 
zeckensack said:
Starting at a base level (=no AF), you can improve quality to your desired level. And that's it. Of course you lose performance. If it bothers you, reduce AF.

I disagree with the comment, as your comment on 'If it bothers you, reduce AF' is a reduction of filtering to gain frames, as reducing sampling is lowering quality (4X AF appears to be the 'sweet spot' and the abrupt filtering changes is noticeable especially on First Person Shooters).
If you are forced to lower filtering then you are not able to use the 'superior' implementation I see thrown around alot.

Nvidias implementation takes far too much of a perfomance hit, and the company themselves know it, as they wouldn't be dropping down to pseudo-trilinear/bilinear with the detection of UT 2003.

There is trade offs for any architecture, and I believe ATI's implementation is more real world useable in 99% of the games on the market. I've never touched my slider on my 9700, always stays at 16X.
Quite a few of my aquaintences jumped from Ti4600 to 9700s and 9800's and that is the 1st thing they said, no more having to play around with the slider to get frames, no more abrubpt filtering changes in front of them.
 
Bjorn said:
What does the definition of "anisotropic" tells us then ?

Well it tells us the filtering should be applied anisotropically to the 'view plane', not isotropically.

Ok, so finding one game that's unplayable at f.e 1600*1200, 6X FSAA on a R9800 makes the FSAA unusable on that card ?

Come on now, I can link to Dungeon Siege benchmarks on this site and many others that show well over 56% of a performance hit, we are talking $500 US cards here.

1600x1200 , 4xAA/8xAF
2003-08-13 16:56:18 - BF1942
Frames: 1137 - Time: 94344ms - Avg: 12.051 - Min: 3 - Max: 23
1600x1200 , 4xAA/ no AF
2003-08-13 16:58:55 - BF1942
Frames: 2669 - Time: 93406ms - Avg: 28.574 - Min: 22 - Max: 35

Enabling 8XAF lowered the average frame rate from 28 to 12, on one of the most popular online games out there...UT 2003 would be more.
 
All I'm saying is there is no 'white paper' that states filtering must be applied equally at all angles for anisotropic filtering. Ideally yes a perfect circle is nice, but the performance hit associated with that implementation makes high sampling (above 4XAF) not useable in 'real world' scenarios.

The above 5800 numbers were done in a 'real world' test from a member on Nvnews, online with lots of action going on in BF 1942.
 
Doomtrooper said:
Come on now, I can link to Dungeon Siege benchmarks on this site and many others that show well over 56% of a performance hit, we are talking $500 US cards here.

Not being able to use 8X AF at 1600*1200 in every game that's out there hardly makes it unusable though. Sure, Ati's might be more "real world useable" and some people might prefer the tradeoff Ati made but that's a completely different thing.
 
Why is it that the playable 16x AF pushing the mipmap planes into the far distance is considered irrelevant (though having them close to you is entirely noticable most of the time... you're always seeing what's in front of you after all). Yet, the fact that nVidia's method smooths out certain angles that the ATI method doesn't, is thought to be all important... even though you really won't get away with anything beyond 4x AF, meaning you're going to have mipmaps in your face all the time,

More importantly, why is every shot SHOWING what a major flaw this is in the ATI method, taken from a camera angle the player won't get while playing the game? Take the camera outside the players location, and it doesn't surprise me you're going to expose some flaws, since if compromises were made, I think it's safe to say they were made with the assumption the player would be viewing from the players location (and there are many rules of thumb that can be made once you have this assumption).

Really, which is worse for you? Constantly facing close mipmaps, or having a few angles somewhat less sharp?
 
Back
Top