Excellent article on NVidia and ATi Optimisations

Discussion in 'Graphics and Semiconductor Industry' started by g__day, Aug 18, 2003.

  1. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    If I understand you, you're are saying that you consider it a nVidia advantage that nVidia's low-quality settings are of a lower quality than ATi's low-quality settings, and because ATi's high-quality settings are at least as good and often better than nVidia's high-quality settings...?

    What's the advantage?
     
  2. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    82
    So, it's a *good* thing than Nvidia cards give you unsatisfactory results in terms of either IQ or performance in every game? :roll:

    I don't think Nvidia should be applauded for producing a card that compromises everywhere, and thus doesn't produce the required good performance and IQ at the same time. Especially when their competitors manage to get good IQ and speed together.

    It's like going out to get lunch: In the ATI bar you get the bread and the filling. At the Nvidia bar, you can get either the bread, or the filling - neither of which make a good sandwich by themselves.
     
  3. Exxtreme

    Newcomer

    Joined:
    Feb 7, 2002
    Messages:
    87
    Likes Received:
    0
    Location:
    Germany
    No, i mean that with a geforce, you can have really good AF-quality if you want. You have the choice how much performance/quality you will sacrifice.

    With a Radeon is this not possible. You have never a very good af quality.
     
  4. Quitch

    Veteran

    Joined:
    Jun 11, 2003
    Messages:
    1,521
    Likes Received:
    4
    Location:
    UK
    The ATI solution may not be ideal, but the point you're missing is it's really a case of ATI doing their best with hardware as it currently stands.

    And no article can say nVidia has better AF than ATI without first doing an nVidia 8x AF Vs. ATI 16x AF, followed by some comparitive framerate scores (no good saying X is better than Y, if X is only better in theory and unplayable in practice).
     
  5. Rolf N

    Rolf N Recurring Membmare
    Veteran

    Joined:
    Aug 18, 2003
    Messages:
    2,494
    Likes Received:
    55
    Location:
    yes
    Disregarding cheats and preemptive compromises for a second, NV hardware has the ability to perform near perfect AF up to 8x. At a cost.

    Starting at a base level (=no AF), you can improve quality to your desired level. And that's it. Of course you lose performance. If it bothers you, reduce AF.

    The fundamental difference to R300 (in a perfect world, again) is that there will be deficiencies in the AF filtering results, no matter how high you crank it up. You can't remove them by investing more performance. As it seems to be a hardware limitation it's not "evil", but it's still "bad".

    PS: I'm actually an ATI nut more than an NV nut (currently 3 Radeons vs 2 Geforces, plus the Radeon is my preferred gaming solution). But you just have to hand them the AF crown. That's only fair. Just because NV stopped playing fair doesn't mean you have to imitate them :p
     
  6. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada
    I disagree with the comment, as your comment on 'If it bothers you, reduce AF' is a reduction of filtering to gain frames, as reducing sampling is lowering quality (4X AF appears to be the 'sweet spot' and the abrupt filtering changes is noticeable especially on First Person Shooters).
    If you are forced to lower filtering then you are not able to use the 'superior' implementation I see thrown around alot.

    Nvidias implementation takes far too much of a perfomance hit, and the company themselves know it, as they wouldn't be dropping down to pseudo-trilinear/bilinear with the detection of UT 2003.

    There is trade offs for any architecture, and I believe ATI's implementation is more real world useable in 99% of the games on the market. I've never touched my slider on my 9700, always stays at 16X.
    Quite a few of my aquaintences jumped from Ti4600 to 9700s and 9800's and that is the 1st thing they said, no more having to play around with the slider to get frames, no more abrubpt filtering changes in front of them.
     
  7. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada
    Well it tells us the filtering should be applied anisotropically to the 'view plane', not isotropically.

    Come on now, I can link to Dungeon Siege benchmarks on this site and many others that show well over 56% of a performance hit, we are talking $500 US cards here.

    1600x1200 , 4xAA/8xAF
    2003-08-13 16:56:18 - BF1942
    Frames: 1137 - Time: 94344ms - Avg: 12.051 - Min: 3 - Max: 23
    1600x1200 , 4xAA/ no AF
    2003-08-13 16:58:55 - BF1942
    Frames: 2669 - Time: 93406ms - Avg: 28.574 - Min: 22 - Max: 35

    Enabling 8XAF lowered the average frame rate from 28 to 12, on one of the most popular online games out there...UT 2003 would be more.
     
  8. Myrmecophagavir

    Newcomer

    Joined:
    Dec 28, 2002
    Messages:
    136
    Likes Received:
    0
    Location:
    Oxford, UK
    No, it's to do with the shape of the pixel's projection in texture space...
     
  9. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada
    All I'm saying is there is no 'white paper' that states filtering must be applied equally at all angles for anisotropic filtering. Ideally yes a perfect circle is nice, but the performance hit associated with that implementation makes high sampling (above 4XAF) not useable in 'real world' scenarios.

    The above 5800 numbers were done in a 'real world' test from a member on Nvnews, online with lots of action going on in BF 1942.
     
  10. Bjorn

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,775
    Likes Received:
    1
    Location:
    Luleå, Sweden
    Not being able to use 8X AF at 1600*1200 in every game that's out there hardly makes it unusable though. Sure, Ati's might be more "real world useable" and some people might prefer the tradeoff Ati made but that's a completely different thing.
     
  11. Quitch

    Veteran

    Joined:
    Jun 11, 2003
    Messages:
    1,521
    Likes Received:
    4
    Location:
    UK
    Why is it that the playable 16x AF pushing the mipmap planes into the far distance is considered irrelevant (though having them close to you is entirely noticable most of the time... you're always seeing what's in front of you after all). Yet, the fact that nVidia's method smooths out certain angles that the ATI method doesn't, is thought to be all important... even though you really won't get away with anything beyond 4x AF, meaning you're going to have mipmaps in your face all the time,

    More importantly, why is every shot SHOWING what a major flaw this is in the ATI method, taken from a camera angle the player won't get while playing the game? Take the camera outside the players location, and it doesn't surprise me you're going to expose some flaws, since if compromises were made, I think it's safe to say they were made with the assumption the player would be viewing from the players location (and there are many rules of thumb that can be made once you have this assumption).

    Really, which is worse for you? Constantly facing close mipmaps, or having a few angles somewhat less sharp?
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...