ATI GPU image quality just looks better! But why?

Discussion in '3D Hardware, Software & Output Devices' started by gongo, Apr 5, 2009.

  1. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    Well, I meant compared to a high-end standalone Blu-Ray/DVD player. Stuff like Sony's ES series is built like a safe :)
     
  2. Novum

    Regular

    Joined:
    Jun 28, 2006
    Messages:
    335
    Likes Received:
    8
    Location:
    Germany
    I don't want to be rude, but that's just utter bullshit. It really annoys me when people do that.

    I implemented an AF algorithm myself (with perfect round angle independence, and up to as many samples as I want) and I can tell you, that ATI IS cheating. ATI is underfiltering some LOD levels more and some less (even with disabled A.I). It even get's worse if you switch from 8x to 16x. And it is just ridiculous what they deliver with enabled A.I. (can't test that directly, because for some reason it does not show any difference in my application although there certainly is for other - perhaps it's disabled for D3D10).

    You would not believe how far you can go with undersampling textures, because their frequency is normally relatively low. I guess that they save up to 50% of the samples with enabled A.I.

    NVIDIAs HQ filter on the other hand is not distinguishable from my reference, even if I oversample by a factor of 2 (which just proves that nyquist was right ;)). The only problem I found with their TMUs is that they have slighty less sample position precision, which results in wrong sampled mips in extreme cases. But that is not noticeable at all for the user.
     
    #62 Novum, Apr 16, 2009
    Last edited by a moderator: Apr 16, 2009
  3. vazel

    Regular

    Joined:
    Aug 16, 2005
    Messages:
    992
    Likes Received:
    3
    I remember reading AF comparisons where ATI came out with better quality. But I haven't read anything recently. I may just have leftover bias, back then Nvidia wasn't too hot in trilinear and AF quality so it's nice to hear they shaped up. They were especially horrible during the Geforce 6 series where you could see textures shimmering on the default driver quality setting, the kind of shimmering you usually only get at the lowest driver quality setting which indicates lots of optimizations/cheating. Nvidia did release a fix but you have to set the driver quality setting to the highest setting incurring a performance hit.
     
    #63 vazel, Apr 16, 2009
    Last edited by a moderator: Apr 16, 2009
  4. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    From what I recall, the original GeForce 8800 was shockingly close to the reference rasterizer in AA/AF. Radeon 2900 wasn't, still the same 'poor' AF as on the X1900:
    http://www.xbitlabs.com/articles/video/display/r600-architecture_16.html
    But I haven't really followed if ATi changed anything to their AA/AF algos in the 3000/4000 series. Besides, with AI enabled, things will all fall apart anyway, no matter how good it can be at its best quality settings.
     
  5. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    ati did change because the X1 series "HQ AF" options were gone, leading to believe it was all enabled by default on most things after it.

    Though I must say the AF quality on my HD4850 "feels" better than on my HD2900, I have nothing to back that up with though.
     
  6. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    Well, if you look at the link I posted, they compare 'default' AF on the 2900 to the 'HQ AF' on the 1900 and 8800.
    I suppose that's because there is no 'HQ AF' anymore, and 'default' is now the same.
    But that's just a change in driver semantics. The hardware apparently performs the same filtering.

    I haven't been able to find any AF analysis images like the one in the XBitlabs article I linked to for newer cards/drivers.
    I'd like to see what both ATi and nVidia are doing now.
     
  7. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    only in-game comparison:

    http://alienbabeltech.com/main/?p=3188&page=9


    Interesting note on AA in OpenGL:
     
  8. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    Well, that's interesting. I didn't expect the visual differences to be THIS apparent (most of the time the differences in those coloured mipmap images are mainly academic. You can see a slightly different algorithm drawing a slightly different pattern, but in actual games the difference may not even be noticable to the naked eye). I suppose it comes down to personal taste then. As the article says, the nVidia one looks more blurry, so people may prefer the sharper ATi look, even if technically nVidia is the more 'correct' one.
     
  9. Novum

    Regular

    Joined:
    Jun 28, 2006
    Messages:
    335
    Likes Received:
    8
    Location:
    Germany
    That's just nonsense. People do like the sharper look when looking at screenshots, but I doubt anybody likes the flicker in motion. In offline CG everything is supersampled like hell to avoid the least amount of flicker. It's just not natural.

    It especially comes apparent if you play old games without shaders (that often flicker in addition to what the TMU delivers). You can get really really perfect anti aliasing with edge detect 8x multisampling, but the textures never look that smooth.

    I think Microsoft should begin measuring TMU output quality in the WHQL suite in the future. It's just silly that you have full IEEE754 double precision in the ALUs and your input signal is just wrong.

    Problem today is that games like Crysis flicker like hell anyway.

    But come on. How much performance would they lose? With todays ALU:TEX ratio and thousands of threads you can hide so much latency, that I really doubt that no undersampling would cost them more than 10% of performance on average. Just give the damn users an option. That's especially true for the high end X2 cards where you ironically have to live with A.I. because otherwise crossfire won't work...
     
  10. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,213
    Likes Received:
    1,883
    Location:
    Finland
    That's one thing many seem to skip completely, this just happened on another forum, where few users see that it's a "apples to apples" comparison when nV settings are manually set to HQ, and ATI has HQ (which is default) but AI disabled, while in reality this disables the application specific fixes and optimizations on ATI, which are something you can't do on nVidia.

    It would be nice to get a view on what exactly AI on standard does excluding appspecific fixes / optimizations, and handle CF, if there's any general, IQ lowering optimizations (even in theory, even if not visible to naked eye without huge zoom ins) or not etc.
     
  11. Novum

    Regular

    Joined:
    Jun 28, 2006
    Messages:
    335
    Likes Received:
    8
    Location:
    Germany
    When A.I. is enabled they do filter even worse in almost any game I tried. It's especially noticeable in HL².

    I posted this GIFs here a while ago:
    http://www.mental-asylum.de/files2/aioff.gif
    http://www.mental-asylum.de/files2/aion.gif

    (8 bit doesn't matter, it looks the same like a 32 bit APNG).

    Also note that ATI is lying about that all the time:
    BULL SHIT.

    By the way R5xx did a better job at filtering than R6xx and R7xx besides having a worse LOD calculation (R5xx HQ was almost as good as NVIDIAs current filter).
     
    #71 Novum, Apr 16, 2009
    Last edited by a moderator: Apr 16, 2009
  12. Neb

    Neb Iron "BEAST" Man
    Legend

    Joined:
    Mar 16, 2007
    Messages:
    8,391
    Likes Received:
    3
    Location:
    NGC2264
    have they improved the 'high' quality mode for Nvidia as it was horrible with visible mip transition as opposed to 'very high' with it's large perfomance impact. I also wonder how much the higher image sharpness out-of-box* ATI IQ affects shimmering vs Nvidias blurrier out-of-box* IQ. Has ther been improvements for GTX280.


    * 7900GT/8800GTX era


    i also remember a comparision between R5xx and G7x regarding anisotropic filtering, AF-HQ vs NV AF in HL2. The ATI AF-HQ was better especially in some angles.
     
  13. Novum

    Regular

    Joined:
    Jun 28, 2006
    Messages:
    335
    Likes Received:
    8
    Location:
    Germany
    All GeForces since the 8 series do have a better default filter than any current ATI can do even with A.I. diabled.

    Even the GeForce 6/7 when set to HQ (not Q!, which is really really bad) was filtering better than RV770 with A.I. enabled. People just didn't look at the quality of R600 because the card was crappy anyway and ATI got away with it since today.

    NVIDIA is not filtering "blurry", they filter correct. If you underfilter you get more wrong frequencies which makes the image look sharper.

    I'm not talking about angle dependence. That's a non-issue today. Both IHVs do a good enough job at determining the line of anisotropy (though NVIDIA is also doing that better).
     
  14. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    Does AA work on UT3 engines without Cat.A.I.?

    kthxbye.
     
  15. Neb

    Neb Iron "BEAST" Man
    Legend

    Joined:
    Mar 16, 2007
    Messages:
    8,391
    Likes Received:
    3
    Location:
    NGC2264
    How about other games, all games affected or just some. I assume A.I works differently for different games?

    I could upload later some A.I off-aggresive settings images of different games. HL2 with CM9.5 and Crysis with Rigel's texture pack?

    I was thinking about general image sharpness as in 2D/3D. The ATI 'wrong frequencies' sharpness seems to me like when I had NV HQ with negative LOD bias. Dunno really not into this stuff.
     
  16. Novum

    Regular

    Joined:
    Jun 28, 2006
    Messages:
    335
    Likes Received:
    8
    Location:
    Germany
    No.

    I assume they do a fast frequency test on each texture and set the filter "appropriately".

    Would be nice.

    It's not exactly the same, but the effect is similar.

    When you apply a negative LOD bias then you get undersampling in both directions, if you do cheap AF you only undersample one.
     
  17. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    8,213
    Likes Received:
    1,883
    Location:
    Finland
    Are you talking about AF or something else?
    Since R5xx HQ has the exact same pattern as R6xx/7xx, and GF6/7 don't come even near it even if set to best possible quality
     
  18. Novum

    Regular

    Joined:
    Jun 28, 2006
    Messages:
    335
    Likes Received:
    8
    Location:
    Germany
    I'm talking about AF.

    No it does not. R6xx/R7xx have better LOD calculation, but are underfiltering much more than R5xx. You can't see that on that colored filter tester screenshots. Even always taking only one sample would get you the same output for a fully colored mip chain.

    That's the problem. People are so focused on the "flowers" that they don't notice the real problem.

    GeForce 7 HQ is filtering better than RV770 A.I. default when it comes to underfiltering. I'm not talking about angle dependence (which is clearly much worse on the GeForce 7).

    Compare it for yourself if you don't believe me. Human memory can be very easily manipulated, especially if you always hear that GeForce 7 had the worst filter of all times (which it had, but only in Q mode).
     
  19. Arnold Beckenbauer

    Veteran

    Joined:
    Oct 11, 2006
    Messages:
    1,415
    Likes Received:
    348
    Location:
    Germany
    If the game offers a MSAA option, ofcourse (GoW or AFAIK R6 Vegas 2).
     
  20. Novum

    Regular

    Joined:
    Jun 28, 2006
    Messages:
    335
    Likes Received:
    8
    Location:
    Germany
    Yes, but not if it doesn't (like UT3). I think that is what he wanted to know.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...