When will ATI allow us to disable filtering optimisations without disabling Cat AI?

Discussion in '3D Hardware, Software & Output Devices' started by Broken Hope, Apr 4, 2010.

  1. Broken Hope

    Regular

    Joined:
    Jul 13, 2004
    Messages:
    483
    Likes Received:
    1
    Location:
    England
    With how powerful the 5800 series is it baffles me that we still can't disable the texture filtering optimisations for AF like Nvidia card owners can, if we want to disable texture optimisations we have to disable Catalyst AI completely which also disables any game optimisations/bug fixes and also disables Crossfire.

    Surely by now we shouldn't be forced into making the choice between image quality and bug fixes/optimisations?

    Especially when ATI's optimisations end up with results like at the bottom of this page.

    Link
     
  2. Arnold Beckenbauer

    Veteran

    Joined:
    Oct 11, 2006
    Messages:
    1,424
    Likes Received:
    358
    Location:
    Germany
    I'm not sure, that with AI off all filtering "optimisations" are disabled.
    It's really sad: The X1000 came with angle independent AF, but with R600 and newer the filtering quality is worse. It's not ok.
     
  3. Broken Hope

    Regular

    Joined:
    Jul 13, 2004
    Messages:
    483
    Likes Received:
    1
    Location:
    England
    I just don't understand why we can't at least get an option in the driver to disable all filtering optimisations, it doesn't have to be the default but having an option for it should be a requirement.

    I'm sure there is plenty of people that would take a performance hit if it means removing/reducing the amount of texture crawl/shimmer that is currently present on ATI hardware when using AF.
     
  4. Ethatron

    Regular Subscriber

    Joined:
    Jan 24, 2010
    Messages:
    864
    Likes Received:
    266
    Hm, ATI Tray Tools always had that options. Maybe you ask Ray what's the registry names for disabling the optimizations.
     
  5. Broken Hope

    Regular

    Joined:
    Jul 13, 2004
    Messages:
    483
    Likes Received:
    1
    Location:
    England
    I noticed when using the filter tester found here that switching from TMU to ALU AF pretty much removes all of the shimmering, is this using the ALUs to do AF instead of the TMUs? If so couldn't ATI provide that as an option? Or would that reduce performance in shader based games too much?
     
  6. caveman-jim

    Regular

    Joined:
    Sep 19, 2005
    Messages:
    305
    Likes Received:
    0
    Location:
    Austin, TX
    The Evergreen series have angle independent filtering.
     
  7. CarstenS

    Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    4,875
    Likes Received:
    2,182
    Location:
    Germany
    Angle dependancy (or independancy) is by far not the only feature that decides about the filtering quality. If you sample to few times for a given degree of anisotropy for example, you'll end up with potentially visible (depends on the texture being filtered) artefacts.
     
  8. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    15,491
    Likes Received:
    2,668
    Thats a shame I remember Ati were famous for the quality of their af
     
  9. CarstenS

    Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    4,875
    Likes Received:
    2,182
    Location:
    Germany
    That was IIRC only the period of the fabolous X1K series (and maybe for some degree in the 9700 generation, which was famous for almost everything).
     
  10. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,542
    Likes Received:
    623
    Location:
    WI, USA
    So, in other words, 9700 - X1950 were relative bliss but then when G80 came along with its nearly perfect filtering the "enthusiasts with an eye" suddenly became unhappy with ATI. :) NVIDIA's stuff before G80 had some horrible filtering hacks. Honestly the only time I've noticed shitty filtering was with my 6800 and 7800 notebook chips.

    But with NV you could set it to HQ mode, ditching probably 20% performance though.
     
  11. Broken Hope

    Regular

    Joined:
    Jul 13, 2004
    Messages:
    483
    Likes Received:
    1
    Location:
    England
    Yeah the problem is ATI is still using horrible filtering hacks and they can't be disabled by setting a HQ mode like on Nvidia hardware.
     
  12. caveman-jim

    Regular

    Joined:
    Sep 19, 2005
    Messages:
    305
    Likes Received:
    0
    Location:
    Austin, TX
    I think ultimately what we see is in response to the success of 'good enough' - perfect isn't fiscally viable, so as close as you can get without incurring massive costs is what is delivered.

    As far as ATI's AF 'problems' I'm on the fence about that, (as Broken Hope knows from other discussions we've had) - I'm not sure that the issue is ATI's driver method or if its something else screwing it up.
     
  13. CarstenS

    Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    4,875
    Likes Received:
    2,182
    Location:
    Germany
    No, it was actually really really fast on 9700 compared to the competition at launch (GF4 TI), so that the feature became more usable. Radeon 9- and X-series had pretty strong angle dependent AF, allowing primarily for cheaper hardware implementation (transistor wise). Quality wise, it was due to it's strong angle dependancy (see below) inferior too. With X1K, Ati had a real and noticeable advantage when AI wasn't interfering (and due to the bad default quality on 6800/7800 also in almost all cases with AI enable too!) - especially when you enabled "HQ-AF", which ditched the angle dependancy, which was still the default, in this case to loose not too much fillrate compared to Nvidia.

    Coming to think of the glory days, somehow, the launch of GTX400 reminds me a bit of the one of X1800 back in 2005. The only real criticism for the average consumer was the noise level of the cards. The then-news technology, SM3, was implemented costly but very thorough (IIRC, no chip beat the branch granularity of R520 until today). And the perf-delta compared to 7800 GTX was small at first but grew later on massively, especially with X1900 refresh. But if that's the case with HD5K vs. GTX400 remains to be seen.

    Yeah, what the driver forced in the default settings onto the chips was nothing short of abysmal. Luckily, you could opt in most cases for better quality with the driver switch to high quality (but that was a bit tricky, because of several factors: You had to first choose your desired Level of AF, then switch off the filter optimizations (trilinear, mip and aniso sample) and then you had to switch to HQ *doh*.

    On 6800/7800: definitely, later it was barely noticeable performance wise in most cases.
     
  14. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,542
    Likes Received:
    623
    Location:
    WI, USA
    I see 9700's AF as an evolution of what R200 and R100 did before it. Low performance impact with considerable IQ improvement, but (especially with R100/200) a very tweaked and simplified implementation. That inability to do trilinear at the same time as AF was pretty nasty.

    GF4 (and probably GF3 for that matter) had good quality AF, but the large drain on fillrate brought its usefulness into question in contemporary games. With video cards you have to "live in the now" because they have no future, and you don't buy $200+ cards for retro gaming usually :).

    I loved how G80 turned the tables and obviously pushed texture filtering quality and performance as a priority. NVIDIA really cleaned up their lawn after leaving G7x and friends behind.
     
  15. OpenGL guy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,357
    Likes Received:
    28
    Maybe you just worded things poorly, but the 9700 could certainly do AF and trilinear at the same time.
     
  16. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,542
    Likes Received:
    623
    Location:
    WI, USA
    Oops. I meant R100 & R200 there. R300 was the first ATI chip to do AF + trilinear.
     
  17. no-X

    Veteran

    Joined:
    May 28, 2005
    Messages:
    2,333
    Likes Received:
    290
    in fact...

    R100 had broken AF
    R200 had broken AA
    R300/400/500 was fine
    R600 had broken AA
    R700 was fine
    R800...
     
  18. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    15,491
    Likes Received:
    2,668
    for those like myself who dont understand codenames

    r100 = original radeon
    r200 = radeon 8500
    r300 = radeon 9000 series
    r400 = radeon X series (ie: x800)
    r500 = radeon X1000
    r600 = radeon hd2000/3000 series
    r700 = radeon hd4000 series
    r800 = redeon hd5000 series
     
  19. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,542
    Likes Received:
    623
    Location:
    WI, USA
    I used AF on my original Radeon card. It was quirky-but-fast like on R200 but image quality was better with than without it.

    What was borked again? I remember that it only had 2X and 16X as options....

    found a mip level pic. Clearly it uses bilinear filtering with AF as is expected but it also looks like instead of different filtering the mip map LOD is just changed. ATI's texturing filtering was a disaster of bugs and cheats until R300 though so who knows whats up.
    http://ixbtlabs.com/articles/radeon/
    [​IMG]

    It didn't have multisampling, but I'm still foggy on what was broken. Since all it did was supersampling, the huge performance hit wasn't surprising.
     
    #19 swaaye, Apr 12, 2010
    Last edited by a moderator: Apr 12, 2010
  20. no-X

    Veteran

    Joined:
    May 28, 2005
    Messages:
    2,333
    Likes Received:
    290
    As for R100... I remember [off] / [16x] options in CP. Maybe earlier CP allowed 2x and it was removed later, I don't know. Anyway, every value other than 16x created quite a mess - it can be seen in current AF testers.

    only AF 16x worked as it should - or rather "worked"... it was angle-optimized in a very strange way. Only exact 0/90/180/270 angles were filtered in full quality. It wasn't problem for many then games (>90% of surfaces fit these angles), but quality test of 3DMark 2001 wasn't the case. It seems the scene was cherry picked to find the worst-case situation for the R100 AF:

    AF off / AF 16x
    [​IMG] [​IMG]

    2x/4x/8x modes were broken. You can see it even on the Q3 sample - one stage is missing (no red mipmap - maybe they fixed that later or it was disfunctional only under specific circumstances, but for 4x and 8x in never worked correctly):

    no AF: http://www.abload.de/img/01xk7z.png

    2x / 4x / 8x / 16x

    [​IMG] [​IMG] [​IMG] [​IMG]

    as for R200 - it's quite difficult to say... ATi engineers were more successful to mask the problems, but anyway, R600 was referred to be the most buggy part since R200 - so you can imagine it wasn't working in the way as ATi intended... some sources say, that R200 in fact supported multi-sampling, but it was broken. I have no proof of that, but I noticed, that ATi changed sample patterns in driver for the R2xx GPUs. But none of the drivers I have ever tried gave me the pattern shown in official whitepaper. That not proves broken MSAA implementation, but it proves, that AA simply didn't work as it should on R200.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...