ATi is ch**t**g in Filtering

Discussion in 'Architecture and Products' started by Bitpower, May 16, 2004.

  1. nutball

    Veteran Subscriber

    Joined:
    Jan 10, 2003
    Messages:
    2,492
    Likes Received:
    979
    Location:
    en.gb.uk
    Oh, they screwed up. That's alright then. Move along people, nothing to see here... :roll:
     
  2. Veridian3

    Newcomer

    Joined:
    Jan 31, 2003
    Messages:
    120
    Likes Received:
    0
  3. pocketmoon_

    Newcomer

    Joined:
    Nov 15, 2002
    Messages:
    117
    Likes Received:
    0
    Now the dust is beginning to settle, I think ATI have missed a massive marketing opportunity here. They should have came out on day 1 and say 'Hey we have this great new adaptive filtering that's better than anything you get now' followed by some graphs showing increased FPS and equivalent IQ. It would have had more impact than 3dc.

    It's a shame that anything they do now will just be viewed as spin.
     
  4. Xmas

    Xmas Porous
    Veteran Subscriber

    Joined:
    Feb 6, 2002
    Messages:
    3,344
    Likes Received:
    176
    Location:
    On the path to wisdom
    I was going to ask the same thing. I think such optimizations (this one, fast trilinear, even brilinear for quite a lot of cases) are sensible approaches, it's just that filtering control is in the APIs for a reason. Of course introduction into an API takes time, so a way to use them before is fine IMO. But I have seen no indications at all that the IHVs are trying to push those optimizations into an API. There are cases where you invariably want full trilinear, and it would be nice if the application could specify that. In other cases, it may be enough to just make the mip map transitions practically invisible.

    The really fishy thing is not those optimizations by themselves, but how they are (or not) advertised.
     
  5. mikechai

    Newcomer

    Joined:
    Mar 6, 2003
    Messages:
    210
    Likes Received:
    1
    #705 mikechai, May 19, 2004
    Last edited by a moderator: Mar 12, 2011
  6. mikechai

    Newcomer

    Joined:
    Mar 6, 2003
    Messages:
    210
    Likes Received:
    1
    Great. How many hours from now ?
    ________
    EASY VAPE REVIEW
     
    #706 mikechai, May 19, 2004
    Last edited by a moderator: Mar 12, 2011
  7. nutball

    Veteran Subscriber

    Joined:
    Jan 10, 2003
    Messages:
    2,492
    Likes Received:
    979
    Location:
    en.gb.uk
    A few reasons I can think of why this isn't ideal.

    First off APIs tend to get frozen! This means that any new whoppee-doo filtering schemes that are developed are not accessible to older games (where older can mean just a few months old).

    Second it's a whole load of extra work for developers, and my (albeit limited) experience with the way games handle AA/AF is that support is piecemeal at best. Moreover if each IHV comes up with different whizzo-filters, each game effectively has to be aware of each of these quality setting for each IHV. Nightmare!

    Personally I'd much prefer a centralised control panel approach, which can evolve with new drivers. New filtering features can be retro-fitted to older games. Using the new NVIDIA app. specific settings is a nice idea IMO (when it works properly!). People who don't want app-specific settings can just use a global setting.

    My €0.02.
     
  8. tEd

    tEd Casual Member
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,105
    Likes Received:
    70
    Location:
    switzerland
    and where can reach the chat , didn't see any link or is it IRC(what network,channel?)
     
  9. DemoCoder

    Veteran

    Joined:
    Feb 9, 2002
    Messages:
    4,733
    Likes Received:
    81
    Location:
    California
    Is the consensus that this "optimization" will be an option in the next driver version?
     
  10. Kombatant

    Regular

    Joined:
    May 29, 2003
    Messages:
    639
    Likes Received:
    19
    Location:
    Milton Keynes, UK
    from what I gather, they're going to implement a web-based chat interface in that page, or link to one, anyway.
     
  11. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    Just seen the email about the chat, I'll make a couple of additions:

    ;)
     
  12. mikechai

    Newcomer

    Joined:
    Mar 6, 2003
    Messages:
    210
    Likes Received:
    1
    :shock:
    ________
    Honda XL175 history
     
    #712 mikechai, May 19, 2004
    Last edited by a moderator: Mar 12, 2011
  13. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    I think there is a simple reason for this with X800 - they forgot. Being a 130nm product they lifted some of the code for RV360, and it inherted its texture filtering abilities almost by default. How the driver guys knew but PR / Marketting didn't is another question - however, when I mentioned to PR that 9600 has been doing this from day one the initial reaction was "no, we don't have that capability".


    I think it might be simpler than his in the frst place anyway - I think if the developer supplies the mip map levels then it will do exactly as the developer asks; if the developer relies on autogenerated mips then its being left to the hardware/drivers which falls into the realms of "let the IHV decide".
     
  14. pocketmoon_

    Newcomer

    Joined:
    Nov 15, 2002
    Messages:
    117
    Likes Received:
    0
    ATI are stuck between a rock and a hard place.

    If they include an option in the CP then we will see FPS falling all over the place as reviewers try and compare, rightly or wrongly, "apples to apples".

    If they stick to their guns and say 'you don't need to switch it off we guarantee equivalent or better IQ' the wrath of the community is apon them!

    It's all going to depend on how clearly ATI can show no loss of IQ vs how clearly the community CAN show a visible loss.

    At the moment it nudging towards ATI. Time for a Poll!
     
  15. mjtdevries

    Newcomer

    Joined:
    Mar 25, 2004
    Messages:
    178
    Likes Received:
    1
    Location:
    Netherlands
    @democoder:

    I hope it will not be an option in the next drive release. I hope that you cannot turn it off and the reason for that is the following:

    Suppose you can turn it off. Then review sites like Anandtech, Tom's and most others will most likely tell their readers that this feature is something like Nvidia's brilinear and treat it as if it was exactly the same.

    They will do performance tests with both ATI's and Nvidia's optimizations both turned off or both turned on. And they will ignore the fact that ATI's method doesn't give visible lower IQ, and Nvidia's does.

    I suspect there will only be something like 2 sites that would dare to do high IQ performance tests with ATI's feature turned on and Nvidia's turned off.

    I also strongly believe that this is the reason why ATI kept it a secret.
     
  16. pocketmoon_

    Newcomer

    Joined:
    Nov 15, 2002
    Messages:
    117
    Likes Received:
    0

    If it appears as an option it should say

    'Enable High Quality Filtering'

    rather than

    'Disable Filtering Optimisations'


    ;)
     
  17. no_way

    Regular

    Joined:
    Jul 2, 2002
    Messages:
    301
    Likes Received:
    0
    Location:
    estonia
    Thats why theres an extension mechanism in OGL in the first place.
    So you think that all new texture compression schemes should just "sneak in" in disguise, for instance ? Would it have been OK for ATI to silently apply TruForm whenever the driver got a batch of tris that seemed suitable for it ?
    Somehow though various vendor OGL extensions still get developed, and god forbid, some even ratified by ARB as official.
    Somehow MS managed to come out with PS 1.4, PS2.0a and b etc etc .. but for this amazing new filtering scheme there was simply no room eh ?

    And for "accessible to older games" .. those older games would be pretty happy with vanilla old trilinear or bilinear as it is, after all this is a new benchmark-busting superchip which should have no trouble running them, right ?

    This rant is not specific to ATI and this case, im generally annoyed with the way graphics industry is moving. At some point in time we will be in a situation where everything will be rendered as driver developers damn well please, nevermind what the API and application developers specify. Oh, perhaps triple-A titles will even look and perform barely acceptably, but everyone else ... ?
     
  18. mjtdevries

    Newcomer

    Joined:
    Mar 25, 2004
    Messages:
    178
    Likes Received:
    1
    Location:
    Netherlands
    One problem of putting something like this in an API is that ATI will probably have to tell what exactly they are doing.

    And they might not want to do that, because this feature is giving them a performance edge over Nvidia.
     
  19. DemoCoder

    Veteran

    Joined:
    Feb 9, 2002
    Messages:
    4,733
    Likes Received:
    81
    Location:
    California
    Well, I'd like to have benchmarks like this:

    Trilinear vs Trilinear
    Brilinear vs Brilinear (measures "quality" of Brilinear implementation, and "image analysis" error rate. NVidia's drivers already do a sort of "poor mans" image analysis: it is done by human beings and enabled via application detection vs ATI's "automatic" analysis)
    Bilinear vs Bilinear

    If there isn't even an option in the driver, it will be harder to analyze the exact IQ differences. It seems they can only compare it to older cards. We need the ability to take screenshots on X800 with optimizations and without, so we can go over them with a fine tooth comb.

    I just don't see the point of taking away an option from users just because some reviewers might use it in ways you don't like. The point is to deliver choice to users, not be afraid of benchmarks.
     
  20. tEd

    tEd Casual Member
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,105
    Likes Received:
    70
    Location:
    switzerland
    a fact? hardly! From what i gather from this thread so far is that nvidias brilinear and ati's brilinear give the exact same IQ. I would like to see some testing with a couple of games where ati or their adaptive alogroithm apply bilinear and where not. I wouldn't be surprised to find out that when you take 20 games and test them to find out that in none of those games the algorithm would apply full trilinear.

    Funny thing is that ati is proving nvidia right with this. They were using bilinear over a year now with nv3x and there were a lot of people not so happy about it(me included) and now ati's practically does the same. The difference however when i read some of the people thoughts on different forums about it is that nvidia cheats(lowering IQ) while ati doesn't cheat(not lowering IQ) even though the output is the exact same
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...