Patrik Ojala's edited/revised thoughts on the fiasco

Discussion in 'Graphics and Semiconductor Industry' started by Reverend, Jun 5, 2003.

  1. Reverend

    Banned

    Joined:
    Jan 31, 2002
    Messages:
    3,266
    Likes Received:
    24
    http://www.beyond3d.com/forum/viewtopic.php?t=6230

    He had to remove the short Q&A in his original post (if you didn't read the original post, too bad) and his post is now slightly longer and IMO a better reflection of his personal thoughts on the matter.
     
  2. DethWraith

    Newcomer

    Joined:
    Feb 8, 2002
    Messages:
    14
    Likes Received:
    0
    Location:
    Hangar 18
    Hmmm... very informative, and it does change my view on this situation. If Futuremark does as they say and will continue to test drivers for application specific optimizations, this may return 3dMark to its position as a valid and unbiased benchmarking program. Let's hope they can do it. :)
     
  3. CorwinB

    Regular

    Joined:
    May 27, 2003
    Messages:
    274
    Likes Received:
    0
    Very informative, interesting and well-written, I agree. That doesn't change my view on the situation, though. What FM seems to say (feel free to disagree) is that Nvidia did not "cheat" per se, but apply optimizations that would be legal and welcomed for other software (games, for example), but are not for benchmarks.

    As of now, the only thing that would restore my faith in FM would be if they issued a technical (as opposed to PR) paper explaining for each and every "former cheat" of both ATI and Nvidia why this "former cheat" is indeed a "slight optimization", as formulated by their PR person on ExtremeTech. I must say I would be pretty interested on a technical answer on the clipping planes, for example, and how said driver-level clipping planes taking advantage of know camera paths would be useful for gaming purposes.

    As for FM considering including vendor-specific path in their code, why not, provided :
    - the vendor, which would have to be a beta member, proposes a specific code path to FM. Driver-level stuff is a no-no, and any detection of driver-level stuff gets all scores of said driver set removed from the ORB, and an email fired to any person having posted such a score ("We are sorry, but the manufacturer of your video card chipset cheated, which is why we removed your score. Contact your video card chipset's manufacturers for further details").
    - FM independantly analyzes the output of every proposed code path, which has to be a "generic inside the app" optimization (ie for a shader, the vendor-specific shader has to be a mathematically equivalent of the generic shader proposed by FM). This would allow for both instructions reorganization and partial precision to play a role, provided they produce the exact same output.
    - all path-specific proposals are presented for all beta members to see and comment upon.
    - FM has the final say about what makes it to the code and what doesn't
    - the benchmark would propose both "use vendor path" and "use generic path" options in all its configuration, including the free downloadable version.

    Which is in direct contradiction with the PDF explaining why synthetic benchmarks are better, and can be interpreted as providing another escape way to close your eyes on future cheats by any company.

    We know that. But since you say that you won't investigage all future driver releases, how are we supposed to know that those scores are not comparable in the first place ? Do you realize that saying such a thing without dedicating yourselves to an extensive research of future cheats is indeed invalidating 3DM ? Both statements can't go together.

    Actually, many people speculated that Nvidia would have either paid you or threatened you with legal action. I'm curious as to why you don't comment on the second part...

    Let me respectfully disagree here, Patric. Getting better with NV is vital for the commercial future of FM, either because of all the smearing they did (directly through company statements or through their "black ops PR specialists", the "guys with webpages"), or because of legal threats, no doubt about it. But "continuity" relies on much more than commercial future, it also relies on public acceptance of the product as an useful tool. I am not against synthetic benchmarks, I think they are useful tools to use in addition to in-game benchmarks, but I consider the specific software 3DMark2003 will be of zero value once ATI or NV release new drivers, especially if those give a large performance increase. This is not because of some (actual or pretended) opposition to synthetic benchmarks, but because FM destroyed their own credibility by calling blatant cheats (those clipping planes are still stuck in my throat) "slight optimizations". This indicates a desire to have a better relationship with a major IHV that goes way beyond any pretention at objectivity.

    Call a cat a cat, a cheat a cheat, and you will restore this credibility.

    We want names !!! Actually, if you refer to the various "guys with webpages", they hardly qualify as "professional reviewers" in the first place... :)
     
  4. RussSchultz

    RussSchultz Professional Malcontent
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,855
    Likes Received:
    55
    Location:
    HTTP 404
    A racing game and/or tank/tank simulation, for example, might be useful to have clipping planes when drawing the sky, or the view out the periscope.
     
  5. YeuEmMaiMai

    Regular

    Joined:
    Sep 11, 2002
    Messages:
    579
    Likes Received:
    4
    they sold themselves down the river and there is no way that I will use a benchmark that cannot peroperly test DXn functionality.

    So what futurmark will allow nVidia to do is insert their own code path that my (I think most definately) lower image quality for speed. Nvidia has a very long HISTORY of killing IQ when they needed a speed boost.

    ATi made their hardware to DX9.0 specs and so did nVidia, unfortunately for nVIdia when they DO run at spec they get their butts handed to them by ATi..............
     
  6. CorwinB

    Regular

    Joined:
    May 27, 2003
    Messages:
    274
    Likes Received:
    0
    I think the only games that could truly benefit would be "rail shooters", like REZ or Panzer Dragoon.
    But anyway, the clipping planes should be included at the game level, not at the driver level... Imagine for example that the designers release a patch allowing for a greater field of view, the clipping planes would become unusable and mess with the rendering...
     
  7. nelg

    Veteran

    Joined:
    Jan 26, 2003
    Messages:
    1,557
    Likes Received:
    42
    Location:
    Toronto
    In this new jargon just what would be considered a cheat then :?:
     
  8. Hyp-X

    Hyp-X Irregular
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,170
    Likes Received:
    5
    So you say that those clip-planes should be in 3dMark03 not in nVidia's drivers?
    I agree.
     
  9. CorwinB

    Regular

    Joined:
    May 27, 2003
    Messages:
    274
    Likes Received:
    0
    I say that the decision to include clipping planes and other culling techniques should never ever been taken at the driver level, but should be a designer's decision, be it for a game or for a benchmark.

    If in the next release FM chooses to include clipping planes in their benchmarks, at least the workload for all chips will be the same. The benchmark would lose some of its "game-likeness", though.
     
  10. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    17,267
    Likes Received:
    1,783
    Location:
    Winfield, IN USA
    From my understanding of the situation, absolutely NOTHING is considered a "cheat" anymore....to many lawyer implications. Now they're all "optimizations" no matter what they are. :(

    Question, how can we have any faith in a benchmark that squirrels around with semantics like that? What else are they going to bend for nVidia? :(
     
  11. RussSchultz

    RussSchultz Professional Malcontent
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,855
    Likes Received:
    55
    Location:
    HTTP 404
    It doesn't matter what they call them (cheats or application specific optimizations), they're not allowed or the benchmark numbers aren't accepted by futuremark.
     
  12. nelg

    Veteran

    Joined:
    Jan 26, 2003
    Messages:
    1,557
    Likes Received:
    42
    Location:
    Toronto
    What I am trying to understand, is Futuremark lowering the bar are just simply changing semantics. Is there any violation that would now would be considered a cheat and not simply an optimization ?
     
  13. Himself

    Regular

    Joined:
    Sep 29, 2002
    Messages:
    381
    Likes Received:
    2
    On the face of it, it's just semantics, but there is some talk of making a useless benchmark to shut NVIDIA up as well.
     
  14. RussSchultz

    RussSchultz Professional Malcontent
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,855
    Likes Received:
    55
    Location:
    HTTP 404
    What talk? (Beyond the whole 'they caved!' accusation). What makes you suggest they're going to make a useless benchmark to shut NVIDIA up?
     
  15. Himself

    Regular

    Joined:
    Sep 29, 2002
    Messages:
    381
    Likes Received:
    2
    Read the press release again, it says they will consider evaluating some form of benchmark that allows for apples to oranges comparisons.. bleh
     
  16. Borg

    Newcomer

    Joined:
    May 26, 2003
    Messages:
    8
    Likes Received:
    0
    Location:
    Belgium
    "Bleh" seems to be the right word to describe how you read the statement...

    If you'd really read it, you can see that NVIDIA wants optimised rendering paths in 3DMark (like most games). Futuremark's reply is that they'll consider that option.

    That doesn't mean that NVIDIA will get an optimised path and that all other cards have to use a general path. It will rather be an optimised path for every graphics IHV (perhaps as a non-default option?), if that becomes a reality that is.

    Now where did you get that "apples to oranges comparisons"? :?
     
  17. Bjorn

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,775
    Likes Received:
    1
    Location:
    Luleå, Sweden
    Well, 3D Mark 2003 are already not an exact apples to apples comparison so i don't see the problem with what they're proposing there. And this will be more in line with what SA proposed in this thread http://www.beyond3d.com/forum/viewtopic.php?t=6240&start=0.
     
  18. GreenBeret

    Newcomer

    Joined:
    Mar 2, 2003
    Messages:
    33
    Likes Received:
    0
    I"ve seen some B3D member call it Custom Hidden Extensive Application Tuning :p
     
  19. Himself

    Regular

    Joined:
    Sep 29, 2002
    Messages:
    381
    Likes Received:
    2
    Well, there is NVIDIA's not supporting 1.4 shaders in it's older cards so it's not apples to apples that way, ATI does 24 bit precision, NVIDIA does 32bit, so there is no common ground there, both have different ways of doing FSAA and Anisotropic filtering. If you look for all the ways that the cards are different you can't compare them at all.

    You can't help the hardware differences, but creating a benchmark where you run a voodoo3 in 16 (22) bit vs a TNT2 in 32 bit isn't the solution to compensate either. You either test one or the other, you don't test separate things and try to equate them as being the same. For a game, it's simply the reality of the game, you are benchmarking the game performance, for a synthetic app, it's totally useless.

    Borg, if all cards have thier own code to run, then it's even more widely divergent than before, if it wasn't apples to apples before, then you have just ensured it isn't and you don't even have the excuse of saying that it is how the thing will be like out in users hands like a game.
     
  20. YeuEmMaiMai

    Regular

    Joined:
    Sep 11, 2002
    Messages:
    579
    Likes Received:
    4
    Force all the cards to run at DX9.0x levels that means

    nVidia runs in 32bit FP (after all they pushed it as being superior to 24FP)
    ati runs in 24bit FP. (after all they knew just like nVidia that DX calls for a minimum percision of 24bits)


    Now is this means that nVidia's hardware sucks speed wise then so be it. Running the cards in DX9.0x compliant mode should tell you who has the better overall design and who will deliver the best DX9.0x gaming experience.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...