Futuremark Announces Patch for 3DMark03

Discussion in 'Graphics and Semiconductor Industry' started by Nick[FM], Nov 11, 2003.

  1. Som

    Som
    Newcomer

    Joined:
    Nov 11, 2003
    Messages:
    9
    Likes Received:
    0
    Why buy a card now for DX9 games? If you're going to upgrade hardware for a specific game or games, you wait until the first is out to purchase.


    I thought that was obvious.
     
  2. cthellis42

    cthellis42 Hoopy Frood
    Legend

    Joined:
    Jun 15, 2003
    Messages:
    5,890
    Likes Received:
    33
    Location:
    Out of my gourd
    It's definitely the best approach to take, since simply labelling a current set of drivers as "cheating" and forcing results from a few drivers back can kill many valid optimizations as well as the invalid ones. If Futuremark is confident they excised everything objectionable to their new rules, then they are free to approve the current drivers under their new patched version. It removes offending paths while keeping valid ones, as well as being the "smoothest" adoption method. The critical steps are the ones that go from here on out. They've been VERY careful in working up their guidelines and parameters so as to leave no wiggle-room, and they've "reset the counter" as it were. How will they respond if nVidia were to patch 52.16 to defeat their current 3DMark2003 build? How will they respond to later infractions from nVidia, ATi, or any other source?

    They've taken a solid stand on THEIR desires alone, and made a long, careful progression to this point, but the maintainance will likely be even harder and more critical to ensure they don't have to do this again.
     
  3. Som

    Som
    Newcomer

    Joined:
    Nov 11, 2003
    Messages:
    9
    Likes Received:
    0

    Nice use of out of context quotation. The point is, we have real world performance tests. If you're getting good performance in actual game play, why bother looking at synthetic benchmarks?


    And it's "assinine" I believe. Not sure m'self though, to be honest.
     
  4. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    71
    Did you even read what I wrote? :roll: I'll keep it short this time.

    What if I'm buying a card NOW for DX8 games. Does that mean I shouldn't consider potential DX9 performance?

    I want to know which card has better potential at running DX9 games. Because that could mean I don't have to upgrade AGAIN when DX9 games start arriving.

    Get it?
     
  5. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    71
    How many for DX9 do we have?

    Estimating the future potential of the hardware.
     
  6. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    71
    OK...we'll just have to agree to disagree then....and I think we're tied. :)
     
  7. Som

    Som
    Newcomer

    Joined:
    Nov 11, 2003
    Messages:
    9
    Likes Received:
    0
    Of course, the thing I'm still unclear on is is this code-baking or does the patch just screw up some sort of instruction critical to NV3x architecture performance?

    I mean, we all know that the FX line is ridiculously fickle to code for (small changes in code can result in HUGE changes in performance), so is it just maybe possible that the patch moves some code around that causes NV3x to choke? In other words, it kind of comes off like NV3x needs app specific code for EVERYTHING. Sounds like they're trying to pull a microsoft without the proper monopoly.

    I'd like to see a reversal from that in NV4x, as I think many others would as well.

    Of course, you Beyond3d folks, don't break any NDAs just to answer questions here. If it is some kind of questionable tactic Nvidia is using, I'm sure we can wait til the sites around the web do their own investigations.
     
  8. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    83
    But the performance isn't there while *playing the game*, it's only in the *benchmarking modes*. Nvidia have been found to be not clearing buffers, replacing shaders with lower quality versions, static clip planes, reducing filtering to bilinear, etc. All of these things either lower the IQ or cannot be used while playing games - they are just techniques to create misleading benchmark scores and make the Nvidia hardware look better than it is when you play a game. Nvidia is actively trying to cheat the results because their hardware is not up to scratch.

    All this is before we even get onto the question of quality settings like AA/AF (why you buy a high end card in the first place), and the extremely poor DX9 shader performance that will hurt the Nvidia cards in the newer games.

    You need to search back through the forum, because I'm sure most of the regulars simply can't face rehashing the whole "cheating in benchmarks" issue again.
     
  9. Som

    Som
    Newcomer

    Joined:
    Nov 11, 2003
    Messages:
    9
    Likes Received:
    0
    Oh, I got it the first time. Considering DX9 performance is certainly a factor, but just check out how the hardware performs in current DX9 games. (gunmetal, TRAOD, Halo, etc.)

    What about before they were available? Well, in that case you're just guessing really. 3dmark came out before the first DX9 games, fair enough, but since it is synthetic, there's no telling how it would tally with real world performance, so all you're really doing is making a slightly more educated guess than without consulting 3DM03.
     
  10. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    71
    I already explained this to you.

    If nVidia has to resort to doing things that go against FM's guidelines, this means that either

    1) They are detecting the app
    and/or
    2) They are NOT rendering what FM is asking them to.

    This matters because there is no guarantee that nVidia can do this with every game out there. It's absurd to think that nVidia can. Can nVidia detect every single app and have custom code for it in their drivers? Can nVidia render something OTHER than what the app calls for when shading, and have the imagae quality look similar or not break the app?

    I want to know if the card I'm considering buying has the REAL DISADVANTAGE of requiring "special support" for individual apps in order to get performance up to snuff. I want to know if the card I'm considering buying needs to replace what the developer ASKS FOR vs. what the IHV "decides" is acceptable.
     
  11. Som

    Som
    Newcomer

    Joined:
    Nov 11, 2003
    Messages:
    9
    Likes Received:
    0
    Oh I'm well aware of all the previous questionable optimizations, but the reputable hardware analysis sites (Anandtech, HardOCP, etc.) have had their own game based benchmarks scripted to undercut shenanigans like that, or they'll simply run through a scripted gameplay event (which is gaining more acceptance) with all cards being compared to get a performance tally.

    As for Nvidia's hardware, I don't know that it's not up to scratch so much as they are placing far too much work on the developer. If coded for properly, the NV3x architecture appears very capable, the problem is it takes SO much extra coding that developers simply aren't willing to do it, and I don't blame them.

    And of course, their PS2.0 performance does suck. That's undeniable.

    Like I said, I'm familiar with all that previous stuff, so please, DON'T rehash.
     
  12. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    71
    And as soon as a game becomes a "benchmark", then the IHV can "optimize" for it.

    So what if the next game that isn't benchmarked comes out?

    I want a benchmark that does NOT ALLOW "app specific" optimizations, because whether or not my next game get the "treatment" is just a crap shoot.

    And you're just guessing with any benchmark. Halo doesn't tell you anything about Half-Life2.

    Again, no different than "real games." Unless there is a benchmark for the specific game that you are after, all the other "real world" tests are just guesswork as well.

    Regardless...I'll take all the additional "education" I can get.
     
  13. AzBat

    AzBat Agent of the Bat
    Legend

    Joined:
    Apr 1, 2002
    Messages:
    7,749
    Likes Received:
    4,847
    Location:
    Alma, AR
    True, then if we go back to the press release, then we see that they specifically say....

    Does anybody know what the max is for ATI and NVIDIA? Is it possible that NVIDIA was using larger than 256x256 point sprites than ATI? Doesn't the MaxPointSize CAP give us the maximum supported by the hardware?

    Tommy McClain
     
  14. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    71
    Don't you get it? With some of these "real-world" benchmarks, and the "non patched" 3DMark, nVidia's PS 2.0 perofrmance sucking is deniable.

    It's exactly synthethic tests (like PATCHED 3DMark, shader mark, etc) that TELLS US that nVidia's PS 2.0 performance generally sucks.
     
  15. AzBat

    AzBat Agent of the Bat
    Legend

    Joined:
    Apr 1, 2002
    Messages:
    7,749
    Likes Received:
    4,847
    Location:
    Alma, AR
    Wussy. :p LOL

    Tommy McClain
     
  16. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    9,470
    Likes Received:
    1,686
    Location:
    Treading Water
    So 99% of consumers deserve to be ripped off simply because they are not among B3D readership. I have to disagree.

    The point is the performance is not there and they are attempting to make it appear that it is there.
     
  17. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    9,470
    Likes Received:
    1,686
    Location:
    Treading Water
    bah double post
     
  18. stefan435

    Newcomer

    Joined:
    Nov 11, 2003
    Messages:
    1
    Likes Received:
    0
  19. Som

    Som
    Newcomer

    Joined:
    Nov 11, 2003
    Messages:
    9
    Likes Received:
    0
    That's not the situation I'm discussing though - you're shifting off. But we'll talk about your hypothetical for a bit.

    And as for special support - you're right, it does seem that that's what NV3x needs. I think Nvidia was tying to bully market share with NV3x - they though developers would follow their way of thinking because of their superior position in the market, but they moved too soon. Devs simply aren't willing to optimize for hours on end for NV3x hardware (Cg) instead of the indsutry standard compiler (HLSL) provided by the REAL monopoly.

    So in short, I think you're right that Nvidia made the wrong move with their architecture, and is paying the price for it. But I'm not sure that them trying to create as much performance as possible for their customers by putting in app specific code for alot of programs is cheating so much as support. Now, doing such for a benchmark is definitely marketing, and not meant to help the user, so it's a bit more questionable.

    So then, is it only "cheating" when it's done for a benchmark, or is it just good optimization when it's done for something such as BF1942 or UT2003?
     
  20. nelg

    Veteran

    Joined:
    Jan 26, 2003
    Messages:
    1,557
    Likes Received:
    42
    Location:
    Toronto
    What I would like to know is, was there certain cheats found based on suspicion? Or did were they discovered from random instruction re-ordering?
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...