Futuremark Announces Patch for 3DMark03

Discussion in 'Graphics and Semiconductor Industry' started by Nick[FM], Nov 11, 2003.

  1. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,022
    Likes Received:
    122
    Though I find it VERY strange that the 44.03 driver drops performance so much (since 330 already disables optimizations this driver has for 3dmark03). But the FX5900Ultra loses ANOTHER 50% (in GT 2 and 3 at least) with the 44.03 driver going from 330 to 340???
     
  2. PaulS

    Regular

    Joined:
    May 12, 2003
    Messages:
    481
    Likes Received:
    1
    Location:
    UK
    I assume he's talking about 44.03 drops from the non-patched 3dMark to the 340 patch of 3dMark - not from the first patch to this patch. 52.17 doesn't drop ANOTHER 50% over and above the drops already witnessed with 44.03 ;)
     
  3. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,022
    Likes Received:
    122
    Well the diagram clearly says 330 which IS the patched version - unpatched would be 320 iirc.
     
  4. Ostsol

    Veteran

    Joined:
    Nov 19, 2002
    Messages:
    1,765
    Likes Received:
    0
    Location:
    Edmonton, Alberta, Canada
    From the linked document:

    Those points alone tell us that the compiler cannot be bypassed. If it will work on applications written before the introduction of the compiler, it obviously does not need to be "allowed" to do its job -- it simply will. Another thing is that in order for it to be disabled there has to be an interface to such functionality. However, the DirectX spec has not changed and NVidia has not released a new OpenGL extension.

    In any case, NVidia's Universal Compiler appears to fully compliant with FutureMark's guidelines regarding optimizations. It does not change the mathematical result of a shader and can be applied automatically to any and all applications without any specific application detection. As such, there is no rational reason to try and work against it.
     
  5. Sabastian

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    991
    Likes Received:
    2
    Location:
    Canada
    I am not sure about this one. All I know about GT1 is that nvidia initially was not very happy about the single textured sky. But there was the matter of the vertex shader detection in GT1 that FutureMark was unable to clear up with the 330 patch. I am really not familiar with the compressed textures matter you are addressing. Has the 340 patch fixed the matter of compressed textures? I honestly don't know they were able to detect the cheat but if they were there would be a decrease in frame rate from the 43.51 DETs to the 52.16 Forceware. Instead though, we see an increase. I am far from qualified to answer the question however someone did note earlier that there was some sort of anomaly in GT1 wrt to the nose of the aircraft. That is at best though, an unqualified observation on my behalf. It is true though the 44.03s or the 43.51 and all others after them were an improvement in performance over previous drivers on GT1.
     
  6. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    Not seen the exact source of this, but evidently there is a statement from FM over Gainwards comments:

    http://www.dvhardware.net/article2116.html

    I assume this came from Patric. Patric?
     
  7. FUDie

    Regular

    Joined:
    Sep 25, 2002
    Messages:
    581
    Likes Received:
    34
    I don't mean that FutureMark could have disabled the cheat (maybe it's not possible) but it should be obvious that textures are being compressed by just comparing results to the refrast or other vendor's results. I believe it was Xbitlabs that originally showed that the NVIDIA drivers were using texture compression on GT1.

    -FUDie
     
  8. Sabastian

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    991
    Likes Received:
    2
    Location:
    Canada
    Could you point me too where you read that at Xbitlabs?
     
  9. Sabastian

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    991
    Likes Received:
    2
    Location:
    Canada
    It was Tero Sarkkinen according to The Tech Report.

    http://www.tech-report.com/onearticle.x/5874
     
  10. FUDie

    Regular

    Joined:
    Sep 25, 2002
    Messages:
    581
    Likes Received:
    34
    My mistake, it was from Digit-life.

    -FUDie
     
  11. Neeyik

    Neeyik Homo ergaster
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,231
    Likes Received:
    45
    Location:
    Cumbria, UK
    Nope. Both cards were first tested with build 330 across the various drivers and then tested again with the new patch. The drops you see are simply between the 330 and 340 patches.
     
  12. madshi

    Regular

    Joined:
    Jul 26, 2002
    Messages:
    359
    Likes Received:
    0
    I'd find it rather funny, but I fear that a lot of people will actually believe this:

    http://www.xbitlabs.com/news/video/display/20031112181114.html

     
  13. FUDie

    Regular

    Joined:
    Sep 25, 2002
    Messages:
    581
    Likes Received:
    34
    Riiight. They expect us to believe that the CPU is doing the rendering. :roll:

    -FUDie
     
  14. nggalai

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    402
    Likes Received:
    0
    Location:
    /home/rb/Switzerland
    Or, rather, that the "unified compiler" was supposed to run on the GPU but now is forced to run on the CPU. ;D

    FUD or SNAFU?

    93,
    -Sascha.rb
     
  15. Quitch

    Veteran

    Joined:
    Jun 11, 2003
    Messages:
    1,521
    Likes Received:
    4
    Location:
    UK
    This would be worth checking, because if it can't be patched, the driver shouldn't be approved. I also remember hearing tell of "compressed textures" back around the time after build 330.
     
  16. XForce

    Newcomer

    Joined:
    Jun 12, 2003
    Messages:
    58
    Likes Received:
    0
    [​IMG]
    Ok, so, according to his logic, CPU activity would go up measurably during performing the game tests which lost most fps, which could be easily monitored.
     
  17. CorwinB

    Regular

    Joined:
    May 27, 2003
    Messages:
    274
    Likes Received:
    0
    Who are they kidding ? Seriously, they are a FM beta member, so chances are they had ample access to the candidate builds to the patch. If this was a genuine concern, such as "application blocking the unified compiler" (which is a no-go from a technical standpoint anyway), I'm pretty sure they would have brought this up with FM, and would have worked a solution together.

    Which is more likely :
    1) FM is, once again, out to "get Nvidia"
    2) The new 340 patch breaks the Nvidia compiler, although the compiler works at a much lower level and does not need to have applications aware of its existence
    3) Nvidia was caught once again with their hand in the cookie jar (and are trying to wiggle their way out with "but it's breaking our genuine technology" and "if the benchmark tries to emulate a game environment, then it's ok for us to optimize for it)
     
  18. Katsa

    Newcomer

    Joined:
    Feb 12, 2003
    Messages:
    11
    Likes Received:
    0
    GPU compiler == using idle CPU time during 3dmark to run some vertex shaders on the CPU?
     
  19. nggalai

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    402
    Likes Received:
    0
    Location:
    /home/rb/Switzerland
    I wonder. Previous incarnations of 3DMark03 (i.e. 320 and 330 patches) didn't scale all that much with CPU on my GFFX 5800, no matter whether I ran benchmarks on a 2.4GHz P4 or a 1GHz P3. 'twas in the 1-2% region, overall.

    93,
    -Sascha.rb
     
  20. madshi

    Regular

    Joined:
    Jul 26, 2002
    Messages:
    359
    Likes Received:
    0
    More bullshit coming from NVidia:

    http://www.theinquirer.net/?article=12657

    All games run this way? Most games were written well before Det5x.xx, so the game developers didn't even have a chance to write their games in a way that pleases NVidia's "Unified Compiler technology". If they *nevertheless* *all* run this way - then how can 3DMark behave differently, if all they did was reordering the shader instructions? That's nothing but pure bullshit, dear Derek Perez. Do you honestly think people will fall for that? Well, some may even do...
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...