New 3DMark03 Patch 330

Discussion in 'Architecture and Products' started by Nick[FM], May 23, 2003.

  1. John Reynolds

    John Reynolds Ecce homo
    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    4,491
    Likes Received:
    267
    Location:
    Westeros
    And what evidence leads you to assume that? I'm not saying I know something but I think it's way, way too early to make such assumptions.
     
  2. Evildeus

    Veteran

    Joined:
    May 24, 2002
    Messages:
    2,657
    Likes Received:
    2
    martrox,

    Where did i say that ATI was guilty of cheating? I said that there's a drop only in GT4 of more or less 8% and that it could be optimisation or cheating. May i quote myself?

     
  3. gkar1

    Regular

    Joined:
    Jul 20, 2002
    Messages:
    614
    Likes Received:
    7
    Please give it up with the % drop/increase math. The straws arent that long. :roll:
     
  4. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    71
    It's clear to me / has been proven "beyond a reasonable doubt" that SOME of the nvidia "optimizations" are in fact cheats. For example, the cheats that involve knowing the position of the camera. (Clip planes / some buffer clears). The fact that these cheats rely on a fixed camera path make it a 95% certainty that it is a deliberate cheat. The fact that these cheats turn on based on some sort of app / scene detection, moves that to 99.44%

    Replacing shader code with other code that does not EXACTLY reproduce results, is also cheating. It is even "highly questionable", to replace shader code with different code that DOES produce the same exact results, though I could at least see an argument for that not being cheating.

    I will repeat what I said earilier. This does not mean that I assume that ATI's optimization is a cheat (or not.) Nor does it mean that I assume EVERY nVidia optimization is a cheat. (There can be legitimate optimizations that rely on detection, IMO.)

    But in those specific cases at least, It has now been proven to me that nVidia has cheated.
     
  5. Evildeus

    Veteran

    Joined:
    May 24, 2002
    Messages:
    2,657
    Likes Received:
    2
    Joe,
    I think John was speaking of that:
     
  6. martrox

    martrox Old Fart
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,065
    Likes Received:
    16
    Location:
    Jacksonville, Florida USA
    I guess what we are going to need here is a better idea of just what is considered a cheat vs. an optimisation. And it's up to FM to establish this and present it to the IHV's. Does Daves supposition qualify as a cheat?

    And, IF FM decides that what ATI has done is a cheat, on what scale should we compare ATI and nVidia? Is what ATI has maybe done compare to what nVidia has done? I really don't think that the two can even be compared....
     
  7. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    71
    Well, I think FM is pretty clear about it: If you need to "detect" something (application, scene, shader), then consider it a cheat.

    I think that's a valid position to take for a synthetic benchmark.
     
  8. martrox

    martrox Old Fart
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,065
    Likes Received:
    16
    Location:
    Jacksonville, Florida USA
    optimisations/cheats...... what's that look like to you, ED? Looks like the word "cheats" to me..... And, at this point, can you admit that nVida is cheating?......
     
  9. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada
    I am glad to see the policing of this benchmark is keeping a level playing field. The consumer and gamer don't need to mislead, and I thank Futuremark and its BETA partners for doing the right thing.
     
  10. martrox

    martrox Old Fart
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,065
    Likes Received:
    16
    Location:
    Jacksonville, Florida USA
    I have to agree with it, then....my bad......
     
  11. AJ

    AJ
    Newcomer

    Joined:
    Feb 18, 2003
    Messages:
    78
    Likes Received:
    0
    Location:
    Finland
    We're very clear in this matter. In fact it's all documented in 3DMark help files and lisence agreement. Benchmark specific optimizations (i.e. ones which detect that 3DMark is running and change the way things work) are not allowed during a benchmark run.

    We have a long (4-6 months) specification phase for each benchmark that we release. Once the benchmark is released the data sets may not be changed in order to ensure apples-to-apples comparisons for different hardware. This is a policy that we've had in place since we started working with benchmarks since 1997 so I think we've been pretty consistent with this point.

    The objective of 3DMark is not to find out who writes most efficient shaders and can replace them in drivers during a benchmark run. Should that be our goal we'd do an open source benchmark or construct the whole product completely differently.

    Cheers,

    AJ
     
  12. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    I am disapponited if it is true that ATi did something on one of the tests to increase its score--especially since it was so unnecessary.

    Here are my '03 scores for my 9800P (128mbs) clocked at 445MHz, stock-configuration card (I run the card continuously @ 455MHz/365.x without difficulty):

    3.2 = 5860

    3.3 = 5776

    Running the 3.4 Catalysts, running 3D Mark '03 in its default configuration, FSAA and AF set to Application Preference in the Control Panel. My cpu is an Athlon XP 2000+ Thoroughbred, motherboard is a Chaintech 7njs (nf2 chipset running latest bios and 2.41 nVidia drivers, exluding nVidia audio drivers.) 1 gig PC2100 ram running at 133MHz (2x512.)

    I am not surprised to see the score some ~1100 points higher than the $500 nv35 reference-design review cards from nVidia, when the benchmark is rendered by both products correctly. Moreso, it is obvious why nVidia decided to cheat--although certainly deplorable.

    I also find it hypocritical in the extreme that nVidia would publicly denigrate the benchmark, resign from the 3D Mark beta program, but yet be able to recognize the marketing potential of 3D Mark to the degree that the company felt taking a chance on cheating it was justifiable in order to boast inflated scores which it felt would stimulate sales of the yet-to-ship product. Well, it flipped a coin and lost. Here's hoping nVidia will recognize that not only do its products need work, but so do its ethics.

    I think that the variation in the ATi scores is so slight that there might be a simple explanation as to what they did--although I cannot condone a driver "recognizing" a benchmark test and altering its behavior, even slightly. The interesting thing is that whatever ATi did wasn't enough to push the difference in score beyond a statisitical norm variation of + or - 3%. Certainly nothing approaching nVidia's ~24% + differential, which is of course well outside normal variation possibilty.

    Kudos to FutureMark for handling this as they have done as the issue is now settled beyond a reasonable doubt.

    Edits: typos
     
  13. Nite_Hawk

    Veteran

    Joined:
    Feb 11, 2002
    Messages:
    1,202
    Likes Received:
    35
    Location:
    Minneapolis, MN
    HardOCP's reply is on their front page. They blame futuremark.

    Nite_Hawk
     
  14. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada
  15. ♪Anode

    Newcomer

    Joined:
    May 9, 2003
    Messages:
    10
    Likes Received:
    0
    I am not sure about this approach. 3dmark is made so that it approximates the game performance. We all know that games normally have specific codepaths for specific hardware. Each GPU ( nv/ati ) does certain things a lot better for others. So normally a game dev(with guidance from an IHV) would optimise for these things to get the maximum performance possible.
    Now 3dmark does most of these things in a fixed way (or the dx9 way) and doesnt have a specific codepath. So this in effect makes its performance not correspond with that of what that particular gpu is fully capable off. So hence you might see optimisations from IHV's to show truly what their gpu is capable off in 3dmark. They just cant say "to hell with 3dmark" since a lot of people and OEMs use it to judge a gfx product.
    The question here is what is the line between such optimisations and cheating. Should futuremark also use the approach of game devs and have specific codepaths to remove all doubt from everyones mind ?
     
  16. martrox

    martrox Old Fart
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,065
    Likes Received:
    16
    Location:
    Jacksonville, Florida USA
    OK...I'm am definately wrong here for saying ATI may not be guilty......
    I am now supping on a small black bird that makes "caw" sounds.....

    Edited to make sense......hehe!
     
  17. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    71
    Why am I not surprised. I'm also feeling pretty good about not going there to read it.
     
  18. John Reynolds

    John Reynolds Ecce homo
    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    4,491
    Likes Received:
    267
    Location:
    Westeros
    Yes, I was. Thanks for pointing that out. . .I wasn't being clear enough even with the added italics.

    And, for the record, I'm not saying ATi is guilty. I'm only saying let's not assume anything yet.
     
  19. Ilfirin

    Regular

    Joined:
    Jul 29, 2002
    Messages:
    425
    Likes Received:
    0
    Location:
    NC
    What happens to the output if you make very slight (like make the water red) changes to these detected shaders and re-run?
     
  20. Ante P

    Veteran

    Joined:
    Mar 24, 2002
    Messages:
    1,448
    Likes Received:
    0
    Abit GeForce FX 5800
    build 320: 4700 marks
    build 330: 3400 marks

    28 % less...
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...