Futuremark Announces Patch for 3DMark03

Discussion in 'Graphics and Semiconductor Industry' started by Nick[FM], Nov 11, 2003.

  1. Magic-Sim

    Newcomer

    Joined:
    Nov 14, 2003
    Messages:
    99
    Likes Received:
    0
    Location:
    Calais (France)
    I follow your remark : even people with good background knowledge miss the point.

    Rewriting shaders is a painful hassle, a loss of time and money, and it is just creting a pale copy of the original work.

    Recompilation when done properly can save time for the developper, fot the IHV, and should normally output the good...... output.

    Nevermind, it is sometimes very difficult to see all the facets of a problem. Mostly when it is flooded with awfully written PR-pieces.

    I am pleased and suprised by the one issued by ATI. Clean, neat, factual, and thoughtful.

    And they do not find necessary to quote ther Nasdaq reference etc copyright informations blablabla. This shows to that ATI truly want to communicate on this, and not play the PR-wheel.
     
  2. Rolf N

    Rolf N Recurring Membmare
    Veteran

    Joined:
    Aug 18, 2003
    Messages:
    2,494
    Likes Received:
    55
    Location:
    yes
    Extremely OTish

    Can I pull you aside for a second? :D
    I've pondered the basic idea of treating every target as VLIW/MIMD for a while ...
    Ie, find parallel dependency chains and 'drop' independent operations into slots of a VLIW packets until you find a best match, where "best" is defined as a combination of a)maximum source operations consumed adjusted by b)a weight attached to the type of instruction packet you end up generating.

    Ummm ... like, several types of packets, pure texture sampling packets, MAD+MUL+use constant packets, TEX+TEX+LRP+ADD packets, etc. Each with a predetermined 'goodness' factor attached to them, which could be further adjusted wrt to the last issued packet (to model scheduling issues).

    This was about CPU based runtime stream kernel compilation. Should be pretty O(n)ish. I unfortunately didn't go very far yet :oops:
    Does this sound sensible? 8)
     
  3. Rolf N

    Rolf N Recurring Membmare
    Veteran

    Joined:
    Aug 18, 2003
    Messages:
    2,494
    Likes Received:
    55
    Location:
    yes
    Hell yeah! They actually used human vocabulary :D
    "Developers hate it."
    "<...> performance sucks?"
    "running like molasses in January" :lol:
     
  4. Magic-Sim

    Newcomer

    Joined:
    Nov 14, 2003
    Messages:
    99
    Likes Received:
    0
    Location:
    Calais (France)
    Well, I had a big time to translate the statement in french. I was too much laughing sometimes ;)
     
  5. XForce

    Newcomer

    Joined:
    Jun 12, 2003
    Messages:
    58
    Likes Received:
    0
    I believe you did, as would I! :D

    It's tragic that I can't share this with some of my friends, who have no clue WTF shaders are anyway.

    Which leads us to the core of what substained NV for so long:
    they've taken this to a technical level where the general public has no chance of understanding what's actually going on.

    People love simple answers.

    Example #1
    "FM is siding with Ati to make NV look bad, don't use the evil evil performance crippling 340 patch"

    Example #2
    "Patch 340 can't disable the shader compiler, same as no other application could. Anyway, if it were possible and done, drivers which factually don't have this compiler yet wouldn't be affected, yet they are in exactly the same way"

    Take your pick.
    Ok, good choice.

    Now go and ask the kid clerk in your local computer store.
     
  6. AndY1

    Newcomer

    Joined:
    Aug 22, 2003
    Messages:
    24
    Likes Received:
    0
    http://www.xbitlabs.com/news/video/display/20031114041519.html

     
  7. Sharkfood

    Regular

    Joined:
    Feb 7, 2002
    Messages:
    702
    Likes Received:
    11
    Location:
    Bay Area, California
    Although this was discussed a bit on the first few pages of this thread (and maybe a new thread is in order)... it has now been totally glanced over during this whole debacle- what's to come of FutureMark?

    FutureMark released a PDF on what they define to be "valid" drivers and have also included the 52.16s on their list of what they consider to be valid/verified drivers that are supposed to adhere to the guidelines set-forth in their PDF.

    The problem is- the 52.16s clearly break most every "rule" that was portrayed in the FutureMark PDF. So they are validating/certifying drivers that have all evidence of NOT comforming to the very rules they have set forth for validation.

    The recent 3DMark patch is a tool that is used to illustrate this point exactly, but it can in no way be certain they have captured or otherwise squelched all cheats that may exist in these drivers. By defeating *some* cheats, they have clearly illustrated these drivers (by their own rules lol) need to be disqualified.

    I'm left scratching my head why the 52.16's hit the FutureMark "certified" list... and what it does to the (now useless) PDF. Without either completely discarding the PDF, or modifying it dramatically, it becomes a big paradox by it's own terms for as long as the 52.16's sit on their list of allowable drivers.
     
  8. nggalai

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    402
    Likes Received:
    0
    Location:
    /home/rb/Switzerland
    Err, LOL? :roll:

    93,
    -Sascha.rb
     
  9. CorwinB

    Regular

    Joined:
    May 27, 2003
    Messages:
    274
    Likes Received:
    0
    Well, there is a very simple explanation, more or less comfirmed by none other than Derek Perez himself... The compiler is doing both genuine generic optimizations (register reuse, reordering...), which are not hindered by the 340 patch, and specific shader detection/replacement based on shader fingerprint, which is against 3DMark's rules, and removed by the 340 patch...

    How someone could call specific shader replacement part of a "unified compiler technology" still boggles my mind, though. :roll:

    IMHO, M. Alibrandi's backpedaling is a direct result of the Internet backlash and the ATI press release regarding their ridiculous "compiler is disabled" statement... Or a realisation that saying that application had to be "compiler-aware" would be in the long term more devastating to their already much shattered reputation than a few hundreds 3DMark points. They are running out of options and wiggle-room very quickly on this case.

    Ok, no problem so far.

    And that's where the problem is. Since the compiler is not disabled (as understood by anyone with a little technical knowledge, and admitted by Nv), then why on Earth would that re-ordering not take place ? Especially since when running the shaders from v330 and v340 in another program, they get the same result ? The only technical explanation is of course that there is no re-ordering taking place because this specific "optimization" does not rely on generic re-ordering, but in this case on app-detection, something explicitely forbidden by 3DMark.

    I see Gainward moving to ATI soon... Their Managing Director is sure bound to love looking like a fool to buy Nvidia some time.
     
  10. PatrickL

    Veteran

    Joined:
    Mar 3, 2003
    Messages:
    1,315
    Likes Received:
    13
    They just need to add something along theses lines:

    What is a certified driver for FM?
    - A driver that follows the guidelines
    OR
    - A driver with innacurate behaviour disabled by current built.

    Would be maybe more clear for FM to write that down if not done already :)
     
  11. Exxtreme

    Newcomer

    Joined:
    Feb 7, 2002
    Messages:
    87
    Likes Received:
    0
    Location:
    Germany
    Sorry but this is really ridiculous. :lol: :lol: :lol:
     
  12. whql

    Regular

    Joined:
    Mar 8, 2003
    Messages:
    278
    Likes Received:
    1
    no, the backpedaling is just because a number of people spoke before they had the full party line from HQ - yesterday there were two or three different conflicting messages from different parts of the NVIDIA family, but now those are being gathered up and mashed into a single message now that the US are dictating what should be said.

    Still, the backpedaling will do nothing but confuse the uninitiated and make them doubt what they are saying.
     
  13. Simon F

    Simon F Tea maker
    Moderator Veteran

    Joined:
    Feb 8, 2002
    Messages:
    4,560
    Likes Received:
    157
    Location:
    In the Island of Sodor, where the steam trains lie
    How about this answer -
    (DISCLAIMER This is all speculation of course...)

    #3 A hypothetical company currently has a poor quality compiler but it probably expects it to improve in future revisions. To make their system look better in the interim (until they can make the compiler more intelligent) they put in some customised, hand-coded examples. FM changed their code so it no longer matches those examples.
     
  14. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,081
    Likes Received:
    651
    Location:
    O Canada!
    Simon, read a few posts back - I think that is out the window since work on the compiler has stopped, i.e. its already optimal for "mathematically equivelent" shaders. Its probable that the hand coded shaders are now wholly mathematical equivelent.
     
  15. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    82
    According to FM it's the latter. They have stopped Nvidia from cheating, therefore the driver is valid for their benchmarks. It seems to be a way to get a baseline so that Nvidia is inlcuded in the Futuremark programme, but it could be possible that Nvidia never get another "approved" driver while they are still cheating.

    Of course there are still doubts raised in this and other threads that FM have actually caught all the cheats.
     
  16. nggalai

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    402
    Likes Received:
    0
    Location:
    /home/rb/Switzerland
    Exactly. I work in PR on my day job (ok, we prefer to call it "marketing and communications" ;)), and the way NV handled the past couple of months seems to me like a number of people will be looking for new jobs come next quarter. Accidents DO happen, but everything concerning Futuremark has been handled rather ... let's just say not the way big corporations should do it.

    I wonder who'll still be around 2004, of the PR blokes.

    93,
    -Sascha.rb
     
  17. Ostsol

    Veteran

    Joined:
    Nov 19, 2002
    Messages:
    1,765
    Likes Received:
    0
    Location:
    Edmonton, Alberta, Canada
    Heheh. . . In otherwords, it's a bug -- just like the custom clipping planes! :wink: :lol:
     
  18. Simon F

    Simon F Tea maker
    Moderator Veteran

    Joined:
    Feb 8, 2002
    Messages:
    4,560
    Likes Received:
    157
    Location:
    In the Island of Sodor, where the steam trains lie
    I saw that but disregarded it. It's like saying that work on the drivers has been stopped.
    Did you instead mean to write "mathematically approximate" ?
     
  19. Dio

    Dio
    Veteran

    Joined:
    Jul 1, 2002
    Messages:
    1,758
    Likes Received:
    8
    Location:
    UK
    Re: Extremely OTish

    Scheduling is non-trivial and there's dozens of different ways of approaching the problem. The fundamental issue is that every instruction potentially affects every other instruction. What's right for the application depends on what your input looks like, and what your scheduling priorities are.

    O(n) scheduling algorithms often only have a fairly linear view of things, and therefore make bad guesses because they doesn't know what's coming later. Which isn't to say that it can't give good results, but it is certainly likely to be a bit on the wobbly side (in terms of the quality of output code) unless it's shored up carefully, perhaps using some multipass approach to derive data for tuning heuristics.

    Scheduling is the one place I've found where a human can do better than a compiler because the human can better spot fourth-order changes, 'If I move this, swap this, invert that, and invert that, then this fits in this slot here'; but humans make frequent mistakes in scheduling (because it's terribly easy to miss some dependency) and the compiler often spots things that a human misses particularly as the code gets longer.

    I'm glad you didn't mention algorithmic complexity on Tuesday. Something which worked perfectly reasonably for months turned out to have an O(2^n) pathological case. Ouch. Spent four hours debugging that assuming it was an infinite loop...
     
  20. nelg

    Veteran

    Joined:
    Jan 26, 2003
    Messages:
    1,557
    Likes Received:
    42
    Location:
    Toronto
    Where have I heard that before?
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...