Doom3 benches revisited.

Discussion in 'Architecture and Products' started by jjayb, May 19, 2003.

  1. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    70
    Actually, you should stop using 3DM03 with nVidia hardware until the cheating issue is resolved one way or another.

    Otherwise, you should not stop using 3DMark 03, because the benchmark is publically available to anyone. nVidia won't be "surprised" to see 3DMark03 benchmarks appear in any web reviews...so if they choose not to provide drivers that optimize for it, that's their decision.
     
  2. FUDie

    Regular

    Joined:
    Sep 25, 2002
    Messages:
    581
    Likes Received:
    34
    How did you come to that conclusion based on what I wrote? If you're going to speak for me, I won't respond at all.

    You said:
    So you're basing your "thoughts" on things that happened last year. What else has happened in the last year regarding Doom 3? Maybe a leak that many people have attributed to ATI? You think that may have affected ATI's access to Doom 3?

    The point I was making in the original quote above was that you can't always base current relationships between companies based on the past. How closely do you think Carmack is working with 3dfx these days?

    -FUDie
     
  3. Mephisto

    Newcomer

    Joined:
    Feb 7, 2002
    Messages:
    200
    Likes Received:
    0
    Define "setting up". Did you install WinXP yourself? Was it the naked hardware NVIDIA provided to you?

    For spending their engineering time on performance optimisations for available games instead of unrealeased game alphas?

    Did you really? Have you detected the Serious Sam or 3DM2003 cheats for example?

    Sorry Borsti, your 4X AA scores in your review are incorrect as stated by yourself several times.

    Did I publish a RADEON 9800 vs. GeForce FX 5900 review or did you?

    You're aware of the basic coherences between cpu and fillrate limits on the final fps score, aren't you?

    Understand my reaction, after recognizing such serious flaws in the way you (and probably the other DOOM3 testers too) test 3d graphic cards (benchmarking in a rush, not aware what driver settings affect, publishing wrong numbers, not beeing aware what game settings mean) I got quite a bit scared that there are more "issues" behind all these DOOM3 tests.

    Nevertheless, at least you seem to "learn".
     
  4. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada
    http://www.rage3d.com/board/showthread.php?s=&threadid=33685071&perpage=20&pagenumber=2

    Terry Makedon above sums it up well, Websites saw a chance to get some hits with co-operation with Nvidia and jumped at it.
    Nobody is pulling the wool over this old timers eyes, and it shows what I've been preaching about for a while, journalistic integrity sucks..big time :!: :!:

    It's all about money.
     
  5. saf1

    Newcomer

    Joined:
    Nov 19, 2002
    Messages:
    235
    Likes Received:
    1
    Yep - just pimping traffic and hits. Not a problem though. I just do not waste my bandwith on sites like this anymore.
     
  6. Sabastian

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    991
    Likes Received:
    1
    Location:
    Canada
    http://biz.yahoo.com/rc/030512/tech_graphics_1.html

    They used the combination of DoomIII and the 3DMark2003 cheat to manipulate the preception that the Geforce FX 5900 Ultra is faster then the Radeon 9800 Pro. We all know that these benchmarks are used by OEMs to determine what is the next video card that goes in the next model .... right?
     
  7. Borsti

    Newcomer

    Joined:
    Feb 14, 2003
    Messages:
    91
    Likes Received:
    0
    For HQ that´s correct. I did not test minimal

    The last one is correct. If you change the Doom III setting from MediumQuality to HighQuality it will use Aniso automaticly. I can´t tell what level of aniso but I´ll ask JC about that. It will use Aniso no matter if you disabled Aniso in the driver or not. That´s what I meant when I said that there are issues with the driver settings. And that´s the reason why I did not like to post AF numbers (means driver forced AF settings). I ran some tests with the NV35. With 8x AF (Quality forced in the driver) the perfomance droped from 83 to 80,8 (in medium quality). And it droped from 55,0 to 54,5 in HQ (all in 10x7). So there´s more going on than only Aniso. That´s why I did not post more results on that.

    That´s correct. But I´m not sure about the reasons for that. NV says it must be a driver bug.... maybe because of the notcompressed textures. But I still wanted to post those results since they show the Radeon in the lead.

    See above. A slight performance drop between "standard" AF quality (asked by the game) and "forced" aniso.

    Let´s see it with numbers:

    Medium Quality, 10x7
    NV 35 no AF: 83,0
    NV 35 forced 8x AF Quality in the drivers: 80,8

    High Quality, 10x7
    NV 35 no AF (in the drivers): 55,0
    NV 35 forced 8x AF Quality in the drivers: 54,5

    So it looks that the performance drop of NV35 in Quality mode has nothing to do with Aniso at all. Seems to be a trouble with the textures or whatsoever. I feel very bad that I did not run more HQ tests with the R350. That would make things much clearer now.

    I´ll ask him on that!

    You think there´s a parallel development in the drivers? There are MANY games outside... this would make compatibility testing almost impossible. I know that NV has allready optimized code for not-yet released in it. I would be surprised if ATI does that different.

    That´s right. But id said that they are ready.

    i see this a little bit different. :)

    Well, that´s what they say.

     
  8. saf1

    Newcomer

    Joined:
    Nov 19, 2002
    Messages:
    235
    Likes Received:
    1
    Sabastian - that right there is the article I was talking about earlier. Yep - people took the scores and ran with it.

    I guess the armatures needed a startling statistic for an attention getter...
     
  9. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    70
    I know you didn't test minimal, I misspoke. ;)

    Then you don't understand how these driver settings work. These driver settings don't force OFF aniso. They either force on a certain setting, or they let the application choose the filtering method.

    The only question is, if you force on one type of setting, "performance 4x" for example, and you also "turn on aniso" in the game, what happens? That can vary depending on the driver.

    You should be safe posting "forced" AF numbers, as long as you ran Doom3 in medium quality mode. (Though you should be checking the quality of the Aniso that results.)

    Don't follow you. There IS more going on than aniso when going from medium to the HQ game setting. TC is turned off. The fact that the Radeon doesn't take nearly as large a performance hit when going from medium to high quality suggests any number of things:

    1) ATI simply handles non compressed textures and aniso with less performance impact
    2) There's a bug in nVidia's drivers causing slower than expected performance in high quality mode.
    3) There's a bug in ATI's drivers that are causing incorrect (lower) quality, and faster than expected performance in high quality mode
    4) There's a bug in nVidia's drivers causing faster than expected performance in medium quality mode.
    5) There's a bug in ATI's drivers causing slower than expected performance in medium quality mode.

    I just don't see why option 2 (the most favorable to nVidia) was laid out as "the" possibility in your article, and not option 1...or any of the other options which the data supports. In other words, your article reads like the high quality scores are an anomoly, when the truth is we don't know.

    For reference, do you have benchmarks with ATI hardware with the same settings (forced on or not?)

    Possibly....or that driver forcing on Aniso with the GeForce doesn't do anything at all in Doom3. (Did you check the image quality?)

    It is very surprising to see almost no performance drop with 8X "quality" aniso on the FX. This is unlike any other situation I know of. Look at your own UT benchmarks.

    In short....looking at your data (the medium quality numbers), I would be more suspect that nVidia has a driver bug that DOESN'T ACTUALLY TURN ON aniso with the control panel, or perhaps turns on a different setting (performance) than selected. And that the performance drop between medium and high quality doom3 settings is in fact a combination of proper aniso actually being turned on, and higher quality (more bandwidth sucking) textures.

    Again, I more suspect that "forcing on" Aniso isn't properly working with Doom3.

    Specifically, I would ask him what ATI driver build he was basing his "should be representative of performance" comments on.

    For a game/benchmark as important as Doom3? Wouldn't surprise me in the least.

    Apparently! :)

    That's right. Even if ID says that's the case. Unless you know that ID has the same drivers that are available to the public. There's a very easy retort to your argument:

    Fact 1: Latest ATI drivers are 3.4
    Fact 2: 3.4 drivers suck for Doom3

    There is no way you can reconcile those two facts, and believe that Carmack only has access to the latest public drivers, and also believes performance comparisons are fair.

    circumstances fully outlines earlier: like are all parties aware that there will be a public benchmark release?

    See above.

    Ask yourself: why did you not just use the 3.4 cats then? ID said everything was cool....so there's "no reason" then to have any issue with the Cat 3.4 drivers, right?

    You are deluded by your own results, which is the problem. (EDIT: Inserted smiley here! :)) From what I can gather..the 9800 Pro is ALREADY good enough to go up against the NV35. They are pretty much equal in terms of performance. With the possible exception of Doom3.

    But the Doom3 scores are basically useless because of how the benchmarking was sponsored and done.

    So I certainly haven't concluded that ATI needs to "find something"...and the problem is, that's pretty much what every review that tested Doom3 concluded. Check the reviews that didn't bench Doom3, and it's a much different conclusion. Usually along the lines of "in some cases NV35 is faster, but not by much, and the overall image quality of the 9800 makes it an overall better deal."

    To be clear...if the Doom3 benchmarks are in fact truly representative, then there is a clear case to be made for NV35 superiority. (It comes down to a preference between performance or image quality.) Problem is, we really have no idea if they are representative or not.

    Thanks...it is appreciated! :)
     
  10. MrGaribaldi

    Regular

    Joined:
    Nov 23, 2002
    Messages:
    611
    Likes Received:
    0
    Location:
    In transit
    Just felt like dropping my .02$ on this...

    What about people beta testing a game?
    There are qutite a few people out there that's testing games that aren't anywhere near completion... How do you think they'd feel if the game they're testing was limited to 10 fps in the drivers?

    I'm quite sure that there's plenty of support for upcoming games in ATI's publicly released drivers, as sending out different driver sets to the people doing beta testing would be way to time consuming and playing merry hell with re-integration into the public drivers...

    Wouldn't it be easier to do it all with the "unified" driver code, and do your tests on that, instead of needing to test several different code builds for different games, and re-testing it once you've gotten it all intergrated into a "unified" driver?

    You might argue that having support for a game not-yet-released doesn't mean they have optimized for it... But that leaves the question, when do you start optimizing for the game?

    Most likely ATI will have had dev rel helping the game producer for quite some time making sure it'll work on ATI hardware...
    (Resolving driver issues vs game issuse, suggesting ways to increase IQ etc)
    Wouldn't it be logical to optimize the drivers at the same time that the game is being beta tested, instead of just before it goes gold?

    This would also give you the benefit of having the game beta testers testing whatever optimizations that you do, to make sure it doesn't degrade the iq....

    The point I'm trying to make here is that having optimizations for games not-yet-released in the drivers might not be such a bad idea after all (maybe even a good idea)...

    Oh, and to whoever talked about "ATI is doing the right thing by optimizing their drivers for games currently out", I'm not so sure that's "the right thing to do"....
    Why haven't ATI optimized their drivers for these games allread?

    Yes, continuing to optimized for games after they're released is a Good Thing, but having no optimizations when the game is launched is a Bad Thing in my book...

    Does this make any sense to anyone but me?
     
  11. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    70
    Right...so how can anyone like carmack say that the Catalyst drivers, which are limited to 10 FPS, are representative of game performance?

    The only logical answer is: Carmack could not be referring to Cat 3.4 drivers. The question is, are the Cat 3.2's representative of what Carmack has? That cat 3.2s are a couple months old as a PUBLIC release, who knows how far they are behind the latest dev release.

    I agree with that. Though I also agree that ID and Doom3 is a special case.

    I do understand your point...but that still doesn't reconcile with the Catalyst 3.4 release. we know that the Cat 3.4s don't represent final performance in doom. It does not follow that the cat 3.2s (which only utilize 128 MB of memory), should be representative of final performance.

    AGREED. The entire point in this case is: Doom3 is not launched. I would have no issues at all if ATI was given the head's up a month or so ago that this benchmark was coming.

    To be clear, it does make sense to have as little "parallel" driver development as possible. I do believe though, that if there is ever a reason to have a special parallel development version, it would be for Doom3.

    In any case, the fact that catalyst 3.4 is "broken" with Doom3 certainly means that Carmack was not basing his "representative" comments on that particular driver version.
     
  12. demalion

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    2,024
    Likes Received:
    1
    Location:
    CT
    There are optimizations for OpenGL...that's part of writing the driver. What there might not be is specific optimizations for the unique way Doom 3 is doing things.

    Exactly how many games using ARB_fragment_program are there that are out right now? This is new territory, AFAIK.

    One hypothetical example: what if ATI was having issues in implementing F-Buffer support? OpenGL is the place they're likely to do it first, and getting it working would be more important than getting it working at high speed initially (it is primarily a development feature, not game feature). It seems quite likely that such a decision would be made for a shipping driver, with no idea that Doom 3 benchmarking would occur.

    There are plenty of other possibilities related to the uniqueness of Doom 3 and that it won't be shipping for several months. None of them seemed to have been considered when proposing "Shame on ATI". <- Unacceptable proposition (about ATI not optimizing enough :!:) from a reviewer for issues they've introduced in their own self-interest, IMO.
     
  13. MrGaribaldi

    Regular

    Joined:
    Nov 23, 2002
    Messages:
    611
    Likes Received:
    0
    Location:
    In transit
    Agreed [​IMG]
    But then again, he could be referring to the unreleased Cat 3.3.
    The Cat 3.4 seems to be hardlocked into producing 10 fps in the code, so it wouldn't be to unreasonable to think that he was thinking about the previous version of the drivers...

    To him that would logicaly (sp?) be Cat 3.3, as they were supposed to be the previous release... That they didn't make it to the public is another matter...
    (No, I don't think Carmack thought about the fact that the 3.3 wasn't released, since ATI had trumpeted them for quite some time...)

    See above for the first part...

    As for the second, well it should have given us some clues about the performance of the 9800pro 128mb ver... (If what I said above is correct)

    Fully agreed! I do not think that websites with the D3 demo did the right thing since they included cards from other IHV's than nvidia...


    I'm not too sure about there being a reason for having a special driver build for D3, as I think they should've incorporated the optimizations/bugfixes into the official drivers before they beta test it, but that's just my opinion [​IMG]
     
  14. nelg

    Veteran

    Joined:
    Jan 26, 2003
    Messages:
    1,557
    Likes Received:
    42
    Location:
    Toronto
    This is one possible thing I am worried about. I also mentioned it on JF_Aidan_Pryde's thread in regards to his request for suggestions on benchmarking the NV35. In light of recent events with Nvidia's Inflateonator drivers I think it would be prudent for any reviewer to examine images to ensure that the settings that are requested are infact produced. Furthermore with respect to paths, it was mentioned in an interview with J.C. that the ARB path (wich the 9800 uses) is of higher quality than the NV30 path which the NV35 uses. I know that the J.C. said that the differences would be slight but, perhaps as a service to readers who may spend $500 on a video card, Kyle, Anand or Lars could elaborate. J.C. may be a great coder but I know nothing of his eyesight. :wink:
     
  15. jjayb

    Regular

    Joined:
    Feb 13, 2002
    Messages:
    358
    Likes Received:
    1
    I've seen 2 reviews for the nv35 that benched splinter cell with AA yet didn't have AA on for the fx cards. Do these reviewers not pay any attention?
     
  16. tEd

    tEd Casual Member
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,094
    Likes Received:
    58
    Location:
    switzerland
    that's just a NO. Sad but true

    For example. I know that Anand got many emails explaining him the SC AA Problem but he didn't seem to care because he still not changed his wrong Nvidia SC AA numbers in his review and he still claims that the rendering artifacts occuring when AA is used in SC being solely an ati driver problem -> it's sad really

    SC=SplinterCell
     
  17. Pete

    Pete Moderate Nuisance
    Moderator Veteran

    Joined:
    Feb 7, 2002
    Messages:
    4,943
    Likes Received:
    347
    I'm OK with their hardware, just not their drivers. ;)

    For what? For ruining your scoop so you couldn't show how a 9800P stacked up to a 5900U? Are you that self-centered? Because I can't think of another reason for you to say this. How can you be disappointed that ATi drivers haven't been optimized for a game when that game isn't available to be played anywhere yet? I can understand once it's been released, but not months before. I'm not sure it's appropriate for you to keep your Doom 3 scores up while you don't know whether ATi's current drivers are representative of their final performance. This seems to me something you and Anand and [H] should have clarified with ATi before, not after, benchmarking. A disappointing lack of professionalism on all your parts, IMO.
     
  18. jjayb

    Regular

    Joined:
    Feb 13, 2002
    Messages:
    358
    Likes Received:
    1
    One of their hardware reviewers posted in their forum last night that they are looking into it. That was a little under 24 hours ago.

    I also see that Lars still hasn't updated his review yet. Guess it takes a lot longer than I thought to change "high quality" to "medium quality" in a picture.

    Don't know what's worse. Sloppy reviews or dragging your feet fixing the mistakes in sloppy reviews.
     
  19. RAnta

    Newcomer

    Joined:
    Jan 13, 2003
    Messages:
    8
    Likes Received:
    0
    HMMMMM??

    I wonder if Omega or someone competent can undo the hacks in NV drivers
    and release a Nocheat driver to check out the True performance of zee beast(NV35).

    Would sure be nice wouldn´t it??? :wink:
     
  20. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada
    Alot of these reviewers would change their tune if they woke up in the morning, checked the newspaper and reads "winning numbers in last nights Lottery".

    Continues to read, checks their numbers...Anand/Borsti see they got a 'winner'.
    Jump in their BMW, roar down to the lottery center to pick up their 10 million but when they get there they find their numbers are wrong.

    There was a typo in the newspaper.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...