Poor OpenGL performance on ATI cards?

Discussion in 'Architecture and Products' started by Goragoth, Mar 12, 2003.

  1. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    71
    Re: good, bad and ugly

    Wrong. I'll add emphasis:

    Sounds to me like Carmak's using the FX is more for convenience, than better drivers. That of course is self-fullfilling, because

    1) NV30's ARB2 path sucks balls wrt performance, forcing Carmack to use the NV30 path.

    2) ATI's latest gen doesn't "need" hardware speicific paths, industry standard ones are just fine.
     
  2. Goragoth

    Regular

    Joined:
    Feb 4, 2003
    Messages:
    365
    Likes Received:
    7
    Location:
    NZ
    I realize that this kind of OpenGL performance test has little to do with gaming performance, especially since most games are written in D3D these days. My guess is that the test doesn't use shaders and probably not even textures but is more of a raw geometry test aimed at 3D DCC package performance.

    The test was just called OpenGL Test or similar and the magazine was just something I was browsing in a bookshop and I wouldn't particularly trust it. It just seemed interesting. I didn't think I would ever see a Ti4200 get a higher score than a 9700Pro in anything.

    As to the argument that ATI's professional drivers are different than its consumer drivers, sure but so are Nvidia's. That's why the software patches work because their professional cards are similar to the consumer cards but the drivers make the difference (mostly).

    I'm perfectly happy with my 9700Pro and no, its not slow in anything but I was intrigued by those benchmarks 8)
     
  3. kyleb

    Veteran

    Joined:
    Nov 21, 2002
    Messages:
    4,165
    Likes Received:
    52
    the thing is that openly is used in the workstation industry extensively and the benchmarks are for that. however, trying to base game performance on it is like declaring the winner of a football game on based on the number of fouls. :roll:
     
  4. Heathen

    Regular

    Joined:
    Jul 6, 2002
    Messages:
    380
    Likes Received:
    0
    We used to play a game that used that as a scoring system, not sure I'd define it as football though. :D

    Sorry, talkative mood tonight.
     
  5. Humus

    Humus Crazy coder
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,217
    Likes Received:
    77
    Location:
    Stockholm, Sweden
    It has always been my experience that both OpenGL and DirectX perform pretty similar if used in a similar way. The only time there is a significant difference is when DX an OGL has very different interface and deals with something in a very different way. Like rendering to texture for instance, but I'm not sure what which performs better there, but I would guess DX has a slight edge since it doesn't have to deal with context switches.
     
  6. flf

    flf
    Newcomer

    Joined:
    Feb 15, 2002
    Messages:
    214
    Likes Received:
    5
    Re: good, bad and ugly

    Some emphasis in red to help your reading comprehension. (Yes, I'm being a jerk.)
     
  7. JD

    JD
    Newcomer

    Joined:
    Dec 15, 2002
    Messages:
    122
    Likes Received:
    0
    The following JC's quote is what I find most interesting and somewhat confusing:

    "Trying to keep boneheaded-ideas-that-will-haunt-us-for-years
    out of Direct-X is the primary reason I have been attending the Windows
    Graphics Summit for the past three years, even though I still code for OpenGL."

    Sounds like JC is preparing to move over to d3d. That would be interesting to say the least. Why else would he care about d3d then? I were him I would be happy that ms screwed up as it means more gl users which means better/faster vendor support.
     
  8. Pete

    Pete Moderate Nuisance
    Moderator Legend

    Joined:
    Feb 7, 2002
    Messages:
    5,779
    Likes Received:
    1,816
    Do you see video cards advertised as OpenGL ARB-compliant or DirectX 8-compliant? That's why JC has to concern himself with DX development.
     
  9. ZoinKs!

    Regular

    Joined:
    Nov 23, 2002
    Messages:
    782
    Likes Received:
    13
    Location:
    Waiting for Oblivion
    No that's not it at all. He knows that vidcard capabilities are largely built to meet d3d specifications. If a future version d3d screws something up, vidcard hardware will get messed up also.
     
  10. shaderman

    Newcomer

    Joined:
    Jan 3, 2003
    Messages:
    19
    Likes Received:
    0
    Re: good, bad and ugly

    funny. he DID switch to the NV drivers because they are ** significantly ** more stable than the ATI variety. JC wouldn't say that, because he's a diplomat.

    but it's obvious. he switches to an FX, despite it's obvious flaws, for the drivers and to execise new rendering paths.

    he doesn't need to use an FX in his primary to exercise rendering paths. he can use a target system for that.

    when you load/unload the driver alot, you want a solid driver. that he chose an FX in his primary system is significant ...

    - sm
     
  11. andypski

    Regular

    Joined:
    May 20, 2002
    Messages:
    584
    Likes Received:
    28
    Location:
    Santa Clara
    Re: good, bad and ugly

    "Must... control... fist... of... death..."
     
  12. THe_KELRaTH

    Regular

    Joined:
    Dec 9, 2002
    Messages:
    471
    Likes Received:
    0
    Location:
    Surrey Heath UK
    When he's working on ARB2 he can use either ATI or Nvidia cards, if he's working on a dedicated NV extension he needs an NV card so it makes sense why he'd be using an NV30. But saying that, for all we know he may have finished that work and is now playing with an 9800Pro and it's unlimited shader instructions.
    Also, in reference to driver stability, he said that quite sometime ago. If you're going to use these types of quotes maybe you should add a date to it. I don't see reviewers playing "Hunt the real Driver" where ATI is concerned!
     
  13. LeStoffer

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,262
    Likes Received:
    22
    Location:
    Land of the 25% VAT
    Re: good, bad and ugly

    :)
     
  14. Althornin

    Althornin Senior Lurker
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,326
    Likes Received:
    5
    Re: good, bad and ugly

    Right. And you know the inner workings of Carmack mind, i suppose?

    It is obvious that he needs to make a special rendering path for the nv30 because its ARB2 performance is so terrible, yes. The other is merely an assumption on your part - or are you reading his mind again?

    What about shader length restrictions? Thats a good reason to use the FX over a 9700. Or maybe he does need to do most of his "new path creation" on his primary machine. But then i guess you know more aobut him and how he programs for Doom3 than anyone else alive, right? After all, you can read his mind!
    What the hell are you talking about?

    The only thing significant here is your blatant fanishness.
     
  15. fresh

    Newcomer

    Joined:
    Mar 5, 2002
    Messages:
    141
    Likes Received:
    0
    Re: good, bad and ugly

    You'd be mistaken. We all use R300's here. We do have GeForce's around, but only for testing.
     
  16. SpellSinger

    Newcomer

    Joined:
    Jan 10, 2003
    Messages:
    60
    Likes Received:
    0
    Any developer working with DX9 is using a R300. Think about it. Some may switch to NV once they actually ship an FX.

    ATI's drivers have been pretty stable for the last 8 months. Even Derek Smart is using 9700 now, which definitely says a lot about ATI's support.
     
  17. Radea

    Newcomer

    Joined:
    Feb 23, 2003
    Messages:
    19
    Likes Received:
    0
    Location:
    United States
    Re: good, bad and ugly

    Wierd. And there I was certain thinking that his .plan file said he uses an NV30 because he cant run the most ammount of paths on it. I guess he must have been lying. :lol:
     
  18. rwolf

    rwolf Rock Star
    Regular

    Joined:
    Oct 25, 2002
    Messages:
    968
    Likes Received:
    54
    Location:
    Canada
    Carmack was probably using the FX because he was trying to get Doom III running on the FX. He has had a R300 for so long Doom III is probably optimized for it. He probably wanted to try the different rendering paths available such as fp16 and fp32 etc. on the FX. I wonder if the fan has driven him crazy yet.
     
  19. Mulciber

    Regular

    Joined:
    Feb 7, 2002
    Messages:
    413
    Likes Received:
    0
    Location:
    Houston
    Re: good, bad and ugly

    *chuckle* :lol:
     
  20. Radea

    Newcomer

    Joined:
    Feb 23, 2003
    Messages:
    19
    Likes Received:
    0
    Location:
    United States
    rwolf:
    Thats the point I was attempting to make. However, I just noticed I accidently typo'd "can" with "cant" :oops:
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...