Why the poor NWN performance for Radeons?

Discussion in 'PC Gaming' started by zsouthboy, Nov 12, 2003.

  1. zsouthboy

    Regular

    Joined:
    Aug 1, 2003
    Messages:
    563
    Likes Received:
    9
    Location:
    Derry, NH
    I just don't get it. This game isn't a big poly-pusher. Is it ATI's OGL drivers? Something they're doing in the rendering of the game that is slow, for one reason or another? Would doing a FireGL softmod change performance at all(due to OGL drivers)?

    I'm a little annoyed. This game is great, but the performance on my 9500 softmod leaves much to be desired.
     
  2. OpenGL guy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,357
    Likes Received:
    28
    Did you get the latest patch for the game? It made a large difference for me, particularly with AA enabled.
     
  3. CorwinB

    Regular

    Joined:
    May 27, 2003
    Messages:
    274
    Likes Received:
    0
    Latest couple of patchs + new Catalyst drivers helped my config a lot too, as numerous rendering errors were solved, and performance was increased by a nice margin. It's still not going very fast considering the workload, but at least it's playable now. :)
     
  4. zsouthboy

    Regular

    Joined:
    Aug 1, 2003
    Messages:
    563
    Likes Received:
    9
    Location:
    Derry, NH
    The latest patch and Catalysts made a tremendous difference, as you said.. but it's still not performing where it should be, considering the workload. Ah well.
     
  5. OpenGL guy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,357
    Likes Received:
    28
    The game is using stencil shadows, which are quite expensive, especially with a lot a lights around. Also, the game does some things that are not optimal for our HW/driver. We're still working with the developer to improve things.
     
  6. zsouthboy

    Regular

    Joined:
    Aug 1, 2003
    Messages:
    563
    Likes Received:
    9
    Location:
    Derry, NH
    Well I'm glad you guys are working on it :) I am quite addicted to it already (bought it two days ago)
     
  7. K.I.L.E.R

    K.I.L.E.R Retarded moron
    Veteran

    Joined:
    Jun 17, 2002
    Messages:
    2,952
    Likes Received:
    50
    Location:
    Australia, Melbourne
    I've already backward engineered the Radeon drivers to fix that. ;)

    On a serious note:
    What are they doing that isn't optimal for R3xx hardware?
    Logic errors in the code? :lol:
     
  8. zsouthboy

    Regular

    Joined:
    Aug 1, 2003
    Messages:
    563
    Likes Received:
    9
    Location:
    Derry, NH
    I've come to the conclusion that nothing is un-optimal for R3xx... Hell, I bet I could make a shader that just said shit, over and over again, and it'd run it at 1000 fps anyway! ROFL XD :D R3xx rocks.
     
  9. Quitch

    Veteran

    Joined:
    Jun 11, 2003
    Messages:
    1,521
    Likes Received:
    4
    Location:
    UK
    The real problems would start if the game was developed entirely on nVidia hardware. Then of course the code might be sloppy, with extensive use of nVidia only extensions to compensate. That would of course lead to lousy Radeon performance, through no fault of the card.

    Of course, I don't know how other cards handle this game, so...
     
  10. ZoinKs!

    Regular

    Joined:
    Nov 23, 2002
    Messages:
    782
    Likes Received:
    13
    Location:
    Waiting for Oblivion
    Can you give an example or two of what the game is doing?
     
  11. zsouthboy

    Regular

    Joined:
    Aug 1, 2003
    Messages:
    563
    Likes Received:
    9
    Location:
    Derry, NH
    I have something to add to that... What if they were using D3D instead?
     
  12. OpenGL guy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,357
    Likes Received:
    28
    I think things would have worked better all around if D3D had been used as the API. OpenGL extensions are a good way to more directly access a chip's features, but can be problematic when porting to other vendors' extensions.
     
  13. K.I.L.E.R

    K.I.L.E.R Retarded moron
    Veteran

    Joined:
    Jun 17, 2002
    Messages:
    2,952
    Likes Received:
    50
    Location:
    Australia, Melbourne
    *cough* Logic errors resulting from bad coding. *cough*
    Why don't you just say it outright GL Guy?

    The NWN programmers use bad coding practices and haven't planned the game out properly from scratch.

    That is what I derived from your posts.
     
  14. OpenGL guy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,357
    Likes Received:
    28
    I don't think that's the case here.
    No that's not what I am saying. Let's say that you have your codebase set up for a certain vendor's extensions. Now you want to port your code to another vendor's extensions. There are two approaches:
    - Have two separate engines that utilizes each set of extensions fully
    - Have one engine that handles both

    The second approach is probably a lot easier to develop, but it doesn't guarantee the best results on all platforms.

    This is why I believe people should stick with industry standard extensions if they want good results on all platforms. Note I said good results. It's possible that you won't get the best results on some platforms this way, but should guarantee "good enough" results.
     
  15. K.I.L.E.R

    K.I.L.E.R Retarded moron
    Veteran

    Joined:
    Jun 17, 2002
    Messages:
    2,952
    Likes Received:
    50
    Location:
    Australia, Melbourne
    I see what you're saying.

    Their decision in the planning stage wasn't the best decision.

    If they wanted to use PS then they should have went for DirectX without question at the time.
    Their shadows could have been done more optimally in DirectX.

    I think that's all DirectX is good for; creating special fx. :lol:
     
  16. zsouthboy

    Regular

    Joined:
    Aug 1, 2003
    Messages:
    563
    Likes Received:
    9
    Location:
    Derry, NH
    But DX would've made it harder to do Linux ports, etc... so who knows.
     
  17. OpenGL guy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,357
    Likes Received:
    28
    Or the developer could have just stuck with industry standard extensions...
     
  18. K.I.L.E.R

    K.I.L.E.R Retarded moron
    Veteran

    Joined:
    Jun 17, 2002
    Messages:
    2,952
    Likes Received:
    50
    Location:
    Australia, Melbourne
    I don't believe they had industry standard extensions for pixel shaders at the time, hence they used nVidia's proprietary extensions.
     
  19. Saem

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    1,532
    Likes Received:
    6
    Actually, I believe by that time, both Matrox and ATI were supporting the same extensions, Nv was going it alone.
     
  20. Luminescent

    Veteran

    Joined:
    Aug 4, 2002
    Messages:
    1,036
    Likes Received:
    0
    Location:
    Miami, Fl
    In theory, that is the way it should be. Before this ideal is plausible, however, all hardware should function properly according to the specifications of the industry standard. :wink:
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...