How to boost your Doom3 performance by 40% on ATI hardware

Discussion in 'PC Gaming' started by Humus, Aug 8, 2004.

  1. Milkman

    Newcomer

    Joined:
    Aug 19, 2004
    Messages:
    39
    Likes Received:
    0
    Location:
    Ottawa,Canada
    Re: Further performance increase

     
  2. Tekki

    Newcomer

    Joined:
    Aug 16, 2004
    Messages:
    11
    Likes Received:
    0
    Milkman, my only real point was, that you couldn't set the AF AND AA to anything above x4(for both of them), and that Humus' tweak works very well with AF, but not AA. AA still has the same devastating impact as usual(for my card anyways). Hope you didn't take offence to anything :wink:

    I would recommend the ARB or R200 path for performance, and the ARB2 for image quality. I don't even know what path I am using(the best, machinespec "2" when running High settings), but comparing to screenies in Ultra taken with X800/6800 cards and my own Ultra screens, I should be using the ARB2 path(looks exactly the same as the X800/6800, Ultra screenshots). Since my last .cfg tweak Ultra settings has been running very well(almost always +40 fps, when not in battle, 15-30 fps minimum when in battle with many enemies). Only minor small delays from time to time.


    If my "best" path is really the ARB2 one, then I can only recommend that one, as it still has excellent performance(high and ultra), especially with Humus' tweak.

    So considering you have a better card than mine, you should go for the ARB2 path. Unless you have nasty bottlenecks. Try them out yourself, and find the balance you find is best for you.
     
  3. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,992
    Likes Received:
    3,532
    Location:
    Winfield, IN USA
    Eeew! I tried the "arb" path on my x800 and got that weird "half-dark/half-light down the middle" effect on everything.

    Back to "best" for me....worked best on my 9700 pro too.
     
  4. Milkman

    Newcomer

    Joined:
    Aug 19, 2004
    Messages:
    39
    Likes Received:
    0
    Location:
    Ottawa,Canada

    Im using a Radeon9800pro 128mb so ill go with your advice with ARB2.But ill run Timedemo to compare the Humus tweak against a normal install only using the PK4tweak than ill compare AF&AAcontrol panel verus AA & AF application preference followed by tests with the ARB2. all that in TimeDemo 8)
     
  5. dhrystone

    Newcomer

    Joined:
    Aug 22, 2004
    Messages:
    1
    Likes Received:
    0
    ATI Issues with Doom3

    You people with the "Why don't you tell John Carmack" have forced me to join beyond3d.com just to post a reply.

    Here is the deal:

    If anyone has been following any (hardware/software) history with id's [JC's] development of Doom3, then you'd be aware of the issues he's discussed with NVxx vs. ARB & ARB2 (e.g. NVidia vs. ATI). He picked to go NON-ARB (e.g. NON-ATI), but he decided that support for ARB was a must (e.g. ATI support was a MUST). In case you're not following kiddies, that means that he decided to focus more on NVidia's Graphics Architecture rather than ATI's - (Just like Valve is focusing on ATI Boards rather than NVidia's with Half-Life 2).

    Now it doesn't take a genius to figure out that John probably coded shaders the way he did because of the NVxx adoption (needless to say there was the partnership with NVidia).

    You can gripe and b*tch all you want about your ATI board not holding it's own with Doom3 - and you can even have ATI employees coming in with excuses - but like it or not, Doom3 was developed with NVidia, only keeping ATI on as second-hat. (That means ATI wasn't their main concern).

    Now, before you COMPLETELY Flame me - hear this. *I* own an ATI card, a 9800XT - and the card owns at just about everything other than Doom3 - so don't go saying I'm pro NVIdia, because ATI (NOT NVidia) got MY 500 dollars! But I do have issues with my ATI Card playing Doom3, there is a "Snow" effect which forces me to toggle my "SmartGart setting" for "Fastwrite" (e.g. ON then OFF) to clear up the problem, until it returns a few hours later.

    ATI - You make amazing products and have caught up to NVidia these last few years - but your products aren't completely up to par with Doom3.

    Wait until Half-Life 2 comes out, and people are making these same lame-brain whines over "Why won't my GeForce 5950 run Half-Life 2 at 120 FPS like ATI!?"

    -Peace
    Dhrystone
     
  6. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,992
    Likes Received:
    3,532
    Location:
    Winfield, IN USA
    Re: ATI Issues with Doom3

    I'm on the couch for this one, but I brought LOTS of extry popcorn!

    [​IMG]
     
  7. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    Re: ATI Issues with Doom3

    Horrors that you'd be "forced" to do something like this...;)

    Sorry, but Carmack's involvement and "focus" on nVidia goes back years--to the 3dfx era, really--and was concerned with OpenGL driver development, primarily. Valve's "focus" with ATi is much, much newer than that, only goes back about a year, and centers around D3d, primarily.

    Indeed...;)

    Who's griping? I've all but finished the game with my 9800P. I'm once again looking forward to HL2, with D3 behind me.

    Never had anything like that with my 9800P and D3, myself. Sorry for your troubles, but doesn't sound like an ATi issue. I've never turned smartgart or fastrwrites off in my system, for D3 or anything else--never even thought about it until reading your post.

    You're certainly speaking for yourself...;)

    I think the main attraction of HL2 will be the game itself, as opposed to shadows and specularity, and that therefore the general climate surrounding HL2 will be much different. IMO, of course...;)
     
  8. Milkman

    Newcomer

    Joined:
    Aug 19, 2004
    Messages:
    39
    Likes Received:
    0
    Location:
    Ottawa,Canada
    I have a ATI Radeon 9800pro 128mb so what rendering path do you guys suggest I take... ARB or ARB2 and whats the difference between the two? :shock: is ARB & ARB2 only for ATI cards? :? and by doing this will I gain FPS?
     
  9. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,992
    Likes Received:
    3,532
    Location:
    Winfield, IN USA
    I recomend you put "best". :)
     
  10. Milkman

    Newcomer

    Joined:
    Aug 19, 2004
    Messages:
    39
    Likes Received:
    0
    Location:
    Ottawa,Canada
    Ill keep it to best but howcome people are saying be putting it to ARB or ARB2 you will have a increase in the FPS? :shock:
     
  11. fish99

    Newcomer

    Joined:
    Aug 11, 2004
    Messages:
    47
    Likes Received:
    0
    'best' just mean Doom 3 will pick the best render path for your card (from arb/arb2/nv10/nv20/r200), there is no render path called 'best'. On a 9800 pro 'best' means D3 will run the 'arb2' path, so changing it to 'arb2' manually changes nothing. Also 'arb' looks pretty terrible. Just leave it on 'best'.
     
  12. fish99

    Newcomer

    Joined:
    Aug 11, 2004
    Messages:
    47
    Likes Received:
    0
    Cause they're just plain wrong maybe? The guy who said selecting 'arb' would give you a 15 fps increase on ALL cards - I don't know where he got that from, it's just rubbish. I get about a 40% framerate drop with 'arb' on my x800 pro, plus it looks awful. Clearly not on ALL cards then...

    Also why the need to ask here, it takes 2 minutes to try it for yourself and find out :roll:
     
  13. jvd

    jvd
    Banned

    Joined:
    Feb 13, 2002
    Messages:
    12,724
    Likes Received:
    9
    Location:
    new jersey
    Whats interesting is that my x800xt pe is faster than my 6800ultra using the cddoom. The orignal shareware doom , doom3 mod that was posted on this site.


    Can anyone confirm ?
     
  14. DarthFrog

    Newcomer

    Joined:
    Aug 10, 2004
    Messages:
    52
    Likes Received:
    0
    Which of the many available mods would that be? I was running the sharewares - DooM, Heretic, Hexen - and the DooM Collector's Edition with jDOOM (DoomsDay) on a 9800XT for a while and then on a GF6800U, but I used VSync and so both cards were running at the same constant 85 fps and there was never a dip in frame rate, for obvious reasons.

    I mean, the games were designed for software rendering on 386/486 chips and so the graphics load is minuscule, even though jDOOM adds a bit of lighting (like for torches and Imp plasma balls) and other eye candy. Highly recommended.

    But shouldn't this be in a separate thread? None of the DooM mods I know of use shaders and so Humus' tweak does not apply.
     
  15. jvd

    jvd
    Banned

    Joined:
    Feb 13, 2002
    Messages:
    12,724
    Likes Received:
    9
    Location:
    new jersey
  16. DarthFrog

    Newcomer

    Joined:
    Aug 10, 2004
    Messages:
    52
    Likes Received:
    0
  17. pedrovsk

    Newcomer

    Joined:
    Aug 23, 2004
    Messages:
    1
    Likes Received:
    0
    Hey guys where are you supposed to put the atioglxx.dll file? Doom 3/atioglxx.dll or Doom 3/base/atioglxx.dll? And is there any way so the FPS don't cap at 60? My screen is at 75Hz and even if I run with NoVsync the fps keep getting limited at 60.

    Thanks!
     
  18. DarthFrog

    Newcomer

    Joined:
    Aug 10, 2004
    Messages:
    52
    Likes Received:
    0
    Where Doom3.exe is (i.e. the Doom 3 directory). The OS looks there first when it is told to load a DLL.

    seta com_fixedtic 0

    However, I don't think you'll like the effect and any spare horsepower of your graphics card is better invested in aniso filtering and anti aliasing and/or higher resolution than in higher framerates (for singleplayer) or on increasing minimum framerate (for multi).

    This is how Upset Chaps quoted John Carmack on this issue:

     
  19. Utrion

    Newcomer

    Joined:
    Feb 16, 2004
    Messages:
    5
    Likes Received:
    0
    just a question the hack from humus why does it reduces the lightning quality?from the way it works it shouldnt do that but it does it:( can someone explain or did anyone else show what i am saying?
     
  20. DarthFrog

    Newcomer

    Joined:
    Aug 10, 2004
    Messages:
    52
    Likes Received:
    0
    The image quality of the latest version of the hack should be virtually indistinguishable from the game's original lookup version, but I fear it is lost among the thousand or so posts in this thread. Your best bet is probably searching the thread for posts by Humus and Demirug, or laud the GeForces a little to wake up jvd and WaltC from their slumber ... :twisted: Since I traded my Radeon for a GF6800U I stopped tracking the details.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...