SGSSAA tool for Geforce8 - GTX480

Discussion in '3D Hardware, Software & Output Devices' started by Arnold Beckenbauer, May 25, 2010.

  1. Arnold Beckenbauer

    Veteran Subscriber

    Joined:
    Oct 11, 2006
    Messages:
    1,711
    Likes Received:
    667
    Location:
    Germany
    http://nvidia.custhelp.com/cgi-bin/nvidia.cfg/php/enduser/std_adp.php?p_faqid=2624
    Is that cool.
     
  2. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,532
    Likes Received:
    957
    That's great, but why didn't they just include it in the drivers?
     
  3. homerdog

    homerdog donator of the year
    Legend Veteran Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,294
    Likes Received:
    1,075
    Location:
    still camping with a mauler
    This is kind of a big deal for me. Hopefully it will work on my GTX260.
     
  4. Albuquerque

    Albuquerque Red-headed step child
    Veteran

    Joined:
    Jun 17, 2004
    Messages:
    3,976
    Likes Received:
    526
    Location:
    35.1415,-90.056
    Good question; I wonder if it has to do with WHQL certification?
     
  5. Arnold Beckenbauer

    Veteran Subscriber

    Joined:
    Oct 11, 2006
    Messages:
    1,711
    Likes Received:
    667
    Location:
    Germany
    Geforce 257.15 review: Better performance and higher quality - plus: Nvidia interview
    :lol:
     
  6. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    so GPU from G80 upwards are supported according to that article.
    that was completely unexepected to me.

    well, I remember when I was waiting for an affordable geforce 7 AGP and a beta driver accidentally enabled transparency supersampling on the geforce 6.


    I hope it works under XP, and that I either gain framerate, or at least don't lose framerate compared to 8xSS mode.

    dunno why I had to wait almost a decade to get that feature back. nowadays the driver size looks totally mad.
     
  7. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    how are you supposed to set it up? it doesn't work.

    I've tried it mainly on the original half-life and no luck with it, whether I enable or not multisampling. No blur, noise reduction nor anti-aliasing of textures with 1-bit alpha.
    by the way nHancer no longer works, even after being updated, so make sure you use the nvidia control panel instead.

    hopefully the feature is real and nHancer will be updated again to just work with the new driver series, including that RGSS feature.
    I'm using a G86.
     
  8. Secessionist

    Banned

    Joined:
    Feb 10, 2010
    Messages:
    71
    Likes Received:
    0
    I didn't work on my 9800GTX+--I've tried it with several games both GL and DX. I'm upgrading to Fermi soon. Also, nhancer 2.6.0, which isn't out yet, will be the first nhancer to work with the 256 series drivers.
     
  9. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,821
    Likes Received:
    887
    Location:
    WI, USA
    Anyone else find that some games are incredibly blurry with SGSSAA? Mass Effect with 2x MSAA+2x SGSSAA, for example. It reminds me of Quincunx. 4X is fine.

    One crazy thing I've found is that my 8800GTX doing Mass Effect 4X MSAA + 4X SGSSAA is a slideshow. However, my notebook has a Juniper "5870" 700/1000 and it is very smooth at the same settings (~30 fps). What is the deal with that? Is this G80 shader overload? The 8800GTX actually outscores it in 3Dmark05. If I switch to 16X CSAA and transparency supersampling the framerate is pegged at 60.

    Is it easier to configure SGSSAA with the newer GTX4xx/5xx cards? I have to use either the SSAA Tool or NVIDIA Inspector.
     
    #9 swaaye, Dec 12, 2010
    Last edited by a moderator: Dec 12, 2010
  10. CarstenS

    Legend Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,567
    Likes Received:
    3,508
    Location:
    Germany
  11. DennisK4

    Newcomer

    Joined:
    Oct 1, 2009
    Messages:
    85
    Likes Received:
    0
    A question for those Nvidia owners using SGSSAA - how do you enable negative LOD correction to avoid blurry textures?

    Nvidia Inspector? Or is it possible with the "GeForce_SSAA_Tool.exe"

    One of the main draws of Nvidia cards as far as I am concerned is the ability to enable SSAA in both DX9 and DX10/11 games but as I understand it the Nhancer program does not work with the newer drivers.

    Any answers wll be greatly appreciated.
     
  12. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,821
    Likes Received:
    887
    Location:
    WI, USA
    NVIDIA Inspector can do it. The drivers may be doing it automatically with SSAA now though. SSAA will inherently soften textures a bit however because of how it removes/averages some sharp detail.
     
  13. DennisK4

    Newcomer

    Joined:
    Oct 1, 2009
    Messages:
    85
    Likes Received:
    0
    Thanks. Do you mean the Nvidia drivers automatically corrects LOD now or did you mean Inspector?

    And when you say that the textures will be inherently blurred a bit I assume you mean even when a negative LOD correction has been applied?
     
  14. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,821
    Likes Received:
    887
    Location:
    WI, USA
    NVIDIA Inspector has a configurable LOD setting.

    I think I was wrong in saying that the drivers are forcing a negative LOD because I found an interview in which NV said that they would not do this because some kinds of texturing will not react favorably to forced LOD settings. I saw this for myself with Mass Effect 1.

    Regarding softening, SSAA anti-aliases everything. So it will inherently reduce sharp details even in textures. But I've been finding that NV's SGSSAA has some compatibility problems with some games and sometimes you'll get a very blurry result for some reason. Mass Effect 1, for example. It also seems to be more demanding than ATI's approach because my 8800GTX is several times slower than my notebook's "5870" (Juniper based). Maybe the slowness is caused by the compatibility problems.
     
    #14 swaaye, Dec 17, 2010
    Last edited by a moderator: Dec 17, 2010
  15. I.S.T.

    Veteran

    Joined:
    Feb 21, 2004
    Messages:
    3,174
    Likes Received:
    389
    Well, a Juniper is much faster than an 8800 GTX...
     
  16. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,821
    Likes Received:
    887
    Location:
    WI, USA
    8800GTX beat my notebook Juniper in 3DMark05. :grin: I think the shader throughput is by far the most disparate factor between them. The 8800GTX actually has considerably more memory bandwidth than this 700/1000 Juniper and the fillrates aren't really hugely different.
     
  17. CarstenS

    Legend Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,567
    Likes Received:
    3,508
    Location:
    Germany
    Contrary to the time since it's release, 3DMark05 nowadays relies heavily on the CPU. Don't forget to factor in the likely difference between your notebook' and your desktops' CPU. :)
     
  18. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,821
    Likes Received:
    887
    Location:
    WI, USA
    The desktop is a Phenom II X4 3.0GHz whereas the notebook has that Core i7 720QM 1.6 GHz with max turbo of 2.8 GHz.

    But overall it seems that the Juniper is only much faster if the shader load is very significant. With SSAA the shader load shoots way up along with everything else (if the game is shader effect heavy) so the 8800's much lower shader throughput catches up to it. Juniper literally seems to be several times faster in this case. I've been comparing the two cards at the lowly resolution of my 768p TVs.

    However, I've also played with NV's older OGSSAA modes and they are much faster on the 8800 than SGSSAA. Especially in Mass Effect, which may be the result of a compatibility problem related to the blurry image with SGSSAA. OGSSAA doesn't come close to the IQ of SGSSAA but it's still an improvement for all of the shader aliasing in Mass Effect.

    The ability to force SGSSAA on my old 8800 is really great for old games. I wish ATI would allow you to enable SSAA on their older cards for the same reason. I'd rather have this 8800 than a 4890 because of that! I know it sounds crazy but when you see that shader aliasing completely gone it is addictive. I just ordered a new HD 6950 and I'm planning on trying out 8X SSAA on just about everything. I only game on a 768p TV so it should be able to run almost anything smoothly.
     
    #18 swaaye, Dec 18, 2010
    Last edited by a moderator: Dec 18, 2010
  19. Karoshi

    Newcomer

    Joined:
    Aug 31, 2005
    Messages:
    181
    Likes Received:
    0
    Location:
    Mars
    IIRC on ATI you can "force" OGSSAA by:
    1) adding resolutions over your native LCD resolution in CCC display panel
    2) enabling GPU scaling

    Sorry, can't test now.
     
  20. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    17,041
    Likes Received:
    4,402
    then do you have to run the game at that added resolution?

    with nv when i had a 24" lcd 1920x1200 and it broke I had to go back to my 22" 1650x1080
    most games would revert to 640x480 knowing 1920 wasnt valid but some would just show a black screen (like an out of range error with crt's) and i had to delete the cfg file, edit the ini file or remove 1920 from the registry
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...