Starcraft II GPU performance/IQ

Discussion in '3D Hardware, Software & Output Devices' started by Ancient, Jul 20, 2010.

  1. homerdog

    homerdog donator of the year
    Legend Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,294
    Likes Received:
    1,075
    Location:
    still camping with a mauler
    Dunno if you're speaking from experience or not, but even AAA titles that don't support AA see massive performance drops when AA is forced. Mass Effect 1 & 2, Borderlands, Bioshock, Batman:AA on ATI cards, well basically all UE3 games save for Gears of War which properly supports multisampilng via D3D10, the first Stalker, the list goes on. And like I said, the forced MSAA in those games doesn't work very well. There are still loads of jaggies despite the nutslapping fps hit.

    I'm very surprised to see a knowledgeable programmer like yourself supporting the notion of driver forced AA. Driver forced IQ enhancements are almost unanimously condemned by the 3D programmer types on this site.

    I don't see other devs complaining about the added support costs of a D3D10 renderer. Literally I have never heard mention of that. Perhaps we could ask repi if this was a consideration in BFBC2..

    Oh I see my sig has magically appeared. Spooky.
     
  2. entity279

    Veteran Subscriber

    Joined:
    May 12, 2008
    Messages:
    1,332
    Likes Received:
    500
    Location:
    Romania
    So have I :D.
     
  3. air_ii

    Newcomer

    Joined:
    May 2, 2007
    Messages:
    134
    Likes Received:
    0
    It's as if saying "let's not fix that nasty bug in our code, let driver writers do the job for us". To me if you can blame anyone for the lack of AA, it's game developers. Releasing a game and hoping that a third party will add an essential feature to it is simply idiotic, if that's indeed their way of thinking.
     
  4. Psycho

    Regular

    Joined:
    Jun 7, 2008
    Messages:
    746
    Likes Received:
    41
    Location:
    Copenhagen
    Isn't the driver forced AA just supersampling/upscaling the right surfaces? I don't see what you can do driver side, which you couldn't do dx9 application side anyway. The hit will be big no matter what (unless you'll stick to some edge smoothing alone), and the application should have a better chance of optimizing it for the actual situation.
     
  5. hoom

    Veteran

    Joined:
    Sep 23, 2003
    Messages:
    3,264
    Likes Received:
    813
    Again, Blizzards fault.
    They chose to use a D3D9 deferred renderer.
    Gas Powered Games managed to make a PS2.0 based renderer that scales to vastly more units than you'll ever get in a Starcraft game: Supreme Commander.

    Or blame Microsoft for writing an API that doesn't let you do AA on that sort of renderer.

    I'd have thought Blizzard should at least start with "Hey MS, NV & ATI, how can I make AA work out of the box with this renderer, here's what we're doing".
    Certainly they shouldn't just expect the hardware providers to make it magically work & neither should users.
     
  6. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,426
    Likes Received:
    10,320
    So much stuff to reply to, but only going to reply to a few.

    And 6600's? And X800's? And 6800's? And laptops? I think Blizzard is far more interested (as they have abundantly shown in the past over and over again) that they are far more interested in courting their existing userbase while expanding to new customers than they are in chasing after enthusiasts who are more than willing to shite on your product (Crysis and Crytek fans turning on them when it was released). And then pirate it claiming it was programmed by amateurs because it wouldn't run on their 2-3 year old computers.

    Yes, as an enthusiast with bleeding edge hardware I suppose people are "entitled" to feel left out or some other silly nonsense, but SC2 should come as absolutely ZERO surprise to anyone that has followed Blizzard. They NEVER put any focus whatsoever on enthusiast class machines. And yet amazingly, their games still turn out to be blockbusters time after time.

    I would say they have a good idea of exactly what they are doing.

    Sure with relatively barren low detail landscapes. And extremely simplistic unit models most of the time. 2 different game targets are going to lead to different needs.

    From what I've seen of SC2 so far, it's a rather dark game with few high contrast edges. Aliasing is obviously still there but less so than a game where you might see a mountain or building silhouetted against a bright sky or character against a bright light.

    Likewise the default viewpoint of an RTS is going to de-emphasize those situations where Aliasing is most prominent. As well as the small unit and building scale with few long edges.

    Age of Empires 3 didn't support AA either and AA couldn't be forced at the time with HDR and while there was aliasing it wasn't bad or in your face.

    This doesn't mean I don't desperately want AA in the game, just that it's far less of a problem than it would be in a shooter for example. I already know it's going to be annoying as hell when I do see the artifacts of aliasing, but for most people? Probably won't be that big a deal.

    Either way, first thing I'm going to do when I get it is see if SSAA works. I, too, am disappointed that it doesn't support standard MSAA out of the box, but I'm also not a bit surprised.

    Regards,
    SB
     
  7. L233

    Veteran

    Joined:
    Mar 15, 2002
    Messages:
    1,031
    Likes Received:
    29
    Location:
    Germany
    Most games you've listed still run fast enough with forced AA, though. It seems that the ATI cards suffer higher FPS drops when forcing AA than nVidida cards: see here. In this case, it's 40% vs 30%.

    Why?
     
  8. Novum

    Regular

    Joined:
    Jun 28, 2006
    Messages:
    335
    Likes Received:
    8
    Location:
    Germany
    I'm not supporting it at all. I just say that I understand Blizzard in their decision.

    And I'm just pretty sure it will not hurt their sales at all. And like I said they save a lot of possible support costs.
     
    #68 Novum, Jul 26, 2010
    Last edited by a moderator: Jul 26, 2010
  9. homerdog

    homerdog donator of the year
    Legend Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,294
    Likes Received:
    1,075
    Location:
    still camping with a mauler
    Those are PS 3.0 GPUs except the X800, which I noted in the very quote you responded to: "R420/430 does not even register." Laptops with PS2.0 GPUs will not meet the minimum specs, obviously.

    My point was that making sacrifices to retain PS2.0 support is silly. They actually list the 9800 pro as the minimum spec. Seriously: 9800 PRO. An enthusiast card for the mythical PC enthusiast who stopped upgrading in 2003 but still buys PC games.

    AA support wouldn't be just for the enthusiasts. Even mainstream to mid-low end GPUs today can rock this game 1080p style.

    I would say they got confused or did some bunk market research on this one. No way does it make sense to make sacrifices at the altar of SM2.0, but leave out DX10 support which the majority of users could take advantage of.

    I game at 1440x900 with a GTX260. Borderlands is fresh in my mind, and that went from 60fps to a wildly fluctuating 20-40fps with a measly 4xAA forced in the NVCP.
     
  10. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    nVidia spent loads on making this an "NVIDIA exclusive-sive-sive-sive-sive" Watch for GTX460's being bundled with SC2 in your local hardware shop.

    Please, take not of the list of other games to show you how bad your ATI card is:

    World of Warcraft: Cataclysm
    Crysis 2
    Mafia II
    Lost Planet 2
    F.3.A.R.

    It has absolutely nothing to do with nV investing buckets of money into these games.
     
  11. willardjuice

    willardjuice super willyjuice
    Moderator Veteran Alpha

    Joined:
    May 14, 2005
    Messages:
    1,386
    Likes Received:
    299
    Location:
    NY
    I agree with homerdog; a game being released in the 2nd half of 2010 without msaa support is ridiculous.

    I don't care who you are. I don't care what market you are targeting. And I don't care what type of engine you are using. None of these reasons are good enough, especially for a game like StarCraft 2. The Humus quote summed it all up perfectly.

    Even if you disagree with that argument, I don't see how you can fault AMD. Sure it was nice of Nvidia to provide a workaround, but why is it AMD's fault for not having a workaround? Anything AMD/Nvidia develop, Blizzard could have done (and probably better/faster too). The fault always lies with the developer.
     
  12. homerdog

    homerdog donator of the year
    Legend Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,294
    Likes Received:
    1,075
    Location:
    still camping with a mauler
    Crysis 1 runs as good if not better on ATI hardware, so if Crysis 2 changes this I'd be surprised.

    And like Willard said, Blizzard is to blame here, not ATI. Kudos to NVIDIA for hacking in support for a feature Blizzard should have included, but the real solution is for Blizz to support this essential feature in the engine.
     
    #72 homerdog, Jul 26, 2010
    Last edited by a moderator: Jul 26, 2010
  13. Novum

    Regular

    Joined:
    Jun 28, 2006
    Messages:
    335
    Likes Received:
    8
    Location:
    Germany
    You are aware, that Crysis 1 has a forward renderer and Crysis 2 a deferred one? That is a huge difference. I don't think you can infer anything from that.

    Also ATI is doing a very bad job at texture filtering in Crysis, but that's another topic.
     
  14. homerdog

    homerdog donator of the year
    Legend Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,294
    Likes Received:
    1,075
    Location:
    still camping with a mauler
    Ah, well then maybe I will be surprised.

    Yeah I'm aware of ATI's texture filtering woes (thanks largely to your findings). But Crysis and crap ATI texture filtering are both topics for another thread. :)
     
  15. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    9,045
    Likes Received:
    1,119
    Location:
    WI, USA
    I don't really see what's so surprising about PS2.0 support. Blizzard's game plan for years has been to make sure their games have at least a chance of running on everything. I'm sure there will be people playing this game at 5fps and happy. I've seen it before with every other Blizzard game.

    And hell a X800XT is faster than some DX11 cards. Some people get those old high-end cards as hand-me-down gifts.
     
  16. L233

    Veteran

    Joined:
    Mar 15, 2002
    Messages:
    1,031
    Likes Received:
    29
    Location:
    Germany
    Who cares who's to blame or whose fault it is? From a gamers' point of view the simple fact is: GPU A can do something GPU B can't. It would be in ATI's best interest to come up with a solution because "blaming Blizzard" won't put ATI users even just one single step closer to being able to enjoy AA in SC2.
     
  17. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,992
    Likes Received:
    3,533
    Location:
    Winfield, IN USA
    Good sig, you get extry bonus points and a medal. :yep2:
     
  18. Laa-Yosh

    Laa-Yosh I can has custom title?
    Legend Subscriber

    Joined:
    Feb 12, 2002
    Messages:
    9,568
    Likes Received:
    1,455
    Location:
    Budapest, Hungary
    It'll take some time to set up my new PC, so I'll probably be able to give a little report on how the game runs. It's been shipped out today, although I probably won't get it before the weekend (got it from overseas).
     
  19. homerdog

    homerdog donator of the year
    Legend Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,294
    Likes Received:
    1,075
    Location:
    still camping with a mauler
    The install base for the X800 is tiny. There will be a miniscule # of players playing this game on an R420, and even fewer who actually enjoy it. It is a waste of time and effort to support it.
     
  20. Novum

    Regular

    Joined:
    Jun 28, 2006
    Messages:
    335
    Likes Received:
    8
    Location:
    Germany
    All the old GMA stuff is SM 2.0. I'm pretty sure at least 10% of Blizzards customer base still uses such hardware.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...