Starcraft II GPU performance/IQ

Discussion in '3D Hardware, Software & Output Devices' started by Ancient, Jul 20, 2010.

  1. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,992
    Likes Received:
    3,532
    Location:
    Winfield, IN USA
    The fix is adding two lines to a bloody text file, get unlaziful! :razz:
     
  2. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    20,516
    Likes Received:
    24,424
    The issue that people are missing here is this power virus situation was found and reported in the beta. Blizzard never patched it into the final product.

    Once again, Blizzard is lazy.
     
  3. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    That's exactly what I said (in the rest of my posting), right? At least I don't see a difference.


    Strangely though, in our testing we were (and I at home was) locked to 60ish Fps in the log-in screen and consequently didn't see anything unnormal wrt temperatures, fan-speeds or overheating - on no card. I wonder if this might be happening only to people who installed/unlocked the final over the beta? Maybe some residue in the config-files?

    Apparently, since this is still happening: Agreed.
     
    #263 CarstenS, Jul 31, 2010
    Last edited by a moderator: Jul 31, 2010
  4. AlexV

    AlexV Heteroscedasticitate
    Moderator Veteran

    Joined:
    Mar 15, 2005
    Messages:
    2,535
    Likes Received:
    144
    The problem occurs when getting maximum utilization from the HW, obviously. That means I'm constantly fetching data from RAM, processing it and writing it out as fast as possible...sounds like something that'd happen with a simple scene. On a complex scene I'm likely to get breathing room as I wait for a fetch from VRAM, or maybe wait for raster to finish working on those small tris, or i'm doing a shitload of per-pixel work, and the TUs and ROPs are sitting there idling, because I'm doing a trillion MADs and just working in the RF.

    Everybody's in love with Furmark, but have you looked at what it's doing? Hint: it's doing pretty simple rendering that manages to get very high utilization. Their explanation makes perfect sense. Maybe AIBs and IHVs should get more serious about their thermal work, CPU guys have already been here and learned their lessons.
     
  5. Neb

    Neb Iron "BEAST" Man
    Legend

    Joined:
    Mar 16, 2007
    Messages:
    8,391
    Likes Received:
    3
    Location:
    NGC2264
    ATI cards are like beasts, cant be tamed when they roaarrr! :wink: :grin:

    But seriously I hope the protection system really prevents damage as when it trips it for a short moment goes past set limit by several Amps before it is cut.
     
  6. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,426
    Likes Received:
    10,320
    Hmmm, which makes me wonder whether the pirated version doesn't lock the FPS? That might go a bit to explain why Blizzard aren't concerned? As well the BETA situation you mention. As I believe the SC2 crack leverages some stuff from BETA.

    That would actually be pretty hilarious if pirated versions of the game, instead of making the game not work, actually fried hardware instead. :D

    Regards,
    SB
     
  7. Colourless

    Colourless Monochrome wench
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,274
    Likes Received:
    30
    Location:
    Somewhere in outback South Australia
    I recently purchased a new card, because the cooler on my old card was virtually impossible to clean. I would routinely get all sorts of over heating graphics card problems depending on the games I was playing. I really can't lay the blame on the software devs for managing to drive the temps on my card way up. It was just a case of dust buildup. If there card was more able to cope with dangerously high temps, there may not have been as much of a noticable issue.

    Blizzard can not be blamed for this issue, and I would find it difficult to blame the hardward companies either. Though both should probably issue statements that people with the overheating issues really should dust the heatsink on their cards.
     
  8. tongue_of_colicab

    Veteran

    Joined:
    Oct 7, 2004
    Messages:
    3,773
    Likes Received:
    960
    Location:
    Japan
    Whatever the case I still don't believe it can fry your hardware. Even if for some reason only a small portion of the gpu will be stressed really hard in the menu's, how can that break the card? You can't tell me some part of the gpu can't work @ 100% for 5 minutes. I also don't believe overheating because modern cards easily do 100+ degrees and I don't believe a small portion of the card can't be cooled enough and that is without considering themal protection or just plain old crashing from too much heat like in the old days. I mean everybody who ever overclocked something too much knows that before you fry anything (unless you start playing with the voltage) you will have crashes, artifacts or shutdowns. The only way I ever found to fry hardware is to take the damn cooler off and wait untill smokes start coming out and this only works on stuff that doesn't have temp sensors. My old a64 which I gave to my brother was running for months with a cooler only attached half. Half the heatspreader was black when I got it off but the pc actually never fried. Hell, it actually could run for hours if you didn't stress it.

    I just don't understand how I can play every game that would stress my total system 100% for hours with no problem but I can fry my gpu in 5 minutes in a menu just because its stressing (only one part of the gpu) 100%.
     
  9. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    Heh...You're welcome...;) Was just preparing to post the link for you, in response to your last post asking me where it was, but found Wavey had beaten me to it.
     
  10. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    Thanks for the response, and I agree that pre-rendered video can be stunning--but if it is too much better than in-game rendering all it serves to do, imo, is to make the in-game rendering look stunningly poor by comparison.

    As to your second proposition, I'm just a bit puzzled. If they use separate models for characters and environments than you see in the game, and so the cut-scene characters and environments look appreciably better than they are rendered during actual game play, what makes you so certain that these scenes aren't pre-rendered as well? I mean, if you are going to use better models for characters and environments than the models you use in-game, and the cut-scenes are indeed driven by the actual game engine running scripts, then why would you not use the superior models during normal game play as well as during selected cut scenes?

    It's just my experience that cut scenes in any game which use much better models for objects, characters, and environments than are used when actually playing the game are almost always pre-rendered cut scenes. OTOH, I have been very pleasantly surprised when the cut scenes in a game not only look spectacular, but then also happen to exactly match in-game rendering (which is how you usually discern that the game engine running a script is actually creating the cut scene you're viewing, as opposed to the game playing back 2d pre-rendered cut scenes that simply serve to splice together various segments of actual game play.)
     
  11. Laa-Yosh

    Laa-Yosh I can has custom title?
    Legend Subscriber

    Joined:
    Feb 12, 2002
    Messages:
    9,568
    Likes Received:
    1,455
    Location:
    Budapest, Hungary
    Gameplay has a very different perspective and requires the models and textures to be instantly readable by the player.

    The ingame cutscenes are definitely ingame and doubting them makes you look kinda silly.
    There's plenty of video and images around the net and you could easily check and see that it's all realtime. There's even been a paper from Blizzard detailing their implementation of SSAO (which is the one with the highest quality I've seen so far).
     
  12. Laa-Yosh

    Laa-Yosh I can has custom title?
    Legend Subscriber

    Joined:
    Feb 12, 2002
    Messages:
    9,568
    Likes Received:
    1,455
    Location:
    Budapest, Hungary
  13. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    As far as "games frying gpus" goes, I agree that the notion is ridiculous...;) I mean, think about all of the synthetic 3d benchmarks circulating for years that have "endless loop" settings, and are designed to test virtually every aspect of a gpu's performance and IQ--depending on the year the benchmark is published, of course...;) The only one I ever heard of was "Furmark" doing something to some gpus--causing them to overheat, I think--but I don't recall reading posts claiming that "Furmark fried my gpu!" There may have been such posts, though--whether I read them or not...;)

    I've found in complaints like these that often people are doing other things to their gpus, like massively overclocking and/or overvolting them, which they seem to completely forget about when first posting their "X fried my gpu!" claims.

    Are these kinds of complaints being reported widely on the Sc2 forums?

    The above "technical suggestion" as quoted by Brit, presumably taken from some official Blizzard SC2 forum somewhere, is really bizarre! I don't recall ever seeing its like. Basically, Blizzard is saying that on fairly static or "blank" screens, the game is causing a gpu to achieve such high frame-rates that it may cause it to overheat to the point of failure!

    It's a new twist on the "stopping to look at the wall" gimmick we've all seen, which is what some people do to inflate the frame-rate scores of various game demos they create or simply run themselves. When you stop to look at the wall, there's nothing moving on the screen, including the camera, and frame-rates on all gpus tend to soar in comparison to frame rates achievable when actually "playing" the game and doing much, much more, both in terms of your own movement and the moving universe of the game surrounding you.

    Another way of describing, it is by looking at the number of pixel changes that occur between frames. Typically, the frame rates for a series of frames in which there is little to no change in pixels from the first frame displayed to the second, the second to the third, the third to the fourth, and so on, are much, much higher than in a series of frames in which there is a lot of pixel change taking place between frames.

    Also, by saying "temporary workaround" Blizzard is implying a much deeper issue, as in a software bug in the game Blizzard intends to fix in the future, else the "solution" they recommend above would not be a "temporary workaround" at all--it would be the solution. Next, the fact that they would suggest a minimum framerate of 30 to a maximum cap of 60 fps--and then telling you that you can "replace these numbers if you want to," seems very curious especially as they don't provide you with a hint as to whether the numbers should be higher or lower.

    I think it is true that a gpu actually "does more work" per clock cycle when it is rendering complex scenes with lots of pixel changes between frames, while it is doing other things like shader work, FSAA, etc.---than the amount of work a gpu is doing in the "standing still looking at a wall" mode. Even though the frame rates would be much higher in a "looking at wall/blank screen" scenario, it also seems very unlikely to me that such a scenario would put so much stress on a gpu that it might cause it to overheat to the point of failure. I would think that a scene in which the frame rates are much lower because the gpu is doing a lot more work per cycle would be the scenes that would tend to occasionally stress-out a gpu. That is, not the "light on detail" scenes that Blizzard points out as the likely culprit.

    And in turn this sounds very much to me as if they aren't sure of exactly what the problem is and have suggested a "temporary workaround" that they think will work until they can patch in what will hopefully be a permanent fix. I could see suggestions from Blizzard like "use lower resolutions" or "Keep FSAA turned off even if your gpu driver supports it" or even "use the lowest AF setting you can tolerate"--and a few other possibilities, too--as being far more effective at reducing a gpus overall workload when rendering the game than simply writing in a 60fps cap into a Sc2 configuration file. It'll be interesting to see where this goes...
     
  14. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    20,516
    Likes Received:
    24,424
    It's hard to tell, as Blizzard routinely deletes posts and threads which they consider to already be dealt with by their official bug-report thread(s).
     
  15. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    I read it, and thanks--it's a nice PR piece from Blizzard. BTW, I didn't "doubt" whether the game engine was used to generate the cut scenes, I simply asked the person who originally responded to me on this topic what made him so sure. Now I know, and I appreciate the link. The original poster also told me that some of the cut scenes were in his opinion definitely pre-rendered. Now that I have actually read this paper I understand what you mean about "perspective" with cut scenes being 3rd-person and game play being top-down.

    But on page 4 of the report the term "dual modes" is used, in that the game consists of an "in game mode" and also a "story mode," which, again, the paper takes pains to separate. It wasn't at all clear to me that the same game engine that powers the "in game mode" is the same engine that powers the "story mode." On page 13 of the paper, for instance, depicting a scene in the "story mode" the term "Pre-generated Ambient Occlusion maps" is used whereas I did not see the term associated with the "in game mode."

    Thanks again for answering my original questions.
     
  16. Arnold Beckenbauer

    Veteran Subscriber

    Joined:
    Oct 11, 2006
    Messages:
    1,756
    Likes Received:
    722
    Location:
    Germany
  17. hoho

    Veteran

    Joined:
    Aug 21, 2007
    Messages:
    1,218
    Likes Received:
    0
    Location:
    Estonia
    They aren't as detailed, yes, but they definitely aren't blocky looking or generally bad. The rendered cinematics are by far best looking I've seen so far, though I haven't seen many of the later games.
    For one thing you can tune the details up/down from the visual settings.
    In cut scenes there are cases where one guys face fills the entire screen and is very detailed with all the skin, eye and hair details. It's not really wise to use same models when you have tens of them running around each covering just a handful of pixels :
     
  18. Laa-Yosh

    Laa-Yosh I can has custom title?
    Legend Subscriber

    Joined:
    Feb 12, 2002
    Messages:
    9,568
    Likes Received:
    1,455
    Location:
    Budapest, Hungary
    It's totally obvious to me, and IMHO it should be for everyone who's just a bit into 3D rendering and graphics...

    The game's overhead perspective with the tiny units requires more simple, but also exaggerated silhouettes, more saturated colors and so on -so that you can immediately tell a marine from a reaper, a tank from a Viking craft and so on.
    The engine also shouldn't have to display small details because that'd just turn into noise without 16xAA. The same has always been true for any such game, viewing highly detailed models from a large distance will only make them look shapeless and greyish.

    There's a scene at the end of War3 in the cinematic where thousands of ingame units are shown in the CG movie, but because they're far away you can't tell what is what... I can't find a screenshot but it's in here: http://www.youtube.com/watch?v=K09CsrMMsL8&hd=1 at 1:48, but you'll need to find a version with better compression to see that you still can't make out anything ;)


    I also don't get all the criticism and yes, hate, that Starcraft 2 and Blizzard gets here. They've been perfectly clear and honest and straight about what you're going to get ever since they've announced it in 2007, there's been months of open beta testing, we've seen a lot of media released, from the actual game through the realtime cinematics to the conceptual artwork, we've had unit descriptions and so on. This game is pretty close to a perfect sequel IMHO, bigger, better, nicer and all, with some of the best production values that anyone in the industry could deliver (the CG alone has probably cost like 10 million dollars). So I really don't get some of the comments here.
     
  19. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,426
    Likes Received:
    10,320
    I don't exactly understand it either, but I just put it down to elitist enthusiast graphics snobbery. :) If it doesn't have the latest tech checklists then it's obviously inferior. OMG, no godrays. OMG doesn't use 4 cores, despite running at high FPS on dual core and single core machines.

    Meh. At some point I'll just tune that stuff out and just ignore the thread. :D

    Regards,
    SB
     
  20. ECH

    ECH
    Regular

    Joined:
    May 24, 2007
    Messages:
    692
    Likes Received:
    30
    But is it not true that by using Dx9 they forfeit some of the efficiencies gained with DX10, DX10.1 or DX11?
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...