AMD: Southern Islands (7*** series) Speculation/ Rumour Thread

Discussion in 'Architecture and Products' started by UniversalTruth, Dec 17, 2010.

  1. DeF

    DeF
    Newcomer

    Joined:
    May 3, 2007
    Messages:
    162
    Likes Received:
    20
    You could use the microstutter argument if spikes had repeatable pattern. That way we would get good average framerate but crappy sense of fluidity.
     
  2. air_ii

    Newcomer

    Joined:
    May 2, 2007
    Messages:
    134
    Likes Received:
    0
    It's not exactly the same, imo, as it means that you can see a microstutter because at some points the image stays longer than it should for a smooth experience. So it's not exactly equivalent to 35 fps.
     
  3. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,247
    Likes Received:
    4,465
    Location:
    Finland
    No, it means some can see microstutter, and even for those who can, single spike isn't necessarily noticeable yet, or at least not enough to affect anything in real life
     
  4. air_ii

    Newcomer

    Joined:
    May 2, 2007
    Messages:
    134
    Likes Received:
    0
    Yes, but that was not what I was referring to. I was just merely pointing out that those two cases are not equivalent. Not referring to whether you can see it or not.
     
  5. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,247
    Likes Received:
    4,465
    Location:
    Finland
    And if we take some other game, like Skyrim, the tables are turned and in a LOT more dramatic fashion too
    [​IMG]

    GTX480 stuttering like there's no tomorrow most of the bench, and 580 a lot more than 7970 too, while 7970 only has short period, and even then less stuttery
     
  6. ECH

    ECH
    Regular

    Joined:
    May 24, 2007
    Messages:
    692
    Likes Received:
    30
    Interesting to see the frame rate in milliseconds in that review you linked. Something that I've not seen much of lately. Needless to say that the 7970 is doing much better then previous generations. It would have been nice to see both their results and performance overlay for BF3.
     
  7. no-X

    Veteran

    Joined:
    May 28, 2005
    Messages:
    2,455
    Likes Received:
    471
    Hardly. AFR microstuttering means continuously irregular distribution of frame-to-frame time. E.g. 10 ms - 90 ms - 10 ms - 90 ms etc. It results in subjectively halved frame-rate, because every second frame doesn't improve subjective fluency. E.g. 50 fps with significant microstutterig can result in fluency similar to 25 fps - worst case. (Real framerate minus out-of sync frames per second = subjective framerate). Nothing more, nothing less.

    The Crysis 2 benchmark doesn't show, that every second frame is out of sync. It shows, that one of tens / hundreds frames is out of sync (~effectively lost).

    Majority of those persons who "see" microstuttering in fact describes different problems, like jumps, (macro-)stuttering, borked distribution of frames (typical for early SLI/CF days) etc. But true microstuttering can't be seen - it can be percepted as subjectively lower framerate.

    You said that ~1 fps (effective) loss of the HD 7970 can be seen by some people, but on the other hand you refuse to admit, that 20-30% fps advantage of the HD 7970 (over GTX 580) can be any useful...
     
  8. hoom

    Veteran

    Joined:
    Sep 23, 2003
    Messages:
    3,264
    Likes Received:
    813
    Wow. I really like that method of benchmark graphing :shock:
    Tells you so much more about the performance than the normal fps bars :yes:
     
  9. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,247
    Likes Received:
    4,465
    Location:
    Finland
    Though I'd prefer if it was translated to "realtime fps" so to say, as it's really, really simple job in excel, it's far more userfriendly to read than frametimes
     
  10. Man from Atlantis

    Regular

    Joined:
    Jul 31, 2010
    Messages:
    961
    Likes Received:
    855
    Frames per second doesnt much helping.. stuttering lower than in a second is annoying thats the whole point they make graphs in miliseconds not like [H] style..

    the reason of high stuttering on nv cards in those skyrim graphs is maybe because of nv drivers need more cpu power..
     
  11. Mintmaster

    Veteran

    Joined:
    Mar 31, 2002
    Messages:
    3,897
    Likes Received:
    87
    Hey all, haven't been here in a while, but Tahiti has rekindled my passion for 3D graphics and hardware. I'm planning on doing a report on Tahiti and modern games with the same kind of insight as my bandwidth analysis from a few years ago.

    For now though, I thought I'd make a quick remark on the TechReport review:
    You have to understand that what TechReport records is not necessarily what you see. They are using FRAPS to log individual frame times, and this works by a D3D software hook that monitoring the times that IDirect3DDevice9::present() is called. It doesn't monitor changes on the screen like Digital Foundry does in their Eurogamer faceoffs. Games usually take the past few frame steps to predict the next one, so stuttered Present() calls would still have objects move in steps that match unstuttered calls.

    If a driver is to minimize input latency, it wants to get information from the game for the next frame as fast as possible (within reason, as it only queues a frame or two). This could result in stuttered calls from the view of FRAPS, but the driver can put the frames on the screen smoothly with delays at its discretion. If, OTOH, the driver wanted to ace TechReport's test, it would have a heuristic to insert variable artificial delays somewhere so that calls to Present() were more even, and then more delays at the end of the pipeline to make sure the frames went to the display smoothly.

    You should read their Battlefield 3 Performance article. They get differing results in the same game. With their Fear no Evil test, AMD shows lots of microstutter in FRAPS, but "the Radeons just don't feel much choppier than the GeForces overall." In the Rock and a hard place test, it was the other way around, and "with a GeForce, the latency spikes were very palpable, causing animations seemingly to speed up and slow down wantonly".

    Even if the frame times reported by the driver matched those of the display, there's no easy way to summarize it all. Consider the Skyrim 99th percentile numbers you referenced where the 7970 was "worse". Now, look at the actual plot Kaotik posted above. The 99% percentile is virtually the same for all DX11 cards because the slowest 1% were all within the first 600 frames, where it looks like it is CPU/PCI-E bound. After that, however, it is the GTX 580 which shows vastly more stuttering according to FRAPS, despite you claiming the opposite due to the 99th percentile time.

    Looking closely, it appears that up to 2500 frames are CPU limited. If you cut those out of the test, the 7970 is then 20% faster than the 580 instead of 10%.
     
  12. Lightman

    Veteran Subscriber

    Joined:
    Jun 9, 2008
    Messages:
    1,969
    Likes Received:
    963
    Location:
    Torquay, UK
    I would like to see Techreport throw in mix of either PCIe3.0 system with faster CPU or simply SandyB @4GHz + to see how higher CPU speeds and/or PCIe bus affects each card performance.
    It could provide sufficient evidence of how much GFX drivers are dependant on CPU power even in high resolutions usually taken as GPU bound. There is popular believe that AMD is heavier on CPU than nVidia in most games, so seeing frame by frame graphs of each vendor on different speed machines (mainly single core performance matters here) is something I'm interested in :smile:.

    Any volunteers?
    If not I can do it, just send me HD7970 and GTX580, I will throw in HD6970 to the mix as well.
     
  13. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,247
    Likes Received:
    4,465
    Location:
    Finland
    No no, I mean "realtime FPS", as in, just frametimes converted to what it would be in FPS at that particular frame (assuming same frametime for whole frame)

    So it would look like this (above frametimes converted to "fps equivalent", lower the actual fps style)
    [​IMG]
     
  14. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,022
    Likes Received:
    122
    And much more things to argue about :).

    Possible though typically with completely cpu bound settings nv cards tend to be faster hinting nv driver needs less cpu power.
     
  15. Sinistar

    Sinistar I LIVE
    Regular Subscriber

    Joined:
    Aug 11, 2004
    Messages:
    660
    Likes Received:
    74
    Location:
    Indiana
    Test done by Kyle over at [H] seem to show diferently, at least in 3 way SLI, and Tri-Fire.

    Original test here, with the slower AMD cards being faster than Nvidia

    again with faster CPU here, rolls are reversed.
     
  16. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,022
    Likes Received:
    122
    I'm not talking about SLI/CF (which brings its own set of problem, and sometimes some solutions may not scale at all with more cards).
    Look at this for example, http://ht4u.net/reviews/2011/amd_radeon_hd_7900_southern_island_test/index28.php where the scores don't budge at all between the two lowest resolutions, and the the GTX (both 580 and 560) have a higher hard fps limit than the HD7970. Granted it's just a single game, it could certainly be different in other apps, might find some more scores in cpu rather than gpu reviews...
     
  17. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    4
    Exactly, people can't accept that Nvidia cannot always stay on top and sometimes they loose for next gen ATI GPU.
     
  18. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,426
    Likes Received:
    10,320
    Really, you think the "majority" of people with a GTX 580, 6970, or future 7970 users are going to be at 1920x1200 or below?

    I'd argue that the majority of those users got those cards to make games playable either at 2560x1600 (or 2560x1440 now) or to make multi-monitor games playable.

    GTX 580 users are ~1.09% of PC users with 6970 at ~0.60% of PC users according to the latest Steam survey report. Heck there's more users at 2560x1440 and higher resolutions than there are combined GTX 580 and 6970 users which suggests that some users like me are happily gaming on a 2560x1600 monitor on a lower card like a 5870.

    I'd say anyone spending money for a 580 or 6970 for playing games at 1920x1200 or lower is wasting their money.

    Regards,
    SB
     
  19. DuckThor Evil

    Legend

    Joined:
    Jul 9, 2004
    Messages:
    5,996
    Likes Received:
    1,062
    Location:
    Finland
    I'd say 1080p HDTV's are somewhat common these days as a gaming display, I know I'm using one. Plenty of games where I can put my OC'd GTX 570 to it's knees even at that res. I won't be putting 500€ to a GPU purchase though.
     
  20. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    Although all the graphs in that say F1 2010 the top does actually say F1 2011 and the results do not appear to line up with the F1 2010 test in their 53 card roundup, which may suggest it is indeed F1 2011. In which case, we did notice late in the game that F1 2011 wasn't performing as expected (i.e. inline with F1 2011) but it was discovered after the initial driver.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...