AMD: Southern Islands (7*** series) Speculation/ Rumour Thread

Discussion in 'Architecture and Products' started by UniversalTruth, Dec 17, 2010.

  1. AnarchX

    Veteran

    Joined:
    Apr 19, 2007
    Messages:
    1,559
    Likes Received:
    34
    No 16x Tri-AF fill-rate benchmarks in Techreports review?:sad:
     
  2. fellix

    Veteran

    Joined:
    Dec 4, 2004
    Messages:
    3,552
    Likes Received:
    514
    Location:
    Varna, Bulgaria
  3. TBH, the impression you give out is that your dellusions are actually fed by a strongly biased will in getting disappointed. Or at least in showing that disappointment.

    Regarding the resolutions, people buy >500€ graphics cards for either:

    a) playing in very high resolutions (which is not what the majority of people will be using;
    b) future-proof-ness, which is usually guessed by benching the current and most demanding games at very high ("extreme") resolutions/settings.


    None of those scenarios includes resolutions the majority of people will be using on current games.


    Yet your comments seem to show that you don't actually know what these tests mean..
    That Crysis 2 is a 90 second test. During those 90 seconds, the HD7970 surpassed the 50ms response time (lower than 20fps FYI) during 51ms.
    So to sum it up, the rendering speed of the HD7970 dropped below 20fps/50ms during 0,05 seconds, or 0,056% of the bench time.

    And you don't even know what the actual rendering speed was during this time. it could be 19,99fps while the GTX580 is doing 20,01fps.

    The number of "external" factors that can cause this micro-delay are so many that it's pretty much ridiculous to take an assumption of "better or worse" from it.



    Irrelevant bench as Skyrim's almost always CPU-bound. Nonetheless, no one would ever distinguish the difference between 33 and 35ms, much less find it "worse".


    So now we don't use "worse" anymore? On pair, lol ok.



    As stated before, these 20-30% higher framerates usually show their difference in future titles.
    Furthermore, it's pretty much a given that AMD isn't hoping for people with a GTX580 to upgrade to a HD7970 for performance reasons alone.
    It's taking away the potential buyers of a GTX580, since the card is faster and at that price point, the difference doesn't really matter much.
     
  4. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    I'm not a fan of Hardocp's bench methodology, but it's worth noting they said in pretty much every case the 7970 allowed turning up settings vs the 580.

    http://www.hardocp.com/article/2011/12/22/amd_radeon_hd_7970_video_card_review/14

    Again I'm not a fan of [H]'s bizzare benchmarking methodology, but it seemed relevant here.
     
  5. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,022
    Likes Received:
    122
    Maybe you should read the text too.
    Apparently in this test the number coming out at the end is purely dominated by the first few hundred frames (you can see that quite easily in the frame graph actually). I'm not sure why it's much slower there, but if I had to guess I'd suspect it's completely bound by texture uploads, shader compiles or similar things, i.e. not dependent on hardware at all.

    Admittedly, the HD7970 is not much faster in this game anyway (10% more average FPS), but for all chips the 99% percentile seems to be completely caused by spikes. Again, we don't know why those frames are so slow, but I severely doubt it's got anything to do with gpu performance (again more like texture uploads).

    For Crysis2, the HD7970 does indeed worse, due to spikes which unlike in batman the gtx580 doesn't have. You can easily see though every high spike is followed by a low spike (unlike batman), so this looks more like a vsync problem or something to me.

    I agree the fan seems to be set a bit too aggressive (as are voltages imho, a bit less OC potential with slightly lower default voltage would allow the card to run with less power draw (which would translate to more quiet).
     
  6. sir doris

    Regular

    Joined:
    May 9, 2002
    Messages:
    708
    Likes Received:
    165
    With regards to the AMD 'micro stuttering', one of the hardware sites (techreport?) did some analysis recently with a few AMD and Nvidia midrange cards (6870, 6950, 560, 560Ti, etc.) in (I think) Battlefield 3. The AMD cards scored worse in the first benchmark, however in the second they scored much better. The big surprise was the drop the Nvidia cards suffered in the second test, much worse than the AMD cards in the first test. Horses for courses it would seem.
     
  7. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    There's this: http://www.pcper.com/news/Graphics-Cards/Battlefield-3-Frame-Rate-Drop-Issue-GeForce-GPUs
     
  8. Dooby

    Regular

    Joined:
    Jul 21, 2003
    Messages:
    478
    Likes Received:
    3
    My original line:

    And FWIW, the Techreport data shows some huge slow downs at random points. The higher overall higher FPS is nice, but these "spikes of slowness" will feel all the more jarring because of it.
     
  9. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    Microstuttering (also the real MGPU-stuttering) seems to be changing with Game, Vendor and movement type. We did a small comparison back in summer with Crysis Warhead as a test case.

    While Nvidia was much smoother while just running almost straightforward, as soon as we started to strafe, the results reversed and the AMD cards managed smoother gameplay, i.e. less frametime variance.

    http://extreme.pcgameshardware.de/a...al-pc-games-hardware-05-2011-_crysisw_run.png

    http://extreme.pcgameshardware.de/a...pc-games-hardware-05-2011-_crysisw_strafe.png
     
  10. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,062
    Likes Received:
    3,119
    Location:
    New York
    Scott's review is actually not that favorable to Tahiti, despite his high praise. An average of 15% and high of 24% faster than the 580 isn't amazing. I know we usually dismiss it but there seems to be real potential for driver improvements this time around.
     
  11. DeF

    DeF
    Newcomer

    Joined:
    May 3, 2007
    Messages:
    162
    Likes Received:
    20
    Could you please link those graphs showing huge slowdowns at random points? I have read Techreport's review today and i couldn't find slowdowns on the graphs you are referring to.

    And the review sir doris was mentioning can be found here.
     
  12. Dooby

    Regular

    Joined:
    Jul 21, 2003
    Messages:
    478
    Likes Received:
    3
    [​IMG]

    On those graphs, a lower line is "faster" and a thinner line is "smoother". The red line of the 7970 is maybe 10% faster than the green line of the 580, which is "nice" (well, 10% sucks, but that's only my opinion apparently). The red line though has much higher and lower spikes than the green line.

    What this means is that every now and again, any only for a frame or two each time, the FPS plummets, to as much as half what it was. Then comes a much faster frame, to compensate.

    What this does is give a yo yo effect when playing the game. For the most time you will be running at 36fps, then for a second, it will stutter down to 18fps. I don't know about you, but I find this hugely noticeable. Sure it happens on my 4870x2 more than it would on the 7970, but my reason for going with single GPU this time was to have it much smoother. The Techreport review is showing this isn't the case. Sorry, but high FPS don't mean much when every few seconds there's a noticable lag.

    I know the drivers are new, but at *this moment*, the (hopefully) driver issues + ultra conservative clock speeds make the 7970 (vanilla) a miss for me.

    Le voice of reason \o/
     
  13. OpenGL guy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,357
    Likes Received:
    28
    I don't see "10%" faster. HD7970 generated more than 20% more frames than GTX580.
     
  14. sir doris

    Regular

    Joined:
    May 9, 2002
    Messages:
    708
    Likes Received:
    165
    Yeah, that's the one. I had just found it myself and was about to post the link.
     
  15. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    I think, since Scott decided to throw in the classical benchmark bars as well anyway, it would have made more sense to align the graphs to the run time of the benchmark, so identical scenes would be on top of each other. IOW, time in seconds on the x-axis.
     
  16. air_ii

    Newcomer

    Joined:
    May 2, 2007
    Messages:
    134
    Likes Received:
    0
    My thoughts exactly.
     
  17. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,022
    Likes Received:
    122
    The resolution of the diagram isn't quite high enough, but I suspect these are actually _single_ frame spikes (so a slow frame immediately followed by a fast frame). It doesn't really make sense that it would slow down for a second then "compensate" for the next second (and the resolution of the diagram might not be enough but clearly the spikes last for less than a second). Not to say you couldn't notice single frame spikes, but it points to more of a driver issue (or something else software) rather than hardware.
     
  18. no-X

    Veteran

    Joined:
    May 28, 2005
    Messages:
    2,455
    Likes Received:
    471
    Exactly. In this case a single-frame spike means, that one frame isn't synchronized, so the frame exists, but it doesn't improve subjective fluency. One spike per second means one lost frame per second (still in terms of subjective fluency), so despite the game runs at 36 fps, it's subjectively only as fluent as 35 fps game-play. I doubt ~1 fps difference can be noticed during real game-play.
     
  19. Dooby

    Regular

    Joined:
    Jul 21, 2003
    Messages:
    478
    Likes Received:
    3
    This would be bringing us back to the microstutter argument all over again. It's there. Some people see it, some don't. The fact is on the graph I linked, the 7970 is obviously more "messy" than either of the other two single GPU cards that it's compared against.
     
  20. DeF

    DeF
    Newcomer

    Joined:
    May 3, 2007
    Messages:
    162
    Likes Received:
    20
    Exactly.

    You have to look at the graph again and realize you are looking at single frame times (two frames on that graph with duration between 60 to 70ms to be exact).
    as far as the Crysis 2 - time spent beyond 50ms graph goes i will just paste Tottentranz response because he nailed it perfectly yet you seem to not get it.
    And to sum things up i will ask you to think about one thing. How would above mentioned graph from Crysis 2 look like if Techreport guys would decide to move the bar up from 50ms to lets say 30ms.
    Sad thing is the Techreport review is a great piece of work yet you failed to understand it fully.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...