AMD: Southern Islands (7*** series) Speculation/ Rumour Thread

Yes, my dellusions, based on facts and graphs by a 3rd party at resolutions the majority of people will be using. How dare I be so bold!

TBH, the impression you give out is that your dellusions are actually fed by a strongly biased will in getting disappointed. Or at least in showing that disappointment.

Regarding the resolutions, people buy >500€ graphics cards for either:

a) playing in very high resolutions (which is not what the majority of people will be using;
b) future-proof-ness, which is usually guessed by benching the current and most demanding games at very high ("extreme") resolutions/settings.


None of those scenarios includes resolutions the majority of people will be using on current games.


I don't care how high a single avg fps number is, its the slight stuttering thats most annoying:
crysis2-beyond.gif

Worse than a 580GTX.

Yet your comments seem to show that you don't actually know what these tests mean..
That Crysis 2 is a 90 second test. During those 90 seconds, the HD7970 surpassed the 50ms response time (lower than 20fps FYI) during 51ms.
So to sum it up, the rendering speed of the HD7970 dropped below 20fps/50ms during 0,05 seconds, or 0,056% of the bench time.

And you don't even know what the actual rendering speed was during this time. it could be 19,99fps while the GTX580 is doing 20,01fps.

The number of "external" factors that can cause this micro-delay are so many that it's pretty much ridiculous to take an assumption of "better or worse" from it.



skyrim-99th.gif

Worse than a 580GTX.
Irrelevant bench as Skyrim's almost always CPU-bound. Nonetheless, no one would ever distinguish the difference between 33 and 35ms, much less find it "worse".


batman-99th.gif

On a par with 580GTX.
So now we don't use "worse" anymore? On pair, lol ok.



Yes, it has higher framerates (20%-30%), but as you can see from the Techreport review, hardly any time does this take a game from unplayable to playable.

As stated before, these 20-30% higher framerates usually show their difference in future titles.
Furthermore, it's pretty much a given that AMD isn't hoping for people with a GTX580 to upgrade to a HD7970 for performance reasons alone.
It's taking away the potential buyers of a GTX580, since the card is faster and at that price point, the difference doesn't really matter much.
 
As stated before, these 20-30% higher framerates usually show their difference in future titles.

I'm not a fan of Hardocp's bench methodology, but it's worth noting they said in pretty much every case the 7970 allowed turning up settings vs the 580.

http://www.hardocp.com/article/2011/12/22/amd_radeon_hd_7970_video_card_review/14

While apples-to-apples are neat to show raw percentage differences, it doesn't mean much. What matters more is what does that performance let me do, how high can I now increase my game's settings to improve the gameplay experience. In that, we saw that clear across the board, these percentage differences were enough to allow us to play at higher settings on the Radeon HD 7970 in every single game, at every resolution.

Again I'm not a fan of [H]'s bizzare benchmarking methodology, but it seemed relevant here.
 
I don't care how high a single avg fps number is, its the slight stuttering thats most annoying:
skyrim-99th.gif

Worse than a 580GTX.
Maybe you should read the text too.
Apparently in this test the number coming out at the end is purely dominated by the first few hundred frames (you can see that quite easily in the frame graph actually). I'm not sure why it's much slower there, but if I had to guess I'd suspect it's completely bound by texture uploads, shader compiles or similar things, i.e. not dependent on hardware at all.

batman-99th.gif

On a par with 580GTX.
Admittedly, the HD7970 is not much faster in this game anyway (10% more average FPS), but for all chips the 99% percentile seems to be completely caused by spikes. Again, we don't know why those frames are so slow, but I severely doubt it's got anything to do with gpu performance (again more like texture uploads).

For Crysis2, the HD7970 does indeed worse, due to spikes which unlike in batman the gtx580 doesn't have. You can easily see though every high spike is followed by a low spike (unlike batman), so this looks more like a vsync problem or something to me.

My other issues with the card are it being both too conservative, and pushed slightly too hard. In this I'm talking about the fan. On full load, the fan is more than mildly annoying, but it doesn't need to be. The card is actually quite cool really. The fan should be 20% slower for 10% more heat.
I agree the fan seems to be set a bit too aggressive (as are voltages imho, a bit less OC potential with slightly lower default voltage would allow the card to run with less power draw (which would translate to more quiet).
 
With regards to the AMD 'micro stuttering', one of the hardware sites (techreport?) did some analysis recently with a few AMD and Nvidia midrange cards (6870, 6950, 560, 560Ti, etc.) in (I think) Battlefield 3. The AMD cards scored worse in the first benchmark, however in the second they scored much better. The big surprise was the drop the Nvidia cards suffered in the second test, much worse than the AMD cards in the first test. Horses for courses it would seem.
 
With regards to the AMD 'micro stuttering', one of the hardware sites (techreport?) did some analysis recently with a few AMD and Nvidia midrange cards (6870, 6950, 560, 560Ti, etc.) in (I think) Battlefield 3. The AMD cards scored worse in the first benchmark, however in the second they scored much better. The big surprise was the drop the Nvidia cards suffered in the second test, much worse than the AMD cards in the first test. Horses for courses it would seem.

There's this: http://www.pcper.com/news/Graphics-Cards/Battlefield-3-Frame-Rate-Drop-Issue-GeForce-GPUs
 
So now we don't use "worse" anymore? On par, lol ok.

My original line:

Not sure what review you just read, but the 7970 is on par with or WORSE than a GTX580 in Skyrim, Batman, Crysis2.

And FWIW, the Techreport data shows some huge slow downs at random points. The higher overall higher FPS is nice, but these "spikes of slowness" will feel all the more jarring because of it.
 
With regards to the AMD 'micro stuttering', one of the hardware sites (techreport?) did some analysis recently with a few AMD and Nvidia midrange cards (6870, 6950, 560, 560Ti, etc.) in (I think) Battlefield 3. The AMD cards scored worse in the first benchmark, however in the second they scored much better. The big surprise was the drop the Nvidia cards suffered in the second test, much worse than the AMD cards in the first test. Horses for courses it would seem.

Microstuttering (also the real MGPU-stuttering) seems to be changing with Game, Vendor and movement type. We did a small comparison back in summer with Crysis Warhead as a test case.

While Nvidia was much smoother while just running almost straightforward, as soon as we started to strafe, the results reversed and the AMD cards managed smoother gameplay, i.e. less frametime variance.

http://extreme.pcgameshardware.de/a...al-pc-games-hardware-05-2011-_crysisw_run.png

http://extreme.pcgameshardware.de/a...pc-games-hardware-05-2011-_crysisw_strafe.png
 
Scott's review is actually not that favorable to Tahiti, despite his high praise. An average of 15% and high of 24% faster than the 580 isn't amazing. I know we usually dismiss it but there seems to be real potential for driver improvements this time around.
 
...
And FWIW, the Techreport data shows some huge slow downs at random points. The higher overall higher FPS is nice, but these "spikes of slowness" will feel all the more jarring because of it.

Could you please link those graphs showing huge slowdowns at random points? I have read Techreport's review today and i couldn't find slowdowns on the graphs you are referring to.

And the review sir doris was mentioning can be found here.
 
Could you please link those graphs showing huge slowdowns at random points?

crysis2-nv.gif


On those graphs, a lower line is "faster" and a thinner line is "smoother". The red line of the 7970 is maybe 10% faster than the green line of the 580, which is "nice" (well, 10% sucks, but that's only my opinion apparently). The red line though has much higher and lower spikes than the green line.

What this means is that every now and again, any only for a frame or two each time, the FPS plummets, to as much as half what it was. Then comes a much faster frame, to compensate.

What this does is give a yo yo effect when playing the game. For the most time you will be running at 36fps, then for a second, it will stutter down to 18fps. I don't know about you, but I find this hugely noticeable. Sure it happens on my 4870x2 more than it would on the 7970, but my reason for going with single GPU this time was to have it much smoother. The Techreport review is showing this isn't the case. Sorry, but high FPS don't mean much when every few seconds there's a noticable lag.

I know the drivers are new, but at *this moment*, the (hopefully) driver issues + ultra conservative clock speeds make the 7970 (vanilla) a miss for me.

An average of 15% and high of 24% faster than the 580 isn't amazing.

Le voice of reason \o/
 
crysis2-nv.gif


On those graphs, a lower line is "faster" and a thinner line is "smoother". The red line of the 7970 is maybe 10% faster than the green line of the 580, which is "nice" (well, 10% sucks, but that's only my opinion apparently). The red line though has much higher and lower spikes than the green line.
I don't see "10%" faster. HD7970 generated more than 20% more frames than GTX580.
 
Could you please link those graphs showing huge slowdowns at random points? I have read Techreport's review today and i couldn't find slowdowns on the graphs you are referring to.

And the review sir doris was mentioning can be found here.

Yeah, that's the one. I had just found it myself and was about to post the link.
 
I think, since Scott decided to throw in the classical benchmark bars as well anyway, it would have made more sense to align the graphs to the run time of the benchmark, so identical scenes would be on top of each other. IOW, time in seconds on the x-axis.
 
I think, since Scott decided to throw in the classical benchmark bars as well anyway, it would have made more sense to align the graphs to the run time of the benchmark, so identical scenes would be on top of each other. IOW, time in seconds on the x-axis.

My thoughts exactly.
 
What this does is give a yo yo effect when playing the game. For the most time you will be running at 36fps, then for a second, it will stutter down to 18fps. I don't know about you, but I find this hugely noticeable.
The resolution of the diagram isn't quite high enough, but I suspect these are actually _single_ frame spikes (so a slow frame immediately followed by a fast frame). It doesn't really make sense that it would slow down for a second then "compensate" for the next second (and the resolution of the diagram might not be enough but clearly the spikes last for less than a second). Not to say you couldn't notice single frame spikes, but it points to more of a driver issue (or something else software) rather than hardware.
 
Exactly. In this case a single-frame spike means, that one frame isn't synchronized, so the frame exists, but it doesn't improve subjective fluency. One spike per second means one lost frame per second (still in terms of subjective fluency), so despite the game runs at 36 fps, it's subjectively only as fluent as 35 fps game-play. I doubt ~1 fps difference can be noticed during real game-play.
 
This would be bringing us back to the microstutter argument all over again. It's there. Some people see it, some don't. The fact is on the graph I linked, the 7970 is obviously more "messy" than either of the other two single GPU cards that it's compared against.
 
Exactly. In this case a single-frame spike means, that one frame isn't synchronized, so the frame exists, but it doesn't improve subjective fluency. One spike per second means one lost frame per second (still in terms of subjective fluency), so despite the game runs at 36 fps, it's subjectively only as fluent as 35 fps game-play. I doubt ~1 fps difference can be noticed during real game-play.
Exactly.

This would be bringing us back to the microstutter argument all over again. It's there. Some people see it, some don't. The fact is on the graph I linked, the 7970 is obviously more "messy" than either of the other two single GPU cards that it's compared against.
You have to look at the graph again and realize you are looking at single frame times (two frames on that graph with duration between 60 to 70ms to be exact).
as far as the Crysis 2 - time spent beyond 50ms graph goes i will just paste Tottentranz response because he nailed it perfectly yet you seem to not get it.
Yet your comments seem to show that you don't actually know what these tests mean..
That Crysis 2 is a 90 second test. During those 90 seconds, the HD7970 surpassed the 50ms response time (lower than 20fps FYI) during 51ms.
So to sum it up, the rendering speed of the HD7970 dropped below 20fps/50ms during 0,05 seconds, or 0,056% of the bench time.

And you don't even know what the actual rendering speed was during this time. it could be 19,99fps while the GTX580 is doing 20,01fps.

The number of "external" factors that can cause this micro-delay are so many that it's pretty much ridiculous to take an assumption of "better or worse" from it.
And to sum things up i will ask you to think about one thing. How would above mentioned graph from Crysis 2 look like if Techreport guys would decide to move the bar up from 50ms to lets say 30ms.
Sad thing is the Techreport review is a great piece of work yet you failed to understand it fully.
 
Back
Top