Yes, my dellusions, based on facts and graphs by a 3rd party at resolutions the majority of people will be using. How dare I be so bold!
I don't care how high a single avg fps number is, its the slight stuttering thats most annoying:
Worse than a 580GTX.
Irrelevant bench as Skyrim's almost always CPU-bound. Nonetheless, no one would ever distinguish the difference between 33 and 35ms, much less find it "worse".
Worse than a 580GTX.
So now we don't use "worse" anymore? On pair, lol ok.
On a par with 580GTX.
Yes, it has higher framerates (20%-30%), but as you can see from the Techreport review, hardly any time does this take a game from unplayable to playable.
As stated before, these 20-30% higher framerates usually show their difference in future titles.
While apples-to-apples are neat to show raw percentage differences, it doesn't mean much. What matters more is what does that performance let me do, how high can I now increase my game's settings to improve the gameplay experience. In that, we saw that clear across the board, these percentage differences were enough to allow us to play at higher settings on the Radeon HD 7970 in every single game, at every resolution.
Maybe you should read the text too.I don't care how high a single avg fps number is, its the slight stuttering thats most annoying:
Worse than a 580GTX.
Admittedly, the HD7970 is not much faster in this game anyway (10% more average FPS), but for all chips the 99% percentile seems to be completely caused by spikes. Again, we don't know why those frames are so slow, but I severely doubt it's got anything to do with gpu performance (again more like texture uploads).
On a par with 580GTX.
I agree the fan seems to be set a bit too aggressive (as are voltages imho, a bit less OC potential with slightly lower default voltage would allow the card to run with less power draw (which would translate to more quiet).My other issues with the card are it being both too conservative, and pushed slightly too hard. In this I'm talking about the fan. On full load, the fan is more than mildly annoying, but it doesn't need to be. The card is actually quite cool really. The fan should be 20% slower for 10% more heat.
With regards to the AMD 'micro stuttering', one of the hardware sites (techreport?) did some analysis recently with a few AMD and Nvidia midrange cards (6870, 6950, 560, 560Ti, etc.) in (I think) Battlefield 3. The AMD cards scored worse in the first benchmark, however in the second they scored much better. The big surprise was the drop the Nvidia cards suffered in the second test, much worse than the AMD cards in the first test. Horses for courses it would seem.
So now we don't use "worse" anymore? On par, lol ok.
Not sure what review you just read, but the 7970 is on par with or WORSE than a GTX580 in Skyrim, Batman, Crysis2.
With regards to the AMD 'micro stuttering', one of the hardware sites (techreport?) did some analysis recently with a few AMD and Nvidia midrange cards (6870, 6950, 560, 560Ti, etc.) in (I think) Battlefield 3. The AMD cards scored worse in the first benchmark, however in the second they scored much better. The big surprise was the drop the Nvidia cards suffered in the second test, much worse than the AMD cards in the first test. Horses for courses it would seem.
...
And FWIW, the Techreport data shows some huge slow downs at random points. The higher overall higher FPS is nice, but these "spikes of slowness" will feel all the more jarring because of it.
Could you please link those graphs showing huge slowdowns at random points?
An average of 15% and high of 24% faster than the 580 isn't amazing.
I don't see "10%" faster. HD7970 generated more than 20% more frames than GTX580.
On those graphs, a lower line is "faster" and a thinner line is "smoother". The red line of the 7970 is maybe 10% faster than the green line of the 580, which is "nice" (well, 10% sucks, but that's only my opinion apparently). The red line though has much higher and lower spikes than the green line.
Could you please link those graphs showing huge slowdowns at random points? I have read Techreport's review today and i couldn't find slowdowns on the graphs you are referring to.
And the review sir doris was mentioning can be found here.
I think, since Scott decided to throw in the classical benchmark bars as well anyway, it would have made more sense to align the graphs to the run time of the benchmark, so identical scenes would be on top of each other. IOW, time in seconds on the x-axis.
The resolution of the diagram isn't quite high enough, but I suspect these are actually _single_ frame spikes (so a slow frame immediately followed by a fast frame). It doesn't really make sense that it would slow down for a second then "compensate" for the next second (and the resolution of the diagram might not be enough but clearly the spikes last for less than a second). Not to say you couldn't notice single frame spikes, but it points to more of a driver issue (or something else software) rather than hardware.What this does is give a yo yo effect when playing the game. For the most time you will be running at 36fps, then for a second, it will stutter down to 18fps. I don't know about you, but I find this hugely noticeable.
Exactly.Exactly. In this case a single-frame spike means, that one frame isn't synchronized, so the frame exists, but it doesn't improve subjective fluency. One spike per second means one lost frame per second (still in terms of subjective fluency), so despite the game runs at 36 fps, it's subjectively only as fluent as 35 fps game-play. I doubt ~1 fps difference can be noticed during real game-play.
You have to look at the graph again and realize you are looking at single frame times (two frames on that graph with duration between 60 to 70ms to be exact).This would be bringing us back to the microstutter argument all over again. It's there. Some people see it, some don't. The fact is on the graph I linked, the 7970 is obviously more "messy" than either of the other two single GPU cards that it's compared against.
And to sum things up i will ask you to think about one thing. How would above mentioned graph from Crysis 2 look like if Techreport guys would decide to move the bar up from 50ms to lets say 30ms.Yet your comments seem to show that you don't actually know what these tests mean..
That Crysis 2 is a 90 second test. During those 90 seconds, the HD7970 surpassed the 50ms response time (lower than 20fps FYI) during 51ms.
So to sum it up, the rendering speed of the HD7970 dropped below 20fps/50ms during 0,05 seconds, or 0,056% of the bench time.
And you don't even know what the actual rendering speed was during this time. it could be 19,99fps while the GTX580 is doing 20,01fps.
The number of "external" factors that can cause this micro-delay are so many that it's pretty much ridiculous to take an assumption of "better or worse" from it.