AMD: R7xx Speculation

Status
Not open for further replies.
You can't compare SLI & Crossfire. Crossfire works... :p

Heh, I wish that were the case, but apparently not (at least in the current form). Take a look at the extreme fluctuations in framerate on the X2 CrossfireX:

http://enthusiast.hardocp.com/article.html?art=MTQ4OSwzLCxoY29uc3VtZXI=

Compare that to the GTX 280, which has lower maximum (and average) frame rate but much less fluctuation in frame rate:

http://enthusiast.hardocp.com/article.html?art=MTUxOCw0LCxoZW50aHVzaWFzdA==

And this page really illustrates how much less fluctuation there is in framerates with GTX 280 vs the SLI solution:

http://enthusiast.hardocp.com/article.html?art=MTUxOCw5LCxoZW50aHVzaWFzdA==

It will be very interesting to see how framerate fluctuates on 4850 Crossfire and 4870 Crossfire, but I expect to see more of the same.
 
Last edited by a moderator:
it really seems your clutching at straws, there are points where all cards suffer, the problem with the graph is you cant actually tell the drop in frame rate on the 280 because its to hard to actually see the line. if you look at the first and second major drop the 280 is right there with them. After that its to hard to actually make out where the dip is.

i dont see any proof of the point you are trying to make in those graphs. the trends are pretty consistant across the board ( 260,280, GTX sli, X2 etc). The other problem is once you get that low in frame rates 10 frame a sec, 5 frames a sec it doesn't really matter and all the cards hit this point.
 
it really seems your clutching at straws, there are points where all cards suffer, the problem with the graph is you cant actually tell the drop in frame rate on the 280 because its to hard to actually see the line. if you look at the first and second major drop the 280 is right there with them. After that its to hard to actually make out where the dip is.

i dont see any proof of the point you are trying to make in those graphs. the trends are pretty consistant across the board ( 260,280, GTX sli, X2 etc). The other problem is once you get that low in frame rates 10 frame a sec, 5 frames a sec it doesn't really matter and all the cards hit this point.

lol, clutching at straws? How about direct quotes from the review itself:

[H]OCP said:
In this first graph we are comparing the BFGTech GTX 280 OC with the GTX 260 and 9800 GTX SLI. Once again we see GeForce 9800 GTX [SLI] producing overall better framerates, but it is also more erratic in gameplay, swinging from low to high framerates quickly. In fact it produced the lowest minimum framerates in this test at 28 FPS while the GTX 280 hit 35 FPS.

[H]OCP said:
In this first graph we are comparing the BFGTech GTX 280 OC with the GTX 260 and 9800 GTX SLI. These results are quite interesting. First we see the GeForce 9800 GTX SLI pulling in faster framerates, until we reach the 341 second mark. At this mark is where in the game we enter a heavily grassy area, where transparency supersampling is working overtime to reduce aliasing. This part of the game requires a lot of memory capacity and bandwidth. Once we reach that section in the game we see the GTX 280 and 260 surpass GeForce 9800 GTX SLI.

[H]OCP said:
In this first graph we are comparing the BFGTech GTX 280 OC with the GTX 260 and 9800 GTX SLI. You can see that mostly everything is below 25 FPS which is unplayable. It appears as if GeForce 9800 GTX SLI is pulling higher framerates, and it is, but the gameplay felt much more choppy and laggier than the GTX 280 and 260 did. SLI performance is more erratic, and you have to feel it in-game, but it definitely felt worse despite the framerate being higher.
 
[H] is terrible. Funny just in the last few weeks when microstuttering was revealed by OTHER sites, H starts finding differences in playability with SLI/CF..

If H's real world gameplay crap was all they claim, they would have been the ones breaking that news, instead of hopping on the bandwagon after other sites revealed it. In fact real world gameplay is EXACTLY what should have detected the problem first, but H has never mentioned it until recently. One wonders if they actually do bother to play these games.

And it doesn't hurt I'm sure that suddenly Nvidia is selling a SINGLE high end card they want to tell you is better than dual GPU..
 
[H] is terrible. Funny just in the last few weeks when microstuttering was revealed by OTHER sites, H starts finding differences in playability with SLI/CF..

If H's real world gameplay crap was all they claim, they would have been the ones breaking that news, instead of hopping on the bandwagon after other sites revealed it. In fact real world gameplay is EXACTLY what should have detected the problem first, but H has never mentioned it until recently. One wonders if they actually do bother to play these games.

And it doesn't hurt I'm sure that suddenly Nvidia is selling a SINGLE high end card they want to tell you is better than dual GPU..

Well, politics aside ( :) ), you can't really debate the fact that they found the SLI system much choppier in framerate compared to the GTX 280. It's easy to see why SLI-based systems score so high in average framerates because they DO hit very high framerates, but there is huge fluctuation in framerate in different games which hurts the playability at those settings. Many SLI and Crossfire owners have been reporting this on their own for some time as far as I know.
 
Well, politics aside ( :) ), you can't really debate the fact that they found the SLI system much choppier in framerate compared to the GTX 280. It's easy to see why SLI-based systems score so high in average framerates because they DO hit very high framerates, but there is huge fluctuation in framerate in different games which hurts the playability at those settings. Many SLI and Crossfire owners have been reporting this on their own for some time as far as I know.

The very link you use shows a lot of fluctuation with the GTX280. At one point the GTX280 is even lower than the multiple-GPU solution.

Of course, the GTX280 line is conveniently covered by the multi-gpu line despite it being an article about the GTX280.

http://img110.imageshack.us/img110/7664/1213329410dkpbupba8a44ldl9.gif

Note how many of the worst spikes are from the single GPU GTX260.
 
Clearly the SLI/Crossfire setups need more memory. Other than that what was so erratic about the results?

Certainly the framebuffer limitations will always be an issue when comparing lower cost cards in SLI/Crossfire vs higher cost single GPU cards, but there are also other issues such as micro-stuttering with multi-GPU systems. What I'm really trying to say is that there is no doubt that lower cost GPU's used in SLI/Crossfire can give very high maximum (and hence average) framerates, but that doesn't always translate to better playability or better true gaming performance vs a higher end single GPU.
 
The very link you use shows a lot of fluctuation with the GTX280. At one point the GTX280 is even lower than the multiple-GPU solution.

The GTX 280 has much less fluctuation than the SLI system, meaning that swings from very high to very low framerates are much less in magnitude. Also, at ~ 215 second in Crysis is the one point where the GTX 280 appears to have a min, while there are several other points where the SLI system bottoms out at a min.

Note how many of the worst spikes are from the single GPU GTX260.

Not really, other than maybe Crysis (which is super demanding on all the systems tested, SLI included). Note that the reviewer stated:

[H]OCP said:
The GeForce GTX 260 costs less than GeForce 9800 GTX SLI, and the GTX 280 will cost more. Our gameplay testing though seems to indicate that in some games, like Call of Duty 4 and Assassin’s Creed, the GeForce GTX 260 is able to provide a better gameplay experience than GeForce 9800 GTX SLI.
 
Last edited by a moderator:
the graph you posted dont show that, the issues your talking about aren't really that graphable.

also if you where after a smooth gameplay experience you wouldn't use those settings on any of the cards anyway. So then depending on the exact bottle neck "consistence" in terms of frame rates can be quite different compared to pushing push the cards helter skelter.
 
the graph you posted dont show that, the issues your talking about aren't really that graphable.

also if you where after a smooth gameplay experience you wouldn't use those settings on any of the cards anyway. So then depending on the exact bottle neck "consistence" in terms of frame rates can be quite different compared to pushing push the cards helter skelter.

Of course the graphs I posted show this. Take a look at the apples-to-apples graphs again, and read the summary below each graph for an explanation:

http://enthusiast.hardocp.com/article.html?art=MTUxOCw5LCxoZW50aHVzaWFzdA==
 
C...but that doesn't always translate to better playability or better true gaming performance vs a higher end single GPU.
Have you ever used a dual gpu? do you even know what "micro stutter" is?
Do you ever read what Chrisray writes? Do you like AA in games at a decent res.?
I will ALWAYS take a stutter over lower settings..., or no AA... If you go and look up what Chrisray has written on the effect of low frames and the stutter( is spot on). Sooo when you can get in the 30s for your lows your stutter is non issue. If you go lower than your kit is defective.
As for Hardcopout, they are just tools and as about as informative as the Inquirer but the Inq has less spam to get to the point.
 
Have you ever used a dual gpu? do you even know what "micro stutter" is?
Do you ever read what Chrisray writes? Do you like AA in games at a decent res.?
I will ALWAYS take a stutter over lower settings..., or no AA... If you go and look up what Chrisray has written on the effect of low frames and the stutter( is spot on). Sooo when you can get in the 30s for your lows your stutter is non issue. If you go lower than your kit is defective.

You aren't following the argument here. Of course I know that as long as the frame rate doesn't get below ~30fps at a min, then gameplay should be fine. But do you really think SLIing two mid-range GPU's will let you run with higher AA and/or higher resolution than a high-end GPU? That's so far from the reality of the situation it's not even funny, because of the dramatic "dips" in the framerate on the multi-gpu systems with framebuffer limitations compared to much more expensive single GPU cards. The point is not that SLI/Crossfire won't help performance vs using just one card (ie. a 280 GTX in SLI should always be as good or better than a single GTX 280), the point is that two mid-range cards in SLI/Crossfire won't automatically provide a better gaming experience than a high-end GPU simply because they have higher maximum and average framerate at any particular setting.
 
Last edited by a moderator:
Also, at ~ 215 second in Crysis is the one point where the GTX 280 appears to have a min, while there are several other points where the SLI system bottoms out at a min.

Um, look at the graph and see how the yellow line covers up the blue? Maybe if HardOCP would show the blue line you'd see how far it goes down.

It's an article about the GTX280, why were they so quick to cover up the line for the GTX280?
 
Um, look at the graph and see how the yellow line covers up the blue? Maybe if HardOCP would show the blue line you'd see how far it goes down.

It's an article about the GTX280, why were they so quick to cover up the line for the GTX280?

You can look at the apples-to-apples comparison of Crysis, and you will instantly notice that the GTX 280 never dips lower in framerate than the SLI-based system.
 
You can look at the apples-to-apples comparison of Crysis, and you will instantly notice that the GTX 280 never dips lower in frame-rate than the SLI-based system.

The Apples-to-Apples graph has precisely two points where the SLI solution dips lower than the GTX280. At the more severe point, even the GTX dips to 5FPS.

So the comparison is between Abysmal and Horrible. I'd say it really won't matter since with either solution I'd have jerk at that moment.

Of course, 80% of the time, SLI gives 5FPS more than the GTX280.



Your whole argument is utterly flawed because the GTX280 dips catastrophically at the SAME PLACES. If the SLI solution dips to 3FPS while the GTX280 dips to 6, it really won't matter that the GTX280 didn't dip as far in terms of gameplay (the SAME gameplay you keep CROWING about).
 
Why is everyone so quick to talk about the GTX 260/280 vs GX2 in a thread about AMD R7xx Speculation?
 
Of course the graphs I posted show this. Take a look at the apples-to-apples graphs again, and read the summary below each graph for an explanation:

http://enthusiast.hardocp.com/article.html?art=MTUxOCw5LCxoZW50aHVzaWFzdA==

i dont care about the summary, im looking at the trending between the different cards and while you are cherry picking benchmarks to prove your point there is still only 1 game you have shown where the trend of the dual Gpu card is quite different to the other gpu's graphed. Also they dont graph other dual gpu solutions eg X2 for that benchmark so you cant tell if its a dual gpu issue or a SLI issue.

You also failed to recognise the problem with looking at consistency of frame rate when looking at benchmarks that are at settings that you wouldn't play on because of the inconsistent frame rates on any of these cards.

drop the settings back to a level where you meet a minimum frame rate ( eg 30 fps) and then compare cards that would be far more meaningful with the point your trying to prove.
 
Your whole argument is utterly flawed because the GTX280 dips catastrophically at the SAME PLACES. If the SLI solution dips to 3FPS while the GTX280 dips to 6, it really won't matter that the GTX280 didn't dip as far in terms of gameplay (the SAME gameplay you keep CROWING about).

You don't understand my point I think. NONE of the systems tested in apples-to-apples in Crysis at Very High were playable, the point was just to illustrate that the SLI system has much worse fluctuations (and lower minimums) in the frame rate vs the high end GTX. That point is hardly debateable, especially when looking at the other games tested in the review.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top