Synchroniszation issues with SLI and CrossFire

But see, this is what I keep mentioning -- if you're getting good framerate, the "effect" of both lag and stutter are inconsequential. Isn't that the entire reason you run multiple GPU's to begin with?

So now I'm back at my question: I want to know WHY I'm wrong in saying that this miniscule amount of lag is no worse than network lag. I saw someone's scenario / hypothetical situation above, but that didn't really make sense to me. Basically, I don't understand how you can be "frames ahead" of the input system, unless the game code sucks for shit.
 
We need to talk in terms of ms instead of frames, when it comes to input lag. For me anything greater than 50ms is unacceptable. How much input lag are we talking about here?
 
We need to talk in terms of ms instead of frames, when it comes to input lag. For me anything greater than 50ms is unacceptable. How much input lag are we talking about here?

But therein lies the problem. The "lag" is entirely dependant on frame rendering, so if you're behind by two frames (I STILL don't understand how this works, but whatever) then it would be:

2 frames lag at 100FPS = 2ms lag

2 frames lag at 20FPS = 100ms lag

All of this stutter and input lag we're talking about really is specifically measured in frames, which are dependant on how FAST the frames are being rendered. Which is why I'm still completely lost as to why it's such a "huge" deal...

If doubling my video cards boosts my framerate from 60fps to 100fps, then that single extra frame worth of lag really only amounts to 1ms. Now I understand that dips in framerate will of course make that lag worse (as one frame equates to a longer time span at lower framerates), but I'm still not seeing it.
 
I haven't been following this problem. Is this lag only a random lag that crops up for one frame every now and then or is it a persistent lag? I ask because a 20ms would still be a big problem to me if it were persistent.

2 frames lag at 100FPS = 2ms lag

2 frames lag at 20FPS = 100ms lag

1/5 the frame rate gives 50x the lag? 20ms, 100ms
 
But see, this is what I keep mentioning -- if you're getting good framerate, the "effect" of both lag and stutter are inconsequential. Isn't that the entire reason you run multiple GPU's to begin with?

So now I'm back at my question: I want to know WHY I'm wrong in saying that this miniscule amount of lag is no worse than network lag. I saw someone's scenario / hypothetical situation above, but that didn't really make sense to me. Basically, I don't understand how you can be "frames ahead" of the input system, unless the game code sucks for shit.



i *LOVE* Micro stutter; that was MY own issue .. that the second GPU *always* makes it better overall for me; but you have to realize my 20" LCD is "only" 16x10 and i use HD2900 Crossfire AA [which is not AFR] so i NEVER see micro stutter - except with Crysis
. . . and i play Crysis on my single 8800GTX at 10x7 .. all details 'Very hi' [except for Shadows, "hi"] and MS never bothers me

i don't want to say "anal", but some people [imo] are just easily annoyed .. --especially the guys with 30" LCDs struggling with Tr-Sli or Quad Crossfire
 
Make sure that you "strain" the graphics system ... if you are getting 50 FPS it is not very noticeable

it is a momentary "stutter" and rather different from input lag [although most people really don't see it either]

CNC3 which runs at a capped 30FPS doesn't show it. But it might have something to do with the framerate being capped.
 
I thought I'd go ahead here and display UT3 on 9800GX2 cards from 1 GPU to 4 GPUs in latency. To give you guys a better idea of whats going on and what increasing the the amount of AFR is doing. I took only 100 frames in this case and
used the UT3 Deimos Map with 16xAA/16xAF

I own a 8800GT SLI setup and I think it's worthless.

Regardless of that, I'd like to interpret the AFR results ChrisRay posted.

First of all I want to point out that for the 100 frames Chris posted, the actual FPS of one-way AFR is 41, two-way 57, three-way 56.5, and four-way 59.9... not the FPS values he posted on the graph. I think he did a much longer test and obtained those FPS's and just displayed the first 100 frames; which give totally different FPS's.

Anyway, the stuff that matters:

Let's look at the first 16 frames of 1-way and 2-way. 1-way up to that point is generally consistent with ~29ms of frame time difference that corresponds to a "practical" 35FPS. And when we look at the time data, we see that the 16th frame is rendered at 441st millisecond which makes the "official" FPS 36.2. That's good.

For the 2-way case, we see that the 16th frame is rendered at the 276th millisecond which makes the official framerate 58FPS up to that point. However, looking at the frame rate differences, for every third frame we see a frame time difference of roughly 21ms, which translates into 47 practical frames per second. AND IT FEELS WORSE THAN AN ACTUAL 47 FRAMES PER SECOND. Because the frame times are not consistent and this causes stuttering. I'd rather play a consistent 40FPS rather than an inconsistent "official" 58FPS.

I'm not even bothering to talk about the 3-way and 4-way results posted because they aren't even officially better than the 2-way result (3-way being even worse than 2-way) However, given the frame rates of the posted 100 frames, these 100 frames were probably the more stressing part of the benchmark (because the graphs showing the completed bench are much higher), and for the stressing part of the benchmark (i.e. where it matters) 3-way and 4-way didn't perform better than 2-way. Which tells a lot.
 
I own a 8800GT SLI setup and I think it's worthless.

Regardless of that, I'd like to interpret the AFR results ChrisRay posted.

First of all I want to point out that for the 100 frames Chris posted, the actual FPS of one-way AFR is 41, two-way 57, three-way 56.5, and four-way 59.9... not the FPS values he posted on the graph. I think he did a much longer test and obtained those FPS's and just displayed the first 100 frames; which give totally different FPS's.

Anyway, the stuff that matters:

Let's look at the first 16 frames of 1-way and 2-way. 1-way up to that point is generally consistent with ~29ms of frame time difference that corresponds to a "practical" 35FPS. And when we look at the time data, we see that the 16th frame is rendered at 441st millisecond which makes the "official" FPS 36.2. That's good.

For the 2-way case, we see that the 16th frame is rendered at the 276th millisecond which makes the official framerate 58FPS up to that point. However, looking at the frame rate differences, for every third frame we see a frame time difference of roughly 21ms, which translates into 47 practical frames per second. AND IT FEELS WORSE THAN AN ACTUAL 47 FRAMES PER SECOND. Because the frame times are not consistent and this causes stuttering. I'd rather play a consistent 40FPS rather than an inconsistent "official" 58FPS.

I'm not even bothering to talk about the 3-way and 4-way results posted because they aren't even officially better than the 2-way result (3-way being even worse than 2-way) However, given the frame rates of the posted 100 frames, these 100 frames were probably the more stressing part of the benchmark (because the graphs showing the completed bench are much higher), and for the stressing part of the benchmark (i.e. where it matters) 3-way and 4-way didn't perform better than 2-way. Which tells a lot.

The only thing i really have to ask you is about the bolded part:

I own a 8800GT SLI setup and I think it's worthless.

in other words, you are SO bothered by the "stuttering", you cannot appreciate the gain from 40 to nearly 60 FPS?

Some of us are not bothered - i see it; yet i much prefer the higher IQ settings and greater details - but then i only play at 16x10 or 16x12.

What resolution do you play at? it appears to me that the higher the resolution you need, the more irritating the MS appears to be - at least that is an observation i have noted but not confirmed.

Now i havce 2900xt Crossfire and use Crossfire to enable Filtering which is no longer AFR at all - so i never see MS unless i am playing Crysis at 0/AA; then it is irritating. So i play Crysis on my 19" CRT at 11x8 with Filtering applied

Also, i have a 8800GTX which IS a bit slower than my 2900xt Crossfire; but whenever possible, i also prefer a single GPU
 
The only thing i really have to ask you is about the bolded part:



in other words, you are SO bothered by the "stuttering", you cannot appreciate the gain from 40 to nearly 60 FPS?

Some of us are not bothered - i see it; yet i much prefer the higher IQ settings and greater details - but then i only play at 16x10 or 16x12.

What resolution do you play at? it appears to me that the higher the resolution you need, the more irritating the MS appears to be - at least that is an observation i have noted but not confirmed.

Now i havce 2900xt Crossfire and use Crossfire to enable Filtering which is no longer AFR at all - so i never see MS unless i am playing Crysis at 0/AA; then it is irritating. So i play Crysis on my 19" CRT at 11x8 with Filtering applied

Also, i have a 8800GTX which IS a bit slower than my 2900xt Crossfire; but whenever possible, i also prefer a single GPU

It's not 60FPS, look closer. If frame time differences are 10,10,20,10,10 you're seeing it as a very stuttering 20ms. I can't appreciate a FPS upgrade from a normal 40FPS to what I perceive as a stuttering 48FPS.

The results of benchmark programs are totally irrelevant; and it's only laughable, but also so common, to compare a non-AFR result to an AFR one; like "adding a second 8800GT bumped the FPS from 40 to 60!" Yes, it did, but then FPS is irrelevant when AFR is involved.

I play at 16x10 and stuttering is common in most games. If higher resolution indeed means this problem is more stressed, then it can be only bad for SLI because the only reason to get SLI today is if you have a 22" or better monitor. I'd dare say not even 22", 24"; because apart from Crysis there are no games that would stress a G92 (or 8800 GTX) at 1680x1050.

Some games play nice (for example FEAR) but some others do suck. For example in CS:Source when I enable SLI, frame rates are over the roof, 200+ most of the time, and it's smooth. But when there are some player models on the screen, the FPS naturally goes down to 150, and it feels like I'm playing at 40 or 50 FPS. The difference between 200FPS and 150FPS is so obvious, on a monitor than can support only 60Hz. When I disable SLI, frame rates can go down as low as 80 or 90 but still it doesn't bother as much.

I won't even talk about the monstrosity that is Assassin's Creed that goes practically implayable below frame rates of 50. What the hell? With one 8800GTX I had played and finished Crysis with frame rates that never go above 50 (and were mostly in the 30-35 range) with super-responsive controls and totally comfortable gameplay.

And, interestingly; my former graphics card used to be a G92 GTS and I played Crysis at 16x10, all High. When I sold that and got two GT's, I thought I'd be able to bump the graphics a lot more. Looking at the FPS, I could. But for the overall feel of the game, it sucked! And I had to bring all details down to High, again (I only enabled sunshafts. Bottom line: A second G92 gives you only sunshafts in Crysis?)

If you're good with your multi-GPU, then it's good. But not for most people, who don't even know "it's not OK" because the only thing they look at is the irrelevant AFR frame rate numbers.
 
Now i havce 2900xt Crossfire and use Crossfire to enable Filtering which is no longer AFR at all
Hubba-wa? It isn't, what is it then?

Also, i have a 8800GTX which IS a bit slower than my 2900xt Crossfire; but whenever possible, i also prefer a single GPU
What nVidia drivers are you using in XP? I've got a bit of a similar thing right now (3870s Crossfire and a monster 8800 GTS) and I've been having trouble finding a decent XP driver that lets me force things through CP. I keep seeing a damned mipmap line using the 8800 in too many games. :???:

Sorry if it's OT, but you seem like someone to ask about that. :)
 
It's not 60FPS, look closer. If frame time differences are 10,10,20,10,10 you're seeing it as a very stuttering 20ms. I can't appreciate a FPS upgrade from a normal 40FPS to what I perceive as a stuttering 48FPS.

The results of benchmark programs are totally irrelevant; and it's only laughable, but also so common, to compare a non-AFR result to an AFR one; like "adding a second 8800GT bumped the FPS from 40 to 60!" Yes, it did, but then FPS is irrelevant when AFR is involved.

I play at 16x10 and stuttering is common in most games. If higher resolution indeed means this problem is more stressed, then it can be only bad for SLI because the only reason to get SLI today is if you have a 22" or better monitor. I'd dare say not even 22", 24"; because apart from Crysis there are no games that would stress a G92 (or 8800 GTX) at 1680x1050.

Some games play nice (for example FEAR) but some others do suck. For example in CS:Source when I enable SLI, frame rates are over the roof, 200+ most of the time, and it's smooth. But when there are some player models on the screen, the FPS naturally goes down to 150, and it feels like I'm playing at 40 or 50 FPS. The difference between 200FPS and 150FPS is so obvious, on a monitor than can support only 60Hz. When I disable SLI, frame rates can go down as low as 80 or 90 but still it doesn't bother as much.

I won't even talk about the monstrosity that is Assassin's Creed that goes practically implayable below frame rates of 50. What the hell? With one 8800GTX I had played and finished Crysis with frame rates that never go above 50 (and were mostly in the 30-35 range) with super-responsive controls and totally comfortable gameplay.

And, interestingly; my former graphics card used to be a G92 GTS and I played Crysis at 16x10, all High. When I sold that and got two GT's, I thought I'd be able to bump the graphics a lot more. Looking at the FPS, I could. But for the overall feel of the game, it sucked! And I had to bring all details down to High, again (I only enabled sunshafts. Bottom line: A second G92 gives you only sunshafts in Crysis?)

If you're good with your multi-GPU, then it's good. But not for most people, who don't even know "it's not OK" because the only thing they look at is the irrelevant AFR frame rate numbers.

i have to agree with you the way you describe it. i have Zero experience with SLi however, and you can see that i run all of my games - except Crysis - at 16x10 [20.1" Samsung 205BW] with usually maximum Crossfire filtering applied. However, i am not getting AFR with CrossFire filtering so i usually love the improved IQ and details. However, Crysis is a studdering mess with 0/AA; so i drop my resolution down to even 8x6 [on my 19" Samsung 955DF CRT, which still looks nice!].

Now there are guys [like nVRollo] who simply refuse to acknowledge it and say it is always better with SLi; that is clearly nonsense, but some people never seem to also be much bothered by it. Just like mouse input lag just kills some people and others get by. i am thinking it is all compromise.

Now, here is my question; the creator of nHancer suggested a driver fix - adding a 2ms delay to the first card [i think]. Is that a reasonable fix?

Hubba-wa? It isn't, what is it then?
i was under the impression it wasn't pure AFR. There are 4 modes i can evidently choose from: AFR, Supertile, Scissor and SuperAA. Maybe someone better explain it to me, if i am lost when i apply maximum filtering; the MS becomes unnoticeable to me with in applied compared to without. =P
What happens when you apply SLI-AA? it isn't AFR either, is it?

this is old but is it still applicable, i think:

http://www.xbitlabs.com/articles/video/display/ati-crossfire-x1000_3.html
 
Last edited by a moderator:
Actually the recorded benchmark I use was just the one I use for all my UT3 Benchmarking. So yes the frame counter goes much further. As does rendered frames. I thought it was pretty obvious I wasnt including the whole thing because the post would have been a mile long of frame spam. There's nothing paticularly malicious about that. Other than trying to save some forum space for the few people who really care to read through all that. If I included all the frames. I would have had to need several posts worth of character space. All I was doing was demonstrating the typical frame distribution in UT3 and the overall results I got without spamming the forums in mile long page of data.

Like I said before. If anyone thins they can do a better job of obtaining a 100% accurate benchmark with the exact amount of frames rendered in a given sec. Then by all means. Be my guest. I've certainly got no problem with that. I am not dedicating that amount of time to something I consider pretty much a non issue. I have much bigger things on my table right now to deal with hardware wise.

Apoppin. SLIAA will behave like a single GPU in frame syncs.
 
Last edited by a moderator:
Now, here is my question; the creator of nHancer suggested a driver fix - adding a 2ms delay to the first card [i think]. Is that a reasonable fix?


i was under the impression it wasn't pure AFR. There are 4 modes i can evidently choose from: AFR, Supertile, Scissor and SuperAA. Maybe someone better explain it to me, if i am lost when i apply maximum filtering; the MS becomes unnoticeable to me with in applied compared to without. =P
What happens when you apply SLI-AA? it isn't AFR either, is it?

this is old but is it still applicable, i think:

http://www.xbitlabs.com/articles/video/display/ati-crossfire-x1000_3.html

To the first question, no:the load-balancing process is dynamic for a reason.

SuperTILING and Scissor are no longer used in current drivers AFAIK, you either get compatible AFR or the properly profiled AFR, once the game gets handled by the driver team.

SuperAA is not AFR, obviously. It's also not feasible for newer titles that don't work with CP forced AA.

What is this Crossfire filtering thing you speak of.
 
To the first question, no:the load-balancing process is dynamic for a reason.

SuperTILING and Scissor are no longer used in current drivers AFAIK, you either get compatible AFR or the properly profiled AFR, once the game gets handled by the driver team.

SuperAA is not AFR, obviously. It's also not feasible for newer titles that don't work with CP forced AA.

What is this Crossfire filtering thing you speak of.

Have you ever tried SLi-AA?

According to Chris Ray:
apoppin. SLIAA will behave like a single GPU in frame syncs.

Crossfire AA also behaves like a single GPU in the frame synchs.
 
Hes right. Control panel forced AA modes dont work really well with games that dont support control panel AA. Lately I find 75% of games have their own ingame AA options. What I am glad about is that they are supporting CSAA.
 
Have you ever tried SLi-AA?

According to Chris Ray:


Crossfire AA also behaves like a single GPU in the frame synchs.

And what exactly did i say?It's not AFR. It's same frame rendering:both GPUs render the same frame with 4x AA, the results are slightly offset to prevent sample overlap,combined and voila,you have 8x SuperAA/SLIAA. What frame synching do you want to have there?Why?
 
Last edited by a moderator:
And what exactly did i say?It's not AFR. It's same frame rendering:both GPUs render the same frame with 4x AA, the results are slightly offset to prevent sample overlap,combined and voila,you have 8x SuperAA/SLIAA. What frame synching do you want to have there?Why?


excuse me, did you not ask me this or was it rhetorical?

What is this Crossfire filtering thing you speak of.

OK .. now, i know it is not AFR; were we misunderstanding each other?
--and have you tried SLI-AA?

i cannot; i have no sli MB but i will not play a game without Crossfire AA if i can help it; even to dropping the resolution and playing on my CRT.
 
excuse me, did you not ask me this or was it rhetorical?



OK .. now, i know it is not AFR; were we misunderstanding each other?
--and have you tried SLI-AA?

i cannot; i have no sli MB but i will not play a game without Crossfire AA if i can help it; even to dropping the resolution and playing on my CRT.

There is no CF filtering,sigh. SuperAA is SuperAA. Or is the filtering thing like the SMIC one or the Tesla 2.0 one?In which case i'm sorry for having interfered.
 
Back
Top