Synchroniszation issues with SLI and CrossFire

To be honest I've never paid much attention to this effect. :oops: But I'm glad it was raised (yet again) as I'm one of those people who've never actually used a multi-GPU setup so it's good to know.
 
Hmmm, thanks....I had always naively assumed that dumping time was close enough to rendering time to give each rendered frame relatively equal screen time. It looks like all AFR does is let the CPU burn through frames faster so that GPU A gets a new frame slightly faster than it would if running by itself. If the difference is as great as you indicate shouldnt this stuttering be glaringly obvious in faster paced games?

no, cause I'm really showing the most extreme example possible. The drivers themselves do try to spread out the frame completion signals somewhat so its not as back to back in my example, its just that in some cases they screw up.
 
This is not new. ((I'm surprised at anyone who'd think this was something "new"))
No one was suggesting that this is a new issue.
That this issue exists since years and isn't publicized by hardware review sites is really shocking because it makes such a big difference.

And there are different ways of handling AFR synchronization in the nvidia drivers.

Some games respond differently. Hence the differences needed in the Nvidia profiling system and while some standard AFR profiles wont work.
Could you please clarify what you specifically mean with that?
A lot of games were tested and all have shown the same issue, regardless of the kind of AFR mode or if MRTs and the like are used.
 
Nope. But if your willing to browse and try to figure out the Nvidia profiles. The hex values do mean something.

Chris
 
No one was suggesting that this is a new issue.
That this issue exists since years and isn't publicized by hardware review sites is really shocking because it makes such a big difference.

Could you please clarify what you specifically mean with that?
A lot of games were tested and all have shown the same issue, regardless of the kind of AFR mode or if MRTs and the like are used.

How big of a difference really?Will the average dude getting his SLi/CF config notice it?If so, why isn't there any user outcry of righteous annoyance with this(the SLi/CF crowd are very good at this).

It's something that ends up being dependent on the driver screwing up as badly as Aaron showed-which certainly is possible, but with properly profiled titles should be improbable...with generic compatibility AFR profiles the story might be different-and on ppl actually being sensitive to it/noticing it. Now, reading a piece on the web about the might have a billion dudes saying:"Yeah, I've got that microstuttering thing going on!Clearly!Obviously!Undoubtedly!", which is mostly a placebo effect.

There are ppl who will certainly notice it(if it is extreme enough). There are ppl who are sensitive to the input lag AFR introduces. But overall, out of the small percentage with multi-GPU solutions, they seem to be the minority.

And yes, I agree with Aaron that alternatives to AFR should be explored-a)who says this isn't the case?;) and b)there's that small issue of who'll make the first step and risk being reamed in reviews everywhere for lower framerates than those produced by the AFR using competition.
 
How big of a difference really?Will the average dude getting his SLi/CF config notice it?If so, why isn't there any user outcry of righteous annoyance with this(the SLi/CF crowd are very good at this).
Because it is not obvious and most people only look at their FPS counter.
As I already said - a part of the German enthusiast community noticed the problem and started a widespread awareness about this and the problem was even retested and confirmed by German speaking hardware sites like here.

I cannot understand why this immense problem isn't know internationally either and this has to change.

It's something that ends up being dependent on the driver screwing up as badly as Aaron showed-which certainly is possible, but with properly profiled titles should be improbable...with generic compatibility AFR profiles the story might be different-and on ppl actually being sensitive to it/noticing it. Now, reading a piece on the web about the might have a billion dudes saying:"Yeah, I've got that microstuttering thing going on!Clearly!Obviously!Undoubtedly!", which is mostly a placebo effect.

There are ppl who will certainly notice it(if it is extreme enough). There are ppl who are sensitive to the input lag AFR introduces. But overall, out of the small percentage with multi-GPU solutions, they seem to be the minority.

And yes, I agree with Aaron that alternatives to AFR should be explored-a)who says this isn't the case?;)
Again, this was tested thoroughly and the problem is NOT dependent on driver settings, driver versions, type of games or whatever.
Take a look at the links provided and you will see that everyone is making the same observations.

and b)there's that small issue of who'll make the first step and risk being reamed in reviews everywhere for lower framerates than those produced by the AFR using competition.
I don't think a fix will reduce the frame rates significantly. It is only about scheduling the rendering of frames.

Anyway, this problem causes tremendously lower performance than the frame rate figures suggest and thus we badly need some serious media attention on this to increase pressure on ATI/Nvidia to get it fixed.
 
Oh really Captain Obvious? ;)

Well considering you've done all this testing or so you say. I pretty much told you where to look to manipulate this. But I have no intention of detailing every single hex value's function for you.
 
Micro-shuttering is one of the 2 most devastating issues for me in SLI.
The second one is inability to use tripple buffering + vsync.

In addition, even using vsync in SLI (double buffered) introduce visible motion stutter and produces visually much less pleasing results then with single videocard (with pretty same FPS). This is a result of updating the scene by the game according to 2 different time intervals.

Im quite sure it is possible to make AFR+vsync "friendly" game but is it worth it? Performance hit (from 60 to 40 FPS) caused by inability to use tripple buffering is almost the same as those caused by switching off 1 GPU.
 
Still said:
I cannot understand why this immense problem isn't know internationally either and this has to change.
Tempest in a teapot. There are far larger problems internationally than some alledged AFR stuttering issue.

You can always force SFR to be used, if you believe that AFR doesn't work.
 
Tempest in a teapot. There are far larger problems internationally than some alledged AFR stuttering issue.

You can always force SFR to be used, if you believe that AFR doesn't work.



Yup. I see this mentioned at slizone like uhh never. People who know they dont like Multi GPU typically dont buy multi GPU in the first place. We all know AFR isnt perfect all the time. But it's not like every alternated frame does this.
 
Tempest in a teapot. There are far larger problems internationally than some alledged AFR stuttering issue.
Yeah, I was absolutely saying that this is more important than Reuters headlines! :cool:

First of all, if you don't believe in the "allegations", than test it on your own. It only takes 5 minutes to do so.
And yes, this issue is so important for the related market, that it should be known worldwide by people who care about these product.
Or don't you agree that it someone considering to buy a second graphics card would like to know that most of the proclaimed performance gains are wasted in reality?

You can always force SFR to be used, if you believe that AFR doesn't work.
SFR is mostly useless because of bad performance.


About the "royal plural": I am simply not a single person behind the observations/concerns and posts here. ;)
 
Anyone else having trouble reconciling this?

Because it is not obvious ... I cannot understand why this immense problem
[emphasis added]

If it's not obvious to users (to the extent that most users don't even notice it), how can it be an immense problem?

I'll be the last to claim AFR is goodness -- single fast GPU for me, thank you very much -- but give me a break. If Still had posted a link to his website I'd write this off as a sad attempt to get page hits.
 
Well considering you've done all this testing or so you say. I pretty much told you where to look to manipulate this. But I have no intention of detailing every single hex value's function for you.
I have read the nvidia control panel API paper and know what these settings are about. And no - they don't have an influence on the problem (as I already said).
 
I have read the nvidia control panel API paper and know what these settings are about. And no - they don't have an influence on the problem (as I already said).

You are wrong. You clearly dont understand what the hex values are doing. ((Hint they do much more than control the type of SLI rendering mode ((AFR/SFR/)). They have an effect on how the driver handles frame workload and syncing. Nvidia's control panel API paper does not go into detail of its various SLI compatibility bits. Because these bits are not even accessible from the control panel.

Anyway. I'm done here.
 
Last edited by a moderator:
Back
Top