The only thing i really have to ask you is about the bolded part:
in other words, you are SO bothered by the "stuttering", you cannot appreciate the gain from 40 to nearly 60 FPS?
Some of us are not bothered - i see it; yet i much prefer the higher IQ settings and greater details - but then i only play at 16x10 or 16x12.
What resolution do you play at? it appears to me that the higher the resolution you need, the more irritating the MS appears to be - at least that is an observation i have noted but not confirmed.
Now i havce 2900xt Crossfire and use Crossfire to enable Filtering which is no longer AFR at all - so i never see MS unless i am playing Crysis at 0/AA; then it is irritating. So i play Crysis on my 19" CRT at 11x8 with Filtering applied
Also, i have a 8800GTX which IS a bit slower than my 2900xt Crossfire; but whenever possible, i also prefer a single GPU
It's not 60FPS, look closer. If frame time differences are 10,10,20,10,10 you're seeing it as a very stuttering 20ms. I can't appreciate a FPS upgrade from a normal 40FPS to what I perceive as a stuttering 48FPS.
The results of benchmark programs are totally irrelevant; and it's only laughable, but also so common, to compare a non-AFR result to an AFR one; like "adding a second 8800GT bumped the FPS from 40 to 60!" Yes, it did, but then FPS is irrelevant when AFR is involved.
I play at 16x10 and stuttering is common in most games. If higher resolution indeed means this problem is more stressed, then it can be only bad for SLI because the only reason to get SLI today is if you have a 22" or better monitor. I'd dare say not even 22", 24"; because apart from Crysis there are no games that would stress a G92 (or 8800 GTX) at 1680x1050.
Some games play nice (for example FEAR) but some others do suck. For example in CS:Source when I enable SLI, frame rates are over the roof, 200+ most of the time, and it's smooth. But when there are some player models on the screen, the FPS naturally goes down to 150, and it feels like I'm playing at 40 or 50 FPS. The difference between 200FPS and 150FPS is so obvious, on a monitor than can support only 60Hz. When I disable SLI, frame rates can go down as low as 80 or 90 but still it doesn't bother as much.
I won't even talk about the monstrosity that is Assassin's Creed that goes practically implayable below frame rates of 50. What the hell? With one 8800GTX I had played and finished Crysis with frame rates that never go above 50 (and were mostly in the 30-35 range) with super-responsive controls and totally comfortable gameplay.
And, interestingly; my former graphics card used to be a G92 GTS and I played Crysis at 16x10, all High. When I sold that and got two GT's, I thought I'd be able to bump the graphics a lot more. Looking at the FPS, I could. But for the overall feel of the game, it sucked! And I had to bring all details down to High, again (I only enabled sunshafts. Bottom line: A second G92 gives you only sunshafts in Crysis?)
If you're good with your multi-GPU, then it's good. But not for most people, who don't even know "it's not OK" because the only thing they look at is the irrelevant AFR frame rate numbers.