(if you're not Chris Ray, PLEASE read and don't disregard this!)
Synchronization is a BIG issue. The more I dig into it, the worse it becomes. I don't know why people keep telling "Well it's an issue but it doesn't hurt much! And I'm willing to sacrifice that much for such high frame rates!!!"
It's not an "issue" like, for example, "some artifacts on the screen while playing", that you can choose to keep in favor of playing on smoother frame rates.
It IS directly your frame rate.
We made a lot of different tests with different games in SLI:
Assassin's Creed, BioShock, Call of Juarez, Lost Planet and Crysis.
AC and Bioshock aren't hampered with bad synch issues at all. (Although there's such an issue in Kingdom in AC, it's there even with a single 8800GT, and it makes the game unplayable below 40FPS in Kingdom, which sucks. BTW, a single 3870 doesn't have such an issue at all. Yay NVidia!)
The rest, however... Make nearly all SLI gains worthless.
Now, before I publish some of the results, I want to elaborate a bit on the "practical" frame rate concept. It's not the FPS shown in FRAPS. Let me give an example from Call of Juarez:
Code:
# - Frame time - Difference
7 171.979 48
8 187.469 15
9 229.284 42
10 244.826 16
11 287.247 42
12 303.01 16
13 346.28 43
14 361.656 15
Here exactly 8 frames have been rendered in 0.19 seconds, corresponding to 42FPS. But as far as human perception goes, the general sense of fluidity, naturally, depends on the most delayed frame in the "vicinity". You don't see it as 42FPS. Every one of two frames has 45ms delay. So, yes... You see this entire scene at a practical frame rate of 1000 / 45 = 22 FPS. With additional stuttering involved due to every second frame added at a much lower delay.
Yes. Only 22 FPS. While we were benchmarking there was just no way we would believe what we were seeing was being displayed as 40-50 FPS. It wasn't fluid at all.
Lost Planet sucked as well, this time it was more like
Code:
48 753 10
49 769 16
50 784 15
51 790 6
52 814 24
53 832 19
54 838 6
55 859 21
56 877 18
57 881 4
58 906 25
59 921 15
60 928 7
This time the frame seemed to top once for every three frames. So the corresponding momentary frame rates for the above scenario were like 174-50-45-195-40-50-170-45-40 and so on. Of course no one needs to be told that the perceived frame rate is somewhere near 40 in that scenario. However, for 13 frames rendered in 0.15 seconds, FRAPS shows us a FPS of near 75 (due to every one of three frames unnecessarily being rendered in a super fast delay).
Crysis performed like a regular game that sucks in SLI (which makes it a regular game overall)- frame rate dips (and tops) every second frame, just like CoJ. Here's an example from Crysis:
Code:
6 94 26
7 107 13
8 131 25
9 144 13
10 168 24
11 184 15
12 207 23
13 220 14
14 244 24
15 258 14
Here FRAPS tells us we're seeing ~60FPS, but we don't believe its lies because we know we perceive an entire stuttering scene rendered at near 40 FPS.
So how can we "measure" this "practical" frame rate concept? By equating the "momentary FPS" (1000/frame time difference) of each frame to the minimum momentary FPS in its vicinity (vicinity being three frames). This might look like a brutal way of calculating FPS, but actually it's accurate for measuring the sense of fluidity in the scene. If two frames are rendered at 50FPS and the frame next to them at 250FPS, that 250FPS frame adds absolutely nothing to the fluidity of the game. If momentary FPS's are 50-100-50-100-50-100, you see the entire scene at near 50FPS. Sorry, but it's the truth.
With each frame being given a so-called "real" FPS, I couldn't help but average all the "real" FPS's we got. Here's what I got:
Crysis:
Avg. Benchmark FPS: 38
Avg. "Real" FPS: 27
Call of Juarez:
Avg. Benchmark FPS: 37
Avg. "Real" FPS: 25
Lost Planet:
Avg. Benchmark FPS: 63
Avg. "Real" FPS: 47
Notice a pattern? Yeah. A ratio of near 1.4; which corresponds to a little less what we usually get from the second card. (the tests were performed at 1440 and 1680 res., and the average FPS a second card gives you is somewhere between 1.4x and 1.5x) So I might conclude that for those games, a second card brings you only a little more than nothing. Which is exactly what I experienced at Crysis: I bought a second G92, which only enabled me a nice "sunshafts" IQ setting.
For the other games that the synch. thing was not an issue:
Assassin's Creed:
Avg. Benchmark FPS: 53
Avg. "Real" FPS: 53
BioShock:
Avg. Benchmark FPS: 106
Avg. "Real" FPS: 102
See, when you've nicely distributed frame times, this method of calculating isn't brutal at all.
Edit: I can happily give the frame time spreadsheets to anyone who wants. They'll reach the same result.