trinibwoy said:
WaltC,
Regarding LCD's they already have enough issues when it comes to gaming without using SLI as some kind of detractor. For best IQ they need to be run at their native res and then there is considerable ghosting on all but the best LCD's. And what about single card configurations that can do 16x12 but are limited by the native res on the LCD - this restriction is not specific to SLI, it also applies to the PE's and Ultra's out there. I have a friend who bought a vanilla 6800 because his monitor is best at 10x7. You have to realize that many of your arguments apply to all levels of GPU hardware - not just high-end vs SLI.
What is a restriction specifically of SLI, though, is that that the justification for purchasing SLI is limited to the increase in frame rates at gaming resolutions *above* 1024x768 when contrasted to a single card running at resolutions above 1024x768.
So, for instance, if your friend's monitor doesn't operate to his satisfaction at 1152x864 or higher, and your friend does not want to replace his monitor, then you surely would not recommend that he buy SLI under any circumstances, correct? What would be the point? Likewise, if someone owns a 1024x768 LCD and has no inclination to change monitors, then recommending SLI for him, too, would also be something you'd want to avoid.
As well, I'd hope you wouldn't think that a 6800U is no faster in terms of frame rates at 1024x768 than a 6800 is at 1024x768--because I do believe it is...
As pointed out in my last post speaking to this, the justification for buying SLI in terms of frame rates is the *opposite* of the frame-rate justification one uses when buying a single card.
With a single card the peak frame rates occur at the lower resolutions and decline as resolution rises. However, with SLI at 640x480 and 800x600 and sometimes even at 1024x768, the SLI system is often *slower* than a single card, or else is so close that the frame-rate advantage of SLI over a single card disappears. So, the justification for SLI over a single 3d card is found in terms of frame rates, but *only* at resolutions above 1024x768.
Anecdotally, I installed an x800xt to replace a R9800P, and the frame-rate performance difference at *all* resolutions, including the lower ones, is obvious. But such is not the case with SLI, is it, because of the dynamics of what an SLI environment actually is in comparison with a single 3d-card environment. The advantages of SLI over a single card manifest at resolutions above 1024x768, and so that is why I said that choosing SLI may involve the necessity of changing monitors for reasons that are entirely different from single-card monitor-environment considerations.
When it comes to CRT's you must have very sensitive eyes. The only games I have ever experienced tearing on are Mafia and Freedom Fighters. I have NEVER seen it on an FPS so vsync-off is my default setting. I'm sure most others here would agree.
I think you might well be surprised to see a poll of how many people prefer to run with vsync on...
As I've stated, I certainly do. It isn't just that people's eyes are different, which is true; it's that people's IQ preferences are different, and that people's *monitors* are different, too.
In example: on a 21" CRT at 1600x1200 the pixels are larger than they are on a 19" CRT at 1600x1200, and so things not noticeable on a 19" CRT might well stick out like sore thumbs on a 21" CRT simply because it's easier to see them on the larger monitor. The same is true for 19" CRTs in contrast with 17", and so on. Then there are things like dot pitch/ap grille, etc., which can differ widely among monitors. IQ preferences: some people run games all day on low-quality mipmap settings and don't care--I won't use anything but the highest-quality settings, etc.
Vsync-off tearing most certainly exists--*seeing it* however is another matter and depends on the variables mentioned...
If I couldn't see it so well in my own environment I'd have no reason to run with vsync on, would I? Indeed, if tearing wasn't a visible issue then 3d card makers would have no incentive for ever turning vsync on, would they? Let alone providing controls to switch vsync on and off, etc.
We also don't know how the emergence of powerful dual-GPU solutions will affect developers' high-end targets for in-game settings. Sure the lower settings will remain with the mid-range cards. But if they start putting extreme settings in there that are much more GPU limited than today's games then the lower resolutions will start seeing more gains with SLI. I think that's a reasonable possiblity given IHV and developer relations.
I do not think any game optimization from developers for SLI is likely for reasons already covered exhaistively in this thread...
Your evaluation of the pros and cons of SLI is relatively accurate but for one thing - that single pro outweighs all the cons for many people and they are the ones purchasing SLI systems today. I find it strange that you can rail against SLI so much when it's actually selling quite well in the market. What SLI needs is a killer app that brings single GPU's to their knees - this generation is just so powerful that all the games out there are being chewed up by a single high-end card.
Many, many more, however, like me, are *not* purchasing SLI systems at the present--also covered exhaustively earlier in the thread. (The great majority of PCIe mboards currently made and sold are single-slot, etc.) IE, the fact that some people are buying SLI has no bearing on whether a majority are buying it--and the majority clearly is not.
I know that people who buy plasma monitors, for instance, for exhorbitant prices might well like to imagine that "everybody's buying what I'm buying" and that "It'll be great when developers start specifically supporting the unique features of this monitor," because it's just human nature to not want to admit to yourself when you've bought a pig in a poke which currently interests very few others (usually because the price-performance profile is entirely too low for most people)...
It's no doubt a similar thing with SLI at the moment--and will likely remain so until the novelty wears off, I'd imagine...