Nvidia went SLI because they can't compete?

Status
Not open for further replies.
Chalnoth said:
Jawed said:
So, you're saying that both cards are sending each other the "half" of the FPB they've created, so that the "exposure" can be deduced? And there's bandwidth crunch on the link between the two cards?
This is possible, of course, but it's more likely that nVidia has simply decided to disable SLI for these parts when render to texture is used, unless a specific compatibility flag is used.

But SLI is faster when doing HDR in FC. It's just hardly worth mentioning...

http://www.beyond3d.com/reviews/nvidia/sli/index.php?p=23

4-8%. In the run up to SLI, HDR (and SSAA) were seen as great ways to use SLI power - there was quite a buzz...

I dunno, maybe FC is just pathologically useless as a platform for testing SLI.

Jawed
 
DemoCoder said:
ANova said:
And a single R600/G80/NV60 would be better still, while saving you money in the process. That's the whole basis of this argument.

Vaporware can't stand in for hardware. A card that doesn't exist, and won't be out until 2008 can't fulfill your gaming needs in 2006.

Homework assignment: Look up "The time preference of money".

Here, I quote for you from Wikipedia

Time preference is the economist's assumption that a consumer will place a premium on enjoyment nearer in time over more remote enjoyment. A high time preference means a person wants to spend their money now and not save it, whereas a low time preference means a person might want to save their money as well.

This is particularly important in microeconomics. The Austrian School sees time as the root of uncertainty within economics.

A high time preference indicates that a person puts a high premium on satisfying wants in the near future.

The time preference theory of interest is an attempt to explain interest through the demand for accelerated satisfaction.

Now take that, and try to reconcile it with your notion that there is some objective argument about the economic rationality of your video card purpose. Different people have different preferences, and your preference to wait 18 months to get a cheaper card is not the same as someone who wants the top performing system in the very near future.

Great post. That has been my point from the start. If I want something good now I am going to buy it.
 
Jawed said:
But SLI is faster when doing HDR in FC. It's just hardly worth mentioning...
Then it's just a matter of tweaking the SLI method. This certainly won't be the case for every HDR game.
 
ninelven said:
So the game I have works with SLi or doesn't (ie yes or no), how could one live at such a speed. It's mind boggling. In the land of bandwith, fillrate, pipelines, AA methods, AA quality, AF methods, AF quality, memory interfaces, and clockspeed, that yes or no question just takes the cake.

Just a guess here, but maybe, just maybe, that's why people benchmark SLi in the games it works in. Nah, I think if I want to know how SLi performed in a particular game I'll ask my magic 8 ball.

Nice job, you've proven you aren't understanding me at all. Yes benchmarks do help tell you what games do and don't work. So what? My point is that SLi doesn't work with all games, and that some games only get a small improvement. Those are rather large disadvantages to SLi imo. The whole point of SLi is to improve the playability of games you like to play, if SLi offers me no advantage in a specific game that I like to play then I find absolutely no use for it; not to mention I have no idea whether or not future games will support it. I also don't consider a 20% improvement worth twice the price. It's as simple as that.
 
I would say that after looking at that list for games supported by SLI, most recent games that stress the video subsystem are supported. The games I see absent are older and don't really have a need for more video power.
 
if SLi offers me no advantage in a specific game that I like to play then I find absolutely no use for it
Stop the presses it's the revelation of the century.

not to mention I have no idea whether or not future games will support it
And there isn't anyone forcing you to buy it either. One could just wait and see or maybe that is too difficult.

I also don't consider a 20% improvement worth twice the price. It's as simple as that.
If you game at a resolution / settings that only yield a 20% performance increase, that is your business. I couldn't care less what settings you play at as they have all of about 0 relevance to me.

Seriously, this thread has entered the realm of Chalnoth criticizing the install proccess of ATI's drivers.
 
ninelven said:
Stop the presses it's the revelation of the century.

What is it, exactly, you are arguing?

And there isn't anyone forcing you to buy it either. One could just wait and see or maybe that is too difficult.

Completely pointless comment.

If you game at a resolution / settings that only yield a 20% performance increase, that is your business. I couldn't care less what settings you play at as they have all of about 0 relevance to me.

Seriously, this thread has entered the realm of Chalnoth criticizing the install proccess of ATI's drivers.

Some games cannot get larger then a 20% improvment, that's my point. Resolution and quality settings have absolutely nothing to do with it.
 
some games cannot get larger then a 20% improvment, that's my point
So your point is that and SLi setup performs differently in different games (just like every other video card available), and you can't be bothered to figure out which ones these are to make an informed purchase decision. Or are you arguing that you are not intelligent enough to make an informed decision?

if SLi offers me no advantage in a specific game that I like to play then I find absolutely no use for it
So you came to the conclusion that if SLi is worthless then SLi is worthless... that's some nice work there Sherlock.

Completely pointless comment.
Yes, heaven forbid you accept personal responibility for what video card you purchase.

I've got no more time to waste on you. How you feel about SLi is irrelevant; it is just the constant bitching gets old. Anyway, if SLi causes you any problems, they are yours not mine so you can deal with it.
 
ninelven said:
So your point is that and SLi setup performs differently in different games (just like every other video card available), and you can't be bothered to figure out which ones these are to make an informed purchase decision. Or are you arguing that you are not intelligent enough to make an informed decision?

I'll spell it out for you, SLi is not worth the price due to all the reasons I have mentioned in this thread, that includes game support, price, heat, power, and variable performance increases, some worthwhile some not. Is that clear enough for you? If you think SLi is still worthwhile despite those reasons then good for you, that's your opinion, just like what I said is mine. My purpose was not to try to pursuade anyone to not buy SLi, I was stating my reasons for not liking it, you and others obviously decided to take offense to that for whatever reason. I don't care. The original topic was about whether or not SLi is a desperate attempt to one up ATI's solution. I think it is to a certain degree, and I explained why.

So you came to the conclusion that if SLi is worthless then SLi is worthless... that's some nice work there Sherlock.

Again, another pointless comment. Do you have any worthwhile thing to say, or are you going to continue to beat around the bush with this type of meaningless drabble.

Yes, heaven forbid you accept personal responibility for what video card you purchase.

How you feel about SLi is irrelevant; it is just the constant bitching gets old. Anyway, if SLi causes you any problems, they are yours not mine so you can deal with it.

Then why bother arguing with me? Since they are my problems and not yours what do you care? I would also like you to point out exactly where I am bitching about anything? Please do enlighten me.

I've got no more time to waste on you.

Good, then we agree on something.
 
Chalnoth said:
Jawed said:
So, you're saying that both cards are sending each other the "half" of the FPB they've created, so that the "exposure" can be deduced? And there's bandwidth crunch on the link between the two cards?
This is possible, of course, but it's more likely that nVidia has simply decided to disable SLI for these parts when render to texture is used, unless a specific compatibility flag is used.

Search is your friend, too:

http://www.beyond3d.com/forum/viewtopic.php?p=397449#397449

DaveBaumann said:
FAQ's are your friend ;)
Are render-to-texture operations accelerated with NVIDIA's SLI technology?
Yes.
NVIDIA's SLI technology automatically shares and distributes render-to-texture operations across multiple GPUs. There may be some synchronization and communications overhead associated with those operations; that overhead, however, is less than the overall performance boost enabled by multiple GPUs.

Are texture render-targets shared between the multiple GPUs?
Yes.
NVIDIA's SLI technology automatically shares and distributes render-target data across multiple GPUs as necessary.
Jawed
 
This thread came to a nice explosion 8)

Once the dust has settled and some of you have calmed down a bit, I'd like to remind some of you that one should be careful these days when criticising specific policies and that goes for either/or side.

Eventually any comment in time might slap back at your face, sooner or later. I used to remind some folks about AF angle-dependency quite some time ago as an example; since the release of the GF6 line the tune has obviously changed.

On to the topic at hand: if there are some users (and I'm sure there's a small portion out there) that want to have a highly expensive multi-board config, then let them have it. Hell if Quantum3D or simFusion would have sold some of their lower end systems quite some time ago with affordable prices (and the driver support from each IHV would had been there), some would have jumped on those too. Yes the price/performance ratio is in relative terms out of place for a SLi system, but I doubt that any of the interested consumers actually care much about that factor. There have always been sollutions that are way too expensive for what they deliver; as an example try to convince me why I as the average user should for example buy a FX55 and not an AMD64 4000 or even a 3800; especially the last one will cost me right now about half as much as the FX55. Does it deliver twice as much performance?

PCI-E is simply way better suited for dual board configs than AGP would had ever been and I expect vendors to take advantage of it, one way or another. They'll end up with higher sales figures in the end and that's what interests them both.

Last but not least: I've never been personally much in favour of either multichip or multiboard sollutions (not in the past and not today); but I don't bent out of shape about it either nor will I use it as a banner for a personal crusade against any IHV because of it. Some people just can't snap out of that kind of crap obviously. In any case if a user today would ask me whether to pick up two 6600GTs@SLi vs a 6800GT, the answer will be obvious. It's the price segment where the highest sales figures are occuring anyway when it comes to high end sollutions. The first just looks "cooler" through a funky blue case window, now doesn't it? LOL :LOL:
 
Geeforcer said:
LOL. So when 3 SLI conjurations (6800, GT and Ultra) show no signs of excessive temperature and one (6600) does, and I point it out, you have the gall to accuse ME of using selective data to prove my point? ROFL. The fact remains: Most SLI systems tasted do now show excessive heat. You can twist all you want and latch onto 6600 results as long as you like, but the data is there for everyone to see. I now you are probably incapable of answering a simple yes or no question, but let me try anyway: Is system operating at 36 degrees Celsius "hotter then hell"? Yes or now Walt?

Right....;) And you can pretend that two 3d-cards in the case generate the same noise and heat as one, for as long as you like--but as for me, I'm not buying it...;)

It’s not my fault dear Walt that you knowledge of acoustics is so dismal. Adding an singe identical sound source would result in only marginal increase in noise, both perceived and measured. Just for the reference, it would take 10-fold increase number of sources for perceived noise loudness to double (the scale is logarithmic, remember?) Dave's quote does not diverge from the results, the results do diverge from your assertions (SLI configuration is excessively noisy, as compared to non-SLI configuration). His comments (and data) show that single card is already noisy. The problem is thus not limited to SLI.

Thank you for pointing out my dismal knowledge of acoustics--but I still think yours is much more dismal than mine...;)

I mean--wow--according to you the single card is already "constantly noisy"--but if I put in another noisemaker just like it the environment is transformed somehow and magically becomes "acceptably quiet"--Heh...;) Ah, the essence of delusional thinking...;)

When Dave says, "constantly noisy," I really have little difficulty parsing his meaning. Gee, it seems to me that if *one card* is "constantly noisy" by itself as you state, and you add another "constantly noisy" card just like it into the case, which makes the aggregate sound 1%-5% *noisier* as measured logarithmically--then the end result is still "constantly noisy"--and that appears to be exactly what Dave said...;) You might at least consider conceding that adding the second "constantly noisy" card to the first "constantly noisy" card does not result in a reduction of sound into a *less noisy* environment.
 
trinibwoy said:
WaltC,

Regarding LCD's they already have enough issues when it comes to gaming without using SLI as some kind of detractor. For best IQ they need to be run at their native res and then there is considerable ghosting on all but the best LCD's. And what about single card configurations that can do 16x12 but are limited by the native res on the LCD - this restriction is not specific to SLI, it also applies to the PE's and Ultra's out there. I have a friend who bought a vanilla 6800 because his monitor is best at 10x7. You have to realize that many of your arguments apply to all levels of GPU hardware - not just high-end vs SLI.

What is a restriction specifically of SLI, though, is that that the justification for purchasing SLI is limited to the increase in frame rates at gaming resolutions *above* 1024x768 when contrasted to a single card running at resolutions above 1024x768.

So, for instance, if your friend's monitor doesn't operate to his satisfaction at 1152x864 or higher, and your friend does not want to replace his monitor, then you surely would not recommend that he buy SLI under any circumstances, correct? What would be the point? Likewise, if someone owns a 1024x768 LCD and has no inclination to change monitors, then recommending SLI for him, too, would also be something you'd want to avoid.

As well, I'd hope you wouldn't think that a 6800U is no faster in terms of frame rates at 1024x768 than a 6800 is at 1024x768--because I do believe it is...;) As pointed out in my last post speaking to this, the justification for buying SLI in terms of frame rates is the *opposite* of the frame-rate justification one uses when buying a single card.

With a single card the peak frame rates occur at the lower resolutions and decline as resolution rises. However, with SLI at 640x480 and 800x600 and sometimes even at 1024x768, the SLI system is often *slower* than a single card, or else is so close that the frame-rate advantage of SLI over a single card disappears. So, the justification for SLI over a single 3d card is found in terms of frame rates, but *only* at resolutions above 1024x768.

Anecdotally, I installed an x800xt to replace a R9800P, and the frame-rate performance difference at *all* resolutions, including the lower ones, is obvious. But such is not the case with SLI, is it, because of the dynamics of what an SLI environment actually is in comparison with a single 3d-card environment. The advantages of SLI over a single card manifest at resolutions above 1024x768, and so that is why I said that choosing SLI may involve the necessity of changing monitors for reasons that are entirely different from single-card monitor-environment considerations.

When it comes to CRT's you must have very sensitive eyes. The only games I have ever experienced tearing on are Mafia and Freedom Fighters. I have NEVER seen it on an FPS so vsync-off is my default setting. I'm sure most others here would agree.

I think you might well be surprised to see a poll of how many people prefer to run with vsync on...;) As I've stated, I certainly do. It isn't just that people's eyes are different, which is true; it's that people's IQ preferences are different, and that people's *monitors* are different, too.

In example: on a 21" CRT at 1600x1200 the pixels are larger than they are on a 19" CRT at 1600x1200, and so things not noticeable on a 19" CRT might well stick out like sore thumbs on a 21" CRT simply because it's easier to see them on the larger monitor. The same is true for 19" CRTs in contrast with 17", and so on. Then there are things like dot pitch/ap grille, etc., which can differ widely among monitors. IQ preferences: some people run games all day on low-quality mipmap settings and don't care--I won't use anything but the highest-quality settings, etc.

Vsync-off tearing most certainly exists--*seeing it* however is another matter and depends on the variables mentioned...;) If I couldn't see it so well in my own environment I'd have no reason to run with vsync on, would I? Indeed, if tearing wasn't a visible issue then 3d card makers would have no incentive for ever turning vsync on, would they? Let alone providing controls to switch vsync on and off, etc.

We also don't know how the emergence of powerful dual-GPU solutions will affect developers' high-end targets for in-game settings. Sure the lower settings will remain with the mid-range cards. But if they start putting extreme settings in there that are much more GPU limited than today's games then the lower resolutions will start seeing more gains with SLI. I think that's a reasonable possiblity given IHV and developer relations.

I do not think any game optimization from developers for SLI is likely for reasons already covered exhaistively in this thread...;)

Your evaluation of the pros and cons of SLI is relatively accurate but for one thing - that single pro outweighs all the cons for many people and they are the ones purchasing SLI systems today. I find it strange that you can rail against SLI so much when it's actually selling quite well in the market. What SLI needs is a killer app that brings single GPU's to their knees - this generation is just so powerful that all the games out there are being chewed up by a single high-end card.

Many, many more, however, like me, are *not* purchasing SLI systems at the present--also covered exhaustively earlier in the thread. (The great majority of PCIe mboards currently made and sold are single-slot, etc.) IE, the fact that some people are buying SLI has no bearing on whether a majority are buying it--and the majority clearly is not.

I know that people who buy plasma monitors, for instance, for exhorbitant prices might well like to imagine that "everybody's buying what I'm buying" and that "It'll be great when developers start specifically supporting the unique features of this monitor," because it's just human nature to not want to admit to yourself when you've bought a pig in a poke which currently interests very few others (usually because the price-performance profile is entirely too low for most people)...;) It's no doubt a similar thing with SLI at the moment--and will likely remain so until the novelty wears off, I'd imagine...;)
 
WaltC said:
What is a restriction specifically of SLI, though, is that that the justification for purchasing SLI is limited to the increase in frame rates at gaming resolutions *above* 1024x768 when contrasted to a single card running at resolutions above 1024x768.

I do see your point about requiring higher quality monitors to get the best out of SLI. But honestly, the people with SLI rigs will most likely have good equipment. Even for those without the best monitor see my earlier post referencing Dave's benches at 1024x768 8x16x where significant gains are seen.

I think you might well be surprised to see a poll of how many people prefer to run with vsync on...;)

I would be extremely surprised if more people had it on by default than not. The only time I've seen "vsync on" recommended is to alleviate tearing in specific titles. Otherwise the usual recommendation is to leave it off to maximize performance, especially in shooters.

WaltC said:
I do not think any game optimization from developers for SLI is likely for reasons already covered exhaistively in this thread...;)

Maybe not optimizations per se but you can't honestly believe Nvidia has not briefed developers on which implementations best leverage the power of SLI and which do not - and I expect ATI to do the same. Doom3 saw large gains with SLI out the door and I would expect any new TWIMTBP title to do the same. The upcoming Chaos Theory may answer this question.

Many, many more, however, like me, are *not* purchasing SLI systems at the present--also covered exhaustively earlier in the thread. (The great majority of PCIe mboards currently made and sold are single-slot, etc.) IE, the fact that some people are buying SLI has no bearing on whether a majority are buying it--and the majority clearly is not.

So the majority are not buying the most exotic, expensive and powerful solution in the market.....how does that support the anti-SLI argument? People who have the cash to blow on SLI and plasma TVs do not seek the approval for their purchase from those with more restricted budgets. Same goes for people who purchase Ultras and PE's.
 
WaltC said:
I mean--wow--according to you the single card is already "constantly noisy"--but if I put in another noisemaker just like it the environment is transformed somehow and magically becomes "acceptably quiet"
Yup, it's true. They do.

It has to do with sound propagation and harmonics, the second card's noise creates interference waves with the original cards and if you tune 'em just right they cancel out.
yep.gif
























;) :p
 
WaltC said:
I mean--wow--according to you the single card is already "constantly noisy"--but if I put in another noisemaker just like it the environment is transformed somehow and magically becomes "acceptably quiet"

Why do you find it necessary to wilfully misinterpret others' words to get your point across? He did not claim that it gets quieter, his claim was that the increase in noise by adding the second card was negligible and probably inaudible since the first card was already so loud. I'm not saying his claim is true, just that it does not remotely resemble what you say.
 
ANova said:
I'll spell it out for you, SLi is not worth the price due to all the reasons I have mentioned in this thread, that includes game support, price, heat, power, and variable performance increases, some worthwhile some not. Is that clear enough for you? If you think SLi is still worthwhile despite those reasons then good for you, that's your opinion, just like what I said is mine. My purpose was not to try to pursuade anyone to not buy SLi, I was stating my reasons for not liking it, you and others obviously decided to take offense to that for whatever reason. I don't care.
You do care, otherwise you wouldn't be constantly repeating yourself. Here, let me bottom-line it. SLI has one negative: price (power draw, heat, noise is all tangential to the issue, and all can be taken care of with more money). SLI also has one benefit: performance beyond current mass-production realities (because ATi/nV aren't going to try to quadruple the previous gen's die size every 18 months).

Again, this all boils down to the educated/uneducated consumer POV. Can we agree to that? But--educated consumer or not--only SLI can offer performance beyond the current high end single cards. For that reason alone, it's potentially valuable, meaning worth the money to someone.

So, are we quibbling over SLI itself, or its marketing? I'm with you on the marketing tip, but I can't deny that SLI offers some benefits to some people. Speaking of marketing, though, just seeing how nV has gotten a few GF6-only game effects on the market already should indicate that they're capable of gettings devs to code with SLI in mind.

The original topic was about whether or not SLi is a desperate attempt to one up ATI's solution. I think it is to a certain degree, and I explained why.
It may have been born of desparation in the FX era, but what exactly is desparate about the GF6 series?

I would also like you to point out exactly where I am bitching about anything?
Look, I tend to agree with you in other threads, but your constant harping on SLI's "short-comings"--which are such only when compared to an ideal performance increase, not to the reality that Intel charges a premium out of line with performance for a few hundred more MHz and Mercedes charges tens of thousands more for AMG versions that most people don't use while stuck in traffic and the Concorde charged an extra $10k to shave a few hours off your commute--comes across as bitching. SLI isn't perfect, and probably won't be, but the fact remains that it is--yes, "only" in certain situations--faster than any single card. For that fact alone nV can justifiably charge a premium.

It just seems like you and Walt are being contrary for its own sake. As much as you two hate SLI's shortcomings as it relates to anything but the high end and anything but the most demanding titles, all your arguments can't deny that neither ATi nor anyone else offers comparable performance in the cases where SLI works. It is for those cases that SLI proves valuable. The precise monetary figure is almost besides the point, but don't think nV is inept enough to price it out of the market (witness people paying $700+ for the XTPE at launch). People don't buy $500+ video cards expecting price/performance, just performance.

Walt, a glance at the HL2 and D3 10x7 8xS numbers should suffice as a rebuttal to your high res argument.

Any way we look at it, SLI is a niche product. It does offer certain advantages, though, and I'm not sure how playing up its limitations reduces the attractiveness of those advantages.

Oh well, this post probably comes across as ruder or more patronizing than I intend, but I'm not going to let all this typing go to waste.
 
Pete said:
ANova said:
I'll spell it out for you, SLi is not worth the price due to all the reasons I have mentioned in this thread, that includes game support, price, heat, power, and variable performance increases, some worthwhile some not. Is that clear enough for you? If you think SLi is still worthwhile despite those reasons then good for you, that's your opinion, just like what I said is mine. My purpose was not to try to pursuade anyone to not buy SLi, I was stating my reasons for not liking it, you and others obviously decided to take offense to that for whatever reason. I don't care.
You do care, otherwise you wouldn't be constantly repeating yourself. Here, let me bottom-line it. SLI has one negative: price (power draw, heat, noise is all tangential to the issue, and all can be taken care of with more money). SLI also has one benefit: performance beyond current mass-production realities (because ATi/nV aren't going to try to quadruple the previous gen's die size every 18 months).

I disagree that the "tangential" drawbacks can be taken care of with more money. It's damn near impossible to get the amount of total power draw we are talking about particularly quiet. And heat is heat, the laws of thermodynamics don't bend easily.

That said, you summed up the benefit nicely - higher performance.
(Unless of course you are limited by host system performance, or run an application where SLI is not applicable or applied, some shader limited scenarios et cetera.)
At best, running two 6800s SLI can allow you to go one step up in resolution. That's it. Period.

Is it worth it? Obviously not, unless you are a reviewer, extremely tech happy, or you're desperate to get some attention among your peers. If you fit either of these categories, or some other I can't see myself - fine!

Choice isn't particularly bad even if the value isn't there for 99.xx percent of the market - and neither nVidia nor ATI are likely to believe that SLI will generate much in the way of sales of gfx-cards. It does generate buzz though, and it does give competitive benefits as far as core-logic goes, and it does give nVidia the top spot in benchmarks at this point in time. And the value of these probably justify the cost involved for them, and more, and justify for ATI to get in on the game.

For reviewers though, SLI presents a thorny question.
Just how small a market presence can an alternative have, and still be an relevant data point to include in comparisons and evaluations?
 
Entropy said:
Pete said:
ANova said:
I'll spell it out for you, SLi is not worth the price due to all the reasons I have mentioned in this thread, that includes game support, price, heat, power, and variable performance increases, some worthwhile some not. Is that clear enough for you? If you think SLi is still worthwhile despite those reasons then good for you, that's your opinion, just like what I said is mine. My purpose was not to try to pursuade anyone to not buy SLi, I was stating my reasons for not liking it, you and others obviously decided to take offense to that for whatever reason. I don't care.
You do care, otherwise you wouldn't be constantly repeating yourself. Here, let me bottom-line it. SLI has one negative: price (power draw, heat, noise is all tangential to the issue, and all can be taken care of with more money). SLI also has one benefit: performance beyond current mass-production realities (because ATi/nV aren't going to try to quadruple the previous gen's die size every 18 months).

I disagree that the "tangential" drawbacks can be taken care of with more money. It's damn near impossible to get the amount of total power draw we are talking about particularly quiet. And heat is heat, the laws of thermodynamics don't bend easily.

That said, you summed up the benefit nicely - higher performance.
(Unless of course you are limited by host system performance, or run an application where SLI is not applicable or applied, some shader limited scenarios et cetera.)
At best, running two 6800s SLI can allow you to go one step up in resolution. That's it. Period.

Is it worth it? Obviously not, unless you are a reviewer, extremely tech happy, or you're desperate to get some attention among your peers. If you fit either of these categories, or some other I can't see myself - fine!

Choice isn't particularly bad even if the value isn't there for 99.xx percent of the market - and neither nVidia nor ATI are likely to believe that SLI with generate much in the way of sales of gfx-cards. It does generate buzz though, and it does give competitive benefits as far as core-logic goes, and it does give nVidia the top spot in benchmarks at this point in time. And the value of these probably justify the cost involved for them, and more, and justify for ATI to get in on the game.

For reviewers though, SLI presents a thorny question.
Just how small a market presence can an alternative have, and still be an relevant data point to include in comparisons and evaluations?

I totally disagree with your statement. When you mean reviewers, are your referring to Beyond3d or Consumer Reports? The people who visit these forums ARE interested in high end performance. SLI definitely fits that category. So the question is why wouldn't they include sli?

<"Is it worth it? Obviously not, unless you are a reviewer, extremely tech happy, or you're desperate to get some attention among your peers. If you fit either of these categories, or some other I can't see myself - fine!">

You think because someone is interested in SLI that they fit those categories? Well, let's see. I am not a reviewer, I am not what you would call tech happy since I don't usually upgrade my pc but once every two or three years, and my friends wouldn't even give a rat's rear end if I had a SLI setup. Perhaps you should add a few more categories? ;)
 
Status
Not open for further replies.
Back
Top