Nvidia went SLI because they can't compete?

Status
Not open for further replies.
Jawed said:
Has anyone figured out why SLI doesn't provide any useful performance boost for the HDR rendering style in Far Cry?
Any HDR rendering requires render to texture, which is always going to be a sticky point for HDR. This means that nVidia has to produce a proper compatibility mode for this game when HDR rendering is used. All that you're seeing is that they haven't done this yet.
 
zeckensack,

In discussing SLI I think it's important to keep the idea that Nvidia is more than a GPU manufacturer fresh. If you look at the quick adoption of SLI capable mainboards by the major players (Asus, Gigabyte and MSI), it becomes clear that there is more to this that simply selling video cards. Nvidia is poised to introduce Intel chipsets featuring SLI and I think I am right when I say that ethusiasts are excited about seeing this technology on the Intel platform. It's a very hand and glove relationship where one helps sell the other and that is, ultimately, what Nvidia is looking to do.

Your point about price pressure from SLI in future generations is an interesting one. There is obviously great pressure to keep prices down, or at least ramp them up so as not to completely shock the consumer. With a doubling in onboard memory capacity due, this will make it somewhat tricky. Because SLI doesn't effectively double your video RAM it may very well be a difficult proposition to sell two 512MB cards at the premium they will demand on a per-unit basis. It would seem the logical choice to purchase a single card with 512MB rather than two 256MB cards.

One last point. I am not sure why you think the "high-end nuts" won't buy a new card just because a single doesn't outperform their SLI configurations. Surely they will buy two no matter what? :p
 
Trinibwoy said:
Your evaluation of the pros and cons of SLI is relatively accurate but for one thing - that single pro outweighs all the cons for many people and they are the ones purchasing SLI systems today. I find it strange that you can rail against SLI so much when it's actually selling quite well in the market. What SLI needs is a killer app that brings single GPU's to their knees - this generation is just so powerful that all the games out there are being chewed up by a single high-end card.

This is exactly why the argument against SLi only seems logical. The reality of the situation is that there are no apps that require SLi that a single card solution like an X800 XT can't handle, nor will there be one for some time (UE3 probably being the first). By the time UE3 comes out we'll have the R520 and possibly G70 if not beyond, which effectively makes SLi's single best advantage, it's performance, useless. And of course that doesn't even include the fact that SLi simply doesn't work with all games, or doesn't work 100% in some of the games it does work with whereas all games work fine with the alternative.

Furthermore, just because people are buying into SLi hype and marketing doesn't automatically make it an effective platform, as I already mentioned. So that's a rather mi-nute argument. People get duped into buying things all the time, they then try to justify their purchase to make themselves feel better, that's a human condition.

wireframe said:
Nvidia is poised to introduce Intel chipsets featuring SLI and I think I am right when I say that ethusiasts are excited about seeing this technology on the Intel platform. It's a very hand and glove relationship where one helps sell the other and that is, ultimately, what Nvidia is looking to do.

Our exclusive benchmarks run today tell us that the top of the range Intel system running nForce 4 with 2 6800 Ultras scores fully 4000 less 3dMark05's than the top of the range AMD rig outfitted similarly. An Nvidia representative yelled to us that we 'Shouldn't compare Intel and AMD!!' Why, pray tell thee?

At the press conference, Drew Henry from Nvidia stood up and showed the world 55 FPS on Doom 3 using the platform. He was in rapture, because he said that a single GeForce 6800 would only score 30FPS at these settings. We learned, however, that the scores are quite different and that Nvidia's original plan was to demonstrate the prowess of the platform by showing the frame rate with SLi turned off in the driver then turned on. However, with SLi off, the system scored 45FPS and with SLi on, the system scored 55FPS - hardly the 90% performance increase that Nvidia claim.

Source
 
Chalnoth said:
Jawed said:
Has anyone figured out why SLI doesn't provide any useful performance boost for the HDR rendering style in Far Cry?
Any HDR rendering requires render to texture, which is always going to be a sticky point for HDR.

It does? I thought the whole point of NVidia's implementation of a floating point buffer was that there was no render to texture? :? Or are you saying that the FPB is just a glorified FP16 texture.

Still, the question remains, why does that get in the way of SLI?

This means that nVidia has to produce a proper compatibility mode for this game when HDR rendering is used. All that you're seeing is that they haven't done this yet.

According to DB, http://www.beyond3d.com/reviews/nvidia/sli/index.php?p=30

Far Cry is one title that appears to have a custom mode predefined in the nvapps.xml configuration file

Nvidia has already created a "proper compatibility mode" (admittedly we have no idea whether they even considered HDR) for this game. Well, obviously it isn't optimal. Will it ever be? What's the hold-up? Will this problem recur with all games that use HDR? If HDR pans out to include most games, then SLI's kinda useless.

Maybe this is all tied up with the architectural limitations of NV4x that mean that AA is inoperative with HDR. If NVidia fixes this limitation, maybe it'll also mean that SLI will work with HDR.

Jawed
 
ANova said:
This is exactly why the argument against SLi only seems logical. The reality of the situation is that there are no apps that require SLi that a single card solution like an X800 XT can't handle

That depends what resolution you want to run in. Can I run all my games in 1600x1200 x4x-6xAA? For users with LCD monitors, running native resolution is imperative. I don't like to run below 1600x1200 *ever*. And high resolution alone doesn't get rid of aliasing issues.

Will the R520/G70 be able to run UE3 in 1600x1200x4xAA @ a solid 60fps? What about 8xS? What if I want supersampling? That's something that will bring any single card to its knees, but provides superior AA when games are using alpha tested textures.

However well an R520 and G70 performs, two R520s or G70s are going to perform even better. There are even people who like to run at 1920x1440 or higher.
 
Jawed said:
It does? I thought the whole point of NVidia's implementation of a floating point buffer was that there was no render to texture? :? Or are you saying that the FPB is just a glorified FP16 texture.
It's not "glorified." Blending allows developers to make use of a floating-point render target in the exact same way that they use normal framebuffers. This makes HDR rendering much easier to implement, and faster as well.

But the problem still remains that the FP16 color format cannot be directly interpreted by the DAC, and thus a tone mapping pass must be performed to map the FP16 colors to FX8 colors for normal framebuffer output (side comment: I'd really like to see a 10-10-10-2 framebuffer format support for the result of this tone mapping pass).

The problem with SLI and this tone mapping pass is that when doing render to texture in general, you need to have both GPU's know the entire texture, since it's not always clear how this texture will translate to the screen when rendered to the framebuffer. There's just no real way in the API currently to tell the driver that this is indeed a tone mapping pass, and thus each graphics card only needs the data that it has rendered. Thus the need for compatibility switches in the drivers.

Right. The problem is that a different compatibility switch would be needed for HDR rendering. The drivers in their current incarnation may not be capable of using two different compatibility switches for different rendering types within the same game. Or the compatibility switch for HDR may just not be implemented (since it's not an official feature, it's probably a fairly low priority).

But no, there's nothing that separates HDR rendering from any other render to texture issues. So there won't be anything that really prevents SLI from working with HDR in the long run. In fact, it may be possible right now with the latest Forceware drivers from nVidia, which offer the user to select SLI modes (though I don't yet know how extensive this customizability is). The big issues here aren't HDR, but rather that there are two different rendering modes available, and that said rendering mode isn't official, and thus may be low-priority to the nVidia's SLI driver team.
 
digitalwanderer said:
I wouldn't. I see tearing all the time on my CRT at 1024x768 @ 100Hz if I don't have v-sync enabled. :?

That's not surprising - considering how much you defend old Walt - you guys must have similar DNA too :LOL: Just kidding!!
 
DemoCoder said:
ANova said:
This is exactly why the argument against SLi only seems logical. The reality of the situation is that there are no apps that require SLi that a single card solution like an X800 XT can't handle

That depends what resolution you want to run in. Can I run all my games in 1600x1200 x4x-6xAA? For users with LCD monitors, running native resolution is imperative. I don't like to run below 1600x1200 *ever*. And high resolution alone doesn't get rid of aliasing issues.

Now your just being picky. Running 12x10 on a 16x12 LCD doesn't look that bad considering the higher the resolution the finer the pixels the less the distortion. At any rate, I would say yes, an X800 XT is capable of good framerates in 16x12; maybe not with 4x or 6x AA and 16x AF but a little more conservative settings. What do you do about those games that do not support SLi btw?

Will the R520/G70 be able to run UE3 in 1600x1200x4xAA @ a solid 60fps? What about 8xS? What if I want supersampling? That's something that will bring any single card to its knees, but provides superior AA when games are using alpha tested textures.

No less then a dual 6800U setup methinks.

However well an R520 and G70 performs, two R520s or G70s are going to perform even better. There are even people who like to run at 1920x1440 or higher.

And a single R600/G80/NV60 would be better still, while saving you money in the process. That's the whole basis of this argument.
 
ANova said:
And a single R600/G80/NV60 would be better still, while saving you money in the process. That's the whole basis of this argument.

Vaporware can't stand in for hardware. A card that doesn't exist, and won't be out until 2008 can't fulfill your gaming needs in 2006.

Homework assignment: Look up "The time preference of money".

Here, I quote for you from Wikipedia

Time preference is the economist's assumption that a consumer will place a premium on enjoyment nearer in time over more remote enjoyment. A high time preference means a person wants to spend their money now and not save it, whereas a low time preference means a person might want to save their money as well.

This is particularly important in microeconomics. The Austrian School sees time as the root of uncertainty within economics.

A high time preference indicates that a person puts a high premium on satisfying wants in the near future.

The time preference theory of interest is an attempt to explain interest through the demand for accelerated satisfaction.

Now take that, and try to reconcile it with your notion that there is some objective argument about the economic rationality of your video card purpose. Different people have different preferences, and your preference to wait 18 months to get a cheaper card is not the same as someone who wants the top performing system in the very near future.
 
ANova said:
This is exactly why the argument against SLi only seems logical. The reality of the situation is that there are no apps that require SLi that a single card solution like an X800 XT can't handle.

The argument for SLI is not that there are apps that require it. Name one game that requires an Ultra or PE.

WaltC made the point of SLI being best applied to very high res and shutting out LCD users in the process. That's not entirely accurate. The LCD user is one great case for SLI. Look at Dave's benches for UT2004, HL2, and Doom3 at 1024x768 (1280x960 for UT) with 8xAA, 16xAF. In all those titles SLI brings the GT and Ultra from the realm of unplayable < 50fps to very playable > 80fps for the LCD user.

You can argue that instead of buying a second card, the consumer can simply buy a better monitor but that's up for them to decide isn't it? We can create our own versions of the future or selectively choose real-world examples to support our views but ultimately only time will tell.
 
ANova, your argument is totally subjective. You say an XT is fast enough at high res, and that an LCD looks good enough upscaling to native res. The person who's buying SLI doesn't want good enough for you, but for himself, rendering your opinion moot. And saying an R520/600/hojillion will do what SLI does with merely a single card is as irrelevant to this argument as the price of cheese, as you can't buy an R520/NV50 right now. The whole point of SLI is to offer next-gen performance today. What's wrong with that?

I just don't understand your insistence on forcing a blanket dismissal on everyone else. SLI has its advantages for a certain few, and your explaining its price/performance and compatability foibles doesn't make it any less attractive for those for whom it'll work like nothing else on the market.

If price/performance were the only criteria, we'd probably all be running 6600GTs at 800x600 with the AA cranked. But one man's "good enough" may not be so for everyone.

Eh, we're arguing from different POVs. I'm arguing from the POV of an educated consumer who accepts SLI's current limits for the one way in which it breaks them. You're arguing from the uneducated consumer who sees SLI on the shelf or in the ads and assumes 2x the cards means 2x the performance. If that's the case, then we're both "right." But your comments about fast enough and good enough are just not going to stand up beyond you. The next guy may be more demanding, and SLI is his only option at this point.

zeck, SLI may well be an act of desparation, at least initially and from the engineering POV (it was kickstarted during the FX years, right?). But once it's up and running, it seems to be a nice ultra-high-end feature (luxury for gamers, utility for developers) that can probably be maintained across generations.
 
Chalnoth said:
The problem with SLI and this tone mapping pass is that when doing render to texture in general, you need to have both GPU's know the entire texture, since it's not always clear how this texture will translate to the screen when rendered to the framebuffer. There's just no real way in the API currently to tell the driver that this is indeed a tone mapping pass, and thus each graphics card only needs the data that it has rendered. Thus the need for compatibility switches in the drivers.

So, you're saying that both cards are sending each other the "half" of the FPB they've created, so that the "exposure" can be deduced? And there's bandwidth crunch on the link between the two cards?

The big issues here aren't HDR, but rather that there are two different rendering modes available, and that said rendering mode isn't official, and thus may be low-priority to the nVidia's SLI driver team.

I agree that HDR in FC isn't "official", but it still seems strange to me that the only released game that specifically uses a unique selling point feature of NV4x isn't speeding-up appreciably with SLI. Well it's no skin off my nose. Just a huge marketing glitch, really.

Jawed
 
SLi's effectiveness is a matter of opinion, it's price to performance ratio and advantages/disadvantages are not.

That's all I have to say on the matter.
 
ANova said:
SLi's effectiveness is a matter of opinion, it's price to performance ratio and advantages/disadvantages are not.

SLI's effectiveness in improving performance is not opinion - it is hard cold FPS numbers and is splattered over every single SLI review - including Dave's.

It's price/performance ratio and balance of pros and cons can be objectively determined but the effect of those factors an an individual's purchasing decision is highly subjective.

ANova, based on your position, are you saying that there is absolutely no advantage of having two R520's in AMR over a single one?
 
SLi's effectiveness is a matter of opinion

I guess all those benchmarks in Dave's (or any for that matter) article were just his opinion...

Effectiveness is directly related to the two latter mentioned topics as well... oh the logic.


EDIT: The irony in this thread is that 12+ pages of bs exists due to the persistent citing of non issues. Had any legitmate reason been given, the thread would have ended.

EX: I wouldn't use SLi because:
A) None of the games I play work with it
B) I don't need the performance
C) I don't want to bother with the hassle
D) I don't want to spend that much
E) I don't want to replace my mobo/cpu right now
F) I don't like Nvidia and wouldn't buy any of their products

But no, we have a thread where its more important to be right than discuss anything worthwhile (because, frankly, there is nothing here).
 
Jawed said:
So, you're saying that both cards are sending each other the "half" of the FPB they've created, so that the "exposure" can be deduced? And there's bandwidth crunch on the link between the two cards?
This is possible, of course, but it's more likely that nVidia has simply decided to disable SLI for these parts when render to texture is used, unless a specific compatibility flag is used.

I agree that HDR in FC isn't "official", but it still seems strange to me that the only released game that specifically uses a unique selling point feature of NV4x isn't speeding-up appreciably with SLI. Well it's no skin off my nose. Just a huge marketing glitch, really.

Jawed
Well, I don't have an SLI setup, so I can't test it myself, but I suspect it may be possible to enable SLI with the latest Forceware drivers under this game.
 
trinibwoy said:
ANova said:
SLi's effectiveness is a matter of opinion, it's price to performance ratio and advantages/disadvantages are not.

SLI's effectiveness in improving performance is not opinion - it is hard cold FPS numbers and is splattered over every single SLI review - including Dave's.

No it is not because SLi doesn't work with all games, and the games it does work with produce varying degrees of improvement. It's not as clear cut and dry as you guys are making it out to be.

ANova, based on your position, are you saying that there is absolutely no advantage of having two R520's in AMR over a single one?

That depends on how effective AMR is. If it's like SLi then it would have an advantage in some games, but not all. I don't think that's worth the price tag, yes that's my opinion.

ninelven said:
I guess all those benchmarks in Dave's (or any for that matter) article were just his opinion...

Effectiveness is directly related to the two latter mentioned topics as well... oh the logic.

See above. Oh the logic indeed. :LOL:
 
ANova said:
No it is not because SLi doesn't work with all games, and the games it does work with produce varying degrees of improvement. It's not as clear cut and dry as you guys are making it out to be.

Yes you are correct, SLI has poor compatiblity at the moment. But nobody plays all games and it seems like the top titles at least are getting significant attention with regard to SLI support. Have you seen the latest compatibility list? The Doom3, Source (and I think UE3) engines all have SLI support along with big titles like BF2, EQ2, WoW and Chaos Theory. I wish someone would do a more thorough analysis of performance improvements in supported titles. Dave? :)

ANova said:
That depends on how effective AMR is. If it's like SLi then it would have an advantage in some games, but not all. I don't think that's worth the price tag, yes that's my opinion.

Ok.
 
No it is not because SLi doesn't work with all games.
So the game I have works with SLi or doesn't (ie yes or no), how could one live at such a speed. It's mind boggling. In the land of bandwith, fillrate, pipelines, AA methods, AA quality, AF methods, AF quality, memory interfaces, and clockspeed, that yes or no question just takes the cake.

and the games it does work with produce varying degrees of improvement. It's not as clear cut and dry as you guys are making it out to be

Just a guess here, but maybe, just maybe, that's why people benchmark SLi in the games it works in. Nah, I think if I want to know how SLi performed in a particular game I'll ask my magic 8 ball.
 
Status
Not open for further replies.
Back
Top