Quad SLI

the SLI AA 32xs is a “mix” of four frames with 8xs (4x multi-sampling + 2x supersampling) FSAA algorithms.
I knew it! :D
Obviously there is no other way to achieve the X32 mode, I think. :)

Anyway, it's absolute overkill even for an year old games and truly stretching the limits of bandwith, blend and rate capacity, even for a quad GPU setup.
Maybe a discrete solution for managing the AA sampling & blending could be more viable at such high degree given the current hardware limitations on MSAA.
 
Last edited by a moderator:
fallguy said:
Totally not worth it to me. Too often it gets beaten by X1900s in CF, and far too many problems.
Well according to the review, it is pretty much utter trash, but that is due to driver cock-ups it seems. So it just means don't buy it now, not don't buy it ever, if you happen to have more money than sense :)
 
fallguy said:
Totally not worth it to me. Too often it gets beaten by X1900s in CF, and far too many problems.

Same thing could have been said about Crossfire and regular SLi when they came out, just have to wait and see ;) . Personally don't like these mega card setups myself but there is a market for em.
 
I understand that it has problems do to being very new tech. SLI has matured leaps and bounds compared to when it first came out almost a year and a half ago. Even CF has matured a lot since the X850 debute, with the new chipset, and video cards.

Even without all the locks ups, crashes, graphical problems, etc., it is still not worth it to me. At least not yet. According to that review, a X1900 CF setup beats it at 2560x1600 on several occasions. And my res of 1920x1200, its pretty darn close. With 8xAA+, CF generally has better frames at my res.

I suppose a lot of it is due to the speeds, arent they 500/600? Thats a lot slower than GTX speeds, yet you pay GTX prices. $500x4, for $2000. Hopefully the 7950X2 (right name?) will improve things, and a few driver updates will do the same. I say Id never spend $2000 on video cards, but I never thought Id spend $1000 either, and I have with SLI, and close to it with CF now. This dang 1920x1200 res keeps *forcing* me to upgrade to get playable frames in newer games. If I get the 30"... who knows.
 
I've always wondered what ATI or NV engineers could do if given a retail (i.e. board) price point of $1,200 for a single gpu solution? An 8 quad R580 w/1024mb and 512-bit bus? Then for a mere $900 you could buy one of the 6-quad fall-outs. :LOL:

But then that doesn't sell mobos, does it? And that seems to be part of the point as well. I'm expecting that within 3-5 years that there will be only Intel, ATI, and NV left in the mid and higher end of the mobo market, with a bit player or two trying to sell chipsets for no-name whitebox mobos.
 
Inane_Dork said:
SLI and CF are good arguments for stereoscopic gaming. That would be interesting.

Very true, the method is quite easy to impliment in current engines too, just have to render the green, blue, red channels just a bit different distance from each other, and use those cheap 3d glasses that we used to get in those 3d movies ;) . No need for special monitors at all.
 
geo said:
I've always wondered what ATI or NV engineers could do if given a retail (i.e. board) price point of $1,200 for a single gpu solution? An 8 quad R580 w/1024mb and 512-bit bus? Then for a mere $900 you could buy one of the 6-quad fall-outs. :LOL:

But then that doesn't sell mobos, does it? And that seems to be part of the point as well. I'm expecting that within 3-5 years that there will be only Intel, ATI, and NV left in the mid and higher end of the mobo market, with a bit player or two trying to sell chipsets for no-name whitebox mobos.

Ya know if cards get that expensive, I think ATi and nV will just pocket the money and give us the regular updates :LOL:.

I agree with the mobo's, right now and in the future the chipsets are also helpping sell cards too.
 
Razor1 said:
Ya know if cards get that expensive, I think ATi and nV will just pocket the money and give us the regular updates :LOL:.

I agree with the mobo's, right now and in the future the chipsets are also helpping sell cards too.

They've proven there's a market there at that price point, and they've shown they are willing to spend engineering resources to craft bits and pieces to serve it. I wonder if the question got asked at least at ATI when they were still in their "not gonna do it!" phase pre making the decision to go with CrossFire. And if it was asked, what engineering responded with. I have a feeling there must be an interesting memo or two floating around on it.
 
geo said:
But then that doesn't sell mobos, does it? And that seems to be part of the point as well. I'm expecting that within 3-5 years that there will be only Intel, ATI, and NV left in the mid and higher end of the mobo market, with a bit player or two trying to sell chipsets for no-name whitebox mobos.
That would be extermely dissapointing. I hope you are incorrect. With AMD putting the memory controller on the die it seems it would be easier now to make a good mobo though. Maybe my next mobo will have to be a sis, via or something to support them :) I don't want competition to cease.

Holy crap, looking at that tech report article is crazy on the power consumption bits. I hate to say but it is pretty amazing that 2*X1900xtx > 4*GX2 or whatever you want to call them
 
Last edited by a moderator:
No wonder Nvidia wants to re-re-launch quad-SLI. Seems that efficiency is in the dumps.

Right now ATI has the better solution for gaming at maximum detail at ultra-high wide-screen resolutions.

Right into Nvidia's marketing nuts!

Sxotty said:
Holy crap, looking at that tech report article is crazy on the power consumption bits. I hate to say but it is pretty amazing that 2*X1900xtx > 4*GX2 or whatever you want to call them

It surprised me that Crossfire consumes such enormous amounts of power. After a quick phone call with ATI I was told that the X1900 has a bigger die, so more heat output. Also the new ring bus memory controller takes up a lot of space and generates "unnecessary" heat because it is not fully utilized yet. However, as I am told the design was an investment into the future and will pay off in the next years.

:?:
 
I suspect "used to its full potential" is closer to the intent of that statement. Kind of like a Chevy Corvette plodding along at 55mph --it's still going to get worse mileage and put off more heat than the Honda Civic in the lane next to it also going 55mph.
 
Quad SLI is is not that bad as a (geeky kind of way) concept to advance processing speed in the medium term.
Even CPU's will reach quad-cores soon, right ?

I remember when the original nVidia SLI tech was launched, back in 2004.
At the time, it was criticized for excessive power draw, bad drivers, lacky performance and game support but, 2 years later, even ATI jumped on the desktop multi-GPU bandwagon.
And lots of people bought the marketing speak (i'm not included, but still...), but there was in fact better performance than a single card, and in the high-end, that's all that matters, it's the "halo effect" that shines through the rest of the line from the IHV that's selling them.

I'd give it a year (tops) until Quad Crossfire, and by that time a set of G80/G81 in Quad SLI should be in a better position as the new king of the 1000 price-point (at least for nVidia's concern, that is).

I mean, if a single Athlon FX-60 or an Intel Pentium XE 965 can cost that much money today, then a GPU -or a dual/quad/whatever- has the right to conquer that space on the market, it has been the most exciting and fast-paced evolution of technology in the PC scene for the last 10 years.


INKster.
 
My only question is how many of the performance issues are driver related? Some of the benches from xbit show incredible potential. But then in some the quad setup would lose to the dual...is it reasonable to believe that the reason these new rigs aren't dominating every benchmark is because of premature drivers, or are there other factors beside software holding them back as well?
 
INKster said:
Quad SLI is is not that bad as a (geeky kind of way) concept to advance processing speed in the medium term.
Even CPU's will reach quad-cores soon, right ?
Not really the same thing since GPU are inherental running in a parrel fashion on a single monothlic chip. CPUs however are not.
 
What I don't understand is why NVidia is letting the old style boards be benchmarked when there's a replacement on its way. Is it simply because of all the clamour for some kind of results (all these months it's been released, but no benchmarks?!!).

This is such a huge own-goal that I don't think ATI even needs to make one of those silly PDFs.

Jawed
 
Back
Top