Quad SLI

Kanyamagufa said:
My only question is how many of the performance issues are driver related? Some of the benches from xbit show incredible potential. But then in some the quad setup would lose to the dual...is it reasonable to believe that the reason these new rigs aren't dominating every benchmark is because of premature drivers, or are there other factors beside software holding them back as well?

I think it's "teething" problems due to premature drivers. If one looks not to far back you'll see Crossfire had issues when it was first released, but worked through them as the drivers matured.

Pharma
 
INKster said:
Presler, Kentsfield ?

I think what bloodbob means is that you cannot compare CPU to GPU period, whether it be multi-core/SMP, NUMA or otherwise.

On topic, I'll be interested to see how UT2007 looks @ some silly resolution :cool:. Not to mention next-next-gen titles such as the great Crysis, if this niche market (however very cool) tech is still knocking around by then...
 
I'm a fan of multi gpu systems but I had the feeling nvidia were still beta testing this before it's time is due and now I am even more convinced. That's not just with the stability either I mean they have not got the hardware to fully do it justice. What they need for quad Sli is firstly a lot more powerful cpu ( Conroe XE ? ), secondly GDDR4, and thirdly cheaper ( and less power hungry ?) 80nm chips. Hopefully by the time they have got all that then the drivers will be stable. I'd much rather have 2x2 rather than 4x1 as well to be honest even if it makes the PCB more complex.

Interestingly the only two games that stood out for me for QuadSli was the ones using HDR, Far Cry and Oblivion . Not being able to use AA with these titles means it is horrible without being able to put in big resolutions to compensate.

Funniest comment in the Xbit labs review ? -

"Apparently, neither water, nor earth, nor skies could be displayed."

That would definitely reduce your gaming experience, even on a nice 30 inch Dell.
 
I'm very surprised that Xbit couldn't get Quad SLI working in BF2, because it worked just fine for us. The resolution changing issue in Call of Duty 2 is also an interesting one - we didn't see that either. I'm thinking that those two could be patch related, because (as far as I can see) they haven't listed the patches used.

Benchmarking Oblivion on this hardware was - without doubts - one of the most painful and frustrating benchmarking experiences I've ever had. :|
 
bigz said:
I'm very surprised that Xbit couldn't get Quad SLI working in BF2, because it worked just fine for us. The resolution changing issue in Call of Duty 2 is also an interesting one - we didn't see that either. I'm thinking that those two could be patch related, because (as far as I can see) they haven't listed the patches used.

Benchmarking Oblivion on this hardware was - without doubts - one of the most painful and frustrating benchmarking experiences I've ever had. :|

There was some indication from the TechPowerUp article that they got better stability with the Asus BIOS PEG Link set to normal/disabled rather than Auto so maybe Xbit did not have this setup the same way as they or you did ?
 
Same with Hilbert (although Hilbert doesn't seem to know what the setting does). Definitely seems like it's not ready yet, which is surprising on a couple of levels.
 
dizietsma said:
There was some indication from the TechPowerUp article that they got better stability with the Asus BIOS PEG Link set to normal/disabled rather than Auto so maybe Xbit did not have this setup the same way as they or you did ?
It's plausible, I guess - that's one of the first things I disable in the BIOS as, AFAIK, it overclocks some part of the architecture. I've known consumers have problems getting 7900 GTX's working with PEG Link mode enabled too, so it's not something that I've not seen before. IMHO, PEG Link mode is one of those settings designed to help the board score better in reviews, but it doesn't help with system stability.

Due to the fact that NVIDIA's Quad SLI drivers are unstable with the current incarnation, anything that will decrease stability is liable to take it over the edge.
 
PEG Link mode auto = overclocking of the GPU and the vid mem. It's been months and maybe years we're asking Asus to disable it. I think it's now the case with new commercial boards. Different story for reviews BIOS of course ;)

Regarding Quad SLI there is something wrong with current drivers and boards. On our system with the provided drivers (87.25) most of the games did not scale at all unless SLI AA was enabled. FEAR and Doom 3 were the only games scaling correctly.

BTW I published some AA sampling patterns with SLI and Quad SLI :
http://www.hardware.fr/articles/621-2/preview-quad-sli-pratique.html
No translation yet but you can see that these patterns are far from being optimal. In games, Quad SLI AA 32x looks mostly like SLI AA 16x while using twice the number of samples. Same with Quad SLI AA 16x vs SLI AA 8x. Hope Nvidia could improve that (SLI AA 8x offset for Quad SLI AA 8x and maybe 2x 8xS instead of 4x 4x for Quad SLI 16x).
 
OMFG!
SLI AA - The Land of Jittering :D

Such a waste ot sampling rate.
icon_rolleyes.gif


At first sight, comparing 16xSLI vs. 16xQuadSLI shots, it's obvious that narrow jittered Quad sampling outputs worse quality. Honestly said, 16xQuadSLI is hardly (if any) better than the plain boring 4xMS mode.
 
Last edited by a moderator:
This is probably the most effective solution, which can be achieved using fixed RG-grid... ...there could be a better way - OGSS 1x2 (xRGMS 4x) / OGSS 2x1 (xRGMS 4x) and blend the results together, but SLI probably doesn't support unequal FB resolutions.
 
it's a shame than NV didn't add programmable sampling patterns to geforce 7, ATI has it since R300 and 3dfx had it (both in VSA/100 and Rampage)

I've seen benchmarks were Voodoo5 in 4x performs exactly like Voodoo4 in 2x, same for V5 6000 4x versus V5 5500 2x, and so on (there's that monster Quantum3D card with 8 VSA/100, also).

so NV quad SLI could have been, have 4x RGSS "for free" when you get no scaling from SLI because of no profile support or rendering techniques. and I know how nice looks 4x RGSS :). instead I'm waiting for someone to force 4x quad SLI AA only so we can have a good laugh.
 
*sighs about SLIAA and SuperAA* Programmable or not programmable. Those sample density are just silly for the quality improvements they yield((for both vendors)) beyond current single card modes such 8xS or 6x in my opinion.
 
If thought #2 is the standard by which thought #1 is proven, then the vast majority of computer users think that X1900 and G7900 are gimmicks.
 
Back
Top