nVIDIA's "SLI" solution

Wow

I can't afford one new video card, let a lone two. How about making an affordable card before suggesting we buy two. I guess the pros and fanatics will eat this up, but it's pretty much useless to 99.9% of the PC market.
 
I find it amusing that so many people have reacted negatively to this. It's an option for those with the money. If you don't have the money...well sucks for you. Haters!!! :p

The Inq did report that Nforce4 may have dual PCIe 16x support. Guess we'll have to wait to see the merit in that rumor.

I don't think that a dual PCIe 16x mobo and two 6800NU's will cost that much ~ $800 compared to ~ $650 for a regular mobo and 6800U/X800XT now. The power supply will be an issue but it's a chance for all those people with until-now-completely-overkill 550W PSU's to strut their stuff :D
 
Nforce 4 supports dual PEG :oops: So the cost of Nvidia SLI suddenly got a lot cheaper. All it takes is for someone to build a consumer board for AMD with an extra PEG 16 slot in it. Given the likes of Asus, Epox, etc will probably be doing Nforce 4 boards, making a slightly more exotic variation looks pretty likely.

Sure, you've still got double the cost of the cards, but if you were going to upgrade to a new PCIe system and be buying a new card anyway, it's really only the cost of one additional card. Buying a consumer level AMD board with an extra card is going to be much cheaper than having to go to a workstation level Xeon system with an extra card.
 
I just hope Nforce4 comes with built-in WIFI and SoundStorm2 is better than my Audigy2 cause there won't be any room left for anything PCI with an SLI solution installed :?
 
Uttar said:
One small number that might interest you guys: AFAIK, this solution is approximatively 35% faster than Alienware's :)
Thanks Uttar, where did you get that number from? (If you don't mind me asking)

LeStoffer said:
What is their target group with this expensive beast anyway? Show off or actually demand?
The reviewer crowd. ;)
 
trinibwoy said:
Why wouldn't you be able to overclock it? As long as you can get the same stable overclock on both cards it should be fine.

Such systems in the past have often been very timing critical. Overclocking them enough can screw up the timing, making them slower then when clocked normally. IIRC, this was the often case for Voodoo 2 SLI, V5500/6000 and ATI MAXX. Multiple chip/board solutions have to work very tightly together.
 
The only good thing about it is that in let's say 2 years when your 6800 isn't fast enough anymore you can just buy a second one. Still i'd prefer to just buy a new card then instead of having 2 monsters in my box but anyway it's sth. you can consider.
 
thop said:
The only good thing about it is that in let's say 2 years when your 6800 isn't fast enough anymore you can just buy a second one. Still i'd prefer to just buy a new card then instead of having 2 monsters in my box but anyway it's sth. you can consider.

Nahh, 'cos in two years you'll be looking for a DX10 SM4.0 card.
 
Bouncing Zabaglione Bros. said:
Such systems in the past have often been very timing critical. Overclocking them enough can screw up the timing, making them slower then when clocked normally. IIRC, this was the often case for Voodoo 2 SLI, V5500/6000 and ATI MAXX. Multiple chip/board solutions have to work very tightly together.

Well I definitely don't have any understanding of the technical limitations but intuitively it should not be an issue. As long as each card is stable and able to render its workload it should work fine. According to this I may be right.

V-Sync: It turns out that the fear that V-Sync would need to be enabled during parallel GPU operation is unfounded. NVIDIA has solved this problem by using a buffer to merge the two cards' signals. The danger of seeing tearing in a scene is therefore no greater than when using a single graphics card with V-sync deactivated. This way, the cards can run at the highest possible frame rate without being limited by the monitor's refresh rate.

Overclocking: Overclocking the card will still be possible through the driver. In this case, both cards will be overclocked to the same level.
 
trinibwoy said:
Well I definitely don't have any understanding of the technical limitations but intuitively it should not be an issue. As long as each card is stable and able to render its workload it should work fine. According to this I may be right.

Well I'm not saying it's impossible with Nvidia's SLI solution, just that it's always been problematic with past multi-chip/board solutions.
 
thop said:
The only good thing about it is that in let's say 2 years when your 6800 isn't fast enough

Personally if I were to entertain this tech it would be to buy a 6800GT or 6800NU now and buy another in a year or so if/when they begin to get taxed. Given SM3.0 support I think that a dual GT/NU setup will still be overkill a year from now.

Imagine if you could double up on your 18 month old 9700PRO now for ~ $150. Sure it won't be as fast as a X800XT but it will still be hella fast and won't cost you $500 either. :LOL:
 
nelg said:
IMHO alternate frame rendering is a better solution. Imagine temporal AA on a dual R420. :oops:

What are the advantages of alternate frame rendering over an interlaced/split-screen approach? Guess you wouldn't need the 'merge buffer' anymore.
 
nelg said:
IMHO alternate frame rendering is a better solution. Imagine temporal AA on a dual R420. :oops:

The multichip solutions in the E&S and SGI make better use of the sparce sampling from ATI. ATI's method doesn't necessarily stick just to a tiled method, each chip can render the same pixel with with different AA subsamples pattern and these can be rejoined and merged into a single pixel. (I'm wondering if a similar thing couldn't be achieved on a single chip - have each of the 4 quads in R420 rendering the same groups of pixels to give 24X AA)
 
From Extreme Tech:

Extreme Tech said:
However, Nvidia's solution requires two PCI Express x16 slots. This is similar in concept to having two AGP slots in a system. Currently, the only known motherboards slated to ship with two PCI Express slots are based on Tumwater chipset, which supports Intel's x86-64 CPU code-named Nocona.

Now, stop and think about that for a moment. The original SLI was something that almost any Voodoo 2 buyer could aspire to. Relatively few people did, mind you, but you always had the comfort of knowing you could do it in your current system.

But if you've got a high end PC, with a PCI Express graphics card, you'll have to upgrade to a pricey motherboard using one or two expensive CPUs on top of plopping down several hundred dollars on a new graphics cards.

Now think of the power requirements. We'll have two GPUs that require around a hundred watts each, plus a couple of CPUs that need a hundred watts each. Hooray, we're up to 400W and we haven't installed memory or hard drives!

The boutique OEMs like Falcon Northwest and Voodoo PC will certainly offer this as an option, but you can be sure that any system that supports two PCI Express x16 cards will not be cheap. Of course, people who buy the system will have terrific bragging rights. "Dude, I got a 3D Mark score of 30,000!"
Empasis mine, 400 watts for just GPU & CPUs?!?! :oops:
 
am I the only one who thinks it's extremely funny that the Geforce6800 is still rarer than an clean prostitute, and NVIDIA is already giving us the option to buy two of them at the same time? :LOL: :LOL: :LOL:
 
digitalwanderer said:
Empasis mine, 400 watts for just GPU & CPUs?!?! :oops:

Do Xeon boards require you to populate all cpu slots? I thought that you could run a dual board with only one cpu.

In any case, the best case scenario of two 6800NU and a A64 will still require some serious juice so a PSU upgrade seems to be in order.
 
joe emo said:
am I the only one who thinks it's extremely funny that the Geforce6800 is still rarer than an clean prostitute, and NVIDIA is already giving us the option to buy two of them at the same time? :LOL: :LOL: :LOL:

According to pricewatch they aren't that rare anymore. If you wanted to pick up a 6800NU or GT you can (at least in the US). Doesn't matter to me anyway since I'm going to be waiting for all this BTX, PCIe, 939, Nforce4, SLI mess to clear up before upgrading.
 
joe emo said:
am I the only one who thinks it's extremely funny that the Geforce6800 is still rarer than an clean prostitute, and NVIDIA is already giving us the option to buy two of them at the same time? :LOL: :LOL: :LOL:

LOL!

How will this ratio method hold up to the old scanline method? Better? Worse? Sounds inferior to me, but the IP for SLI is held by someone else, right?
 
Back
Top