nVIDIA's "SLI" solution

trinibwoy said:
According to pricewatch they aren't that rare anymore. If you wanted to pick up a 6800NU or GT you can (at least in the US). Doesn't matter to me anyway since I'm going to be waiting for all this BTX, PCIe, 939, Nforce4, SLI mess to clear up before upgrading.

bestbuy, newegg, compusa have them pre-order. IMHO, market availability shouldn't be determined by pricewatch listings; the GF6800 is still quite a rare product.
 
trinibwoy said:
According to pricewatch they aren't that rare anymore. If you wanted to pick up a 6800NU or GT you can (at least in the US).
No, you can order one and they are in stock so they'll ship one out right away...but you still won't get it until July. :devilish:

Not that I'm saying "I told you so", at least not in quite those exact words.... ;)
 
Power

I try my best to use as little power in my house as possible. I turn lights off when I leave the room and I don't use air-conditioning, even though I have it. This trend towards monster power computers is really not my bag. It's pretty neat though. I can see why pros would want to use this sort of setup, but it's really wasteful for the average person. Is a single 6800 not powerful enough?
 
Props to nVidia for bringing back dual card solutions! The more choices the better.

On the other hand (as has been said), I can't really see the practicality (even for most "hard core" gamers) any time soon. Perhaps in a year, if dual PEG 16x slot boards become more than a rarity and work with more processors and consumer friendly components. So I don't see consumer SLI NV40 based cards doing much of anything...maybe when the NV5x generation comes along "the market" would be able to accomodate it. I hope nVidia continues to offer SLI capability in their future designs.

Until then, NV40 solutions could be interesting for the pro / workstation market.
 
I think the method noted earlier, where you "upgrade" by buying a second card, is the most likely course for most users, and quite a tempting one at that.
 
Quitch said:
How will this ratio method hold up to the old scanline method? Better? Worse? Sounds inferior to me, but the IP for SLI is held by someone else, right?

If that load-balancing logic works this approach might actually be faster. I would like to know how Nvidia plans to handle shaders that might require information from the other gpu's piece of the screen. I guess any SLI type approach would have the same problem.
 
SLI, then... damn, I couldn't care less about this - in fact, unless they decide to disable it in the mainstream / power mainstream parts (not likely, given the already existing Alienware PCX5750 setups), I can see this only hurt my sort of people (=cheap bastards). You see, if you can just link two mainstream parts together, there won't be enough of a performance delta to warrant the sheer existance of the Ultras and GTs - so, you have to make sure the lower cards STAY below through major feature cuts.

Not good. :?
 
Quitch said:
How will this ratio method hold up to the old scanline method? Better? Worse? Sounds inferior to me, but the IP for SLI is held by someone else, right?

Nvidia's SLI renders two halves of the screen separtately, where 3DFX's SLI actually renders alternate scanlines. The two methods of splitting the load across two cards are fundamentally different, so I doubt the 3DFX IP is being infringed.

However, Nvidia and Alienware may go head to head about their respective SLI technologies, as they sound (at least superficially) similar.
 
This is just terrific... absolutely terr..rrific!!! :D 8) I can't wait. I really do enjoy seeing some whine about the power requirements too. It will make it that much easier to obtain a setup while those who wait for ATi to do something similar will be missing out on the fun. :LOL:

I asked Hercules last year about them ever bringing out a production R350 Maxx solution. I basically got, "Don't hold my breath." Ofcourse we all saw that one graphics board floating around too so who knows?

Are two 16X PCI Express necessary though as ATi's Rage Maxx was a single slot solution with dual cores? Would one 16X PCI Express port allow enough bandwidth for a single card with dual cores?

Don't look too good for ATi at the moment. I mean, with all these new goodies becoming availible the R500 better be as ground breaking (if not more) than the R300 ever was... imho. :oops:
 
digitalwanderer said:
No, you can order one and they are in stock so they'll ship one out right away...but you still won't get it until July. :devilish:

Not that I'm saying "I told you so", at least not in quite those exact words.... ;)

not if i choose fedex super express overnight shipping for $300 :p
 
trinibwoy said:
Quitch said:
How will this ratio method hold up to the old scanline method? Better? Worse? Sounds inferior to me, but the IP for SLI is held by someone else, right?

If that load-balancing logic works this approach might actually be faster. I would like to know how Nvidia plans to handle shaders that might require information from the other gpu's piece of the screen. I guess any SLI type approach would have the same problem.

Is that software load-balancing? I'd have thought that would lead to inferior performance... is this being handled by the GPU or CPU? Hopefully not the later, they're doing enough at the moment as is.
 
From what I've read so far the load balancing logic is in the driver. As far as the performance hit....that's to be seen.
 
Quitch said:
I think the method noted earlier, where you "upgrade" by buying a second card, is the most likely course for most users, and quite a tempting one at that.

Sort of....but that still means your system must be "dual PEG ready"...or you'll have to upgrade your system down the line in order to take advantage of dual cards anyway. You are limiting your future options by tying yourself to a specific card / architecture and smaller variety of motherboards.

And as we know, "next generation" cards tend to not only bring new performance, but new features. So, if you buy one card "this year", and then buy a second card and new mobo "next year", your SLI rig would be competing with whatever new technology is on the market.

The situation with Voodoo2 SLI, in contrast, was much different...every mobo, chipset, and CPU on the market would have support for more than 2 PCI slots at the introduction of V2 SLI. It was a much less risky option to "buy one now and a second one 2 later" for an upgrade path.
 
trinibwoy said:
digitalwanderer said:
No, you can order one and they are in stock so they'll ship one out right away...but you still won't get it until July. :devilish:

Not that I'm saying "I told you so", at least not in quite those exact words.... ;)

not if i choose fedex super express overnight shipping for $300 :p
No way, not even FedEx could prove me wrong at this point. 8)
 
You are limiting your future options by tying yourself to a specific card / architecture and smaller variety of motherboards.
How is buying one of these cards any different than any other card (other than the motherboard requirement)? The PCI-E slots are the same, so you can always choose a next generation card if you feel the features are worth it.
 
Malfunction said:
Are two 16X PCI Express necessary though as ATi's Rage Maxx was a single slot solution with dual cores? Would one 16X PCI Express port allow enough bandwidth for a single card with dual cores?

Yes, two cards are needed. This is not two cores on one card, this is two cards, each with one core working together.

Malfunction said:
Don't look too good for ATi at the moment. I mean, with all these new goodies becoming availible the R500 better be as ground breaking (if not more) than the R300 ever was... imho. :oops:

Don't be silly, this is going to be extremely niche. Even allowing for a hypothetical cheaper NForce 4 motherboard with dual PEG, this is going to need a lot of power, and a lot of cooling. It's also going to be anywhere from very expensive to insanely expensive.

Nvidia may sell some of these come Q3 when the motherboards start to arrive, but right now they can barely ship 6800's. Meanwhile ATI is raking in the cash with R420, still selling R350 based cards, and is killing Nvidia in the PCIe OEM wins with R423.

Nvidia could very spectacularly win the battle with SLI, but lose the war quietly to ATI on the OEM front.
 
Joe DeFuria said:
The situation with Voodoo2 SLI, in contrast, was much different...every mobo, chipset, and CPU on the market would have support for more than 2 PCI slots at the introduction of V2 SLI. It was a much less risky option to "buy one now and a second one 2 later" for an upgrade path.

Well considering that this news just broke today I wouldn't be so hasty to pass judgement on it's viability. If Nvidia decides it wants to push this to consumer boards it can do so easily. I would be miffed if they didn't though as most of the hype is generated in the consumer sector.

If we see an Nforce4 in six months that supports dual PEG with a reasonable premium then I am sure lots of people will jump on this. An additional 6800NU will probably cost ~ $150-$175 this time next year for those looking to upgrade and provide similar performance to the top-end cards of the day.
 
ninelven said:
...The PCI-E slots are the same...
wrong, there's PCI-E 1x and PCI-E 16x slots and most motherboards most likely won't have more than one 16x PCI-E slot. All GFX cards will use 16x PCI-E slot.
 
Nappe1 said:
ninelven said:
...The PCI-E slots are the same...
wrong, there's PCI-E 1x and PCI-E 16x slots and most motherboards most likely won't have more than one 16x PCI-E slot. All GFX cards will use 16x PCI-E slot.

Huh? He meant that the PCIe 16x slots on all mobos are the same. How does buying one 6800 and a dual PEG mobo limit your upgrade options? It only expands them.....
 
wrong, there's PCI-E 1x and PCI-E 16x slots and most motherboards most likely won't have more than one 16x PCI-E slot. All GFX cards will use 16x PCI-E slot.

That's what I hate about this board sometimes..... Obviously if you were planing for a SLI setup you would have at least 1 card from NV using a PCI-E 16x slot. Thus, if you wanted to switch to a single next generation card you would have a 16x slot. Furthermore, if you had already bought a dual 16x slot motherboard, I don't think there would be any requirement to fill both slots. Again you would have no problem.
 
Back
Top