Dual PCIE from VIA

All,

I have tested out a solid state HD but it is on PCI. I would love to see one on a 16x PCI-E. Dual 16x PCI-E slots can have more than just video card uses. =)
 
Some people here seem to see SLI as something that will be a cost worthy upgrade for mini enthusiasts. I don't think the performance from a very good SLI setup with mid range cards can match the performance of the top of the line card alone. Let's say the top of the line card is 500$? and mid range 200-250$ x 2 + a more expensive mainboard for the dual SLI config. Or we can skip that and make it as plain as possible.
The performance of the SLI config wont double from that it had when running single card config. My guess is max around 75%? boost from SLI.
Isn't SLI more of a fun thing for the most richest that have the possibility to buy all the stuff?
Don't really think it's a solution for any other market.

And dang does BTX have any solution for SLI?
Imagine the heat and power requirements! :oops:
 
Unit01 said:
Some people here seem to see SLI as something that will be a cost worthy upgrade for mini enthusiasts. I don't think the performance from a very good SLI setup with mid range cards can match the performance of the top of the line card alone. Let's say the top of the line card is 500$? and mid range 200-250$ x 2 + a more expensive mainboard for the dual SLI config. Or we can skip that and make it as plain as possible.
The performance of the SLI config wont double from that it had when running single card config. My guess is max around 75%? boost from SLI.
Isn't SLI more of a fun thing for the most richest that have the possibility to buy all the stuff?
Don't really think it's a solution for any other market.

And dang does BTX have any solution for SLI?
Imagine the heat and power requirements! :oops:

I agree, the only point in using dual peg is to have otherwise impossible to acieve performance levels with 2 top of the line video cards. The sli mid range solution does not make sense.

However since only nVidia is offering sli now, I wonder how will you fit two dual slot cards with 2 power connector each, for a total of 4 power conectors, into a single system without having mayor stability problems. Not to mention you will loose 4 slots.

To tell the truth I really dont see much of the point, its more there as a marketing tool I guess.
 
Looks like most ppl forgot about SLI Vodoo2. The SLI only gave a 10% to 20% boost and had major problems with many games. One vodoo2 was more then what was needed and SLI is only going to be realy used in the workstation CAD work/animation work and not games. Its like a gamer using a Raid 5 for a gaming system. :rolleyes:
 
Um, from what I remember, V2 SLI offered one hell of a lot more than a 10% to 20% boost. It offered higher resolution (1024x768), and close to twice the performance in real games.

nVidia's SLI should offer a similar doubling of performance, and may be cost effective for some people. Depending upon how the benchmarks of the 6600 shape up, it could be more cost effective to purchase a 6600 SLI system (if one is upgrading their motherboard at that time) than a GeForce 6800 GT or Ultra. After all, with a 500Mhz core, the 6600 Ultra SLI could, in some cases where fillrate is the primary limitation, outdo a single GeForce 6800 Ultra.
 
We are nearing the situation we were at with the V5. The bus can't be doubled easily again (512 bits), and improvements in memory speed are tapering off. Also, both Nvidia and ATI are bumping up against process limits as well. So how do you scale bandwidth and performance to fill the gap until .09u and .065u are ready? Will a 512-bit bus, 2x speed RAM, or .09u be ready by Christmas '04? So what's the near term solution?

Nvidia gambled on DDR in the V5 days, time horizons played out well, and they won. Had DDR not been there? Hmm.. What can Nvidia and ATI gamble on today to double bandwidth?

SLI has the potential to alleviate this pressure. The extra bandwidth will allow higher levels of AA, plus super-sampling may even be an option. How effectively it runs depends on how the game is written (e.g. how it uses rendertargets)
 
trinibwoy said:
Based on that argument the X800XT and 6800 Ultra shouldn't exist either since they fall within that 1% bracket.

Exactly. What's up with all this snotty-nosed negativity all of a sudden, huh?

"Regular users don't need it", blaha blaha. Regular users don't visit sites like this, or play Doom3 either. They use Excel and Word and such, and when they come home from work they surf porn on the internet, they do just fine with Intel Extreme graphics... Why worry about what regular users need or don't need?

Worry about your own needs instead.
 
Aside from the other potential, one thing that seems ignored in this discussion so far is that the cost won't be double for incremental upgrading, as the cost of the 2nd should be significantly less if purchased later. SLI can keep an open door for changing from "mid end" performance to high end as bargains and market forces play a role, without depending on consumer awareness lag for cost saving (i.e., selling the old card to buy a new).

Also, the mid and low end seems likely to continue to be more prone to price decreases due to aggressive price competition than the high end (when the high end parts are strong as is the case with this generation), making the prices of each card more likely to be significantly lower the longer after initial offering they appear.

Some things that could counteract this is there being a significant price premium for SLI capable motherboards, and/or rapid high end turnover and price reduction for prior high end and performance "mid end". Except that the latter actually lends itself to SLI again (the 6800 Ultra seems problematic for SLI due to power usage, but the GT is more reasonable, good performance, and seems more prone to lowered price in that circumstance to boot).
 
PatrickL said:
There is no negativity towards SLI. There was towards guys that claimed it will be a mass market feature.

I don't think anyone ever claimed that...it was claimed that if you plan well and predict lowered prices on mid-range parts it may prove out to be a viable upgrade solution. While some are being optimistic others are going out of their way to find scenarios where SLI is not cost effective - don't know why since if it is, that's just one more option we have and a plus for the tech-savvy consumer.
 
Well, saying that it won't ever be mass market is having a closed mind. Perhaps it won't, and there's many arguments against it, but perhaps it will, and there's many arguments for it.
 
Nforce 4 SLI will not even be 2 x 16 lines nor 16 + 8 lines but if you want to use the 2 pcie it is only able to provide 2 x 8 lines.....
 
And since bus bandwidth typically isn't a large limiter of performance for video cards, I don't see how this is a problem.
 
PatrickL said:
Nforce 4 SLI will not even be 2 x 16 lines nor 16 + 8 lines but if you want to use the 2 pcie it is only able to provide 2 x 8 lines.....

I'm pretty sure the MCP will provide more than 16 lanes in nForce4.

Rys
 
PatrickL said:
Maybe i misread: http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2175

NVIDIA's elegant graphic solution runs 16 lanes into what appears to be a separate switching bridge chip. This bridge can be electrically configured to either run all 16 lanes to one PEG interface, or 8 lanes to two PEG interfaces

Interesting. Hopefully, as Chanloth mentioned - an 8x PCIe interface should be more than sufficient for today's cards.
 
An 8x PEG with two 6800 Ultras would run much faster than a 16x PEG with a single 6800 Ultra. The bus interface is not a dominant performance factor, especially with the SLI happening over a private bus. 8xPEG still maintains the bidirectionality of PCIe plus the bandwidth of AGP8x.
 
Chalnoth said:
And since bus bandwidth typically isn't a large limiter of performance for video cards, I don't see how this is a problem.
I guess his point was that given the way nVidia "implemented" SLI with nForce4, every other chipset supporting a single PEG-link would be just as capable, which kinda takes away CK8-04's unique selling point.

cu

incurable
 
I think true dual 16x PEG boards will be more expensive. System RAM can't even handle the full bandwidth of 2 16x slots operating at saturation, so its wasted bandwidth. I think the solution of one card getting 16x, but if you plug in 2 cards, each gets 8x over both slots is quite elegant.
 
Back
Top