More "baseless" talk from the Inq?

dksuiko said:
What about somebody who is going to build a new computer from scratch and plans to buy it around November or December? In that case, going with SLI would only cost about $400-$500 more vs a non-SLI version.

But is the inquier article talking about novemember ?

THe article is talking about right now. As the article is publsished right now.


In 5 years I'm sure i can get a sli 6800ultra rig for 10 bucks. What does that have to do with sli being a tech demo right now when this article is published ? Nothing except some nvidia fans can't see nvidia cast in a bad light .
 
jvd said:
Nothing except some nvidia fans can't see nvidia cast in a bad light .

You are laughable. So anyone considering upgrading to an SLI system is an nvidia fan now? Pathetic. And exactly how is providing a high end dual-gpu solution equivalent to being cast in a bad light. You should've taken the blue pill ;)
 
WaltC said:
Well, to be synchronized they have to take the same amount of time...;) That wasn't what I was talking about specifically. I was talking about the penalty incurred by the synchronization.
Synchronization means that the card that's first to finish has to wait for the other one. Load balancing reduces this wait time to almost nothing. There is no other significant penalty for the sync.

As well, how is it distiguished as to which is better, AFR or split-frame rendering?
The user can choose which one to use, or leave the decision to the driver, which I guess would prefer AFR until it recognizes some operations that make AFR inefficient.

It's been so long that I cannot recall precisely, but it seems to me a single V2 could do 1024x768, max. I'm pretty sure that V2 SLI could do up to 16x12, but can't recall precisely. My V3 could 2048x1536 with 16mbs ram ROOB, although of course it was very slow...;)
I have a pretty detailed document from 3dfx here stating that the max resolution for a single V2 is 800x600. The reason is that one buffer wasn't allowed to straddle a 1MiB boundary. 800x600x16bit is just slightly less than 1MiB

With 3dfx SLI two odd-even scanline fields of 800x600 combined is certainly a 1600x1200 final resolution frame. In 3dfx SLI each gpu renders every other scanline. They never render the same ones, and so obviously I don't know what you mean about "2x 800x600."
1600x1200 is four times 800x600, not two times.
V2 SLI running 1024x768 meant each card does 1024x384.

But IIRC "NvSLI" allows you to drive a dual (or even quad) display system rendering one 3D scene, both at their resolution limit. There you have the increased resolution.

Which is not like 3dfx SLI at all, which was my point.
Effectively, it is the same.

The point was it's redendant on your nVSLI purchase, unlike with 3dfx SLI, where it not only wasn't redundant, it didn't exist.
That redundancy is pretty insignificant. Using multiple displays, there is no redundancy at all.
 
jvd said:
dksuiko said:
What about somebody who is going to build a new computer from scratch and plans to buy it around November or December? In that case, going with SLI would only cost about $400-$500 more vs a non-SLI version.

But is the inquier article talking about novemember ?

THe article is talking about right now. As the article is publsished right now.


In 5 years I'm sure i can get a sli 6800ultra rig for 10 bucks. What does that have to do with sli being a tech demo right now when this article is published ? Nothing except some nvidia fans can't see nvidia cast in a bad light .

nVidia fan? Ah, no wonder I have an ATI card sitting in my computer right now. Is everything nVidia vs. ATI with you? Grow up. In any case, if you want to nitpick - fine, it is a technology demo right now. But writing an article proclaiming that point is a bit silly, isn't it? Since it will come out within the next few months. I don't think anybody expected or speaks of SLI as being available today. nVidia might, but that's obviously PR speak.

Anyway, I find your willingness to discredit SLI rediculous, given your primary reason - that it is not available today. When X800 Pros and 6800 GTs were announced and previews were released, you didn't see any articles saying: "Pfft, please. Get this rediculous technology out of my face. They're just technology demos and are at least two months away. Don't even consider them as an upgrade." Such an article would be incredibly short sighted - just like the Inq article.

As for your "In 5 years blah blah blah" comment, it pretty much illustrates your lack of willingness to understand my point and the others disagreeing with you. You're obviously bent on downplaying SLI and won't be convinced otherwise, nor objectively discuss the subject. I guess since it isn't out yet, you have a window of oppurtunity to bash it. But what about am I talking about? This is videocards here, we must be zealous. :rolleyes:
 
trinibwoy said:
jvd said:
Nothing except some nvidia fans can't see nvidia cast in a bad light .

You are laughable. So anyone considering upgrading to an SLI system is an nvidia fan now? Pathetic. And exactly how is providing a high end dual-gpu solution equivalent to being cast in a bad light. You should've taken the blue pill ;)

My father is putting together an sli 6800ultra rig for himself . He is looking right now. Nvidia is advertising sli for about the last 2 or 3 months .

Your the pathetic one that knows he is wrong so he has to say in the future . well in the future alot of things will come to pass. One day real 3d displays will be cheap. But what does that have to do with today ?



Sorry to say but you are all missing the more interesting posts by waltc and xmass about how sli actual works. But then with your lack of reading comprehension and the fact that the article was talking about sli right now. I'm sure you wouldn't get half of what they are saying.

Perhaps u should have taken that other pill .
 
jvd said:
Your the pathetic one that knows he is wrong so he has to say in the future .

Please indicate what I am wrong about? And 'the future' is a couple months away. We aren't talking 2010 here.

Sorry to say but you are all missing the more interesting posts by waltc and xmass about how sli actual works. But then with your lack of reading comprehension and the fact that the article was talking about sli right now. I'm sure you wouldn't get half of what they are saying.

Haha you're funny. My verbal SAT score is probably higher than what you got overall. :LOL: :LOL: Even WaltC said he was willing to wait to see if Nvidia's claims pan out before embarking on a full blown anti-SLI crusade that you seem to be on.
 
Please indicate what I am wrong about? And 'the future' is a couple months away. We aren't talking 2010 here

A couple months is the future. Esp considering when sli was announced.

But of course that doesn't help your point so you try to say that a couple of months is not the future.

Haha you're funny. My verbal SAT score is probably higher than what you got overall
doubt full. I only got a 1310 which is not great. But its not horrible.

Even WaltC said he was willing to wait to see if Nvidia's claims pan out before embarking on a full blown anti-SLI crusade that you seem to be on.

Crusade ?

Heh hardly.

The article calls sli a tech demo. The article was talking about now. You prove my point that now its only a tech demo. If you can prove that it is not a tech demo right now then the article and myself am wrong.

But according to you we have to wait for the future. Which means you can't prove it wrong.

Its very simple. Sorry you can't understand simple things . But whatever.

I'm tired of listening to your pro nvidia crap no matter whats going on.


I wish there was an ignore feature on this board. Though i'm just going to do it myself and ignore u .
 
jvd said:
A couple months is the future. Esp considering when sli was announced.

Its announcement has nothing to do with its availability coinciding with my upgrade schedule. If it was announced last year it would make no difference to me as long as it's available when I'm ready.

jvd said:
I'm tired of listening to your pro nvidia crap no matter whats going on.

Pro nvidia? HAHAHAHAHA!!! If refusing to be an ignoramus and denounce SLI as a sham until I see some solid data brands me as an nvidiot then so be it. But coming from Mr. ATI himself that is friggin hilarious. :LOL: :LOL: :LOL: :LOL:

Anyway, you're just a hard headed zealot whose opinion will count for nought when making my upgrade decision if SLI does prove out to be useful so I'm done with you.
 
What about start again that talk when SLI motherboards hit retail (not reviewers office) so we have a better idea about the costs ?
 
dksuiko said:
I wouldn't roll your eyes so soon - while the nForce4 chipset may have support for 754, the question of whether or not motherboard developers will make one is a whole other story. Coupled with the fact that AMD seems to be phasing out the 754 (or at least no real plans to release faster A64s), I'd be pleasantly surprsied if their actually does happen to be a 754 version.

I know it was a bit premature 8) . But socket 754 just got a new lease of life with the Semprons. Socket 754 is almost like the 939 without the second memory channel, so it's not a biggie from the point of view of the chipset if it's modular enough.
 
SLI will be similar to dual CPU m/b’s IMO. The m/b’s will be expensive and very few people overall will be willing to go that route. The vast majority of people are pretty cheap (budget) when it come to computers and really don’t buy more than what they can get by on or need. Unless the cost of an SLI setup can be had for little more than a non-SLI setup it won’t come close to having a chance to making mainstream. A SLI setup is going to be strictly high end enthusiast for a while. Maybe …it can push down to the mainstream in a year or 2 -- but that is a big maybe.

If I was setting up a system now I would probably go for a good Enermax 400watt psu. On a potential SLI setup I would probably have to jump up to the next level psu + more expensive m/b. Rather than spend extra money on a SLI setup with a 6600GT, I think it would be better off to spend the extra money on a 6800GT now with a non-SLI setup.

If a SLI option was available 2 years ago and I had a 4600, I don’t think adding another 4600 today would be a very good option -- the technology is too outdated --ie lousy AA, slow AF, slow/old shaders. That being said I think the current cards will hold out longer than the what the 4600’s did because the paradigm shift in graphics to shaders. But even a 9500pro system, well, would I have to hunt around on Ebay for another 9500pro? Probably better to sell the 9500pro and get a new card.
 
Xmas said:
WaltC said:
Well, to be synchronized they have to take the same amount of time...;) That wasn't what I was talking about specifically. I was talking about the penalty incurred by the synchronization.
Synchronization means that the card that's first to finish has to wait for the other one. Load balancing reduces this wait time to almost nothing. There is no other significant penalty for the sync.

As well, how is it distiguished as to which is better, AFR or split-frame rendering?
The user can choose which one to use, or leave the decision to the driver, which I guess would prefer AFR until it recognizes some operations that make AFR inefficient.

It's been so long that I cannot recall precisely, but it seems to me a single V2 could do 1024x768, max. I'm pretty sure that V2 SLI could do up to 16x12, but can't recall precisely. My V3 could 2048x1536 with 16mbs ram ROOB, although of course it was very slow...;)
I have a pretty detailed document from 3dfx here stating that the max resolution for a single V2 is 800x600. The reason is that one buffer wasn't allowed to straddle a 1MiB boundary. 800x600x16bit is just slightly less than 1MiB

With 3dfx SLI two odd-even scanline fields of 800x600 combined is certainly a 1600x1200 final resolution frame. In 3dfx SLI each gpu renders every other scanline. They never render the same ones, and so obviously I don't know what you mean about "2x 800x600."
1600x1200 is four times 800x600, not two times.
V2 SLI running 1024x768 meant each card does 1024x384.

But IIRC "NvSLI" allows you to drive a dual (or even quad) display system rendering one 3D scene, both at their resolution limit. There you have the increased resolution.

Which is not like 3dfx SLI at all, which was my point.
Effectively, it is the same.

The point was it's redendant on your nVSLI purchase, unlike with 3dfx SLI, where it not only wasn't redundant, it didn't exist.
That redundancy is pretty insignificant. Using multiple displays, there is no redundancy at all.

Voodoo2 could do 1024x768 if you disable the Z buffer ;) Otherwise all it can do is 800x600. Even in SLI setups the most it could do was 1024x768(with Z though).
 
SLI to me is an option which many mainstream people might appreciate. Not everyone puts money down on a completely new rig every couple of months. There are many of us who perform incremental upgrades on the slowest bottleneck every couple of months.

Soon, I would be seeing an upgrade path that goes like this.
Oct - 6600GT (This will probably be the most viable/affordable SLI capable card.)
Jan 2005 - New 939 motherboard and CPU.
May 2005 - Another 6600GT.

I'm not referring to the high end enthusiast here. I'm referring to the Joe Average who wants acceptable performance. Even if Joe Average doesn't buy the 2nd 6600GT in future, the very thought of having that as an upgrade option, might very well sway his initial purchasing decision.
 
Comparisons between SLI and dual-CPU mobos don't make much sense to me. Why?

Having two video cards is beneficial to the mainstream gamer. Having two CPU's is beneficial to a much smaller segment.

nVidia wants you to buy two video cards. They're trying to make SLI a relatively mainstream option, so we're not going to have to jump through hoops to get it when nForce4 hits Broad Street.
 
Smurfie said:
Soon, I would be seeing an upgrade path that goes like this.
Oct - 6600GT (This will probably be the most viable/affordable SLI capable card.)
Jan 2005 - New 939 motherboard and CPU.
May 2005 - Another 6600GT.

I don't think that would work since the person would need to have bought a new PCIe board in October also. I think it's more like -

Jan 2005 - New 939 motherboard, CPU and 6600GT.
July-Sep 2005 - Another 6600GT.

Admittedly the 6600GT upgrade path is a much harder sell than the 6800GT since there are single cards that can compete with 2x6600GT. No single card would touch 2x6800GT if Nvidia rolls out SLI as it claims.
 
WaltC said:
In the example nVidia provides, the Master card renders 70% of the frame and the slave renders 30% of the frame, which means the slave is not running at full speed but is constrained to match the rendering speed of the Master card rendering 70% of the frame. Theoretically, I suppose, the master card should run generally ~30% faster than it would if it was rendering 100% of the frame, but otoh the Slave, tasked with rendering the remaining 30% of the frame, would necessarily be rendering at far below its top theoretical speed, since its rendering output would be capped to exactly synchronize with the output of the Master card, which is tasked with 70% of the workload for every frame.

Er, I have no idea how nv's SLI is really going to work, and what the real performance benefit will be, but you have missed the point of the claims nv has made about load balancing - if one of the two cards is rendering 70% of the scene, that's because the remaining 30% of the scene rendered by the other card generates the same (or v similar) rendering demand. It's all in the name - "load balancing". If the load balancing layer was perfect (which of course it wont be, but who knows how good it actually is) then both cards will do a precisely equal amount of work, and neither will be "waiting" for the other and hence not working at full capacity. It's pretty simple really.
 
trinibwoy said:
Smurfie said:
Soon, I would be seeing an upgrade path that goes like this.
Oct - 6600GT (This will probably be the most viable/affordable SLI capable card.)
Jan 2005 - New 939 motherboard and CPU.
May 2005 - Another 6600GT.

I don't think that would work since the person would need to have bought a new PCIe board in October also. I think it's more like -

Jan 2005 - New 939 motherboard, CPU and 6600GT.
July-Sep 2005 - Another 6600GT.

Admittedly the 6600GT upgrade path is a much harder sell than the 6800GT since there are single cards that can compete with 2x6600GT. No single card would touch 2x6800GT if Nvidia rolls out SLI as it claims.

I agree, 2x 6600GT doesnt make sense, especially since I'll betchya SLI performance willbe someway off 2x(wahtever card you are using). Although pricing for mobos etc isnt known, whatever happens an SLI setup aint gonna be cheap so I cant see too many taking the 6600gt route. I reckon 6800gt will be the weapon of choice for SLI.
 
caboosemoose said:
If the load balancing layer was perfect (which of course it wont be, but who knows how good it actually is) then both cards will do a precisely equal amount of work, and neither will be "waiting" for the other and hence not working at full capacity. It's pretty simple really.

I would imagine it won't be as simple as that.

Assuming were talking about the screen split solution, then the load balancing is probably going to be judged according to the prior frame - i.e. on frame 1 it measured that board 1 finished xx milliseconds before board 2, hence on this frame due to be rendered next I'll bias the frame to render xx more lines on board 1 and xx less on board two.

It'll almost certianly never get a best case - it will get close, but on "twitch" games it you can move suddenly, completely redistributing the rendering requirements of a frame and throwing the bias out of whack. Even if it only alters the bias between boards after calculating a number of frames its going to be an average.

Of course, its going to be a hell of a lot better than nothing.
 
Back
Top