More "baseless" talk from the Inq?

Discussion in 'Architecture and Products' started by Blade, Aug 26, 2004.

  1. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
    Precisely, even on a low end A64 many new and upcoming games will be mostly gpu limited at high resolution and IQ and this is where SLI may show it's worth - I for one will be getting a mid-range A64 when I upgrade whether I go SLI or not.
     
  2. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    Someone wake me when the first hands-on reviews of nV4x "SLI" appear. Considering that nV4x SLI isn't remotely similar to 3dfx SLI, and that the only similarities are that the IHV is asking you to buy two cards instead of one, and that the "SLI" acronym is the same (while the two linking processes are entirely different), I'm amazed that anyone would make the mistake of thinking the two terms, Scalable Link Interface for nV4x and Scan Line Interleaving for 3dfx, were in any way similar apart from the most superficial of perceptions.

    Very much unlike 3dfx's 3d-only V2 (you needed a separate 2d display card to use with it), nV4x "SLI" does not attempt to balance the workload perfectly between the pair of cards by having each card render every other scan line and then combining the frames in the display. Overview of the process is found here:

    http://techreport.com/etc/2004q2/nvidia-sli/index.x?pg=1

    According to the pre-shipping publicity info nVidia provided TR for this article, the two cards make no attempt to split the scanline rendering evenly, but rather they split the screen into discrete segments, and the ratio of work between the cards can be as divurgent as 70/30, which means the Master card is doing 70% of the rendering and the Slave card is doing 30% of the rendering. In fact, the only graphical example of load-sharing nVidia provides TR for that report is of the 70/30 split, which leads me to believe it is far more typical than a 50/50 split will turn out to be:

    [​IMG]



    So, as it of course wouldn't do to have one card rendering 70% of the screen at full speed while the other card renders 30% of the screen at full speed, since the disparity in workloads would produce disparities in frame-rate performance between the discrete screen segments being rendered, it is necessary to synchronize, and thus constrain, the performance of both cards to the Master card.

    In the example nVidia provides, the Master card renders 70% of the frame and the slave renders 30% of the frame, which means the slave is not running at full speed but is constrained to match the rendering speed of the Master card rendering 70% of the frame. Theoretically, I suppose, the master card should run generally ~30% faster than it would if it was rendering 100% of the frame, but otoh the Slave, tasked with rendering the remaining 30% of the frame, would necessarily be rendering at far below its top theoretical speed, since its rendering output would be capped to exactly synchronize with the output of the Master card, which is tasked with 70% of the workload for every frame. Thus the prospect of greatly diminished returns comes into play here when the performance the second card will add to the first is considered, imo. This will be especially apparent in cases where the majority of the "action" (or pixel changes between frames) takes place within the 70% of the frame rendered by the Master card.

    As well, these load-sharing examples apply only to 3d games and applications which are *not already cpu-limited* when run under a single 3d card. In cases where the cpu is the limit to frame-rate performance, the addition of the second nV4x SLI card will make no performance difference at all. When and if these units actually make it into reviewer's hands for testing and evaluation, it will be interesting to see what this arrangement does for the prospect of FSAA and AF applied, as even in cpu-limited games I would *think* that FSAA and AF performance should improve dramatically over that possible with a single card--but I won't hazard a guess at this time.

    The main obstacle to this kind of an arrangement has always been the synchronization of load sharing between the cards. Although some competitive scenarios similar to nV4x SLI were offered at the V2's height of popularity, none of them to my recollection ever came close to doing what V2 SLI did, and the difference in the division of the workload between the two cards and the necessity to synchronize them turned out to be the bugaboo for these competing schemes. Nobody ever figured out how to do the workload sharing better or even as efficiently as 3dfx SLI did it, although a few screen-segment rendering approaches very similar to nV4x SLI were tried. I guess we'll know pretty shortly if nVidia will succeed where the others failed...;)

    Some other differences between V2 SLI and nV4x SLI:

    *By splitting the scanline rendering between cards, a pair of V2's could render in much higher 3d resolution than a single V2 (doing 1600x1200 3dfx SLI simply amounted to each card rendering at 800x600 and combining the result in the display.) From what I've read to date, nV4x "SLI" will be constrained to the 3d resolutions possible through the Master card--there being no combination effect for resolution as with V2 SLI.

    *There was no "2d-display" capability in the V2. There is in nV4x, however, but from what I understand the 2d display in terms of resolution and performance will always be handled exclusively by the Master card in nV4x SLI--ie, the slave will not be used for 2d-display at all.

    *The general hardware environment today is much different than it was for V2 SLI. Whereas PCI slots for V2s and 2d-display cards were ubiquitous at the V2's introduction, nV4x SLI is not PCI, but requires dual-slot PCIex16 motherboards, which are currently about as rare as hen's teeth...;) IE, if nVidia was shipping nV4x SLI today there'd be few who could use it, the expense of implementing it would be far greater than just the cost of the additional 3d card, and the ramp up for market for nV4x SLI is going to be slow going at best--really a pronounced difference between what 3dfx did then and what nVidia is purporting to be preparing to do today for its own version of "SLI."

    In the V2 era, 3d-cards from all IHVs were much slower, cpus were a lot slower, ram, etc., and 3d-games were not nearly as demanding. Whereas a single V2 did two pixels per clock at ~100MHz (?), a single 6800GT does 16 per clock at 350MHz, etc. The emphasis on all 3d hardware development for the last four years has been on single-chip 3d almost exclusively, with the concept of V2 SLI fading away completely. Considering all of this, nVidia's announcement of its upcoming SLI really has me scratching my head as this is something that doesn't seem a likely money-maker for them for quite sometime--at least until dual PCIex16-slot motherboards become fairly ubiquitous among enthusiasts. I await hands-on reviews and evaluations with keen interest.
     
  3. jvd

    jvd
    Banned

    Joined:
    Feb 13, 2002
    Messages:
    12,724
    Likes Received:
    9
    Location:
    new jersey
    Why would u need a socket 939 board or a new cpu ? You can get the top of the line cpu for your motherboard. In my case a 3700+ in most cases a p4 3.6 ghz. With out having to change your board. Most don't even need a new chip.

    Right now though the only board with dual peg is a server board and thus the cost of the board and the new chip is extremely high. IF you want to dispute it please post prices of a dual peg board and the chips that go along with it .

    THen nvidia should take off the sli from all thier pci-e products and not advertise it.

    Because right now the only set up you can get for sli is goign to cost u 1k before the actual video cards .

    If you want to talk about the future thats all great and dandy. How does that hlep a pci-e buyer right now ?

    Claims ? please show me where i can build a 6800ultra pci-e sli rig for under 1.8 k (give or take do to sales ) Not even the whole rig. Just the video cards , cpu and mobo. And please don't gimp it with a 2.2 ghz processer . Put in a chip that will at least attempt to drive the sli .

    WIth what chip ?

    6800gt 400 , extra 400 , sli capable board 100$ ???? so for 900 u get all but the chip u need.

    I would love to see what mobo is going to have dual peg and cost 100$ . You'll have a hard time finding an agp board with almost no features on it for 100$ .



    Lots of people like to talk performance when it comes down to this . No one wants to look into the future.

    Say in novemeber you buy dual 6600gts for 400$ . The board for around 150$ and the chip for around 300$ .

    Now dual 6600gts is going to offer you what ? Its 8x1x500x2 . So its going to offer more fillrate than the 6800ultra. Its also going to offer twice the vertext shaders as the 6800ultra. Now the bandwidth is the only problem.

    How is it going to perform . If it performs the same as a 6800gt why would anyone buy the 6800gt ? If it performs better why would anyone buy the 6800ultra ?

    If it performs worse why wouldn't that person spend the same money and get a 6800gt ?

    How about when the nv50 comes out. Will it be faster than the 6800ultra ? Will it be faster than two 6800ultras ? If you can get a second 6800ultra for 200$ (as you can get 9700pros for 100$ or less right now) and it performed around the n50 why would anyone buy the nv50 ?

    What about the 6800ultras refresh? What if the sli performs better than that ?

    ALl thats going to happen is nvidia eats into its own sales .

    IF could be good for them in the short term but can come back and bite them in the but in the long run .
     
  4. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
    Why do you keep talking about the chip? I currently have an XP on an Nforce2 board - I will be buying a 939 board and a new chip regardless of SLI!!! The chip is not part of the SLI decision - What about that simple fact don't you understand?

    WaltC, I do agree with some of your points - as unnecessarily verbose as they may be :wink: - but please point out one modern title that isn't gpu limited at 1600x1200 4xAA8xAF.

    Also, I don't particulary care if SLI makes money for Nvidia - don't know why some of you are so concerned - For the consumer it's just another option for us to consider.
     
  5. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
    Seems we have very different interpretations of that report. My take was that Nvidia implemented some load balancing logic that determined where most of the 'action' on the screen was occuring and split the workload appropriately. So in a scene such as in Far Cry where the top half of the screen contains a lot of sky a 70/30 split may result in equivalent workload given that object and detail density in the lower 30% is much higher. Please let me know if there is something in that article that is contrary to my analysis.
     
  6. jimmyjames123

    Regular

    Joined:
    Apr 14, 2004
    Messages:
    810
    Likes Received:
    3
    The PCIe 6800GT will have the ability to increase performance via SLI too. So the potential for higher performance is always there. For now, the 6xxx SLI config is 2 cards in SLI, and no more. Also, obviously there will always be a market for single gpu setups, as not everyone will be willing and able to invest in SLI setups.

    If it performs worse, then any rational person would buy a single 6800GT in the beginning. However, realistically, some people are going to invest in a single 6600GT now, and later in the future they can enhance their performance at any time for a marginal cost.
     
  7. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
    I said $100 EXTRA!! Let me break it down for you -

    1) SLI vs non-SLI Socket 939 board ~ $100 extra
    2) 2x6800GT PCIe vs 1x6800GT PCIe - $400 extra

    $100+$400 = $500 - so SLI only cost $500 in this scenario. Capiche?

    And yes, the PSU that I intend to buy is capable of handling 2x PCIe GT's.
     
  8. Xmas

    Xmas Porous
    Veteran Subscriber

    Joined:
    Feb 6, 2002
    Messages:
    3,344
    Likes Received:
    176
    Location:
    On the path to wisdom
    The split is dynamically adjusted to account for the different workloads in the top and bottom of the screen.
    This adjustment of course lags at least one frame behind, but the load will rarely completely turn around.

    So in the best case both cards take exactly the same time to finish their share. But the best case isn't always going to happen, that's why NVidia doesn't claim 2x the performance. btw, split screen is only one of the two modes NVidia's SLI can use (the other being AFR).


    Apart from the fact that two times 800x600 isn't 1600x1200, and the max resolution for V2 SLI was 1024x768, max. resolution today is rather limited by the output device/DVI than the graphics card.

    But IIRC "NvSLI" allows you to drive a dual (or even quad) display system rendering one 3D scene, both at their resolution limit. There you have the increased resolution.

    Multiple displays. There's not much need for 2D acceleration.
     
  9. Drak

    Newcomer

    Joined:
    May 16, 2004
    Messages:
    71
    Likes Received:
    0
    Enthusiasts will buy pairs of nv50 to run in SLI mode of course :twisted:
     
  10. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    OK, it'll be interesting to see if the theory actually works...;)

    Well, to be synchronized they have to take the same amount of time...;) That wasn't what I was talking about specifically. I was talking about the penalty incurred by the synchronization.

    As well, how is it distiguished as to which is better, AFR or split-frame rendering?


    It's been so long that I cannot recall precisely, but it seems to me a single V2 could do 1024x768, max. I'm pretty sure that V2 SLI could do up to 16x12, but can't recall precisely. My V3 could 2048x1536 with 16mbs ram ROOB, although of course it was very slow...;)

    With 3dfx SLI two odd-even scanline fields of 800x600 combined is certainly a 1600x1200 final resolution frame. In 3dfx SLI each gpu renders every other scanline. They never render the same ones, and so obviously I don't know what you mean about "2x 800x600."

    Which is not like 3dfx SLI at all, which was my point.

    The point was it's redendant on your nVSLI purchase, unlike with 3dfx SLI, where it not only wasn't redundant, it didn't exist.
     
  11. dan2097

    Regular

    Joined:
    May 23, 2003
    Messages:
    323
    Likes Received:
    0
    Can someone confirm something, is it right that SLIing two cards theoretically doubles performance but that memory remains as for one card.

    i.e. you sli two 6600GTs and taking into account inefficiency you get say 6800 ultra performance, but would it still be effectively only 128mb thus still not making it anywhere near as good card as the 6800 ultra at very high settings

    I mean do all the textures have to be in both cards ram
     
  12. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
    Correct. Buying 2x6600GT's at once will be a waste of time (and money) - you would be better off with a single 6800U.
     
  13. Blade

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,168
    Likes Received:
    3
    Ugh, this thread turned ugly. Pretty typical, but eh..

    WaltC: Thanks for the in-depth post. I just learned a few things. One thing I would like to point out is that nVidia is claiming that SLI could give in the range of an 80-100% speed increase over a single card. This is what people like me are going on.

    Frankly, I don't care if nVidia's solution doesn't work because I'm going to read reviews before I make such a move anyway. If their NV4X SLI sucks, then I'll just upgrade with single video cards. I'm not holding my breath for V2-like SLI..

    JVD: Just skimming over your posts, I can see that we're not on the same page. The cheapest 939 CPU, the 3500+, is relatively affordable. SLI mobos probably won't be more than $200-250.

    I've just never heard cutting-edge retail products called "demos" before. I was looking for opinions on this. Obviously, an enthusiast would be willing to pay $400 more for 90% more 3D performance in an existing system like yours or mine..
     
  14. Pete

    Pete Moderate Nuisance
    Moderator Legend

    Joined:
    Feb 7, 2002
    Messages:
    5,777
    Likes Received:
    1,814
    1600x1200 is 4x 800x600, not 2x, Walt. Scan Line Interleave means one V2 renders odd lines, and another, even lines. I don't think it meant you can double the vertical (scan line) res.

    And I don't see why nV can't load-balance with their native solution if Alienware's equally elusive device can do the same with any cards.

    jvd, I was looking at high-end SLI from the POV of buying a whole new system, in which case the main point of interest is the premium a dual-PEG M will command over a single-PEG one. Adding in the cost of a CPU for a currently unavailable board seems a bit unfair, as prices may drop by SLI's release. If you're going to factor in the cost of a CPU, why not also that of new RAM and a butch PSU? I think the only fair way to compare SLI's cost is as a total system upgrade in comparison to a single-PEG, single-GPU system. That way, the extra costs are the dual-PEG MB, the bigger PSU, and the extra GPU. It's still an obscenely expensive option, further soured with mind-blowing depreciation, but at least it's an option.

    And, as some ATers have pointed out in the flame-fest on the same subject over there (where I was singled out as the ATi fanboy for not labelling SLI as heaven-sent or relevant to everyone :)), SLI is very interesting for professional workstations, where the extra cost may be well worth it.

    I don't think you can argue with its fit in the high end, as it's really the only option we'll ever have to double the performance of the fastest cards of the day, which can't broach the manufacturing limits. It may be hella expensive, but it's the only option to surpass current process and engineering limits, and you can't knock it for people who have the cash. If nV can sustain SLI by catering to a few professionals and high-end enthusiasts, then more power to 'em. (But I, like you, can't see the value in SLI for anything but the fastest cards.) I do think that the only reason we're seeing it is because of nV's trouble with manufacturing NV30s, trouble that doesn't seem about to disappear with 90nm and below.

    Sam brings up a very intriguing point about dual-display 3D acceleration, but splitting a display in the middle still seems less than ideal to me. Driving a single, ultra-high-res display sounds more exciting to me. Heck, if nV can finagle three-display acceleration out of SLI's four outputs (as Matrox did with Parhelia's two), that'd be awesome.
     
  15. jvd

    jvd
    Banned

    Joined:
    Feb 13, 2002
    Messages:
    12,724
    Likes Received:
    9
    Location:
    new jersey
    Because u don't need to buy a new chip or board to get a 6800ultra currently. So you have to factor in the price of a chip and new board.

    After all not everyone is you.

    Why can't u understand that simple fact ?

    Thats great. But as you keep saying its not for right now. So i guess its really not an option huh ?


    I guess you miss my point. Yes the 6800gt will have the abillity to increase performance. But why would u if you had 400$ to spend buy 1 6800gt which is 350 (right ? ) with 16 pipelines and 6 vertex shaders. When u can get two 6600gts which are 500x8x2 with 6 vertex shadeers each. The performance could actually be greater for your 400$ (we do not know yet though)

    Or vice versa. Why buy a dual 6600gt system if for the same price u get a 6800gt which is faster ?

    You can say well u can get a 6600 gt for 200 now and then in 8-12 months get a 6600gt for 100$ . But at the same side i can say well get a 660gt now , sell it in 12 months and get the 6800gt for a 150$ . Which offers more performance (if thats the way it breaks down)


    You can do that at any ponit in time though with any card . You could have gotten a 5700ultra for 200$ a year ago and then get a 6600gt which will offer most likely twice the performance of the 5700ultra. Is that what your suggesting up above ? Except in the prior case u will only take up 1 slot and you will get a new feature set .



    said $100 EXTRA!! Let me break it down for you -

    1) SLI vs non-SLI Socket 939 board ~ $100 extra
    2) 2x6800GT PCIe vs 1x6800GT PCIe - $400 extra

    and what about that chip your going to buy ?

    Or do you not run your cpu with a chip.

    1) socket 754 board . 75$
    2) 3200 athlon 64 . 250$
    3) 6800gt 400$

    sli
    1) socket 954 board 175$
    2) 3500 athlon 64 (lowest u can buy) 400$
    3) 6800gt x 2 = 800$


    Great for your power supply btw .


    As for those who believe they can use a sempron. The sempron will only be for socket 754s and currently there is no nforce 4 platform for the 754 .

    well the cheapest i can find the 3500+ is 400$ (retail) the 3500+ socket 754 is only 300$ . But the cheapest socket 754 is only 150$ retail (2800+) someone with an athlon 64 system now would have to upgrade a mobo and get a new chip just to get sli. Same thing with the p4s . If someone has a 3.2 ghz p4 you'd have to buy a top of the line chip for dual peg .


    Besides the fact taht the article was clearly talking about right now . As nvidia is advertising this as a feature and yet fans here are telling me oh well wait till the nforce 4 . Well how does this help someone who is upgrading now ? What makes sli something to look at now ? Nothing. As a matter of fact its just a kick in the face.

    Well the reason why i'm not adding in ram is because since the athlon xp 2500+ amd has been using ddr 400 and that carrys over even up to the 3900+ athlon 64 socket 939 .

    As for a power supply. well on the high end i'm certian you will need it.

    Two 6800ultras , 1 3900+ athlon 64 and a gig of ram is going to use alot of power.

    But i was just using an example of the 3 things your def going to need if your making a move to sli.

    Besides the fact that the article is talking about now. Not in the future. IN the future a dual core cpu is going to be cheap. But right now a dual cpu set up is very expensive .
     
  16. dksuiko

    Newcomer

    Joined:
    Feb 6, 2002
    Messages:
    196
    Likes Received:
    0
    What about somebody who is going to build a new computer from scratch and plans to buy it around November or December? In that case, going with SLI would only cost about $400-$500 more vs a non-SLI version.
     
  17. Drak

    Newcomer

    Joined:
    May 16, 2004
    Messages:
    71
    Likes Received:
    0
    Nforce4 supports socket 754/939/940 :roll:
     
  18. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
    Do you look into everyone else's bank account before you make a purchase? Why should I care about others' needs when building my system? You make no sense.

    There are people like me who will not buy an el-cheapo Socket 754 board anyway, SLI or not so your constant factoring of the CPU into the SLI equation is downright idiotic. Tell me, why do people currently buy Socket 939 boards when 754 is available? Damn, does it really bother you that much that people are considering the SLI option? Geez.

    Stop trying to find the situations where SLI will not make sense and realize that there are people (like me) who are still on 492 and will be making an upgrade to 939 in the near future anyway. Why are you so passionate about this anyway - if you can't afford SLI simply don't buy it. And if it turns out that SLI is crap then just get a single card - where is the problem - you act as though people are going to be spending your money.
     
  19. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
    Thank you.
     
  20. dksuiko

    Newcomer

    Joined:
    Feb 6, 2002
    Messages:
    196
    Likes Received:
    0
    I wouldn't roll your eyes so soon - while the nForce4 chipset may have support for 754, the question of whether or not motherboard developers will make one is a whole other story. Coupled with the fact that AMD seems to be phasing out the 754 (or at least no real plans to release faster A64s), I'd be pleasantly surprsied if their actually does happen to be a 754 version.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...