nVIDIA's "SLI" solution

Discussion in 'Architecture and Products' started by 991060, Jun 28, 2004.

  1. ninelven

    Veteran

    Joined:
    Dec 27, 2002
    Messages:
    1,742
    Likes Received:
    152
    I'm not sure this is what you were referring to earlier, but NVIDIA does seem to claim that their SLi implementation can scale geometry performance.
     
  2. g__day

    Regular

    Joined:
    Jun 22, 2002
    Messages:
    580
    Likes Received:
    2
    Location:
    Sydney Australia
    I note when I first raised the NVidia / Alienware dual GPU / CPU solution last week http://www.beyond3d.com/forum/viewtopic.php?t=13402 it qucikly dissappeared from here (in my mind absolutely the correct forum to discuss it and it implications) and moved to the 3d graphics board section (I initially thought WTF??? why put it there) done I might add without any reason or explanation as to why a moderator felt that was a more appropriate forum.

    This coupled to 20 pages about 2 lines of 3 dots just says to me things seem a bit surreal lately guys...

    But well done NVidia btw!
     
  3. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Actually, I don't think so (not that I like it). Remember that one video card still has to do all of the displaying. If the video cards are run in the mode where the downsampling is done at scanout, Quincunx could still very well be used.
     
  4. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    That's possible, and may be related to their load-balancing technology. Here's an algorithm that nVidia might be using:

    1. Cache one whole frame on the system side.
    2. Examine the transform matrix in order to compare the screen-space "y" values for each vertex to be rendered (which wouldn't require a full transform).
    3. Execute an algorithm that finds out where to place a clipping plane that would place half of the vertices above and half below. This would be fairly challenging to do efficiently, but is certainly not impossible.
    4. Send all triangles that have vertices in the lower portion of the screen to one graphics card, and all triangles with vertices in the upper portion of the screen to the other.

    Now, there are many performance issues associated with the above algorithm, particularly in the amount of CPU processing that would need to be done. There are also potential state change issues on the GPU side. These will definitely limit possible performance gains, and may be one reason why 3DMark2003 was used for benchmarking (since it's not very CPU-limited).

    It would therefore be beneficial if nVidia allowed users to disable the load-balancing, which would eliminate the geometry acceleration, and limit fillrate efficiency. It will be interesting to see exactly what the CPU impact of the load balancing is, though there is always the potential that nVidia has found a way of doing this efficiently on the GPU (though doing so may require sending the entire scene once, reading back a "coverage mask," and then sending it again).
     
  5. Lecram25

    Newcomer

    Joined:
    Dec 3, 2003
    Messages:
    103
    Likes Received:
    0
    lol, what exactly does this have to do with Tarolli? :roll:
     
  6. aaronspink

    Veteran

    Joined:
    Jun 20, 2003
    Messages:
    2,641
    Likes Received:
    64
    False, BTX presents a variety of cooling benefits.

    Huh? It should be cheaper and easier to build than what DELL currently does with their Thermal Module.

    False again. The BTX form factor will work fine with IMC. You just need to shift the dimms slightly forward from the prefered embodiment.

    Hmm, can't think of a add in card that needs airflow besides the video card.

    Drives are just fine. Most drives are cooled primarily via thermal coupling to the case anyways.

    Why not? A Single 120mm fan can easily provide enough airflow for both while making almost no noise. In additional the thermal difference in the are provides to the video card versus the dead air provided in an ATX case is at worst minimal.

    BTX is designed to make the system thermal solution cheaper overall, by reducing the number of fans needed and simplifying attachment. When BTX takes off, even things like an X800 Pro with be sold fanless with a simple alu heatsink that will end up with much better thermal performance than the current designs.

    Aaron Spink
    speaking for myself inc.
     
  7. dizietsma

    Banned

    Joined:
    Mar 1, 2004
    Messages:
    1,172
    Likes Received:
    13
    BTX will probably not take off if the sounds coming from Taiwan are to be believed. AMD don't like it either. Even Intel only like it because it was last ditch solution for Prescott and Tejas and now the last ditch has arrived, with BTX in it....

    It's a smart move by nvidia because even if you never evert buy that second card you buy the first one because it gives you the potential to buy the second. Another tick for the GT over the X800PRO. Also nVidia get to sell a chipset chip as well. Those crafty beggars !

    I don't see the problem with the money, there are a lot of young car enthusiasts spending an awful lot more buying dump valves and carbon fibre air boxes and extra shiny injectors and other ridiculous things that I have no interest in, so I don't see why we should be sounding like our own parents on what is " reasonable "

    I have an AMD 754 system at present and have ordered a 6800U but if there is something that will give me a noticable increase in performance ( ie 20-30%) come Christmas with AMD64 San diego and maybe this option then I am prepared to pay.
     
  8. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    83
    BTX makes only Intel's life easier because of the large heat output of upcoming Intel chips. Far Eastern PC makers hate it. It means new cases, PSUs and motherboards for Intel chips, and Intel keep changing the spec on them. How long have we been talking about BTX, and it's still not been finalised?

    If Intel really wanted to make it work, they should have had BTX ready for PCIe, so that those doing a major upgrade to PCIe could also make the jump to BTX.
     
  9. nutball

    Veteran Subscriber

    Joined:
    Jan 10, 2003
    Messages:
    2,492
    Likes Received:
    979
    Location:
    en.gb.uk
    That's completely false. BTX will make life worse for graphics cards. BTX is a solution to Intels problem, and Intel don't give a flying fudge about the rest of the industry.
     
  10. Quitch

    Veteran

    Joined:
    Jun 11, 2003
    Messages:
    1,521
    Likes Received:
    4
    Location:
    UK
    That depends on whether you planned ahead or not, doesn't it? Which works out as more cost-effective? Actually, it's probably going to depend on the final benches.

    It is quite likely this has more of a future as a very niche, top end product. After all, no matter how good your PC you're limited by what's available at the time. Being able to link two cards together compensates somewhat.

    I wonder if cards could end up competiting with themselves as people opt to buy a second old one rather than move onto the new one. What happened in the era of Voodoo 2/Voodoo 3?
     
  11. Nick Spolec

    Newcomer

    Joined:
    Apr 7, 2004
    Messages:
    199
    Likes Received:
    0
    I have to say, the way the cards connect together seems all fine and dandy... But what about the inevitable heat issue?

    Most video cards emit heat from behind the core (on the opposite side of the HSF). The heat generated from the bottom card on Nvidia's SLI method will undoubtedly effect the top card. Because usually, you want COOL air to be sucked into a HSF, not HOT.

    I am guessing people who actually have this setup would be well advised to have a fan blowing inbetween the cards.
     
  12. Simon F

    Simon F Tea maker
    Moderator Veteran

    Joined:
    Feb 8, 2002
    Messages:
    4,563
    Likes Received:
    171
    Location:
    In the Island of Sodor, where the steam trains lie
    Nahh you take advantage of it. The card on top could have a thermocouple to extract electricity from the waste heat of the bottom card and so save power. :p
     
  13. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    This is probably why they elected for the blower design that they did.
     
  14. Quitch

    Veteran

    Joined:
    Jun 11, 2003
    Messages:
    1,521
    Likes Received:
    4
    Location:
    UK
    Heh, an interesting take for the revisionists :)
     
  15. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Huh? There's no way a normal fan design would work with two closely-packed video cards.
     
  16. anaqer

    Veteran

    Joined:
    Jan 25, 2004
    Messages:
    1,287
    Likes Received:
    1
    Neither would a blower. Air intake is severely reduced by the back of the other card in both designs.
     
  17. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
    Depends on your case too. My current project is to drop a 120mm right on top of my gpu so the gpu/cpu share the air. In this config two blowers won't be starved for cool air.
     
  18. anaqer

    Veteran

    Joined:
    Jan 25, 2004
    Messages:
    1,287
    Likes Received:
    1
    I was under the impression we were talking about the situation in generic BTX cases, not custom built rigs. :?:
     
  19. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
    I don't own a DELL :p
     
  20. MeltedRabbit

    Newcomer

    Joined:
    Jun 4, 2003
    Messages:
    13
    Likes Received:
    0
    Remember how a couple of years ago when with hard drives in RAID configurations how even if you had the the same size and model hard drive or how with Dual CPU systems with the same speed CPUs that some times the system did not work? Or, when one installs dual channel RAM it works best to get a dual channel kit or two DIMMs at the same time? The problem is in many cases that there has been a minor revision change and something has changed making the two items incompatible. Does anyone seriously expect buy one 6800 from one AIB manufacturer and another 6800 from different AIB manufacturer in six months later an expect SLI to work? For that matter even if you wait six months and buy another 6800 from the first AIB manufacturer the odds are that it still won't work.

    [conspiracy mode=on]
    I am imagining various board makers changing the pinout of the SLI connector on their cards to break compatability with other manufacturer’s cards. Anyways how will you get the SLI bridge board if you don't buy the two 6800s at once? Retailers are not going to stock something they may only sell 30 of that costs maybe $30. For that matter, I think nVidia only announced SLI as a marketing gimmick to get people to buy PCIe 6800s with the SLI connector, cards which AIB manufacturers certainly will charge a premium for, to people who think that they will just get another 6800 in six to twelve months and expect them to work. For that matter in six to twelve months the 6800 will be old news and will probably be replaced by an even faster card.
    [conspiracy mode=off]
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...