Nvidia went SLI because they can't compete?

Discussion in 'Architecture and Products' started by Redeemer, Mar 4, 2005.

Thread Status:
Not open for further replies.
  1. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Any HDR rendering requires render to texture, which is always going to be a sticky point for HDR. This means that nVidia has to produce a proper compatibility mode for this game when HDR rendering is used. All that you're seeing is that they haven't done this yet.
     
  2. wireframe

    Veteran

    Joined:
    Jul 14, 2004
    Messages:
    1,347
    Likes Received:
    33
    zeckensack,

    In discussing SLI I think it's important to keep the idea that Nvidia is more than a GPU manufacturer fresh. If you look at the quick adoption of SLI capable mainboards by the major players (Asus, Gigabyte and MSI), it becomes clear that there is more to this that simply selling video cards. Nvidia is poised to introduce Intel chipsets featuring SLI and I think I am right when I say that ethusiasts are excited about seeing this technology on the Intel platform. It's a very hand and glove relationship where one helps sell the other and that is, ultimately, what Nvidia is looking to do.

    Your point about price pressure from SLI in future generations is an interesting one. There is obviously great pressure to keep prices down, or at least ramp them up so as not to completely shock the consumer. With a doubling in onboard memory capacity due, this will make it somewhat tricky. Because SLI doesn't effectively double your video RAM it may very well be a difficult proposition to sell two 512MB cards at the premium they will demand on a per-unit basis. It would seem the logical choice to purchase a single card with 512MB rather than two 256MB cards.

    One last point. I am not sure why you think the "high-end nuts" won't buy a new card just because a single doesn't outperform their SLI configurations. Surely they will buy two no matter what? :p
     
  3. ANova

    Veteran

    Joined:
    Apr 4, 2004
    Messages:
    2,226
    Likes Received:
    10
    This is exactly why the argument against SLi only seems logical. The reality of the situation is that there are no apps that require SLi that a single card solution like an X800 XT can't handle, nor will there be one for some time (UE3 probably being the first). By the time UE3 comes out we'll have the R520 and possibly G70 if not beyond, which effectively makes SLi's single best advantage, it's performance, useless. And of course that doesn't even include the fact that SLi simply doesn't work with all games, or doesn't work 100% in some of the games it does work with whereas all games work fine with the alternative.

    Furthermore, just because people are buying into SLi hype and marketing doesn't automatically make it an effective platform, as I already mentioned. So that's a rather mi-nute argument. People get duped into buying things all the time, they then try to justify their purchase to make themselves feel better, that's a human condition.

    Source
     
  4. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,708
    Likes Received:
    2,132
    Location:
    London
    It does? I thought the whole point of NVidia's implementation of a floating point buffer was that there was no render to texture? :? Or are you saying that the FPB is just a glorified FP16 texture.

    Still, the question remains, why does that get in the way of SLI?

    According to DB, http://www.beyond3d.com/reviews/nvidia/sli/index.php?p=30

    Nvidia has already created a "proper compatibility mode" (admittedly we have no idea whether they even considered HDR) for this game. Well, obviously it isn't optimal. Will it ever be? What's the hold-up? Will this problem recur with all games that use HDR? If HDR pans out to include most games, then SLI's kinda useless.

    Maybe this is all tied up with the architectural limitations of NV4x that mean that AA is inoperative with HDR. If NVidia fixes this limitation, maybe it'll also mean that SLI will work with HDR.

    Jawed
     
  5. ninelven

    Veteran

    Joined:
    Dec 27, 2002
    Messages:
    1,742
    Likes Received:
    152
     
  6. DemoCoder

    Veteran

    Joined:
    Feb 9, 2002
    Messages:
    4,733
    Likes Received:
    81
    Location:
    California
    That depends what resolution you want to run in. Can I run all my games in 1600x1200 x4x-6xAA? For users with LCD monitors, running native resolution is imperative. I don't like to run below 1600x1200 *ever*. And high resolution alone doesn't get rid of aliasing issues.

    Will the R520/G70 be able to run UE3 in 1600x1200x4xAA @ a solid 60fps? What about 8xS? What if I want supersampling? That's something that will bring any single card to its knees, but provides superior AA when games are using alpha tested textures.

    However well an R520 and G70 performs, two R520s or G70s are going to perform even better. There are even people who like to run at 1920x1440 or higher.
     
  7. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    It's not "glorified." Blending allows developers to make use of a floating-point render target in the exact same way that they use normal framebuffers. This makes HDR rendering much easier to implement, and faster as well.

    But the problem still remains that the FP16 color format cannot be directly interpreted by the DAC, and thus a tone mapping pass must be performed to map the FP16 colors to FX8 colors for normal framebuffer output (side comment: I'd really like to see a 10-10-10-2 framebuffer format support for the result of this tone mapping pass).

    The problem with SLI and this tone mapping pass is that when doing render to texture in general, you need to have both GPU's know the entire texture, since it's not always clear how this texture will translate to the screen when rendered to the framebuffer. There's just no real way in the API currently to tell the driver that this is indeed a tone mapping pass, and thus each graphics card only needs the data that it has rendered. Thus the need for compatibility switches in the drivers.

    Right. The problem is that a different compatibility switch would be needed for HDR rendering. The drivers in their current incarnation may not be capable of using two different compatibility switches for different rendering types within the same game. Or the compatibility switch for HDR may just not be implemented (since it's not an official feature, it's probably a fairly low priority).

    But no, there's nothing that separates HDR rendering from any other render to texture issues. So there won't be anything that really prevents SLI from working with HDR in the long run. In fact, it may be possible right now with the latest Forceware drivers from nVidia, which offer the user to select SLI modes (though I don't yet know how extensive this customizability is). The big issues here aren't HDR, but rather that there are two different rendering modes available, and that said rendering mode isn't official, and thus may be low-priority to the nVidia's SLI driver team.
     
  8. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,109
    Location:
    New York
    That's not surprising - considering how much you defend old Walt - you guys must have similar DNA too :lol: Just kidding!!
     
  9. ANova

    Veteran

    Joined:
    Apr 4, 2004
    Messages:
    2,226
    Likes Received:
    10
    Now your just being picky. Running 12x10 on a 16x12 LCD doesn't look that bad considering the higher the resolution the finer the pixels the less the distortion. At any rate, I would say yes, an X800 XT is capable of good framerates in 16x12; maybe not with 4x or 6x AA and 16x AF but a little more conservative settings. What do you do about those games that do not support SLi btw?

    No less then a dual 6800U setup methinks.

    And a single R600/G80/NV60 would be better still, while saving you money in the process. That's the whole basis of this argument.
     
  10. DemoCoder

    Veteran

    Joined:
    Feb 9, 2002
    Messages:
    4,733
    Likes Received:
    81
    Location:
    California
    Vaporware can't stand in for hardware. A card that doesn't exist, and won't be out until 2008 can't fulfill your gaming needs in 2006.

    Homework assignment: Look up "The time preference of money".

    Here, I quote for you from Wikipedia

    Now take that, and try to reconcile it with your notion that there is some objective argument about the economic rationality of your video card purpose. Different people have different preferences, and your preference to wait 18 months to get a cheaper card is not the same as someone who wants the top performing system in the very near future.
     
  11. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,109
    Location:
    New York
    The argument for SLI is not that there are apps that require it. Name one game that requires an Ultra or PE.

    WaltC made the point of SLI being best applied to very high res and shutting out LCD users in the process. That's not entirely accurate. The LCD user is one great case for SLI. Look at Dave's benches for UT2004, HL2, and Doom3 at 1024x768 (1280x960 for UT) with 8xAA, 16xAF. In all those titles SLI brings the GT and Ultra from the realm of unplayable < 50fps to very playable > 80fps for the LCD user.

    You can argue that instead of buying a second card, the consumer can simply buy a better monitor but that's up for them to decide isn't it? We can create our own versions of the future or selectively choose real-world examples to support our views but ultimately only time will tell.
     
  12. Pete

    Pete Moderate Nuisance
    Moderator Legend

    Joined:
    Feb 7, 2002
    Messages:
    5,777
    Likes Received:
    1,814
    ANova, your argument is totally subjective. You say an XT is fast enough at high res, and that an LCD looks good enough upscaling to native res. The person who's buying SLI doesn't want good enough for you, but for himself, rendering your opinion moot. And saying an R520/600/hojillion will do what SLI does with merely a single card is as irrelevant to this argument as the price of cheese, as you can't buy an R520/NV50 right now. The whole point of SLI is to offer next-gen performance today. What's wrong with that?

    I just don't understand your insistence on forcing a blanket dismissal on everyone else. SLI has its advantages for a certain few, and your explaining its price/performance and compatability foibles doesn't make it any less attractive for those for whom it'll work like nothing else on the market.

    If price/performance were the only criteria, we'd probably all be running 6600GTs at 800x600 with the AA cranked. But one man's "good enough" may not be so for everyone.

    Eh, we're arguing from different POVs. I'm arguing from the POV of an educated consumer who accepts SLI's current limits for the one way in which it breaks them. You're arguing from the uneducated consumer who sees SLI on the shelf or in the ads and assumes 2x the cards means 2x the performance. If that's the case, then we're both "right." But your comments about fast enough and good enough are just not going to stand up beyond you. The next guy may be more demanding, and SLI is his only option at this point.

    zeck, SLI may well be an act of desparation, at least initially and from the engineering POV (it was kickstarted during the FX years, right?). But once it's up and running, it seems to be a nice ultra-high-end feature (luxury for gamers, utility for developers) that can probably be maintained across generations.
     
  13. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,708
    Likes Received:
    2,132
    Location:
    London
    So, you're saying that both cards are sending each other the "half" of the FPB they've created, so that the "exposure" can be deduced? And there's bandwidth crunch on the link between the two cards?

    I agree that HDR in FC isn't "official", but it still seems strange to me that the only released game that specifically uses a unique selling point feature of NV4x isn't speeding-up appreciably with SLI. Well it's no skin off my nose. Just a huge marketing glitch, really.

    Jawed
     
  14. ANova

    Veteran

    Joined:
    Apr 4, 2004
    Messages:
    2,226
    Likes Received:
    10
    SLi's effectiveness is a matter of opinion, it's price to performance ratio and advantages/disadvantages are not.

    That's all I have to say on the matter.
     
  15. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,109
    Location:
    New York
    SLI's effectiveness in improving performance is not opinion - it is hard cold FPS numbers and is splattered over every single SLI review - including Dave's.

    It's price/performance ratio and balance of pros and cons can be objectively determined but the effect of those factors an an individual's purchasing decision is highly subjective.

    ANova, based on your position, are you saying that there is absolutely no advantage of having two R520's in AMR over a single one?
     
  16. ninelven

    Veteran

    Joined:
    Dec 27, 2002
    Messages:
    1,742
    Likes Received:
    152
    I guess all those benchmarks in Dave's (or any for that matter) article were just his opinion...

    Effectiveness is directly related to the two latter mentioned topics as well... oh the logic.


    EDIT: The irony in this thread is that 12+ pages of bs exists due to the persistent citing of non issues. Had any legitmate reason been given, the thread would have ended.

    EX: I wouldn't use SLi because:
    A) None of the games I play work with it
    B) I don't need the performance
    C) I don't want to bother with the hassle
    D) I don't want to spend that much
    E) I don't want to replace my mobo/cpu right now
    F) I don't like Nvidia and wouldn't buy any of their products

    But no, we have a thread where its more important to be right than discuss anything worthwhile (because, frankly, there is nothing here).
     
  17. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    This is possible, of course, but it's more likely that nVidia has simply decided to disable SLI for these parts when render to texture is used, unless a specific compatibility flag is used.

    Well, I don't have an SLI setup, so I can't test it myself, but I suspect it may be possible to enable SLI with the latest Forceware drivers under this game.
     
  18. ANova

    Veteran

    Joined:
    Apr 4, 2004
    Messages:
    2,226
    Likes Received:
    10
    No it is not because SLi doesn't work with all games, and the games it does work with produce varying degrees of improvement. It's not as clear cut and dry as you guys are making it out to be.

    That depends on how effective AMR is. If it's like SLi then it would have an advantage in some games, but not all. I don't think that's worth the price tag, yes that's my opinion.

    See above. Oh the logic indeed. :lol:
     
  19. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,055
    Likes Received:
    3,109
    Location:
    New York
    Yes you are correct, SLI has poor compatiblity at the moment. But nobody plays all games and it seems like the top titles at least are getting significant attention with regard to SLI support. Have you seen the latest compatibility list? The Doom3, Source (and I think UE3) engines all have SLI support along with big titles like BF2, EQ2, WoW and Chaos Theory. I wish someone would do a more thorough analysis of performance improvements in supported titles. Dave? :)

    Ok.
     
  20. ninelven

    Veteran

    Joined:
    Dec 27, 2002
    Messages:
    1,742
    Likes Received:
    152
    So the game I have works with SLi or doesn't (ie yes or no), how could one live at such a speed. It's mind boggling. In the land of bandwith, fillrate, pipelines, AA methods, AA quality, AF methods, AF quality, memory interfaces, and clockspeed, that yes or no question just takes the cake.

    Just a guess here, but maybe, just maybe, that's why people benchmark SLi in the games it works in. Nah, I think if I want to know how SLi performed in a particular game I'll ask my magic 8 ball.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...