R700 Inter-GPU Connection Discussion

Discussion in 'Architecture and Products' started by Arty, Jun 28, 2008.

  1. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    Huh? I'm just saying, to me all the problems of multi-gpu mean I would stay away from it. I've never like multi-gpu.

    But there are people who dont care about all that stuff. As long as the graph shows they get 150 FPS and the next card gets 120..never mind problems like micro-stuttering, input delay, etc.
     
  2. CJ

    CJ
    Regular

    Joined:
    Apr 28, 2004
    Messages:
    816
    Likes Received:
    40
    Location:
    MSI Europe HQ
    All HD4870s used Qimonda GDDR5 memory.
     
  3. cbone

    Newcomer

    Joined:
    Aug 13, 2002
    Messages:
    14
    Likes Received:
    0
    Location:
    Columbia, MO
    That came across as a little bit snobbish. I assume from your tone that purchase decisions should be made based on criteria other than performance.

    With both mainstream graphics providers not offering anything substantial in their feature sets to differentiate their cards, what are you supposed to go on? PCB color?
     
  4. Wesker

    Regular

    Joined:
    May 3, 2008
    Messages:
    299
    Likes Received:
    186
    Location:
    Oxford, UK
    Erm, how about IQ, price, driver reliability, warranty considerations, future upgradability and power efficiency?

    I wouldn't write off DX10.1 as "nothing substantial" nor would I ignore the great after sales support on the Nvidia side of the fence either (i.e. XFX, EVGA and BFG have far better support than the likes of Sapphire, HIS or Diamond).
     
  5. Tchock

    Regular

    Joined:
    Mar 4, 2008
    Messages:
    849
    Likes Received:
    2
    Location:
    PVG
    I wouldn't count monthly drivers and continuous support without INF modding out, definitely though. ;)
     
  6. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    9,237
    Likes Received:
    4,260
    Location:
    Guess...
    Wow! Performance really is staggering in some situations!

    Crysis is a bit dissapointing although aside from TR tests (which in fact I found to be the most relevant in terms of settings) R700 is still beating the GTX280 albeit by a smaller margin.

    Overall if MS is sorted and this comes in within $50 of the GTX280's price then I think ATI have a serious winner on their hands!
     
  7. cbone

    Newcomer

    Joined:
    Aug 13, 2002
    Messages:
    14
    Likes Received:
    0
    Location:
    Columbia, MO

    IQ is all but equal these days.

    Price doesn't matter except among comparable performance groups.

    If driver reliability is a major purchase factor, you shouldn't be looking at brand new gaming architechures.

    Warranty considerations? Again, assuming that you will buy from one of your chosen vendors, how do you decide between EVGA's 8800 Ultra, 9800gx2, and the gtx280 (all for $549.99)? Or between the 260 FTW ($379.99) and the 9800 gtx ($369.99)?

    Since when have you been able to upgrade video cards?

    If you are that interested in power efficiency over performance, the newest dual-chip, dual slot cards are the wrong place for you to even consider giving a look.

    10.1 currently offers exactly squat and would be a stupid reason to purchase one card over another.

    On-Topic:
    I would have liked a sneak peek into what it takes to really push the inter-GPU connection for a better comparison between the new connection and plain-old Crossfire.
     
  8. Wesker

    Regular

    Joined:
    May 3, 2008
    Messages:
    299
    Likes Received:
    186
    Location:
    Oxford, UK
    ...which was my point, unless there's a typo here or I'm pretty drunk.

    ...but it does differentiate cards in the eyes of the consumer, right?

    That's bollocks, especially in this day and age.

    Of recent memory, the only set of cards to experience serious driver issues (like crashes or hangs) are Nvidia's G80 line up of cards [which again, is a way to differentiate between different cards and IHV's], particularly with the poor Vista support.

    All other cards and drivers have been quite rock solid stable, and have been for the last couple years.

    Between the 8800 U, GX2 and 280, I would go for the 280 without question.

    Why? I prefer the nature of single GPU's over multi-GPU solutions like the GX2--there, I differentiated and picked a card.

    Also, I tend to ignore factory overclocked models as I find that they take some of the fun and joy out of owning the card. They also happen to take extra money out of the wallet for no good reason too.

    I was referring to CrossFire/SLI.

    For example; someone who currently owns a Radeon HD4800 card could easily consider a second card if they continue to use Intel chipsets and processors. Someone else with a GeForce will be locked to nForce equipped motherboards (either by the nForce 200 bridge chip or nForce SLI chipsets) if they want to purchase another card.

    Where did I even mention that I was talking exclusively about high-end, top tier performance cards?

    Also, why should someone with a high end card be forced to put up with higher power consumption? Surely you don't think that PowerPlay or Hybrid Power is a waste of time do you?

    Most users purchase graphics cards for the long run.

    DX10.1 has the potential to be just as useful as any other post-distribution, add on feature set.

    In fact many are excited at some features of DX10.1 such as the single render MSAA pass and global illumination.

    We haven't even seen DX10 being utlised properly yet, so don't you think it's a little too early to write off DX10.1?
     
  9. Mintmaster

    Veteran

    Joined:
    Mar 31, 2002
    Messages:
    3,897
    Likes Received:
    87
    It looks pretty CPU limited to me. In many reviews the GTX 280 is much faster than the 9800 GTX, but their SLI counterparts seem to hit the same framerate wall. The same is true of the 4870 vs. 4850 series.

    Seems like ATI has higher CPU/PCIe overhead in their drivers for this game.
     
  10. cbone

    Newcomer

    Joined:
    Aug 13, 2002
    Messages:
    14
    Likes Received:
    0
    Location:
    Columbia, MO
    Which eliminates IQ as a selection criteria.

    If the consumer incorrectly assumes that higher price always equals higher performance and doesn't bother to look at benches.

    I don't know about all that. Are you saying that you would consider anecdotal evidence of comparative driver stability more important than evidence of comparative performance?

    Why would you go with the 280 over the Ultra unless you knew their relative performance through benches?


    I see where you are coming from here. Semantics confusion.

    Not worth basing a purchase decision on unless you are seriously strapped for cash to pay for energy you used. If you are that concerned with energy efficiency, you need to be looking at performance/watt anyway. Which requires performance numbers.

    Nope. DX11 is on the way, and DX10 hasn't even been used to any useful end. Unless developers get on board with 10.1 PDQ and use it to increase performance or increase image quality, it will just be another worthless advertising bullet point.
     
  11. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,022
    Likes Received:
    122
    Right, that was something like a typo :). But the point remains, it can't hurt that obviously more than one supplier now seems ready.
     
  12. Albuquerque

    Albuquerque Red-headed step child
    Veteran

    Joined:
    Jun 17, 2004
    Messages:
    4,309
    Likes Received:
    1,107
    Location:
    35.1415,-90.056
  13. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    9,470
    Likes Received:
    1,686
    Location:
    Treading Water
    That looks nice, but I'd like to see a few more tests on other titles.
     
  14. Albuquerque

    Albuquerque Red-headed step child
    Veteran

    Joined:
    Jun 17, 2004
    Messages:
    4,309
    Likes Received:
    1,107
    Location:
    35.1415,-90.056
    That was the only sample that Sampsa felt like throwing out to the public -- the entire piece will be published in the very near future to his website. I believe that Sampsa and our Sampsa may be one and the same, so I bet he'll link us to it when he publishes the whole article.

    However, I have to assume that if he decided to make such a proclamation and has already mentioned that multiple apps were tested, I assume they all followed a similar indication.
     
  15. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,245
    Likes Received:
    4,465
    Location:
    Finland
    Yes, they're the same guy from Muropaketti, but sadly for most of you, the articles at Muropaketti are in finnish :wink:
    Hopefully he'll do a summary in english at xtremesystems or something though if he'll write proper article on it, the r700 review itself is out already
     
  16. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    9,470
    Likes Received:
    1,686
    Location:
    Treading Water
    Google translate and the international language of graph should let us derive most of what we need from it.
     
  17. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,022
    Likes Received:
    122
    Indeed. In fact the title could just be cpu-limited at 15ms/frame, in which case microstuttering should just disappear on its own, so this graph doesn't really prove anything yet.
    I can't really see why the HD4870x2 should be any different wrt microstuttering than any other AFR solution. Unless AMD figured it's so fast they no longer care about the small performance hit they'd probably get by implementing some synchronization...
     
  18. Wesker

    Regular

    Joined:
    May 3, 2008
    Messages:
    299
    Likes Received:
    186
    Location:
    Oxford, UK
    What!?

    If IQ is all but different these days [amongst IHV's] then it's a completely valid selection criteria!

    You're starting to stray off point.

    Your point was that you couldn't differentiate between IHV's and their SKU's (using a metric other than performance), however I mentioned price as a [very effective] way to differentiate between products.

    Let me put it this way: Nvidia and ATI/AMD don't offer the same features in their respective driver sets. Thus, someone could again make a decision to purchase different cards from one IHV from another (e.g. Someone who fancies Edge Detect would probably look much more favourably at the HD4800 series compared to a GTX 260/280).


    Why do you assume that one can't research for themselves?

    If I were to be dropping $500+ on anything, I would do some kind of research to find the best product available for my money.

    ...and yes there's the raw performance which I base decisions on, but it's not the ONLY feature I look for. In fact, if I did care purely about performance, I would go for the 9800 GX2. :wink:

    So people with high end cards shouldn't bother with becoming more energy efficient? I suppose everyone who earns over $100,000 p/a should revert back to 150w light bulbs too then?

    Also, some people may be looking for products that feature low power consumption, low heat output and low noise for an office setup or HTPC.

    Interesting point about those looking for a graphics card to put into a HTPC: ATI's HD lineup feature DVI -> HDMI dongles that carry 5.1 or 7.1 channel sound over the connection too. It's just another way to make a purchase decision without looking at performance.

    We don't even know what DX11 will address. Further, there's speculation that DX11 is exclusive to Windows 7, which probably won't retail any more earlier than late 2009.

    As I've said before, DX10.1 isn't the first add-on to a mainstream 3D API. Need I remind you that DX9 is currently in its third revision: DX9c. Give DX10.1, as well as DX10, sometime to mature.
     
  19. Wesker

    Regular

    Joined:
    May 3, 2008
    Messages:
    299
    Likes Received:
    186
    Location:
    Oxford, UK
    Was microstuttering ever a game specific problem or did it affect all games out there?

    In any case, this looks very promising, and may actually make me pull the trigger on a second HD4870. :D
     
  20. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26
    Its always varied from game to game.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...