R700 Inter-GPU Connection Discussion

Because the people who want a comparable/superior card for cheaper are non existent?

Huh? I'm just saying, to me all the problems of multi-gpu mean I would stay away from it. I've never like multi-gpu.

But there are people who dont care about all that stuff. As long as the graph shows they get 150 FPS and the next card gets 120..never mind problems like micro-stuttering, input delay, etc.
 
Still, it'll probably sell well to the [H] forum types who just want the biggest numbers on a graph.

That came across as a little bit snobbish. I assume from your tone that purchase decisions should be made based on criteria other than performance.

With both mainstream graphics providers not offering anything substantial in their feature sets to differentiate their cards, what are you supposed to go on? PCB color?
 
That came across as a little bit snobbish. I assume from your tone that purchase decisions should be made based on criteria other than performance.

Erm, how about IQ, price, driver reliability, warranty considerations, future upgradability and power efficiency?

With both mainstream graphics providers not offering anything substantial in their feature sets to differentiate their cards, what are you supposed to go on? PCB color?

I wouldn't write off DX10.1 as "nothing substantial" nor would I ignore the great after sales support on the Nvidia side of the fence either (i.e. XFX, EVGA and BFG have far better support than the likes of Sapphire, HIS or Diamond).
 
Wow! Performance really is staggering in some situations!

Crysis is a bit dissapointing although aside from TR tests (which in fact I found to be the most relevant in terms of settings) R700 is still beating the GTX280 albeit by a smaller margin.

Overall if MS is sorted and this comes in within $50 of the GTX280's price then I think ATI have a serious winner on their hands!
 
Erm, how about IQ, price, driver reliability, warranty considerations, future upgradability and power efficiency?



I wouldn't write off DX10.1 as "nothing substantial" nor would I ignore the great after sales support on the Nvidia side of the fence either (i.e. XFX, EVGA and BFG have far better support than the likes of Sapphire, HIS or Diamond).


IQ is all but equal these days.

Price doesn't matter except among comparable performance groups.

If driver reliability is a major purchase factor, you shouldn't be looking at brand new gaming architechures.

Warranty considerations? Again, assuming that you will buy from one of your chosen vendors, how do you decide between EVGA's 8800 Ultra, 9800gx2, and the gtx280 (all for $549.99)? Or between the 260 FTW ($379.99) and the 9800 gtx ($369.99)?

Since when have you been able to upgrade video cards?

If you are that interested in power efficiency over performance, the newest dual-chip, dual slot cards are the wrong place for you to even consider giving a look.

10.1 currently offers exactly squat and would be a stupid reason to purchase one card over another.

On-Topic:
I would have liked a sneak peek into what it takes to really push the inter-GPU connection for a better comparison between the new connection and plain-old Crossfire.
 
IQ is all but equal these days.
...which was my point, unless there's a typo here or I'm pretty drunk.

Price doesn't matter except among comparable performance groups.

...but it does differentiate cards in the eyes of the consumer, right?

If driver reliability is a major purchase factor, you shouldn't be looking at brand new gaming architechures.

That's bollocks, especially in this day and age.

Of recent memory, the only set of cards to experience serious driver issues (like crashes or hangs) are Nvidia's G80 line up of cards [which again, is a way to differentiate between different cards and IHV's], particularly with the poor Vista support.

All other cards and drivers have been quite rock solid stable, and have been for the last couple years.

Warranty considerations? Again, assuming that you will buy from one of your chosen vendors, how do you decide between EVGA's 8800 Ultra, 9800gx2, and the gtx280 (all for $549.99)? Or between the 260 FTW ($379.99) and the 9800 gtx ($369.99)?

Between the 8800 U, GX2 and 280, I would go for the 280 without question.

Why? I prefer the nature of single GPU's over multi-GPU solutions like the GX2--there, I differentiated and picked a card.

Also, I tend to ignore factory overclocked models as I find that they take some of the fun and joy out of owning the card. They also happen to take extra money out of the wallet for no good reason too.

Since when have you been able to upgrade video cards?

I was referring to CrossFire/SLI.

For example; someone who currently owns a Radeon HD4800 card could easily consider a second card if they continue to use Intel chipsets and processors. Someone else with a GeForce will be locked to nForce equipped motherboards (either by the nForce 200 bridge chip or nForce SLI chipsets) if they want to purchase another card.

If you are that interested in power efficiency over performance, the newest dual-chip, dual slot cards are the wrong place for you to even consider giving a look.

Where did I even mention that I was talking exclusively about high-end, top tier performance cards?

Also, why should someone with a high end card be forced to put up with higher power consumption? Surely you don't think that PowerPlay or Hybrid Power is a waste of time do you?

10.1 currently offers exactly squat and would be a stupid reason to purchase one card over another.

Most users purchase graphics cards for the long run.

DX10.1 has the potential to be just as useful as any other post-distribution, add on feature set.

In fact many are excited at some features of DX10.1 such as the single render MSAA pass and global illumination.

We haven't even seen DX10 being utlised properly yet, so don't you think it's a little too early to write off DX10.1?
 
Crysis is a bit dissapointing although aside from TR tests (which in fact I found to be the most relevant in terms of settings) R700 is still beating the GTX280 albeit by a smaller margin.
It looks pretty CPU limited to me. In many reviews the GTX 280 is much faster than the 9800 GTX, but their SLI counterparts seem to hit the same framerate wall. The same is true of the 4870 vs. 4850 series.

Seems like ATI has higher CPU/PCIe overhead in their drivers for this game.
 
...which was my point, unless there's a typo here or I'm pretty drunk.

Which eliminates IQ as a selection criteria.

...but it does differentiate cards in the eyes of the consumer, right?
If the consumer incorrectly assumes that higher price always equals higher performance and doesn't bother to look at benches.

That's bollocks, especially in this day and age.

Of recent memory, the only set of cards to experience serious driver issues (like crashes or hangs) are Nvidia's G80 line up of cards [which again, is a way to differentiate between different cards and IHV's], particularly with the poor Vista support.

All other cards and drivers have been quite rock solid stable, and have been for the last couple years.

I don't know about all that. Are you saying that you would consider anecdotal evidence of comparative driver stability more important than evidence of comparative performance?

Between the 8800 U, GX2 and 280, I would go for the 280 without question.

Why? I prefer the nature of single GPU's over multi-GPU solutions like the GX2--there, I differentiated and picked a card.

Also, I tend to ignore factory overclocked models as I find that they take some of the fun and joy out of owning the card. They also happen to take extra money out of the wallet for no good reason too.

Why would you go with the 280 over the Ultra unless you knew their relative performance through benches?


I was referring to CrossFire/SLI.

For example; someone who currently owns a Radeon HD4800 card could easily consider a second card if they continue to use Intel chipsets and processors. Someone else with a GeForce will be locked to nForce equipped motherboards (either by the nForce 200 bridge chip or nForce SLI chipsets) if they want to purchase another card.
I see where you are coming from here. Semantics confusion.

Where did I even mention that I was talking exclusively about high-end, top tier performance cards?

Also, why should someone with a high end card be forced to put up with higher power consumption? Surely you don't think that PowerPlay or Hybrid Power is a waste of time do you?

Not worth basing a purchase decision on unless you are seriously strapped for cash to pay for energy you used. If you are that concerned with energy efficiency, you need to be looking at performance/watt anyway. Which requires performance numbers.

Most users purchase graphics cards for the long run.

DX10.1 has the potential to be just as useful as any other post-distribution, add on feature set.

In fact many are excited at some features of DX10.1 such as the single render MSAA pass and global illumination.

We haven't even seen DX10 being utlised properly yet, so don't you think it's a little too early to write off DX10.1?

Nope. DX11 is on the way, and DX10 hasn't even been used to any useful end. Unless developers get on board with 10.1 PDQ and use it to increase performance or increase image quality, it will just be another worthless advertising bullet point.
 
Testing over at XtremeSystems indicates the 4000 series of video cards appears to have "fixed" microstuttering...
grid_graph2.png
 
That looks nice, but I'd like to see a few more tests on other titles.

That was the only sample that Sampsa felt like throwing out to the public -- the entire piece will be published in the very near future to his website. I believe that Sampsa and our Sampsa may be one and the same, so I bet he'll link us to it when he publishes the whole article.

However, I have to assume that if he decided to make such a proclamation and has already mentioned that multiple apps were tested, I assume they all followed a similar indication.
 
That was the only sample that Sampsa felt like throwing out to the public -- the entire piece will be published in the very near future to his website. I believe that Sampsa and our Sampsa may be one and the same, so I bet he'll link us to it when he publishes the whole article.

However, I have to assume that if he decided to make such a proclamation and has already mentioned that multiple apps were tested, I assume they all followed a similar indication.

Yes, they're the same guy from Muropaketti, but sadly for most of you, the articles at Muropaketti are in finnish ;)
Hopefully he'll do a summary in english at xtremesystems or something though if he'll write proper article on it, the r700 review itself is out already
 
Yes, they're the same guy from Muropaketti, but sadly for most of you, the articles at Muropaketti are in finnish ;)
Hopefully he'll do a summary in english at xtremesystems or something though if he'll write proper article on it, the r700 review itself is out already

Google translate and the international language of graph should let us derive most of what we need from it.
 
That looks nice, but I'd like to see a few more tests on other titles.
Indeed. In fact the title could just be cpu-limited at 15ms/frame, in which case microstuttering should just disappear on its own, so this graph doesn't really prove anything yet.
I can't really see why the HD4870x2 should be any different wrt microstuttering than any other AFR solution. Unless AMD figured it's so fast they no longer care about the small performance hit they'd probably get by implementing some synchronization...
 
Which eliminates IQ as a selection criteria.

What!?

If IQ is all but different these days [amongst IHV's] then it's a completely valid selection criteria!

If the consumer incorrectly assumes that higher price always equals higher performance and doesn't bother to look at benches.

You're starting to stray off point.

Your point was that you couldn't differentiate between IHV's and their SKU's (using a metric other than performance), however I mentioned price as a [very effective] way to differentiate between products.

I don't know about all that. Are you saying that you would consider anecdotal evidence of comparative driver stability more important than evidence of comparative performance?

Let me put it this way: Nvidia and ATI/AMD don't offer the same features in their respective driver sets. Thus, someone could again make a decision to purchase different cards from one IHV from another (e.g. Someone who fancies Edge Detect would probably look much more favourably at the HD4800 series compared to a GTX 260/280).


Why would you go with the 280 over the Ultra unless you knew their relative performance through benches?

Why do you assume that one can't research for themselves?

If I were to be dropping $500+ on anything, I would do some kind of research to find the best product available for my money.

...and yes there's the raw performance which I base decisions on, but it's not the ONLY feature I look for. In fact, if I did care purely about performance, I would go for the 9800 GX2. ;)

Not worth basing a purchase decision on unless you are seriously strapped for cash to pay for energy you used. If you are that concerned with energy efficiency, you need to be looking at performance/watt anyway. Which requires performance numbers.

So people with high end cards shouldn't bother with becoming more energy efficient? I suppose everyone who earns over $100,000 p/a should revert back to 150w light bulbs too then?

Also, some people may be looking for products that feature low power consumption, low heat output and low noise for an office setup or HTPC.

Interesting point about those looking for a graphics card to put into a HTPC: ATI's HD lineup feature DVI -> HDMI dongles that carry 5.1 or 7.1 channel sound over the connection too. It's just another way to make a purchase decision without looking at performance.

Nope. DX11 is on the way, and DX10 hasn't even been used to any useful end. Unless developers get on board with 10.1 PDQ and use it to increase performance or increase image quality, it will just be another worthless advertising bullet point.

We don't even know what DX11 will address. Further, there's speculation that DX11 is exclusive to Windows 7, which probably won't retail any more earlier than late 2009.

As I've said before, DX10.1 isn't the first add-on to a mainstream 3D API. Need I remind you that DX9 is currently in its third revision: DX9c. Give DX10.1, as well as DX10, sometime to mature.
 
That looks nice, but I'd like to see a few more tests on other titles.

Was microstuttering ever a game specific problem or did it affect all games out there?

In any case, this looks very promising, and may actually make me pull the trigger on a second HD4870. :D
 
Was microstuttering ever a game specific problem or did it affect all games out there?

In any case, this looks very promising, and may actually make me pull the trigger on a second HD4870. :D

Its always varied from game to game.
 
Back
Top