R700 Inter-GPU Connection Discussion

That was the only sample that Sampsa felt like throwing out to the public -- the entire piece will be published in the very near future to his website. I believe that Sampsa and our Sampsa may be one and the same, so I bet he'll link us to it when he publishes the whole article.

However, I have to assume that if he decided to make such a proclamation and has already mentioned that multiple apps were tested, I assume they all followed a similar indication.

So far I've been able to pin point this behavior with ATI Radeon HD 3870 X2 to two games: Crysis and Race Driver: GRID. With ATI Radeon HD 4870 X2 and 4870 CrossFireX there are no such variation between frame rendering times so something is clearly fixed in 4800 series.

I'm waiting currently for answers from AMD.

Here are graphs showing the difference between frametimes with ATI Radeon HD 4870 X2, 4870 CrossFireX and single 4870 in Race Driver: GRID:

grid_difference_ati.png


And GeForce 9800 GX2 and GeForce 9800 GX2 in single GPU mode:

grid_difference_nvidia.png
 
Microstuttering isn't caused by specific games. It happens in cases where you have a strongly GPU limited workload. This also means that faster GPUs will show less microstuttering.

Which eliminates IQ as a selection criteria.
I disagree. You may be able to choose IQ settings so most people won't be able to tell a difference, but that doesn't mean other IQ options don't exist.
 
It looks pretty CPU limited to me. In many reviews the GTX 280 is much faster than the 9800 GTX, but their SLI counterparts seem to hit the same framerate wall. The same is true of the 4870 vs. 4850 series.

Seems like ATI has higher CPU/PCIe overhead in their drivers for this game.

I've come to realize Crysis is such a terrible engine. I mean, these cards doubled their power and Crysis runs like 2 FPS faster. It's awful.

One of the R700 review sites even said they were thinking of canning Crysis from their benchmark suite because "it runs like crap on all high end cards" and that's pretty much true.
 
I've come to realize Crysis is such a terrible engine. I mean, these cards doubled their power and Crysis runs like 2 FPS faster. It's awful.

One of the R700 review sites even said they were thinking of canning Crysis from their benchmark suite because "it runs like crap on all high end cards" and that's pretty much true.

Really? Because I'm going to test that tonight when I get home (2 x 3870 vs 2 x 4850) and looking at the results of quite a few other people in forums, Crysis got a noticable and very measurable bump. But not in the "peak" framerate, rather the minimum framerate. Further, quite a few sites I saw measured framerate at some median resolution in medium details. Why the hell would someone play in medium details at 1280x1024 with one of these cards?

I am more suspicious of the testing methodology of some of these sites rather than Crytek's engine at this moment. I'll be posting up my scores at some point in the near future so we can all either see how bad the engine is, or how bad some of these sites are.

EDIT:

Let's give some examples to back up our respesctive claims:

Guru3D tested at medium details at four resolutions from 1280x1024 to 2560x1600. Results seem to be better than Ranger remembers, but it was at crappy medium details. WTF?

Legit Reviews tested at 1920x1200 VH, 1920x1200 Med, and 1280x1024 High. Wierd selection, but a single 4850 performed better than a 3870x2 card in all but the 1920x1200 VH test. I don't think that's a bad result for a single card...

AnandTech tested at a multitude of resolutions using high settings (shaders at very high). The 4850 did about 30% better than a single 3870, and matched the 3870x2 until the rez started to increase where it dropped to about 90% of the x2's score.

I'm not going to go looking for these all day, but it looks like extra GPU horsepower is being used generally. The trick is, we still can't see the minimum scores for any of these results, and the minimum is where you can really make or break the game.

I think you're being a little overcritical of the Crysis engine in the face of some bad journalism practices.
 
Last edited by a moderator:
Google translate and the international language of graph should let us derive most of what we need from it.

Quote of the Week! "International language of graph"! LOL!

What!?

If IQ is all but different these days [amongst IHV's] then it's a completely valid selection criteria!
All but the same. Indistinguishable under most circumstances. Not useful as selection criteria.

You're starting to stray off point.

Your point was that you couldn't differentiate between IHV's and their SKU's (using a metric other than performance), however I mentioned price as a [very effective] way to differentiate between products.

You need the relative performance to make sense of the options available and give them a proper perspective. Otherwise, how do you decide whether an option is useful or worthwhile?



Why do you assume that one can't research for themselves?

If I were to be dropping $500+ on anything, I would do some kind of research to find the best product available for my money.

...and yes there's the raw performance which I base decisions on, but it's not the ONLY feature I look for. In fact, if I did care purely about performance, I would go for the 9800 GX2. ;)

Why should I buy the cards and do all the testing myself when someone (many someones) has been gracious enough to do all the hard work for me many times over?

The point is that you need to know how the features impact performance otherwise they are worthless bullet points. Edge Detect that makes most games a slideshow is worthless. Unless you look at the data and see what card has what frame rate and when, the other criteria aren't as important.

So people with high end cards shouldn't bother with becoming more energy efficient? I suppose everyone who earns over $100,000 p/a should revert back to 150w light bulbs too then?

Also, some people may be looking for products that feature low power consumption, low heat output and low noise for an office setup or HTPC.

Interesting point about those looking for a graphics card to put into a HTPC: ATI's HD lineup feature DVI -> HDMI dongles that carry 5.1 or 7.1 channel sound over the connection too. It's just another way to make a purchase decision without looking at performance.

You need to make compromises. Who is going to buy a gaming card with no performance just because it's efficient?

If I was running an HTPC, I would go ATI. But which ATI VIVO card? How to choose? Maybe I would look at their relative performance for the price? ;)

We don't even know what DX11 will address. Further, there's speculation that DX11 is exclusive to Windows 7, which probably won't retail any more earlier than late 2009.

As I've said before, DX10.1 isn't the first add-on to a mainstream 3D API. Need I remind you that DX9 is currently in its third revision: DX9c. Give DX10.1, as well as DX10, sometime to mature.

With the vast majority of the cards in systems not supporting 10.1, I wouldn't hold my breath on it getting into many games.
 
No offence, cbone, but honestly, I feel like I'm talking to a brick wall here.

I suggest you take a look back at my posts, as I'm tired of repeating myself. I've clearly demonstrated (as have you, believe it or not) that people make purchase decisions on factors other than performance alone.

The desktop 3D video card market is far from being perfectly competitive. There are still many different exclusive features offered from either IHV.
 
No offence, cbone, but honestly, I feel like I'm talking to a brick wall here.

I suggest you take a look back at my posts, as I'm tired of repeating myself. I've clearly demonstrated (as have you, believe it or not) that people make purchase decisions on factors other than performance alone.

The desktop 3D video card market is far from being perfectly competitive. There are still many different exclusive features offered from either IHV.

We can agree to disagree even though I'm right. ;) I never said performance was the only factor, just the most important one, as it gives the other factors perspective and determines their worth. Without knowing how it performs, who cares about the other criteria?

Unfortunately most of the exclusive features available between the two are throwaways or PR fluff and not make-or-break gotta haves. I think VIVO capability is the big difference today and that's mostly forgotten.

Speaking of that, what happened to AVIVO?
 
One of the R700 review sites even said they were thinking of canning Crysis from their benchmark suite because "it runs like crap on all high end cards" and that's pretty much true.

I remember that. I thought it was a pretty ridiculous thing to say actually.

So they don't like to test games that stress high end GPU's anymore? They prefer testing games which don't utilise their potential? :rolleyes:

I don't see why people are so down on Crysis. There are now several GPU's which are capable of playing it at very high settings as long as you keep the resolution to 720p and some that can go higher.

I would much rather have a game that can bring a high end GPU to its knees at the highest settings than have a game which I need to run at stupid high resolutions with piles of AA just to make my GPU sweat a little. Thats as long as the visuals match the performance requirement of course which in Crysis's case they do.
 
Thats as long as the visuals match the performance requirement of course which in Crysis's case they do.

I think you'll find a lot of argument with that particular statement.

Crysis looks nice, however with some competing titles you can throw on AA and still double the resolution.
 
Last edited by a moderator:
I gather Tech Report didn't find any special sauce with R700 scaling on their preview this weekend.
 
I would much rather have a game that can bring a high end GPU to its knees at the highest settings than have a game which I need to run at stupid high resolutions with piles of AA just to make my GPU sweat a little. Thats as long as the visuals match the performance requirement of course which in Crysis's case they do.

That's rather bass-ackwards, IMHO. Think about it. You're saying it's better not to have enough performance, than to have too much.
 
That's rather bass-ackwards, IMHO. Think about it. You're saying it's better not to have enough performance, than to have too much.

Well that depends on what you think might drive future improvements in GPUs of course and why one might be motivated to buy one. It speaks about the ambitions of the game devs too. Game engines last longer than GPUs typically it seems to me.
 
That's rather bass-ackwards, IMHO. Think about it. You're saying it's better not to have enough performance, than to have too much.

No, i'm saying its better to use the performance that we do have on core graphics rather than waste it on insane framerates and resolution/AA.

We have a choice, settle for Bioshock/CoD4 (console) level of graphics at 19200x1200/8xAA/16xAF or get something that looks like Crysis at a straight up 1680x1050. Gameplay aside, i'll take the second one any day of the week. Especially when you can pump up the res/AA in that same game when more powerful GPU's get released. What do you do with Bioshock/CoD4? Buy a new monitor and run at 2560x1600?

People who have a problem with Crysis's performance should ask themselves the following question...

Would it have been a better game if it didn't have the very high quality setting? i.e. the highest you could select in game was High settings.

Because thats exactly what it amounts too. Crysis is still easily on par with the best looking games out there when set at high and its performance its much more reasonable. But would it have made for a better game to take away the higher options? If Bioware was to release an optional patch for Bioshock tommorow that vastly improved its graphics but also dragged its performance down to Crysiis v.high levels, would people complain to Bioware for inflicting that option upon the public? Would having the optional patch available make Bioshock a worse game?
 
We don't even know what DX11 will address. Further, there's speculation that .DX11 is exclusive to Windows 7.., which probably won't retail any more earlier than late 2009.

As I've said before, DX10.1 isn't the first add-on to a mainstream 3D API. Need I remind you that DX9 is currently in its third revision: DX9c. Give DX10.1, as well as DX10, sometime to mature.

Incorrect, Vista will support DX11 .. and Windows 7 runs on the same code base as Vista SP1. Below is a quote from a MS developer working on Windows 7.

Oh ye of little faith.

DX11 will be coming to Vista (and Win7). DX11 will never appear on XP. From our perspective, NEW development on XP is not happening. DX11 is... new development.

And Win7 is based off the same codebase as Vista SP1 (with plenty of additions, changes and fixes). If you have a program which works in Vista, it will work in Win7. Same applies to drivers (mostly). Current Win7 build is running all my applications (excluding DaemonTools, although we're aware of that problem, as are the DT guys, or the guys behind SPTD anyhow). All games I've been playing recently (TQ, Serious Sam 2, FEAR, Painkiller, Q4, AoE3, RoN, Civ4) work fine right now..

US
 
No, i'm saying its better to use the performance that we do have on core graphics rather than waste it on insane framerates and resolution/AA.

I disagree with this assertion as well. High framerates and levels of AA are good choices to leave on the table. You seem to imply a preference for a more console-centric approach to graphics, yet it is clear you are not speaking about consoles.

We have a choice, settle for Bioshock/CoD4 (console) level of graphics at 19200x1200/8xAA/16xAF or get something that looks like Crysis at a straight up 1680x1050. Gameplay aside, i'll take the second one any day of the week. Especially when you can pump up the res/AA in that same game when more powerful GPU's get released. What do you do with Bioshock/CoD4? Buy a new monitor and run at 2560x1600?

Crysis has fantastic art assets and an excellent lighting/shading model. It also has horrible framerates, unless you turn down all that eye candy making the whole exercise rather pointless because in order to play the game you have to turn down the graphics to a level that may not be as aesthetically pleasing as another scenario like those you mentioned. Personally, I would take the high res + high AA + high fps action if given the option.

People who have a problem with Crysis's performance should ask themselves the following question...

Would it have been a better game if it didn't have the very high quality setting? i.e. the highest you could select in game was High settings.

Because thats exactly what it amounts too. Crysis is still easily on par with the best looking games out there when set at high and its performance its much more reasonable. But would it have made for a better game to take away the higher options? If Bioware was to release an optional patch for Bioshock tommorow that vastly improved its graphics but also dragged its performance down to Crysiis v.high levels, would people complain to Bioware for inflicting that option upon the public? Would having the optional patch available make Bioshock a worse game?

I'm not arguing against user-selectable graphical options. Far from it.

Here's the crux of my argument: there's no such thing as too much performance. This is why we have IQ-enhancing features and options like high resolutions, AA and AF.
 
Back
Top