R700 Inter-GPU Connection Discussion

And if you remember my comments in that thread, I said I agree with kyleb that 1080p offers no advantage over 768p under your viewing conditions provided that the image isn't aliased, e.g. a TV show or movie.

I must've missed your aliasing comments earlier. Thanks for clarifying.

16xCSAA isn't perfect, and according to some of the IQ comparisons we've seen, some edges can actually be worse.

Yes, I've seen this in Oblivion especially on large buildings.

For the most part, though, you're probably seeing shader aliasing, so your experience is not necessarily a justification for higher AA levels.

Definitely edge-aliasing. Shader/specular aliasing is minimal in most games I play w/16xCSAA.

Edit: except for D3-engine games which are notorious for specular aliasing, and I do observe quite a bit in D3 and Prey, although the former is much worse. Q4 isn't too bad though.
 
Last edited by a moderator:
??? He compared Crysis to COD4 and Bioshock and said that the latter looked better at the same performance levels, which has nothing to do with old HW/new SW mismatching.You threw out his whole post to focus on one out of context sentence and only part of that sentence.

It still doesn't dismiss his main point of contention was that he couldn't run Crysis on his 7600 GT.

Or that he was replying to another post that was in regards to Crysis at lowered settings looking better than Bioshock or COD4 at higher settings.

And the implication that everything should continue to play and look good with 7600 GT level of hardware.

While it may be debatable depending on personal taste and preference whether Crysis at lowered settings looks better than Bioshock or COD4 at higher settings, the point remains that as technology progresses, older hardware won't be able to run it.

Crysis by all accounts (similar to Far Cry) sought to target hardware not yet out (CPU/GPU) while having the ability to customize settings for current and past hardware.

I suppose, they could have just left out the very high settings and only have high as the highest selectable. Then you'd hear far less complaints about how badly the engine runs even though it's still the exact same.

Or maybe bump up Medium and call it high, and have that as the highest selectable settings when it was released. Then it would have run well on most everyone's machines and noone would complain.

And as we're getting to the point where hardware capable of VH settings at higher res are starting to slowly arrive (R700 being able to bump up settings even more then 4870). Although it may be another generation before that's fully apparent. I think it'll require another generation of both GPUs and CPUs.

And that I think is great. Game devs shouldn't ONLY push the GPU, they should also push what the CPU is capable of also.

And in that case, Crysis is hurt by the fact that CPUs aren't progressing as fast as GPUs.

Regards,
SB
 
It still doesn't dismiss his main point of contention was that he couldn't run Crysis on his 7600 GT.

Or that he was replying to another post that was in regards to Crysis at lowered settings looking better than Bioshock or COD4 at higher settings.

And the implication that everything should continue to play and look good with 7600 GT level of hardware.

While it may be debatable depending on personal taste and preference whether Crysis at lowered settings looks better than Bioshock or COD4 at higher settings, the point remains that as technology progresses, older hardware won't be able to run it.

Crysis by all accounts (similar to Far Cry) sought to target hardware not yet out (CPU/GPU) while having the ability to customize settings for current and past hardware.

I suppose, they could have just left out the very high settings and only have high as the highest selectable. Then you'd hear far less complaints about how badly the engine runs even though it's still the exact same.

Or maybe bump up Medium and call it high, and have that as the highest selectable settings when it was released. Then it would have run well on most everyone's machines and noone would complain.

And as we're getting to the point where hardware capable of VH settings at higher res are starting to slowly arrive (R700 being able to bump up settings even more then 4870). Although it may be another generation before that's fully apparent. I think it'll require another generation of both GPUs and CPUs.

And that I think is great. Game devs shouldn't ONLY push the GPU, they should also push what the CPU is capable of also.

And in that case, Crysis is hurt by the fact that CPUs aren't progressing as fast as GPUs.

Regards,
SB

Again, this just seems like another disingenuous post that is trying to totally and utterly ignore the quote:

Crysis looks much worse and plays worse than Far Cry on my comp

Regardless of anything, that is a DX9 card playing recent DX9 games (one being a port even so it should run or look worse), so unless the rest of this computer is somehow making Crysis "look" worse, his contention IS that Crysis not only runs worse, but looks worse as well at the same performance levels.

I won't comment on that comparison between games because I have not personally played Crysis, much less seen it on lower detail levels.

And I would say that:
thanksfully I've got Bioshock to have some good looking and playable recent gaming.
is in fact a direct comparison the sentence immediately preceding it.
 
that leaves people that don't upgrade very regularly in the cold.
Crysis looks much worse and plays worse than Far Cry on my comp (the 7600GT is a bit of a weak link here. should have got better but that was a sidegrade in a hurry from 6800GT AGP). thanksfully I've got Bioshock to have some good looking and playable recent gaming. you can play it in 1920, I can play it in 1024 ; but I can't play Crysis at all.

I never mind older games with maxed out IQ as well (waiting for quake live, 'cos these damn Q3 bots suck!)
I also hope id's Tech 5 (and that interesting Rage game) will run easily on lowly machines, as did the Q3 and doom3 engines (and even UE3 it seems)

Well on the one hand you could argue that Crysis should have scaled down better but thats no reason why it shouldn't scale up as well. If however reaching the graphical heights of Crysis (including things like density of objects, view distance, physics) requires the engine to fundamentally require a minimum amount of power and therefore simply cannot scale down further than it has, then i'm of the belief that people on low end hardware should simply lose out.

Afterall, why should people who pay the money for high end setups not have the opportunity to use all that power because people who don't spend much money on graphics wouldn;t be able to run the game?
 
That is Not True. Crysis definatly DOES NOT look as good as it should for the processing power requirements!!!

I'm not the slightest bit interested in opinions that are presented as facts.

Just a BAD ENGINE, anyone trying to defend other is just slow, sorry.

Great start here at B3D. With an attitude like that, hopefully you won;t be around for long.

Compare 2 new games (GRID COD4) and play them at higest playable quality, does crysis then look best?

Yes without any doubt what so ever (coming from someone who has all 3 games).

Hell NO! Why? Because you cant go higher than medium settings at medium resoltution with medium antialiasing.....

Look for all the benchmarks you want, it will prove my point.

I can't be bothered posting liks for the simple fact that everyone who reads this knows what you just wrote is complete rubbish. Crysis is more than playable at high settings on pretty modest PC's by todays standards and at modest resolutions the higher end setups can handle very high with little problem. Take a look at the thread your in, now go and look at the benchmark for that GPU and the GPU's that are compared with it. v.high and 720p is perfectly playable for most of them.

And yes, Crysis looks much better than either of those games at very high. It looks a fair bit better at high aswell and its arguable whether it looks worse even on medium.

Why do you think The new Crysis has to make all that better and saying it will run better by needing less performance power??

Because optimising an engine is always better, of course :rolleyes:
 
I'm not the slightest bit interested in opinions that are presented as facts.

but you have no problem doing it yourself?

I think there's a case for quite a number of games at 19x12 with 4xAA vs high crysis at 1024x768 with no AA, but there's really no point, its just opinion and not all that relevant to the thread.

Perhaps a mod can spin off a "Crysis as a graphics benchmark" thread.
 
Quote:
Eventually, of course, older hardware just won't be able to run newer games well or with any sort of IQ.
--------------------------------------------------------------

Crysis cant be run with latest videocard but is "older" with any high resolution!

:D

Crysis at maximum settings and a resolution of >720p....

Even the 9800GTX+ ($230) is fast enough to run the game at greater than 30fps at these settings.

http://www.driverheaven.net/reviews.php?reviewid=588&pageid=6

If your going to make sarcastic comments then at least check that your correct first.

And yes, Crysis can still be run at higher resolutions as well as long as the detail settings are reduced.

Hell, you can play the game on a single 3870 at 1920x1200 if you want too (medium settings):

http://www.legitreviews.com/article/745/6/
 
I think there's a case for quite a number of games at 19x12 with 4xAA vs high crysis at 1024x768 with no AA,
Perhaps a mod can spin off a "Crysis as a graphics benchmark" thread.

Thats not even remotely an accurate comparison. A GPU that can run CoD4 at 1920x1200/4xAA should be able to handle Crysis at V.High 720p no problem. Thats comparison you should be making.

In my opinion, as someone who has completed CoD4 and played the Cysis demo dozens of times, Crysis wins that comparison hands down.
 
Thats not even remotely an accurate comparison. A GPU that can run CoD4 at 1920x1200/4xAA should be able to handle Crysis at V.High 720p no problem. Thats comparison you should be making.

um not from what I've seen.

In my opinion, as someone who has completed CoD4 and played the Cysis demo dozens of times, Crysis wins that comparison hands down

I think we're pretty well established on where your opinion lies. It's still fair to say we don't agree.
 
um not from what I've seen.

Well my GTS640 runs Crysis like butter at 1024x768/High but no way can I pull off 1920x1200/4xAA in CoD4. In fact I play CoD4 at 1280x800/16xAA/16xAF. Any higher than that and it becomes choppy.

Meanwhile i'm playing Crysis at 1280x800/16xAF/High.
 
http://www.anandtech.com/video/showdoc.aspx?i=3354

When AMD began talking about no longer building high end hardware using single monolithic GPUs a few weeks back, we let them know that improving CrossFire support would be incredibly important going forward. AMD told us that they are putting a lot into that but also that they have some exciting technology up their sleeves with R700 to help out as well. Unfortunately, we haven't gotten as much detailed information on how it works, but the new technology is GPU to GPU communication.

Until now, CrossFire has done zero GPU to GPU or framebuffer to framebuffer communication. As with the first iteration, each card fully renders the parts of the screen for which it is responsible (be it a whole frame in AFR, the top or bottom half of a screen, or alternating tiles). These results are sent to a combiner where the digital signals are merged and output to the screen. This is the only communication that takes place in CrossFire at the moment. R700 will change that by allowing GPUs to communicate.


RV770 has a CrossFire X Sideport...we assume that the two RV770s on a single R700 board somehow connect Sideports and make fast. AMD hasn't told us how yet.

It is not clear how extensive this communication will be, what information will be shared, or how much bandwidth requirements are increased because of this feature. And while it is a step in the right direction, the holy grail of single-card multi-GPU solutions will be a shared framebuffer. Currently both GPUs need a copy of all textures, geometry, etc., and this is a huge waste of resources. While the R700 has 2GB of RAM on board, it will still be limited in many of the same ways a 1GB RV770 would be as each GPU only has access to half the RAM on the card. Of course, since we don't have a 1GB RV770 yet, this card could show some advantages over the single 4870 regardless of CrossFire.

I think possibly the entire graphics community is just waiting to jump AMD/ATI for world-class multi-gpu support as soon as R700 is official. He who says A, must say B. ATI has said A. . .the graphics community will hold them responsible for saying B as well.

Y'hear me, Terry? We're coming for ya, man --be ready.
 
Okay I think I threw the whole thread off with my Crysis comment, but my thing is more that it just doesn't seem to scale. It scales very little seemingly.

The new GPU's have near double or more most functional units (consider 3870 vs 4870), yet they only seemingly get a small increase in Crysis frames. From oh, 25 to 28 or something it seems like a lot of times. And yes I know my figures aren't exact so dont quibble with that part.

I also read something to the effect that Crysis does an extreme number of rendering passes (7?) and therefore AA is a tremendous hit.

I think Crysis Warhead will address all this and be very interesting to benchmark. Crysis Warhead appears to be the Crysis engine "fixed".
 
Allright, my apologies for the tone in my post(s).

@Rangers, i agree. (as where 'fixed' to me means corrected faults in engine)

My point was that crysis can not (yet) be run on high resolutions.
720p/768p is about 1MegaPixel wich is very low off course...
(and that does not look good imho, very edgy)

I have a 7950GT 512MB and a 3870X2. The 3870x2 can run any game (except crysis) maxed out on 19*12. (wouldn't matter if i add another X2)

I have ordered 2 4870's but seeing(benchmarks) even then my screen will not show a crisp Crysis. But it does show very very nice screen quality in almost all other games. That just make me feel a bit angry, making me blame the engine...
----------------------------------
@pjbliverpool
quote:
(coming from someone who has all 3 games).
-------------------------------------
To assume makes an ass out off you and me, my vriend!

When i look at the maximum quality i can get out of crysis with both setups other games look better, that was just my point.
 
@pjbliverpool;
quote:
Even the 9800GTX+ ($230) is fast enough to run the game at greater than 30fps at these settings.
-----------------------------------------
min. 23 FPS is not fast enough to me haha. Most people like about 60 frames in fps-games. (your link)
and 768p is still edgy to me too, about 2 MP is alright. (1080p minimum, settings maxed)
 
could you at least learn how to use the quote button?

Saves you time using all the minuses (-) and makes it readable for everyone else here.
 
Okay I think I threw the whole thread off with my Crysis comment, but my thing is more that it just doesn't seem to scale. It scales very little seemingly.

The new GPU's have near double or more most functional units (consider 3870 vs 4870), yet they only seemingly get a small increase in Crysis frames. From oh, 25 to 28 or something it seems like a lot of times. And yes I know my figures aren't exact so dont quibble with that part.

I also read something to the effect that Crysis does an extreme number of rendering passes (7?) and therefore AA is a tremendous hit.

I think Crysis Warhead will address all this and be very interesting to benchmark. Crysis Warhead appears to be the Crysis engine "fixed".

I think at lower settings its pretty CPU bound however once you hit very high settings it seems to scale as expected with GPU power. This is a pretty good example of that:

http://www.legitreviews.com/article/745/6/

It doesn't scale massively however, certainly not as much as some other games out there so its still at least partially being held up by other elements of the system.
 
When i look at the maximum quality i can get out of crysis with both setups other games look better, that was just my point.

Well your certainly entiteled to that opinion. But I question whether your limiting yourself to High or even medium settings for performance reasons and then making a judgement based on that.

I would imagine a 3780 X2 can handle 720p (console resolution so not all that bad) at very high settings but perhaps not. If you think other games look better at very high res/AA than Crysis at v.high/720p then I guess we'll just have to disagree and leave it at that.

min. 23 FPS is not fast enough to me haha. Most people like about 60 frames in fps-games. (your link)

But a 60fps average will dip as well. Probably not into the 20's but either way, a 34fps average is plenty playable for a non twitch shooter like Crysis. In fact most console FPS's run at a similar speed. And 23 as a minimum isn't too bad as long as its temporary and as low as it ever goes. Despite what that site says other sites have stated 25fps (average) being playable for Crysis because of its excellent motion blur. I wouldn't want 25fps in an ideal world obviously but I can attest to the fact that Crysis does feel smoother than most games at a lower framerate and a steady 25 fps can certainly feel like 30fps does in other games. Hence 23fps as a minimum should be fine. Certainly calling it unplayable would be to condemn a very large number of console games as unplayable and the majority of gamers seem to get on fine with them.

and 768p is still edgy to me too, about 2 MP is alright. (1080p minimum, settings maxed)

I agree, higher is most definatly better and 720p with no AA is far from ideal. but we have a finite amount of power available to us and I would rather see games that give us the option of using that power to increase the core graphics details rather than forcing us to use it on res/AA. Remember, Crytek aren't forcing you to play Crysis at low resolutions. Its simply an option they have provided if you wish to take it. Crysis is more than playable at 1920x1200 on decent hardware and it will still stand up to the other great looking games at the lower settings required to make that possible.

The really disturbing thing about this whole discussion is how Crytek are getting flamed for providing graphics options which push the boundries. How do PC gamers ever expect to get games which utilise their hardware and go beyond console graphics if this is how they react when a developer does so? Id have already abandoned pushing graphics on the PC. Its seems as though Crysis/Warhead will be Cryteks last push as well.

This thread is about R700. We should enjoy it while we can becase if this is the attitude towards boundry pushing graphics then the accelerators that allow them will be short lived.
 
But a 60fps average will dip as well. Probably not into the 20's but either way, a 34fps average is plenty playable for a non twitch shooter like Crysis. In fact most console FPS's run at a similar speed. And 23 as a minimum isn't too bad as long as its temporary and as low as it ever goes. Despite what that site says other sites have stated 25fps (average) being playable for Crysis because of its excellent motion blur. I wouldn't want 25fps in an ideal world obviously but I can attest to the fact that Crysis does feel smoother than most games at a lower framerate and a steady 25 fps can certainly feel like 30fps does in other games. Hence 23fps as a minimum should be fine. Certainly calling it unplayable would be to condemn a very large number of console games as unplayable and the majority of gamers seem to get on fine with them.


Hey, but it says on the websites i need at least 90fps for my counterstrike 1.6?!

I can't think of any console game up until this generation that didn't run at 25 or 30 fps fixed. there's simply no NEED for more FPS. Same goes with all PC games UNLESS the network code has a higher data sampling rate.
So basically, the FPS of a game would be sufficient at 23/25/30 and it's the network code that would define "smoothness" for most.. ahem.. "players"
 
Well your certainly entiteled to that opinion. But I question whether your limiting yourself to High or even medium settings for performance reasons and then making a judgement based on that.

That's the point though. You can't dismiss performance here, what good is IQ if the game is unplayable?
 
Hey, but it says on the websites i need at least 90fps for my counterstrike 1.6?!

I can't think of any console game up until this generation that didn't run at 25 or 30 fps fixed. there's simply no NEED for more FPS. Same goes with all PC games UNLESS the network code has a higher data sampling rate.
So basically, the FPS of a game would be sufficient at 23/25/30 and it's the network code that would define "smoothness" for most.. ahem.. "players"

Console shooters could never expose the advantage of 60fps because they can never deliver the accuracy nor twitchiness a mouse can bring and utilize.
 
Back
Top