R700 Inter-GPU Connection Discussion

I thought it was that Crysis is inefficient and slower per "IQ metric" than other games, hence why the next Crysis engine is being "fixed" to run better at the same visuals.

We're not discussing the "why" of Crysis' apparently low performance (or lack of increase in performance with increase in GPU capability). That would be even more off-topic than we already are :p
 
I don't see why people are so down on Crysis. There are now several GPU's which are capable of playing it at very high settings as long as you keep the resolution to 720p and some that can go higher.

I would much rather have a game that can bring a high end GPU to its knees at the highest settings than have a game which I need to run at stupid high resolutions with piles of AA just to make my GPU sweat a little. Thats as long as the visuals match the performance requirement of course which in Crysis's case they do.

+10
Totally agree, IQ for the win. Maybe the engine scales a bit below avgr. on multi GPU's, but the possibility to raise the IQ to those insane levels, hell yeah.
This game stands miles above others in IQ IMHO. And remember, you don't have to play it on VH to have the best viuals either.
 
Average of 30 fps means MIN fps in the teens or lower, which is not acceptable for a shooter where slow down occur during intense moments, IMO

To YOU it might be, but please don't try to generalize and make it seem like a fact that it's unacceptable.

Sure if I was still competing professionally in FPS leagues and tournaments, I'd find min fps in the teens to be unacceptable.

However, "I" am perfectly happy dipping into the teens in a single player game if the graphics are nice enough. "I" don't have to feel like I'm trying to prove anything. I can freely just play and enjoy the scenery, graphics, and any physics that might be employed. Likewise, I can also be quick to find fault with bad implementations of things.

I have friends that are happy even with fps dipping into single digits. Hell, they sometimes never finish a game, yet are very happy to have played it for however long they could.

I also have friends that are quite unhappy if FPS dips below 30. They are quite happy to play ALL modern games at the lowest settings possible in order to have the smoothest gameplay possible.

Either way, while I may not like Crysis as a "game" I certainly do hold them up higher than most other companies for giving me the option of cranking up the visuals far past what current generation hardware (at the time of launch) could play it at.

And if I wanted a smooth gaming experience I also had the option of cranking it down.

However, now that we are soon to have R700 level of performance available and currently have GTX 280 level of performance, even those ultra high settings are starting to get playable. Now if only CPU's were keeping up with GPU advances.

Regards,
SB
 
One additional note on Crysis vs. CoD4: please keep in my the overal structure of the levels and how many things are dynamically calculated here and are scripted there. That's all taking time, because you cannot go with pre-baked stuff most of the time.

So, IMO it's not only graphics that need to be taken into account but the whole concept of the games.
 
there are diminishing returns with graphics engine. one gen ago, doom3 and farcry were very much worth playing on my already outdated ti4200 (800x600 2x AA, somewhat good enough on a CRT).
You got entirely new features, unseen lighting, level design, normal mapping etc. and the satisfaction of getting the very best the old card was able of, now that everything has been done you need high settings and high res to make such a heavy game worth playing.

not that it's wrong. that messes with low end buyers and my schedule (a used former high end card every two years. 6800GT was a year ago, on that schedule I'd get a 8800GTS in a year)

not sure why I'm debating that on a topic about dual GPU. unless to agree that big GPUs are increasingly more needed ;)
 
20080808e691c4a1f58ba4fvl7.jpg


What could be done with this significant boost in IC-BW?
 
Maybe they'll get a reduction in latency whenever sharing render target results in either direction.

Jawed
 
Btw, where did you find those slides? Has the whole presentation been leaked yet?
 
Last edited by a moderator:
5Gb/s full-duplex... Yay! I was right, that there is a second PCIe (2.0) port in that chip! :D
Sorta.

http://www.techreport.com/articles.x/15293

AMD says the sideport is electrically similar to PCI Express but is simpler because it's only intended as a peer-to-peer link between GPUs.
Also, something that patent documents have long previewed:

On top of that, AMD says the bridge chip supports a broadcast write function that can relay data to both of the X2's graphics processors simultaneously, conserving bandwidth.
Jawed
 
according to anandtech, the crossfire sideport will be disabled by defalut due to power limits... also hardware manufacturers are free not to do wiring for it ?!
So much for the hype of "next gen multi chip" ...
 
On top of that, AMD says the bridge chip supports a broadcast write function that can relay data to both of the X2's graphics processors simultaneously, conserving bandwidth.
Sort of thing, the NV's BR04 had for ages? :D

About the whole XSP bonanza -- the question is, does that thing really works out there or the issue is just a matter of pure driver impl?
 
according to anandtech, the crossfire sideport will be disabled by defalut due to power limits... also hardware manufacturers are free not to do wiring for it ?!
The latter makes the former irrelevant ...

They can't very well come out later saying "well stealthily there were actually two different versions of the card, and you bought the sucky one ... sorry". Even if there are performance gains to be had, if it's optional it might as well not be there. If this is really true I think there has been a lack of communication at ATI ... it seems unlikely that the same people getting ready to talk up a feature would turn around and let the card manufacturers make it impossible for them to ever support it.
 
Last edited by a moderator:
What a letdown. Seems according to Anandtech via AMD, the much ballyhood chip didnt improve X2 performance any appreciable way anyway according to AMD, which is why they disabled it in the first place.

Seems AMD's mistake was talking up the thing.
 
Isn't someone meant to be donning a pink tutu as a result of this?

Jawed

Nope, because that bet involved the R700 being released, not the R700 having a working inter-connect upon release:)
 
Back
Top