Silent_Buddha
Legend
It's not that AFR "sucks" per se, just that it's rather limited. It works well only when you have two symmetric GPUs. Consider that forthcoming AMD CPUs are going to have integrated graphics, and they'd love to use that for a little extra boost to graphics performance when you're using a discrete GPU. You certainly can't do AFR when you have a 4x or 8x or greater performance difference between the two GPUs.
It wouldn't surprise me to see ATI invest quite a bit in new forms of mutli-GPU rendering over the coming year.
No AFR does well and truly suck, IMO.
It still only has the control latency of whatever the single GPU FPS would be, while rendering more frames per second. So if a single GPU would do 20 fps, but with AFR you can do 35, you'd still only be inputing at 20 fps. Totally unacceptable for me.
Added to that some devs have even commented on having to not use AFR unfriendly rendering techniques as that would prevent any scaling of multi-GPU. Basically any situation where a following frame relies on the results of the previous frame. As you are basically rendering 2 frames nearly simultaneously. If one frame relies on the previous one, you can't start rendering it until that frame is done. Trying to remember which game it was in reference to, but one dev had to reduce graphical effects so that AFR could work.
Not a programmer so that's just my layman's understanding of what they were saying.
Regards,
SB