So in short: is the ATI MLAA the same as GOW3 MLAA?
Is the MLAA available for PS3 devs the same as the GOW3 MLAA?
I am a bit surprised to see that MLAA blurs the image a bit compared to no AA. GOW3 looks super crisp to me, And I wonder if their MLAA had the same slight blur as presented in the ATI shots?
I am only aware of this comparison here showing "Sony MLAA":
http://www.neogaf.com/forum/showpost.php?p=22661479&postcount=29
but the pics are rather small to judge blur - what do you guys think?
PS: I am astounded how much detail 1080p mode packs in!!
Higher resolution for the win?!
Probably not the same. Since...
Developer applied MLAA can be significantly better as the developer can choose where and when to apply the MLAA. It's a shader process after all. Whereas driver implemented AA is "all or nothing" with regards to MLAA. ATI's solution applies it to the framebuffer after the scene is rendered by the game and sent to the monitor. The driver then applies MLAA and forwards the final image to the monitor.
Therefore, everything gets MLAA applied regardless of whether it would benefit or not. And in the case of such things as Text, actually make things worse. That's where developer applied MLAA has it's most obvious advantage. Text for example can be overlaid on the image after they apply MLAA in the simplest case. I would imagine you'd be able to do similar things with say, detail textures. As said, developers can choose where and when to apply it.
And that doesn't even get into possible differences in what kind of MLAA is being used.
So ATI's MLAA is sometimes nice, sometimes not so nice. And as pointed out by some others, it's not always comparable or better than standard Box MSAA + Transparency AA. But it's another option we can use, and as such, it's nice when it does work well.
Heh, higher resolution is nice, but in your case, you're actually commenting on higher pixel density. For example, take a 46" TV. A 720p image has less pixels per inch than a 1080p image. So it'll look more granular and coarse. Same details in both, but you have more pixels per inch in the 1080p image, thus details can be sharper.
Now take my home situation. Where a 1920x1200 image on a 24" looks visually the same as a 2560x1600 image on a 30" monitor other than the obvious size difference. Both are exactly as sharp as the other. The 30" is very very slightly sharper as it has a slight advantage in PPI (pixels per inch). But basically details are equally sharp on both.
Regards,
SB