The question posed was ridiculously broad and made no mention of theoretical limits. So given a choice between any amount of AA and any amount of reoslution, I'd pick resolution. It is better, as it provide AA at a distance and clarity up close where low resoution is a blur.
Now fi the question is to be presented in a fashion that's technically answerable with some hardware reference 'given this system and this context' then we can go into the pros and cons of different IQ techniques and performance impacts and the best overall compromises. But that's not what the OP was asking.
Well if we're going to keep with the ridiculous then obviously playing something with "inifinite" x "inifinite" resolution trumps everything.
However, back to the realworld, there just is no case on any machine existing today to play a detailed 3D game at playable resolutions on a display where you can actually resolve key details like is that my character or is that just a black dot... In any realworld case, AA still trumps max resolution. Although I suppose a 2560x1600 resolution 5" (yes Five Inch) display may mask most aliasing artifacts.
And to the person above that mentioned CRTs. Yes, I have used high end 17" CRTs with 1800x1440 res and 20" CRTs with 2048x1536 res and Aliasing and Aliasing artifacts were still a problem.
I even had the pleasure of using the IBM 24" LCD with 3840x2400 resolution. And guess what? Aliasing and aliasing artifacts were still an issue.
Added to that, now performance was an even worse issue (ignoring the horrible pixel response of the IBM display).
And yes, I realize that if I sat a football field away from my 46" TV that I wouldn't notice aliasing even at 720p, however I also wouldn't be able to discern much beyond a blob of light that changed in intensity every once in a while.
Anyways, as I said before, some people aren't bothered by it, don't notice it, or just aren't sensitive to it.
But to other's it drives us absolutely batty.
Regards,
SB