IIRC, it was just taking advantage of the ROPs being able to spit out multisamples per clock and then reconstructing a larger frame from it, but a few games would get funny artefacts (like a crenellation) on edges as a result or a blur, can't remember (might have depended on how they went about reconstructing).
Otherwise, it was to perform per sample shading/lighting so the AA was correct for the original frame (alias to 2560x720, do the shading, go back down to 1280x720).
Just a silly thought probably.
edit:
Ah right... Good ol' Quaz
http://forum.beyond3d.com/showpost.php?p=1088796&postcount=673
Looks a lot worse than I remember. :s
But he did seem to imply that 4xMSAA wouldn't have that sort of problem:
http://forum.beyond3d.com/showpost.php?p=1098917&postcount=850
I suppose it'd end up being blurred anyhow. :|
---
Anyways, I suppose it'd have to be compared between a direct upscale by the HW instead of mucking about via SW (to get 4K or mashing about with non-integer buffer scaling). Was just a passing thought experiment... thing... idea... onions. Not like 4K support is meaningful yet.