The 'free' AA in Xenon is good for 2 things:
1) It alleviates bandwidth demands on the main video memory (which is shared in case of the Xbox 360 with the CPU) and therefore makes a lower bandwidth memory interface acceptable (128-bit v 256)
2) AA improves visual quality at any resolution (I don't think anyone can argue against that) and this helps developers design games that use it. They don't have to think about AA. Just design it so it runs properly without AA and the AA should have zero impact (be free).
However, "nothing is free" comes to mind and there is at least one drawback to designing your hardware this way.
By locking AA in you are using transistors (or resources/costs) in a very specific way. These transistors cannot be used to scale performance. Xenos/C1 is the bottleneck and it is now 100M transistors lighter than it could possibly have been without the eDRAM expenditure. I really doubt this will be a problem, but one can imagine situations where a developer may want to push the hardware to new levels and sacrifice AA for it. This is impossible with the Xenos. Thinking about it, I came to the conclusion that this may be a problem in later parts of the lifecycle of the Xbox 360. Whereas the PS3 can start switching off the AA for more raw performance, the Xenos cannot and doing so would just make 1/3rd of the GPU idle (the eDRAM 100M)(well, not really...hyperbole).
I believe EDRAM has typically been very helpful in devices like these. The PS2 can produce some nice visuals with what otherwise seems like very inferior hardware. However, times change and I'm sure it's best to judge on final hardware and, more importantly, software. It may also be that EDRAM helped with things like particle effects and 2D overlays/sprites, but those days are numbered. You don't want some fast good looking 2D fog these days. You want volumetric fog calculated on the GPU.