If an hdtv really follows the hdtv standard, then it will recognize the input of a 1080p signal, because 1080p is specified in the standard. :? Of course, it may or may not show that on the screen. It only needs to be able to read the signal, convert the signal, and then display "something" on the screen to be compliant with the standards. What is displayed on the screen will be determined by what the native screen output is of the particular device. So any "true" hdtv will read a 1080p24/30 signal just fine- it will just then reformat the material to either 760p or 1080i for screen output, as whatever the case may be (not to forget those very special hdtv's that really do display 1080p, but then there are additional conditionals involved there, as well).
I have to agree that this whole 760p+AA vs. 1080i/p may well turn out to be pretty irrelevant, given all the smoothing, blending, processing stuff built-in to an hdtv just by default. You will essentially get the "AA look" either way you go, just by virtue of what an hdtv must do to the input signal to display it correctly on the screen, for all this stuff to work in the first place.