Current generation of consoles only uses 2x antialiasing for reducing interlace flicker in games. Some early games in PS2 did not use even that, and one or two games used very good looking 4x or greater AA (Baldurs Gate)
I really think that for next-gen minimun or standard should be 4xAA but to get most of our TV's and projectors/HDTV we want 6-9 or even 16xAA ! Or 4x with 8-16x temporal AA (motion blur) but I thing we do not get that with next gen yet. In theory, if fillrate increases like prosessing power (with cell) it could be done, 4xAA with 16x motion blur requires something like 32 times of fillrate (16-32x, depending how AA is done). It sounds reasonable when comparing prosessing speed increase of 100-1000x that is expected (say advertised ) but in reality I think majority of gfx power is used for beautifying, bumpmaps, high res textures etc.
What you think is coming, are we going to get these or is it just little update in image quality, say compared to Xbox ? (quantity will be increased, no question about that..)
I really think that for next-gen minimun or standard should be 4xAA but to get most of our TV's and projectors/HDTV we want 6-9 or even 16xAA ! Or 4x with 8-16x temporal AA (motion blur) but I thing we do not get that with next gen yet. In theory, if fillrate increases like prosessing power (with cell) it could be done, 4xAA with 16x motion blur requires something like 32 times of fillrate (16-32x, depending how AA is done). It sounds reasonable when comparing prosessing speed increase of 100-1000x that is expected (say advertised ) but in reality I think majority of gfx power is used for beautifying, bumpmaps, high res textures etc.
What you think is coming, are we going to get these or is it just little update in image quality, say compared to Xbox ? (quantity will be increased, no question about that..)