The Dreamcast in theory supports AA, but afaik any game released used it, it would consume too much resorces on any complex game, what most people see is the standard 640x480p that's why people think the Dreamcast used AA comparing with the ps2 but it is because the ps2 uses some weird lower resolution for games, AA does nothing to do with it.
There was at least one game that used 2X Super Sample AA (Omicron?). Resolving the super sampling as you copied off the tile buffer should have meant that this didn't require any more memory for the frame buffer, just double the fill rate. Which at 30 fps with opaque geometry shouldn't have been much of a problem for the Dreamcast. Pity more games didn't use it.
Games usually just blended alternate scanlines to create the interlaced imaged. I think was referred to as a 'flicker filter'.
The Dreamcast rendered everything at 24-bit RGB into its tile buffer, iirc, and had a number of modes that could be used from there. Copy out at 24-bit, copy out dithering down to 16-bit (humongously better than native 16-bit), and blending down from 24-bit into interlaced 24-bit.
A full res 24-bit buffer took up too much space (needed for polys and textures), 16-bit dithered was common for VGA, and presumably most games used a variant of the interlaced flicker filtered output.
In a sense you could perhaps stretch to say that with 2X SSAA, the flicker filtered interlaced output was effectively using 4X SS AA.
Dreamcast could also use aniso filtering, by using a 2x2 supersample of the texels used to texture a pixel/fragment. Test Drive Le-mans showed that it could be used selectively, and also in fast, stable 30 fps game with a hell of lot of polygons and an enormous draw distance. The white line disappearing clearly hundreds of metres into the distance along the straights was pretty impressive at the time.