Here's another explanation of AA to help understand what the problem is and how you fix it. Aliasing is where the individual pixels of a pixelated diplay can be observed as discrete changes. These gives you the stair look along edges and shimmer on textures. Aliasing appears in computer graphics where each pixel has it's colour determined from looking at the 3D scene and just seeing the colour of the polygon underneath. If you have a black background and a white triangle, you get white pixels where that triangle is and the jaggies. That is, you only sample one point per pixel to determine that whole pixel's colour.Metalmurphy said:I'm aware that AA is done by rendering an image at an higher resolution and then displaying it at a lower one. That alone proves it can't be done in movies? How can you "render" a movie at an higher resolution? oO You can't. The closest thing you can do is upscale it, but when you show it again @ the lower resolution it will just look the same as the original right?
Antialiasing increases the number of samples per pixel, increasing the 'information density'. If you subdivide a pixel up into 4 quarters and take a sample from each of those, you can average them to prtray an approximation of the information held therein. In the case of the white tri on the black background, the pixels along the edge would be one of 4 colours, shades of grey through to white. This decreases the jump in intensity between adjacent pixels, and makes the jaggies less noticeable. It's worth pointing out that the step is still there due to discrete pixels, but when you decrease the difference between adjacent pixels, that step becomes less noticeable.
You can increase sampling of pixel information in various ways. One is supersampling, or downscaling, where you take a larger image and shrink it to a smaller resolution, averaging the values of adjacent pixels. From 1080p to 720p, you're only getting 2 samples per pixel. An improvement, but not great. This does apply to textures too though and can have a good effect there. From 1080p to SDTV, you've got 6 samples per pixel, and the IQ will be very good, but of course the fidelity of the TV is less...
When recording a film, you get very large samples per pixel. The colour information is carried on photons and for each pixel in the image, represents an awful lot of samples. When scanning film, likewise you get very high resolution sources. The quality of sampling is so good that even on an SD display, you can make out individual hairs on a person's head. This is because the human brain does a lot of filling in, and instead of seeing a collection of coloured squares, it sees averages and approximations of things it can assigned a real image to. When producing a CGI movie, it is rendered with high antialising. Laa-yosh is the guy to comment here on the average levels and methods used. You're rendering with multiple samples per pixel to elliminate the visible jaggies. When shown on a movie player, both will look fantastic as both have this very high level of sampling per pixel.
It is possible to get aliasing on filmed sources. Car programs are good examples where you have filming inside a car and a bright day outside. The contrast between window frame and outside brightness often leads to jaggies, especially on interlaced SD feeds. This is because the brightness contrast results in discrete light and dark pixels.
Now potentially, if you have a high res sources, like 1080p, of a filmed subject with discrete pixels like that car example, when you downsample to an SDTV display, it would average out pixels and give you an extra level of AA (because the data is in low dynamic range. If the source was high dynamic range you woudln't be any better off, but we're not there yet so no worries!). Thus it can be said that XB360 can add AA to a movie, because downsampling 1080p to 720p increases the number of samples per pixel, and helps reduce the visibility of discrete pixels. This will only work if the jaggies are visible in the source, though. Adding AA onto an already non-aliased scene isn't going to add anything. Are there any 1080 movies with jaggies that would benefit? I don't know.
But, where downsampling increases AA, it decreases fidelity. You have less pixels and less detail. You friend has the option to play 720p games on an SDTV right, which will add lots of AA (well, it should do, but some games seem to render at SD resolutions from what I hear). Does he choose to do that, or prefer the HD output, jaggies and all? On you're PC do you prefer playing games at 800x600 with 16xAA, or higher resolutions and less AA?
Aliasing is a side effect of pixelated displays and capture methods. As you increase pixel resolution, and decrease information per pixel (which isn't an issue on film) you trade fidelity for potential aliasing. Downsampling will help reduce jaggies from a higher resolution source, but that produces a softer, blurrier image. Ideally we want high resolution and antialising. You can't get better AA than filming that's taking crazy numbers of samples of light per pixel, and the idea of deliberately trying to AA films by downsampling is quite frankly ridiculous. That's like choosing to watch HD movies downsampled to an SD set to add antialising. Films don't need AA. The source material is already antialised and you want to benefit from the fidelity of the higher resolution display! The only situation that argument has a case, is if aliasing appears in the movie. But again, there's a case for watching HD movies on SD sets, in which case, if that's what people want, why are we wasting our time with HD?