Alternative AA methods and their comparison with traditional MSAA*

It would be cool to see the following IMO:

use subHD resolution so that 2xMSAA fits into EDRAM.
Then you can use 2xMSAA basically for free (right?).
On top of it, you could use now MLAA to smooth out the remaining edges :cool:
 
It would be cool to see the following IMO:

use subHD resolution so that 2xMSAA fits into EDRAM.
Then you can use 2xMSAA basically for free (right?).
On top of it, you could use now MLAA to smooth out the remaining edges :cool:
MLAA doesn't work that well on antialiased image as the edges are not that easy to find.
You could render a higher resolution image that barely fits to 10MB use MLAA on that, downscale to got some SSAA on it as well. ;)
 
This thread has been moved from the Console Tech forum. Post #728 links to a short paper briefly describing a GPU implementation of MLAA, making the subject more platform agnostic than the original Cell-centered technique.
 
I don't find the 360 rendertimes to be of much practical use unless optimisation on a proper devkit yield much better results. Looks like you'll still be better off sticking with 2xmsaa or 4xmsaa on that platform, but on the PC side? Wow! The 9800GTX is mid-low end these days, its essentially 4 year old technology and their algorithm absolutely flies on it.

I wonder how AMD and Nvidia feel about this? Offering a similar MLAA algorithm as a driver override would be a huge competitive advantage and something that could tip the favour one way or another when I come to make my next GPU purchase. On the other hand it does kinda nullify one of the priamry reasons for buying one of their high end cards. A low end GPU (like that <$100 9800GTX!) can already run any game on the market fantastically without msaa enabled and if it can now run all those games with "better than 4xmsaa" levels of antialising with the same level of performance then why spend the extra cash? I know both have been pimping stuff like 3D vision and Eyefinity as ways to sell their high end cards but I have serious doubts as to whether either of those are ever going to be mainstream.

Hopefully we won't have to rely on Nvidia/AMD to provide us with the solution though. It looks like the authors of this work have got it up and running in a number of game engines without any priveledged access to them so perhaps a generic app is possible. Boy, I'd love to get my hands on that! :p
 
They added FAQ...

How were the times measured for the Xbox version?

We captured frames from the games listed in the table, and processed them running a XNA version of our technique on a real Xbox. Using a proper dev-kit would probably lead to better times.

How does the quality of your implementation compare with MSAA?

In the article we are conservative, and we say that quality is between 4x and 8x. However, it is something difficult to compare: when the technique works it is as good as 16x, when it breaks it is as bad as 1x. As the performance of our approach is one order of magnitude faster than MSAA, there is still room for quality enhancements, in expense of worser running times.

In which platforms does it run?

We have implementations in DirectX 10, for PC, and XNA, for Xbox 360. It could run in DirectX 9 without problems, as we are not using DirectX 10 special features. It is coded as a regular pixel shader, that runs as a post-process over a color image (with optionally depth as input).

What are the advantages of MLAA over MSAA?
For similar quality, it is considerably faster, one order of magnitude in the PC case.
For consoles, where the MSAA sample count is limited due to hardware restrictions, it allows better quality.
It enables the usage of anti-aliasing in conjunction with Multiple Render Targets, which is specially useful for deferred engines.
It can provide better quality when HDR is used, as anti-aliasing is performed after tone mapping.
Anti-aliasing quality can be dynamically adjusted on the fly, depending on the available resources at each frame.
MLAA is not incompatible with MSAA: this leads to hybrid MSAA-MLAA approaches.
 
Thanks Ruskie for the info, much appreciated.

As the performance of our approach is one order of magnitude faster than MSAA, there is still room for quality enhancements, in expense of worser running times.

I like this part which tells they can improve quality which considering cost would allow quite a lot more. Avg ~0,44ms on the 9800GTX+ very little. IIRC the edge AA method 2 in Crysis had a cost of 0.2ms and their chromatic aberration 2ms (commented in a shader) on a 8800GTX GPU. :D

Also that it can be run with DirectX9 and better sounds great considering most multiplatforms are DX9 only and they tend to be the ones lacking AA solution besides blur filter (sometiems solution of bruteforcing MSAA is available but it costs more than if devs would have implemented it). [strike]To bad it is not compatible with MSAA as that would be of interest to me and many others but MLAA is better than no AA.[/strike]
 
Last edited by a moderator:
I like this part which tells they can improve quality which considering cost would allow quite a lot more.
The two areas that need the most work are AA on longer edges, as by my understanding they are sampling a section of long edges and not the whole thing, so AA doesn't extend the whole length as it should in an ideal solution; and subpixel sampling, which can't be dealt with simply in post and they'd need to find a way to get more samples from trouble spots. Oh, and the animating artefacts (strobing) as MLAA works in the final pixel-space so cannot accurately solve subpixel movements.

To bad it is not compatible with MSAA as that would be of interest to me and many others but MLAA is better than no AA.
I think you're misreading, thanks to an awkward double negative - MLAA is compatible with MSAA. You render exactly as you would with MSAA and then on the final framebuffer, as MLAA is a post effect, apply MLAA to the MSAA'd frame to elliminate any outstanding jaggies, such as from HDR edges.

I think the ideal compromise at present would be selective MSAA on small polygons, MLAA on large edges, with some kind of subsample hints to MLAA to address the strobing. Perhaps some temporal sampling jitter as Sebbi suggested?
 
Right you are!

Somehow I read it as 'not compatible'. Well then this sounds perfect and would do good to cover up where MSAA isn't applied. A game like BFBC2 would greatly benefit from this as it still has quite some jaggies due to their HDR implementation. BFBC2 being a ridicolous case where I rather just use wide-tent filter with the MSAA to do a prettier "Quincunx".

I think the ideal compromise at present would be selective MSAA on small polygons, MLAA on large edges, with some kind of subsample hints to MLAA to address the strobing. Perhaps some temporal sampling jitter as Sebbi suggested?

Seems like the optimal solution to keep a good perfomance and artifact at minimum.
 
I don't find the 360 rendertimes to be of much practical use unless optimisation on a proper devkit yield much better results. Looks like you'll still be better off sticking with 2xmsaa or 4xmsaa on that platform
Deferred rendering is getting really popular on Xbox 360 as well. Deferred rendering and MSAA do not mix that well together. MLAA is much faster than 2xMSAA on a deferred renderer, and the quality better as well (in majority of cases). If it's doable on 3.5 ms on XNA using a bad compiler and unoptimized shader code, it's definitely going to be useful when real developers get their hand on the algorithm and optimize it for the GPU.
 
An example of MLAA's shortcoming, in LBP2 there's a music sequencer where you piece together lots of pale-grey bordered blocks. There's a great deal of strobing and edge blurring when looking at a screen of these.
 
Any of the smarter ones among us want to make some comments on the DLAA in The Force Unleashed demo? To my untrained eye it looks to work very well, certain things break it of course but when it works it easily produces the cleanest edges I've ever seen in an Xbox 360 game. Subjectively speaking it definitely seems to be a much better solution than 2xmsaa at least which is what I'm used to in 360 games.

Anyone able to offer up an indepth explanation of what exactly it is that they're doing. I understand at a broad level how traditional MSAA, temporal AA and MLAA all work, so how does DLAA compare to these approaches? The result seems to be closer to MLAA than anything else.
 
This is what i could find...it was interview with DF...

Digital Foundry: Crytek seem to be using a variant of this re-projection principle with its temporal AA in CryEngine 3. Does your DLAA anti-aliasing system in The Force Unleashed II tie in with your work here? If not, how is it different? Can the velocity buffer be re-used for other purposes?

Dmitry Andreev: No, our anti-aliasing solution doesn't use the re-projection, but it is very similar to the interpolation technique in term of simplicity. It's all about what you can do by not going around and "googling" things, but by looking at a problem from a different perspective. I can't say more about it at this point.

One thing though related to the interpolation is to make sure it works with the anti-aliasing, because very often in games I see that people don't do anything about it, and once you start moving the anti-aliasing is gone, especially with motion blur, so one should take it into consideration.

The use of velocity buffer is only limited by your imagination. I know it sounds a little blurry, but that's how it is. Even though we don't use the interpolation in The Force Unleashed II, most of the stuff described in the presentation is used in production in some way or other. Most of it. I really want people to understand things before or when using them. Knowing something and understanding it are two different things. This is what I have learned.

And yea,its definitely some of the best AA i have seen on consoles bar GOW III though it has some small artifacts.
Take this for example
http://images.eurogamer.net/assets/articles//a/1/2/4/4/4/7/2/Saber2.jpg.jpg
 
I really need to see it in daylight condition, because Force Unleashed 2 looks really blurry on PS3, like a subHD game.

BTW GOW III had similar contrasts and colours but looked much sharper.
 
Some more MLAA talk.

http://forum.beyond3d.com/showthread.php?p=1484043#post1484043


Also seems ATI is putting MLAA in the AA options tab.. soon.

8.jpg
 
I'll bet Nvidia will counter with MLAA support in their drivers. And then their rivalry will push up the MLAA quality to new heights.
 
Yeah especially since AMD's quality looks a little blurry with really slow performance com paired to what was implemented by the guys we previously saw.
 
You can't expect a driver forced technique to offer the same quality and performance levels of proper developer implemented AA, although "MLAA" should theoretically be easier to hack in than MSAA.
 
Back
Top