Alternative AA methods and their comparison with traditional MSAA*

I see some (E)VSM bleeding there, but it's not bad considering :)... you're using 16-bit EVSM + SDSM z-ranges? Sure would be nice if the consoles could do 32-bit filtering.
The slight bleeding is not noticeable in most game scenarios. We have to keep the exponent multiplier pretty low, since the 16 bit precision would otherwise cause bad graining in shadowed pixels. Also considering only four 512x512 cascades for 2 kilometer view range, it looks pretty good. The only "blur" we have in the shadow maps is the 4xMSAA (as it's basically free for 512x512 resolution maps). Unfortunately some compromises are needed to get it run 60 fps all the time. Heavy commitment to user created content doesn't allow us to bake any lighting at all. Everything is dynamic.

AA looks pretty good close up but some of the far stuff doesn't seem to be AAd at all. I can understand the fence and wheel spokes and stuff that needs subsampling, but the pipe in the background - is it just too close to horizontal to be picked up by the filter?
If you zoom in, you'll notice that for each step there's 3 gradient pixels. If you get more horizontal/vertical than that the FXAA2 filter size isn't just enough to find the edge from further away. It's not that bad, because in motion the 3 gradient pixels are enough to completely hide the annoying edge noise that aliasing causes. In motion the image looks very stable and polished compared to the usual aliased mess we have been used to in our past games (we have used deferred rendering since 2007). Very narrow (less than pixel wide) geometry of course still annoyingly pops in and out when you move, because the polygons are not hitting any pixels every frame, but that's something you cannot solve by any post AA filter. You need subpixel information.

We now also have a bit more shader aliasing compared to our previous technology, since we now have parallax mapping on all surfaces and our own custom anisotropic filtering (from the virtual texture) causes some oversampling (our artists really like sharp textures/normalmaps instead of blurry mess). You can see the parallax mapping best in the first screenshot (in the road). It looks very good in the terrain, especially on rough terrain. The good thing in post AA solutions compared to MSAA is that it also kills the annoying shader aliasing. A mix of both would likely be a the best thing with next generation systems (since the performance is not yet there for supersampling all the pixels).
 
Last edited by a moderator:
The slight bleeding is not noticeable in most game scenarios.
For sure, it was not a criticism at all - I'm amazed you get such reasonable results with 16-bit shadows and fairly low resolutions!

In motion the image looks very stable and polished compared to the usual aliased mess we have been used to in our past games (we have used deferred rendering since 2007).
Cool good to hear - I'm looking forward to seeing it in motion.

We now also have a bit more shader aliasing compared to our previous technology, since we now have parallax mapping on all surfaces and our own custom anisotropic filtering (from the virtual texture) causes some oversampling (our artists really like sharp textures/normalmaps instead of blurry mess).
Yeah for sure - also the two shots have the light pointing roughly at the camera which shows off the shadows nicely but highlights (no pun intended :)) specular aliasing at silhouettes.
 
Thanks for the tip!

Here is the result: NoAA - 4xMSAA - FXAA

p.s.: The filter works in Photoshop too, with enabled GPU acceleration. ;)

Really weird things on the FXAA pic. First the object is brighter and the FPS is much higher.

Looking at the grid on the bottom, two things are obvious. The boundry lines are barely effected and seem to have a little blur to them. The inner grid lines are aliased and less visible than on the other two pics.

Anyone have an explantion?
 
Really weird things on the FXAA pic. First the object is brighter and the FPS is much higher.

Looking at the grid on the bottom, two things are obvious. The boundry lines are barely effected and seem to have a little blur to them. The inner grid lines are aliased and less visible than on the other two pics.

Anyone have an explantion?

Already been covered in this thread I believe. Felix replied about the brightness and fps.
 
I know some people get all upset when I say I love SSAA, because it's so wasteful, but it looks awesome. Especially ATI's implementation. Could games support SGSSAA directly or is that not an option because NV doesn't officially support it?
 
NVIDIA hardware should be perfectly capable of SGSSAA.
Oh it is and I have used it on even my 8800GTX (and now my GTX 560) but you have to use NVIDIA Inspector and you also have to set up a negative LOD bias manually, and it is definitely slower than ATI's SGSSAA. It is not available in the NV Control Panel. I have both a 6950 and that 560 and I compare the cards frequently. Sometimes a game won't work with SSAA on one or the other so it's handy to have both vendors around.

ATI - DX9 only
NV - OpenGL and everything else except some games that use complex deferred shading techniques like Dead Space 1/2. Sometimes there are workarounds.
 
I don't understand why it wouldn't work regardless of what shading techniques are in use.
 
http://iryoku.com/aacourse/#schedule

There will be big presentation on Siggraph 2011 about Post-Process AA techniques with great set of developers and really interesting insights like:
pdf said:
Tobias Berghoff (SCE WWS ATG) will for the first time reveal the inner workings of this method and share recent improvements and lessons learned. Cedric Perthuis (SCE WWS Santa Monica) will discuss the integration into God of War III, showing the benefits and pitfalls of the technique live on a special build of the game.

It would be great, if we would get video from this course, instead of only papers/ppt.
 
I don't understand why it wouldn't work regardless of what shading techniques are in use.
I don't understand why either, but what do I know. SSAA seems just as picky as MSAA about which games it'll work with. Both Dead Space and its sequel need you to use one of NV's AA workaround modes and with DS2 you have to pick missed edges or reduced shadowing quality.

Recently I also couldn't get NV MSAA or SSAA working with DIRT 3 or FUEL. ATI SSAA works with FUEL though and looks superb.

A curious aspect with ATI that I've noticed is that 4X SSAA is often smoother than 4X MSAA+AAA. AAA seems to cause stuttering.
 
Could anyone here with some technical insight explain why SSAA doesn't always work?
 
Last edited by a moderator:
I am surprised that SSAA doesn't work all the time. As far as AA goes, it is the only physically correct method, so I would naively expect it to work the best.
 
Could anyone here with some technical insight explain why SSAA doesn't always work?

I am surprised that SSAA doesn't work all the time. As far as AA goes, it is the only physically correct method, so I would naively expect it to work the best.
SSAO works all the time, if the game developer has programmed support for it. However driver controlled antialiasing overrides work only for simple cases. If the game does anything more advanced, like deferred shading, fancy post processing effects or virtual texturing, forcing antialiasing on from the drivers is likely going to cause severe graphics corruption instead of antialiasing of any kind.

Old games had just a single back buffer that contained color data. All rendering was done to the back buffer, and the back buffer was directly sent to the display. For this kind of rendering pipeline, it was easy for the driver just to modify the backbuffer format behind the scenes, and the application didn't notice it all, and everything worked fine. New games tend to create dozens of different offscreen render targets for different purposes. Some of these render targets might contain color data, some might contain other data (such as shadow map depth, material properties, surface normals, virtual texture coordinates, particle animation data, etc). However for the driver, these all render targets look pretty much the same, so the driver alone cannot detect what buffer is used for what purpose. Simple forced scaling of all these render targets up by 2x2 would cause severe graphics corruption (and even crashes). If the driver could detect the buffers correctly that are used for color data, the shaders used in later stages of the rendering process would need to be patched as well, since now the input data resolution/format are different from the resolution the shader expects.

It's not always a straightforward task to patch a game for MSAA/SSAA, and automatic driver (checkbox) patching is even more difficult. That's why we nowadays have driver profiles for different games. The driver team has identified the patches needed for the game rendering pipeline in order to enable all supported driver forced AA modes. Of course the most efficient and guaranteed to work antialiasing will be the one programmed by the game programmers. Game programmer is now responsible for providing good antialiasing modes in their game. Sadly all developers are not taking this seriously enough, so we will likely need forced driver AA hacks in the future as well.
 
Most distressing is that SSAA can not be forced in DX10+ at all on ATI and it rarely works with NV so it's going to be on the way out if nothing happens to help it stay. I've never seen game devs support SSAA*.


*except maybe back with NV's old pre-G80 8X mode that was 4X MSAA + 2x SSAA.
 
It's not always a straightforward task to patch a game for MSAA/SSAA, and automatic driver (checkbox) patching is even more difficult. That's why we nowadays have driver profiles for different games. The driver team has identified the patches needed for the game rendering pipeline in order to enable all supported driver forced AA modes.
It's a bit sad that a driver team without access to a games source code has to do the programmers work for them by taking their dimwitted shaders and turning them into something which can work well on a PC ... sadly pathetic.
 
It's a bit sad that a driver team without access to a games source code has to do the programmers work for them by taking their dimwitted shaders and turning them into something which can work well on a PC ... sadly pathetic.
If virtual texturing gets more popular (something I personally believe will happen), we are going to lose driver overriden anisotropic filtering as well. Programmers have to code separate support for it as well.

PC gamers have used to be able to bump up the game visual quality with driver settings if they had extra GPU power to spare. But in the future there will be less and less opportunities for this. Most games will start to use GPU computing APIs for their lighting and post processing (it's more efficient that way). And it would be really ackward if the manufacturers modified our GPU computing kernels in order to hack in some extra driver forced effects. This could cause some serious problems, if graphics driver for example messed up with some financial calculation software, because it tried to force antialiasing to it's calculations :)

I personally believe the best way to solve this issue would be to add official API support to user configurable graphics settings. The game could ask DirectX the selected quality settings for filtering, antialiasing and other properties. And then the game could react to these settings, by creating render targets differently and by loading different set of shaders. Currently the game has no knowledge of the driver overridden parameters, and this causes lots of problems.
 
Last edited by a moderator:
I personally believe the best way to solve this issue would be to add official API support to user configurable graphics settings.

That would be good, but the best way would be for developers to not be lazy and code support for good filtering and AA :smile:
 
To take this to it's logical conclusion: imagine a world where GPUs did all go Larabee-like and the entire rendering pipeline was entirely programmed by the ISV*: There effectively couldn't be any driver overrides left at all. Everything would have to be provisioned by the software developer.

* I don't think that actually is going to happen in the foreseeable future, but that's a flame war for another thread.
 
No need for a flame war over that. It will happen eventually but not any time soon.
 
Back
Top