SMAA = magic bullet for next gen AA?

Inuhanyou

Veteran
Yes, i keep bringing up SMAA, but this is the last time. And if possible i'd like it discussed well if anybody does have any thoughts or opinions on it either way;

In terms of flexibility in regards to adapting to all manner of specific game types, genres, hardware configurations, engines ect. Is there ANY reason as far as any of you can tell, why SMAA would not be the standard for all upcoming PS4/XB1 titles?

Meaning, not in terms of simply wanting a different solution(such as 1886's 4xMSAA just because it looks better than 2xMSAA), but in terms of being held back by hardware or software limitations.

Cause from where i'm standing the only reason why SMAA is not widely used is because its not really familiar to console developers in general.
 
I think everything so far has been FXAA or MSAA, right? I guess FXAA was still being used because launch window titles are challenging enough already and they didn't want to rock the boat. I kind of hope people don't catch on with SMAA because then my next game will look better relative to the crowd when it comes out.

It could be that light prepass renderers using MSAA become even more popular in the coming years. As it stands I'm kind of surprised how popular they are right at this moment.
 
SMAA is used by Infamous and also Ryse to great effect...everything besides that has either been custom or FXAA(or msaa)
 
I don't think any post-processing effect is a magic bullet; at best it's a compromise.

It's the same for fxaa vs smaa. It's all a compromise. If there's extra performance to spare, perhaps smaa is a better choice. If not, well you have to make a compromise somewhere...
 
instead of magic bullet, i should have said "should it be the standard".

SMAA is the best looking for the least performance cost outside of FXAA. The gains from using FXAA in relation to the lowest form of SMAA(which is still superior to FXAA and doesn't introduce any blurring) aren't even noticeable, like 1 or 2 fps for most games.
 
But couldn't you argue that the differences between the lowest level of smaa and a quality implementation of fxaa aren't noticeable in practice either? Why pay any fps?
 
because FXAA by definition is blurry. The developer is actually going to have to put in effort to actually calibrate a correct FXAA implementation and have the blurryness not be overly obtrusive. Look at BF4, they just didn't put in the work, and so the image is completely smudged with sub pixel artifacts everywhere on both XB1 and BF4 in addition to blurring the image, same with a lot of FXAA games on console. That should not be the default outcome.

This is in direct contrast to SMAAx1 which requires almost no effort to calibrate. You slap it on there, no blurriness and is still crisper than FXAA when your making the effort to adapt a decent implementation. It would save a lot of time and look better as a result for a marginal cost
 
Like I said it's a compromise. Some developers would rather spend the time implementing a quality version of fxaa instead of giving up a few fps. Perhaps others would prefer the opposite. It depends where they want to compromise.
 
Maybe it is like you said, developers are just trying to get used to the architecture of both consoles before they explore different types of PPAA. FXAA has been a feature since last-gen and moving it over to next-gen consoles shouldn't have been much of a problem. SMAA is a rather new technique and the only game last gen that I could think of that used it was Crysis 3.

Here's what sebbbi said in his recent interview with Digital Foundry

On next-generation consoles we also use FXAA at launch, because we had to prioritise resolution and frame-rate over antialiasing quality.
More advanced algorithms such as SMAA and CMAA provide a minor quality improvement over (properly configured) FXAA at a minor performance cost. We have been evaluating various algorithms, and it is likely that we will switch to a better algorithm in a forthcoming patch. There are multiple feature updates planned for the game after launch, so we still have plenty of time to do small improvements to the rendering pipeline.
 
Of the current console AA solutions, I'll give SMAA my vote. More so, ISS implementation off it.

http://www.eurogamer.net/articles/digitalfoundry-2014-infamous-second-son-performance-analysis
Also worthy of note is the implementation of state-of-the-art anti-aliasing, believed to be a variant of SMAA T2X, as found in Crysis 3. This is one of the best post-process anti-aliasing techniques we've seen, combining a new take on MLAA with a temporal element. Edge-smoothing is phenomenal, and while there is some ghosting, it is not any kind of real distraction during gameplay.
 
SMAA 1x is one of the best pure PPAA algorithms (if not the best one). It's easy to integrate and offers pretty much as high quality as a pure PPAA can (without needing any extra data in addition to the final color buffer). It's better than FXAA and MLAA, but not a "magic bullet" by any means.

Other SMAA algorithms (s2x, t2x, 4x) need additionally either subpixel data or temporal data (or both). This makes the integration of these better algorithms harder (and/or less efficient) to deferred rendering pipelines. I'd expect the temporal algorithms to gain more popularity, since the extra cost of reprojection is much smaller compared to rendering everything at subpixel precision. However the temporal algorithms suffer from occasional ghosting, so it might not be the best choice for all games.

I also expect to see some algorithms in the future that exploit the coverage sampling hardware on modern GPUs (for better subsample edge determination), but not actually save multiple color samples per pixel. This would be a cheap way to get subsample edge information to a PPAA algorithm.
 
Last edited by a moderator:
You'd know more about this than anyone without hands on experience Sebbbi, you'd agree that SMAA1x at the very least is the most balanced solution in regards to IQ and performance impact right?

And if so, would you personally say that, in general terms, there's nothing specifically holding back widespread adoption of SMAA1x besides political factors?

I've also heard that SMAA1x is an improved form of MLAA. Is this the same MLAA that powered many PS3 games through the SPU's?
 
Until 7 months ago the SMAA license was a bit tricky. They changed to a MIT license to encourage adoption.

Also it's more interesting to use SMAA now (GPU time/IQ ratio) with those +1 Tflops GPUs.

And finally SMAA would still need hours of additionnal work compared to just using the well known FXAA. And it would need even more effort than that if you use temporal AA. Ubisoft spend weeks to correctly implement their customized version in AC4.

Anyway I do agree that SMAA (even 1x) is the best solution in regard to IQ and performance, even with 60fps games.

- SMAA 2TX is the best balance with 30fps games
- SMAA 1x is the best balance for 60fps games
 
you'd agree that SMAA1x at the very least is the most balanced solution in regards to IQ and performance impact right?
Yes, 1x is among the best (cheap cost + good quality + easy to integrate). The temporal ones (1TX/2TX) are harder to integrate, and the re-projection might be hard to get right (especially with alpha layers), but when it is integrated properly and it works for your pipeline, I think it's very hard to beat. Unfortunately it seems that even on next gen consoles, locked 60 fps still isn't the "standard" frame rate (most games are either locked 30 fps or fluctuate somewhere between 40-50 fps). Locked 60 fps halves the ghosting compared to 30 fps, and thus makes re-projection based techniques much more usable.
I've also heard that SMAA1x is an improved form of MLAA. Is this the same MLAA that powered many PS3 games through the SPU's?
There's many flavors of MLAA, all based on same principles (finding edge shapes and blending pixels together based on the pixel location in the shape). Intel did the first algorithm (it was CPU based). SPU algorithm is similar to this, and the GPU-based algorithms are often based on shape lookup tables. Results are similar. SMAA is based on the same ideas, but is more advanced (and can be combined with multisampling and temporal re-projection).
 
Thank you for the information! :)

So the MLAA variant used by SMAA is more advanced than the vanilla types of MLAA. I'm assuming with the next gen GPU's, it is easier to implement SMAA1x now through GPU based algorithms.

Well, i hope it takes off.
 
I always like to point out that post-processing and temporal solutions are not mutually exclusive with MSAA. Even within MSAA itself there's lots of room for variations that let you make various trade-offs between performance, aliasing, and high-frequency detail. This is especially true if you're working on consoles, since you have raw access to the advanced MSAA capabilities present in those GPU's. Personally I just look at MSAA as a framework for getting sub-pixel coverage and color information, which gives me additional data for performing a filtering step.
 
That was the post processing AA solution applied along with the 1080p patch right?

It looked almost a world apart from the 900p FXAA original version. I wonder much improvement was gained from SMAA and how much from the boost the resolution.

If Ubisoft is using SMAA in AC4 already, i wonder if its possible they use it in Watch Dogs for the PS4 version?
 
That was the post processing AA solution applied along with the 1080p patch right?

It looked almost a world apart from the 900p FXAA original version. I wonder much improvement was gained from SMAA and how much from the boost the resolution.

If Ubisoft is using SMAA in AC4 already, i wonder if its possible they use it in Watch Dogs for the PS4 version?

They are indeed using SMAA in Watchdogs. They confirmed it in an interview.

They are using it with a temporal AA component (like AC4 patched). You could say they are using a customized SMAA 2tx (maybe 1tx not really sure here...).

For the AC4 patch I would say 75% of the improvement came from replacing FXAA by SMAA + temporal AA and 25% from 900p -> 1080p.
 
and the re-projection might be hard to get right (especially with alpha layers)
Multiple things happening at a single pixel seems like a substantial challenge in games nowadays. Motion buffers are used for several things, and they don't play nicely with details moving relative to each other in the same position.

Even stuff like shadows in a motion-blurred scene is something that regularly winds up looking quite incorrect.
 
Back
Top