Anti-Aliasing types for Next Gen Consoles

QAA should be gone next gen, no more nVidia GPU. I think MSAA + SMAA combo might be best to stay? 2XMSAA + SMAA should be very good combo no? Assuming all games aims at higher than 1080p. And eventually down to just SMAA later in the console life cycle due to performance.

I would say 2xMLAA (Sony's implementation) + SMAA would be great combo. The only main compliant I've heard against Sony's version of MLAA is sub-pixel aliasing. Maybe just Sony's implemention of MLAA + SMAA might be good enough.

That should be pretty light on compute and memory resources, right?
 
I would say 2xMLAA (Sony's implementation) + SMAA would be great combo. The only main compliant I've heard against Sony's version of MLAA is sub-pixel aliasing. Maybe just Sony's implemention of MLAA + SMAA might be good enough.

That should be pretty light on compute and memory resources, right?

isnt Ps3 MLAA specifically design to work on the SPU. And isn't SMAA just MLAA with sub pixel smoothing.
 
isnt Ps3 MLAA specifically design to work on the SPU. And isn't SMAA just MLAA with sub pixel smoothing.
No.

SMAA has better edge shapes, voluntary temporal component and it can use MSAA sample locations to fit the edges meaning sub-pixel accuracy.
The injected 1xSMAA can only use the better edge shapes and thus is not a good representation of the full solution. (It doesn't have any sub-pixel properties.)
http://www.iryoku.com/smaa/#downloads
I would say 2xMLAA (Sony's implementation) + SMAA would be great combo. The only main compliant I've heard against Sony's version of MLAA is sub-pixel aliasing. Maybe just Sony's implemention of MLAA + SMAA might be good enough.
I have never heard that Sonys MLAA would use any sub-samples so 2xMLAA is wrong way to describe it.
They do some very nice things on detecting edges though.
http://iryoku.com/aacourse/downloads/06-MLAA-on-PS3.pdf

MLAA+SMAA used on top of each other would cause latter to find very few edges and what it would find would be false positives which would reduce quality.
MSAA sample based post-AA methods like 4xSMAA is better way.
 
Last edited by a moderator:
This "Compressed AA" On Durango doesn't sound very nice. Anything that introduces Artifacts reduces image quality a lot.

And while 1080p is good resolution, 1080p with no AA still looks as awful as 720p with no AA.

Good AA is an ABSOLUTE must for next generation.

Even in the Luminous Engine demo that ran with 8xMSAA+FXAA for Hair+DOF for AA still had temporal aliasing such as specular shimmering.


AA MATTERS. Regardless of viewing distance. You'd have to sit a very large distance away to not see temporal aliasing in a game.

No.

SMAA has better edge shapes, voluntary temporal component and it can use MSAA sample locations to fit the edges meaning sub-pixel accuracy.
The injected 1xSMAA can only use the better edge shapes and thus is not a good representation of the full solution. (It doesn't have any sub-pixel properties.)
http://www.iryoku.com/smaa/#downloads

I have never heard that Sonys MLAA would use any sub-samples so 2xMLAA is wrong way to describe it.
They do some very nice things on detecting edges though.
http://iryoku.com/aacourse/downloads/06-MLAA-on-PS3.pdf

MLAA+SMAA used on top of each other would cause latter to find very few edges and what it would find would be false positives which would reduce quality.
MSAA sample based post-AA methods like 4xSMAA is better way.
FXAA works good at high resolutions. It doesn't resolve Temporal aliasing, but it still works ok, and the higher you go from 1920x1080p the better FXAA works. (In my experiences at least, dpending on the game)

Binary Domain for example on PC @2560x1440p W/FXAA downsampled (OGSSAA)produces an amazingly good result for the cost. (It helps that the game doesn't have issues that many other games have. Severe specular aliasing for example)

Which brings us back to OGSSAA-Stupid SSAA. OGSSAA by itself can just outright suck at times.

The perfect example for this is Bad Company 2. You can use OGSSAA by a factor of 4+ and the IQ is still basically the same.

And Crysis 1 as well, OGSSAA by itself looks both terrible in stills and in motion
https://i6.minus.com/iyAD3TGcJdttP.png Crysis OGSSAA'd from 4k Resolution to 1080p. It looks way worse in motion.

But if you use OGSSAA+another form of AA the results can be absolutely wonderful, depending on the game.

Crysis 1 again: OGSSAA'd from 1536p with 4xMSAA+FXAA
https://i3.minus.com/i0tWRDLYWdTUl.png
vs 4xSGSSAA @1080p https://i4.minus.com/ibmwNVEQ4vposn.png (Performance hit on my system was similar with both IIRC. Too hard to remember)

Using BD again as an example
2732x1536(Which is a factor of 4.5) OGSSAA+FXAA down to 720p vs the 720p Native image on the PS3
https://i2.minus.com/iCQtpSkXdMZVC.png AA'd
PS3
https://i3.minus.com/ibvni6LHfHFYUx.png
The leap in IQ is enormous. (Although at this point you become limited by the quality of the game assets when it comes to certain thing's. No amount of filtering will make some of those low res textures look any better)

Another shot but down to 1600x900 instead and the original full resolution file
https://i3.minus.com/ibzWJKMslPkOlr.png
https://i4.minus.com/iCF4iXf5HtIJb.png

MLAA+SMAA would most likely conflict, but it depends on the order in which what goes first, just as with anything else.



MSAA+FSSGSSAA+SMAA1x(In that order) for example produces an EXCELLENT result in Resident Evil 5
A comparison of just 2xMSAA+2xFSSGSSAA(With a special AA bit that resolves some MSAA issues. Which is why you don't see the White outlines on the first picture, but on some places in the 2nd. This is down to what performance you can afford on PC so ignore the outlines) vs the same(With no AA bit) +SMAA at the end.
http://screenshotcomparison.com/comparison/4821

It produces very good results with next to no temporal aliasing or shimmering.


Honestly i'd be happy if we got TXAA with these consoles. It's not limited in hardware to 6XX. Timothy Lottes has pretty much confirmed as much himself(Though the posts have been deleted now for some reason). He said we could get it on Fermi (4xx&5xx) if enough people probably pestered Nvidia or not.

And yeah, no NV GPU's in these consoles(Which I think is a mistake on one hand because on PC at least, NV GPU's far outclass AMD GPU's in terms of AA and the amount of AA solutions available) but FXAA is an "Nvidia" product and it was used on the 360.


Hell, the dream is SGSSAA. But I doubt we'll see that, even at 30FPS(Which is fine in my book for Cinematic style games. Action games like DMC and ALL multiplayer games absolutely REQUIRE 60FPS IMO)

I doubt we'll even see anything near the Luminous Engine and UE4 Demo's with Decent AA.

I'm sure this gen will start out like the last.

At the beginning, everyone is using MSAA. And then they all start using post process AA, or use it straight from the beginning. which at 720p absolutely sucks.(It's one of the things I despised about Dragon's Dogma. The move to deferred makes the game look worse because of the lack of AA IMO.)



Even MSAA+TrSGSSAA(TrSSAA in Nvidia drivers)(+another Post process AA solution like SMAA/FXAA maybe) if the game doesn't have severe temporal and specular aliasing issues would be sufficient and produce excellent results.

Like in Syndicate for example
https://i3.minus.com/iotQsJids9ewg.png
This is 4xMSAA+4xTRSSAA+in-gameFXAA. It looks glorious in motion and any temporal aliasing is basically nonexistant. (Despite on the soft side. Texture quality doesn't take a huge hit. And on next gen consoles, SMAAT2X or one another method could easily be used instead)

Crysis 1 had a built in Post-AA method that when combined with FXAA @1080p produces a good result as well. Despite some temporal aliasing and unresolvable stuff.
https://i3.minus.com/ibzqOusci61rU5.jpg (Sorry for picture quality. These were taken on my last playthrough of the game using built in Screen capture from the game/Steam. Poor JPG compression)

If neither of these consoles can produce decent AA @1080p minimum, i'll be buying the few exclusives that dont' get ported to PC and that's about it probably. Because 3D games without AA today IMO is just not a great thing.

Especially if you've played remotely any PC games released in the last 4 years.

AA is just as important as any other advances in Real time rendering to making things look "Realistic" or perfecting your vision without compromise due to the innate artifacts of the medium. Aliasing.

At least IMO. And i'm probably one of the few that think this.
 
Last edited by a moderator:
Really?? Are things looking that bad for next gen?

The jump from PS2 to PS3 was quite enormous, we got higher res graphics and IQ at the same time as a level of detail which was many times better - think any game really but Uncharted comes to mind.

Surely a PS4 will be able to increase the IQ as well as increase the level of detail, without having to choose between the two?

I would be very, very disappointed if that were not the case. They're many years apart after all.

Resolution, frame rate, graphical quality: the holy triad of game development (I just made that up). Any way, in any 3D game you make, one will always suffer relative to the other 2 and in comparison to other games.

When I look at how many games today don't even run at 720p, hoping for 1080p60 just seems out of the question. Even 1080p30 seems like it would be a waste since I doubt most people would really notice on your average size television. I'd be happy just with 720p60 and a real put on graphical quality.
 
isnt Ps3 MLAA specifically design to work on the SPU. And isn't SMAA just MLAA with sub pixel smoothing.

Yes SMAA is essentially a more effective implementation of MLAA principles. It blurs less because it ignores low contrast discontinuities. It's also better at handling diagonals and reconstructing geometric coverage.

On top of that it adds a temporal component and the ability to use 2xMSAA for subsample info. It's currently the best post-AA method out there as far as I can tell.

SMAA 4x in Crysis 3 combines 2 samples from temporal re-projection (SMAA T2X) between frames with two samples from regular ole MSAA (SMAA S2X).

MSAA is no longer the holy grail anyway. Shader aliasing in Crysis 2 is atrocious and MSAA won't fix that.
 
MSAA is no longer the holy grail anyway. Shader aliasing in Crysis 2 is atrocious and MSAA won't fix that.

Yup. It's no longer the holy grail (never has been, IMO) but is still required for quality edge AA.

Basically we're at the point where MSAA is required (IMO) for good edge AA (polygon edges) but we still need something to address shader aliasing, specular aliasing, alpha texture aliasing (there's well implemented solutions for this as well at various performance costs), etc.

The holy grail is trying to get something that can get you close to the IQ of high levels of RGSSAA without the associated high performance penalty of high levels of RGSSAA. Or as Shifty mentioned SSAA using stochastic (randomly generated every frame) sample patterns.

Regards,
SB
 
Really?? Are things looking that bad for next gen?

The jump from PS2 to PS3 was quite enormous, we got higher res graphics and IQ at the same time as a level of detail which was many times better - think any game really but Uncharted comes to mind.

Surely a PS4 will be able to increase the IQ as well as increase the level of detail, without having to choose between the two?

I would be very, very disappointed if that were not the case. They're many years apart after all.

Going to 1080p is 2.25 times the pixels, and the rumors of the next gen consoles put them at 5-6 times current gen in terms of general processing power. 5-6 divided by 2.25 is 2.22 to 2.66 times the compute per pixel compared to current gen. Thats not all that much.

Not enough to run the Samaritan demo Epic made a year or so ago, which required 2.5 Tflops at 1080p and 30 fps, unless you go down to 720p, which only need 1.1Tflops.
 
Samaritan demo was also running on UE3 not UE4 and it wasn't tailor made for a certain specification either.
 
Back
Top