Anti-Aliasing types for Next Gen Consoles

BBgfvf0CIAAdFET.png:large




BBiaV9PCQAAza3l.png:large


Timothy Lottes ( creator of FXAA ) developing a cross platform new AA type !

Which AA types will we see in next gen consoles ?

For reference ;

durango and orbis ( rumoured specs )

BBiby_1CcAIFFas.png:large
 
Post-Processing "AA" that applied before the HUD is applied.

on PC, SMAA applied on very last render ruined text and HUD :(
 
on PC, SMAA applied on very last render ruined text and HUD :(
Not necessarily. If the HUD itself has jaggies, then you'll want the AA to apply to it as well. Hard Reset was a good example of this, the HUD looked much better with post-AA, IMO.

Besides, if it's implemented correctly, and the HUD is already smooth, then the AA should ignore it. I get really good results personally with injected SMAA, simply because it doesn't automatically smear the entire image, it really does isolate itself quite well to the hard edges. Works amazingly well on Borderlands, which otherwise suffers pretty badly from jaggies because of the art style.
 
Which AA types will we see in next gen consoles ?
Same AA as this gen (multiple sampling at some level, and 'data reconstruction' techniques). Unless someone invents something new (don't see any new options meself other than some new reconstruction algo), which of course we can't talk about because it doesn't exist yet.
 
Am I the only one not very bothered by alising? I would rather have better frame rate than anti-alising.
 
Am I the only one not very bothered by alising? I would rather have better frame rate than anti-alising.

It's gotten better but anytime things shi,mmer or crawl it bothers me. Especially on the simpler games where you wonder if the low budget coders know what AA is and how much better it'd make heir game look. Especially a problem with low budget japanese games.
 
SMAA would be fine with me. Is superior to FXAA with very little impact on performance (relative to FXAA).

I think MSAA, even if doable will be sacrificed cause it can be a resource hog.

But if games are 1080P then we need less Aliasing any how. so may be we will get a mix of SMAA and low level MSAA ?
 
Am I the only one not very bothered by alising? I would rather have better frame rate than anti-alising.

In my opinion AA is something that has to be improved at all cost with the next gen. Take a look at Uncharted 2 (720p + 2xMSAA) and Uncharted 3 (720p + FXAA) for example. The latter has absolutely outstanding graphics but the image quality is heavily inferior to Uncharted 2. I played both games in a row and I was shocked of how much aliasing U3 has compared to the predecessor.

Having finally a 1:1 ratio between rendering resolution and screen resolution would have a very positive effect on the image quality anyway. 1920x1080 + 2xMSAA + Post-AA is what I want badly for the next gen games. I'm really, really tired of a low rendering resolution in combination with Post-AA and I don't want to see it in the new games. In my eyes developers need to find a good balance between graphics, image quality and frame rate, but in the last couple of years most devs ranked graphics first and ignored the other two aspects. "Why bother with image quality and framerate if youtube is going so downsample it with 30FPS anyway?" - That's probably what they thought. PD's GT5 is probably the best example of how to balance graphics, image quality and frame rate, even though one has to keep in mind the insane resources for their game.
 
I'm not worried about AA next gen, as I figure 100% of games will feature at least some cheap post process solution. The better looking ones will go a little fancier with it and maybe have more samples by pixel, low budget ones will just use the most popular open source algo out there and we will be fine.
If most games run at 1080p, i really don't care too much about a little bit of blur on my screen.
 
I'm not worried about AA next gen, as I figure 100% of games will feature at least some cheap post process solution. The better looking ones will go a little fancier with it and maybe have more samples by pixel, low budget ones will just use the most popular open source algo out there and we will be fine.
If most games run at 1080p, i really don't care too much about a little bit of blur on my screen.

Well, I am worried about AA next gen! Even on my super duper PC, aliasing is a big issue...
 
At 1080p I am finding AA to be pretty unimportant, and stuff like good depth of field and motion blur becomes more important.
 
new AA Timothy lottes is creating !
Sounds like he's just supersampling on a rotated grid (as opposed to an ordered grid in line with your display grid, which you want to avoid; he's doing what he's doing for roughly the same reason that SGSSAA exists).

As he noted, that's an old idea.

Unless I'm missing something.

Not that I'd mind next-gen devs taking a drop in the amount of on-screen junk they can push to give us HD imagery with full-screen supersampling. Smash those shimmery thin thingies for good!
 
Sounds like he's just supersampling on a rotated grid (as opposed to an ordered grid in line with your display grid, which you want to avoid; he's doing what he's doing for roughly the same reason that SGSSAA exists).

As he noted, that's an old idea.

Unless I'm missing something.

Not that I'd mind next-gen devs taking a drop in the amount of on-screen junk they can push to give us HD imagery with full-screen supersampling. Smash those shimmery thin thingies for good!

Nope not anything new. The first consumer video card to do this was the Voodoo 5 well over a decade ago. It has superb AA quality at the expense of high performance and memory cost.

As with any SSAA method the performance impact scales mostly linearly. Hence 2x AA costs twice the graphics rendering performance while 4x AA would cost 4x the graphics rendering performance.

The benefit is that it provides, IMO, the best AA quality possible. Depending on the game, and people's tastes, however, it can be argued that the increase in AA quality isn't worth the associate rendering performance impact.

MSAA + some form of SSAA for transparencies is still the best compromise between quality and performance.

Shader based AA will always be the worst and, IMO, rarely gives a satisfying final image. It'll always either leave too many edges unsatisfactorily AAed or will blur too many textures. MLAA, TXAA, SMAA, etc. all provide subpar image quality although occasionally it can be better than an image without those. But often I find the final image quality to be below an untouched image, for my tastes. And anyone that's been on Beyond3D for a while knows how much I absolutely HATE HATE HATE HATE HATE (ad nauseum) aliasing artifacts.

High pixel density displays may be able to breathe new life into shader/compute based AA algorhythms, however. As the texture blurring at 4x pixel density may not be as noticeable as it is with current displays. The only problem is that gaming at the resolutions required for high pixel density displays will make it virtually impossible for anything but enthusiast class graphics cards on PC for quite a few years.

Regards,
SB
 
Oh, well that's cool. FXAA and similar algos is an interesting technique in that it doesn't inflate bandwidth and memory requirements like MSAA does, but unfortunately it's not really antialiasing as much as simply an advanced form of blurring, and its effectiveness is a bit limited because of it. Maybe you can't really tell as much of a difference at HD resolutions or higher as you could back when good ol' quincunx was used, and common gaming rez was 1024*768 or even less.
 
Oh, well that's cool. FXAA and similar algos is an interesting technique in that it doesn't inflate bandwidth and memory requirements like MSAA does, but unfortunately it's not really antialiasing as much as simply an advanced form of blurring, and its effectiveness is a bit limited because of it. Maybe you can't really tell as much of a difference at HD resolutions or higher as you could back when good ol' quincunx was used, and common gaming rez was 1024*768 or even less.

High resolution has nothing to do with it on its own.

Aliasing looks almost exactly the same on a typical 1024x768 display of the time (14-15") as it does on a 1920x1080 display (24") or 2560x1600 display (30") when viewed from a similar distance. The PPI for all of those are roughly similar although most 15" monitors of the time could do 1280x1024 which is closer to the average modern display PPI. In other words the jaggies would look pretty much the same.

I really wish people would stop just saying high resolution as that is relatively meaningless by itself. A personal pet peeve of mine as aliasing and various rendering artifacts are my biggest annoyance with 3D gaming. If people can't even get the terminology right, there's no hope that the problem will ever be addressed.

Sure aliasing will be less noticeable on a 5" - 1920x1080 display than it would be on the 24" - 1920x1080 display. Nothing at all to do with the resolution. Everything to do with the pixels per inch.

Hell, aliasing would be less noticeable on a 5" - 1920x1080 display than it would be on a 30" - 2560x1600 display. Oh damn, higher resolutioin = more noticeable aliasing. :p

BTW - don't think I'm picking on you for this Grall (I'm not :)), a LOT of people have this misconception that higher resolution = less noticeable aliasing when that is not the case at all.

Regards,
SB
 
Back
Top