Will next gen games be fully anti-aliased?

blip

Newcomer
Will next gen games be able to be run natively at 1080p with hi-res textures (no blotchy textures when you get close to the wall) AS WELL as having like 16x anti-aliasing to remove jaggies?

A lot of games this generation had no AA, and even 4x isn't really enough for a bigger HDTV. 8x-16x should be the target next gen?
 
As we don't know what hardware they'll have, it's impossible to say, allthough it's extremely unlikely 8x MSAA or better will be common at 1080p. Look at what the best PC rigs can do now with current-gen graphics complexity, and you can see that improving everything across the board as well as AA just isn't going to happen. Given the explosion of alternative AA techniques, I imagine the future will be things like FXAA/TXAA, combining MSAA and post effects or complex resolves to reduce edge jaggyness without the sample costs. IQ should be greatly improved over this gen (although they'll still have the option to cut back IQ in favour of pixel complexity, just like they did this gen) as it has improved already over this generation thanks to new AA techniques, but don't expect perfectly clean games.
 
Higher resolution is better than antialising (in my opinion and I am always right) so I hope devs concentrate on that next gen. With higher resolutions you also get less edges.
 
16xaa is just (WAY) too much of a performance hit for too little gain. Also, isn't msaa dying out anyway?

It will likely be pretty much the same as this gen, visuals first, AA as you can. Should be better though because presumably we'll be at 1080p instead of 720.
 
Will next gen games be able to be run natively at 1080p with hi-res textures (no blotchy textures when you get close to the wall) AS WELL as having like 16x anti-aliasing to remove jaggies?

A lot of games this generation had no AA, and even 4x isn't really enough for a bigger HDTV. 8x-16x should be the target next gen?

will next gen games be able to run natively at 1080p ?
absolutely yes for the majority of next gen games except the first ones (the first year of next gen when developers are learning the hardware). It would be a huge marketing tool for companies to advertize that their games are running natively at full HD 1080p. i wont be surprised if sony advertize quad HD 2560*1440 resolutions as they did with 1080p this generation.

Will next gen games ne hugely anti aliased ?
absolutely yes, a lot more than this gen. but dont expect great sub-pixel anti aliasing, that is very costly for hardware, it would take time to innovate and implement such techniques at reduced hardware processing power. But great hugely improved pixel level anti aliasing ? YES.


Will next gen games be able to have High Rez textures ?
it depends of what do you mean by high rez textures, but if I understand you correclty than absolutely NOT, at least for the majority of next gen games.

Unfortunately you can expect more polygons, more and higher precision visual effects, better lighting, better shadowing, better animations, physics....but high rez textures on consoles ? this requires huge quantity of RAM and bandwidth, which historically arent abundant in consoles hardware.

if we get 1024*1024 textures (a la metro 2033, crysis, battlefield 3) next gen instead of the 512*512 (and less) this current gen we should consider ourselves very fortunate and be very happy.

But next gen games which will exceed the 1024*1024 texture bar would be very rare.

battlefield 3, crysis, metro 2033, level of textures sould be the benchmark for next gen games. Expecting higher rez than that is not realistic. Again we should expect next gen games to deliver better visual effects than those games, but unfortunately not higher rez textures.
 
Higher resolution is better than antialising.
Higher resolution costs more resources, hence the invention of MSAA. The purpose of AA techniques is to remove edge aliasing, texture/shader aliasing, and potentially temporal alasing. Where we can't actually remove them with higher sample rates, we can suppress the artefacts somewhat. MSAA does an okay job of edge aliasing, but we need a toolbox of tricks to solve all the issues.
 
1080p is expected and most games will likely use at least 2xAA with FXAA on top of that or some other post process AA. That should be pretty good really. You definitely don't need 16xAA, 8xAA is the most anyone should aim for and even then 4x MSAA + FXAA will probably be the more common choice.
 
Current high-end PC GPU's don't even support actual 16x MSAA, they cap out at 8 subsamples. Even if you had hardware support it would be a pretty terrible idea for deferred rendering (600MB+ just for G-Buffers?), and wouldn't be enough on its own to combat all forms of aliasing.
 
I hope that they will achieve that pretty well next-gen even if it's at the cost of higher resolution.
A blend of existing techniques like MSAA, MLAA, SRAA,XFAA, temporal AA should give great results.
Still for some games I can see the benefit on pursuing the highest possible resolution.
 
wasn't a big reason for lack of AA in some games deferred shading ? but some people got around that.
MLAA plays better with deferred shading generally?

might sound a bit brute force but remember the old 3dfx tbuffer idea
render multiple frames at different times and spatial offsets then combine them for a mix of temporal and spatial AA. they were trying to use it for depth of field too but they only ever had 4 samples which didn't work well at all.
has anything like that been tried recently with todays uber-GPU's with ports of console content i wonder... I guess you'd get way more benefit from the traversals of memory with smarter techniques.

Imagine having a non AA game for 3d but then just blend 2 frames together for mono vision with a bit of AA with similar code path and balancing decisions
 
And on a similar note, if you have a 720p TV as I do (it's a Panasonic with great IQ, don't really plan on replacing it anytime soon) does the downscale from a native 1080p game to 720p result in a sort of anti-aliasing effect, or would it increase jaggies?
 
And on a similar note, if you have a 720p TV as I do (it's a Panasonic with great IQ, don't really plan on replacing it anytime soon) does the downscale from a native 1080p game to 720p result in a sort of anti-aliasing effect, or would it increase jaggies?
If your TV's doing it right, it will yield a sort of AA, similar to supersampling. Not 100% effective at that resolution, but not bad, either. Combined with whatever in-game AA they're using, you'll probably be fine (at the obvious expense of the sharper image that the higher resolution would otherwise give you).
 
Granted, depending on how far you are sitting from a given size display and how good your eyes are, you may not be able to distinguish a difference in sharpness between a 1080p and a 768p display anyway.
 
"Will next gen games be fully anti-aliased?"

Yes! Blur filters ala MLAA and FXAA ensure that the screen will be, errr, fully anti-aliased. The remaining edges will be called pixel tears caused by all the washed out non-edge detail being surpressed. Ps- For $10 I can anti-alias all your current gen games. I take no responsibility for any damage Vaseline will do to your screen though.
 
"Will next gen games be fully anti-aliased?"

Yes! Blur filters ala MLAA and FXAA ensure that the screen will be, errr, fully anti-aliased. The remaining edges will be called pixel tears caused by all the washed out non-edge detail being surpressed. Ps- For $10 I can anti-alias all your current gen games. I take no responsibility for any damage Vaseline will do to your screen though.
Honestly that's pushing a bit ;) There are a lot of improvements made at that moment I believe that the main issue for a good solution to emerge is indeed our +7 year old sucky hardware.
 
Jokes aside I am not confident that even with powerful hardware (lets say GCN Pitcairn class to Kepler GK104 class) that we will see aliasing get addressed consistently. There is always some sort of new eye candy that takes precedents. And even if they do flick on MSAA I don't have much confidence that they will do so with a proper resolve order to address tone mapping aliasing from HDR approaches. And that doesn't even begin to talk about the issues with aliasing from specular, normals, texture filtering, or just plain, "We used a really low precision shader because it was too slow with the better shader, but ouch my eyes the shader aliasing is horrible!!! Call 911!" And then you have things like transparencies (fences and whatnot) that always get forgotten or the lol worthy 2D sprite grass or shrub that not only looks flat but also has jagged edges and poorly defined boarders. At least the lack of anti-aliasing isn't only killing me, half the time bad guys die next to an in-game explosion the bots death is unrelated to the explosion but due to the spray of sharp edges pixels. And don't even get me going on shadows. Shadows are dark and soft and yet this gen has made them look like stealth ninja blades of pain and suffering. My eyes have both bleed this generation from sharp pixel cuts and grown weaker from squinting at poorly filtered and fuzzy textures. And I have little confidence that (a) next gen hardware will be robust enough to really address all of these issues AND make games look "next gen" as well and (b) that even if we see the hardware to disprove the 'a' I don't know how many developers will have the time to address these things. Unless the new consoles are so crazy fast that "everyday devs" can use simple to impliment / robust solutions that cost an arm and a leg but are "easy to implement" I think we are back to the "I have this crazy renderer that addresses all these hardware short comings and one of the trade offs is you cannot use MSAA without 4x increasing your shadow rendering" or whatnot. Call me a pessimist--I don't think it is the hardware as much as priorities and various resources. I think there will always be that guy who says, "We take a 15-20% fps hit from proper 4xMSAA and resolve order with 8xAF; lets drop MSAA, go with FXAA and 2xAF and add whizbanglighting features instead." Sometimes this will be a good decision... other times, well, said feature just adds to a messy picture.
 
If your TV's doing it right, it will yield a sort of AA, similar to supersampling.
It'll be 2x (ordered grid) SSAA. No TV is going to do anything less than a bilinear resampling. That's not going to have a dramatic effect on edge aliasing, but it's good for shaders and works on specular highlights nicely.
 
Jokes aside I am not confident that even with powerful hardware (lets say GCN Pitcairn class to Kepler GK104 class) that we will see aliasing get addressed consistently.
You're right, but it won't be anything like as bad as this gen IMO. The wealth of new AA techniques is going to offer a range of solutions. Post effects that took 5ms this gen will take 1ms next, because the buffers to work on will be only 2x larger. We've gone from MLAA to FXAA to TXAA in one year, and with new programmable hardware we'll have better options. The days of zero AA should be gone.
 
What i would love personally, is to have support for OGSSAA in next-gen titles on PC. Many postAA techniques are great but not good enough, but with some downsampling like 30-50% they look marvelous.
Something that would use 2xMSAA resolve and heavy postAA functions like FXAA 4 or SMAA with downsampled image from 50% increased resolution should be enough to eliminate all edge jaggies and most of shader aliasing and subpixel aliasing.

It could be dynamic, like Timothy suggested in FXAA 4 blog, when You have spare time over 60/30 fps [depends what You are aiming], You have 1.5x1.5OGSSAA + SMAA T2x/FXXAA 4, but when Your gpu cant keep up it automatically decrease OGSSAA to lower value or as low as Your native resolution and then finally drop AA completely or just leave postAA which would be almost free.
In very gpu heavy and fast scenes You dont care that much for AA, so it could be quite good solution.
 
Last edited by a moderator:
Back
Top