Anti-Aliasing types for Next Gen Consoles

If you have higher resolution but same screen size you will see less aliasing. If you move from 720p to 1080p on the same TV you will see less aliasing.
 
And doesn't higher resolutions on the same screen have a huge impact on the texture quality as well? Look at Dark Souls PC Version on a 1920x1080 screen with and without Durante's resolution fix. The same textures look ultra-blurred at 1280x720, but at 1920x1080 the very same textures look like Hi-Res compared to the 720p situation. In my eyes it is absolutely worth aspiring to have a ratio of 1:1 between screen resolution and render resolution, 1:2, 1:4 or 1:10 would be even better. In most current gen games we have a ration of 2.25:1, that's just the opposite of how it should be!

I have a 24 inch 120Hz display for my PC that has a feature called "Smart Scaling" which allows you to have a ratio of 1:1 between screen resolution and render resolution. I connected my PlayStation to it and used the 1:1 ratio and it looked absolutely great! The image was razor-sharp, it just looked as if rendered on a high end PC. The downside: The image was very tiny with very huge black bars at the sides of it... :D

I want that for next gen (the razor-sharpness, not the tiny image). I'm not going to buy a console if it isn't playing games in 1080p and I'm not going to buy a single game that is not rendered in 1080p (maybe except for dynamic resolution rendering techniques). It's 2013, I've had 720p for 8 years...
 
And doesn't higher resolutions on the same screen have a huge impact on the texture quality as well?
Heh, fire up good ol' Duke Nukem at 640*480 versus 1024*768... DN3D probably uses textures that's like 64*64 pixels or so (maybe at most) and blow them up across an entire wall sometimes, and it still makes a heck of a difference in-game. Of course, it doesn't mipmap, so everything distant beyond a certain limit turns into a mess of crawly ants; increasing rez pushes that distance back...
 
If you have higher resolution but same screen size you will see less aliasing. If you move from 720p to 1080p on the same TV you will see less aliasing.

Yes, that's the whole point of my post. Just saying "high resolution" is meaningless and rather misleading. Is aliasing on a modern day top of the line 1080 - 80" TV going to have less noticeable aliasing than a top of the line 720p - 40" TV from 2003? It has more resolution, no? But depending on how far you sit from the TV will actually determine whether it appears to be more aliased or not.

I have a 24 inch 120Hz display for my PC that has a feature called "Smart Scaling" which allows you to have a ratio of 1:1 between screen resolution and render resolution. I connected my PlayStation to it and used the 1:1 ratio and it looked absolutely great! The image was razor-sharp, it just looked as if rendered on a high end PC. The downside: The image was very tiny with very huge black bars at the sides of it... :D

I want that for next gen (the razor-sharpness, not the tiny image). I'm not going to buy a console if it isn't playing games in 1080p and I'm not going to buy a single game that is not rendered in 1080p (maybe except for dynamic resolution rendering techniques). It's 2013, I've had 720p for 8 years...

There's a difference between sitting 1-2 feet away from a desktop monitor and 3-4 meters from a living room TV. In the first case, I can certainly quite easily notice whether I'm running at native resolution or not. On my 55" TV that is 3-4 meters away in my living room? Absolutely no visible difference between 720p and 1080p (and this is PC games where I can certainly crank up the detail).

As to your second paragraph. I'd be prepared to not play a lot of games on Orbis and Durango then. :) Especially not the ones that try to push graphics to the next generation.

Basically pick one.

Current gen level of graphics at 1080p. Or next gen graphics at 720p.

Regards,
SB
 
Yes, that's the whole point of my post. Just saying "high resolution" is meaningless and rather misleading. Is aliasing on a modern day top of the line 1080 - 80" TV going to have less noticeable aliasing than a top of the line 720p - 40" TV from 2003? It has more resolution, no? But depending on how far you sit from the TV will actually determine whether it appears to be more aliased or not.

There's a difference between sitting 1-2 feet away from a desktop monitor and 3-4 meters from a living room TV. In the first case, I can certainly quite easily notice whether I'm running at native resolution or not. On my 55" TV that is 3-4 meters away in my living room? Absolutely no visible difference between 720p and 1080p (and this is PC games where I can certainly crank up the detail).

As to your second paragraph. I'd be prepared to not play a lot of games on Orbis and Durango then. :) Especially not the ones that try to push graphics to the next generation.

Basically pick one.

Current gen level of graphics at 1080p. Or next gen graphics at 720p.

Regards,
SB

So basically next gen is gonna be just God of War Ascension/Halo 4 at 1080p/30fps? I guess we'll see about that in a few months.

Also many guys here say that PPAA is gonna be again very popular next gen, I guess games going for closer to 1080p resolutions will probably use FXAA or other cheaper AA solutions while 720p titles will use PPAA/MSAA combos a la Horizon/American Nightmare for better IQ.
 
Basically pick one.

Current gen level of graphics at 1080p. Or next gen graphics at 720p.

The difference between 1080p and 720p is factor 2.25. The rumored Durango is about five times more powerful on paper as the current gen consoles, Orbis is about eight times more powerful, and that is without taking the architectural advantages of the HSA design into consideration. The new AMD Temash HSA SoC was presented a few weeks ago and it managed to render Dirt: Showdown, a current gen DirectX game, at 1080p with laughable 5W! For this reason I think it is highly doubtable that a 150W custom HSA SoC with a super thin console API won't be able to achieve more than current gen level of graphics at 1080p.

For me it's totaly clear: I will pick the system that delivers 1080p, if both system can't deliver 1080p than I won't pick neither. It's always graphics, graphics, graphics, since that is what you can easily exploit in (downsampled) screenshot or in (downsampled) youtube videos, but the image quality and the framerates of today's games are absolutely lousy. I want a balanced game that spares one or two effects in order to deliver more image quality and better playability. Why putting so much effort in designing a game if nobody can see the little details anyway, because of the blurry image?

And for the viewing distance of the gamer: It's impossible to give a general statement on this since you have to keep the size of the screen in mind. Bigger screens require higher resolutions. You can't expect that console gamers play with a distance of 4, 5 or 6 metres to the screen only because you don't want to give them a proper 1080p image. HD4K is in the starting blocks already. You can't bring a 720p console to the market in 2013, unless you want to fight with Nintendo and Ouya over the casual gamers.

(...)while 720p titles will use PPAA/MSAA combos a la Horizon/American Nightmare for better IQ.

Image quality is not only anti-aliasing. A 720p game with a more potent AA algorithm may have less aliasing but it will have a blurry image with blurry textures compared to the 1080p game.
 
Image quality is not only anti-aliasing. A 720p game with a more potent AA algorithm may have less aliasing but it will have a blurry image with blurry textures compared to the 1080p game.
AA doesn't have to add blur. MSAA and SSAA don't introduce any blur. In the case of 1080p 2xMSAA vs 720p 8xMSAA+16xAF, 720p will have the better IQ. but if viewed with a wide FOV, the higher resolution may be preferable.

Oh, that's what you mean. 720p on a larger display is blurrier.
 
Yeah, let's say we have a 37 inch 1080p TV screen and we can choose between GameXY being rendered in 720p + 8xMSAA or in 1080p + 2xMSAA. The game will have less aliasing with 720p + 8xMSAA, but the 1080p + 2xMSAA will be less blurrier, it will show more details, especially the textures will look much better.

We would have the best image quality in terms of aliasing and in terms of sharpness if the game would be rendered at (for example) 2880x1620 and output on the 1920x1080 screen. That would be downsampling. So we have one extreme on one side (low res + high AA) and one on the other (downsampling). The 1920x1080 + 2xMSAA solution is in the middle between these extremes, it delivers the best of both worlds: It will have proper anti-aliasing without a blurry image and blurry textures. In my eyes 1080p is going to be the most balanced approach for next gen games.
 
So we have one extreme on one side (low res + high AA) and one on the other (downsampling).
Downsampling is antialiasing, known technically as supersampling. This is the highest quality AA because it samples all values so includes texture filtering too, but comes at considerable cost. MSAA uses selective supersampling on geometry edge, but doesn't filter textures/shaders. However, high levels of MSAA will reduce the amount of visible steps in an aliased edge by more than a 2x supersampled image will. Similarly, high levels of texture filtering will reduce texture aliasing far better than 2 samples per pixel. That's where development of more targeted, more efficient techniques are important, and supersampling is left to PCs with too much power on their hands and nothing else to do except rendering everything 5x bigger. ;)

There's a lot of value to reconstruction AA. It's the basis of vector line and font rendering, which we consider as perfectly smooth. Incorporating multisampling and temporal sampling to our renderers should deliver the best solution we can for the given level of technology. PS3's MLAA examples have produced some incredibly smooth visuals at times, with only the shimmer being a hindrance, and that can be solved. The choice of rendering resolution is more down to the end user, depending on the TV and viewing distance. If devs provide two solutions (like Under Siege on PS3 offering options for 1080p or 720p+MLAA), users can pick their preferred poison.
 
I have a silly question, i've just thought of it and realise I don't really know.
How do 2/2.5D XBLA/PSN/PC type games do AA?

I'm playing the excellent Don't Starve on my Mac, it's running in Chrome and the IQ is excellent, does it render using vectors or something?
 
I have a silly question, i've just thought of it and realise I don't really know.
How do 2/2.5D XBLA/PSN/PC type games do AA?

I'm playing the excellent Don't Starve on my Mac, it's running in Chrome and the IQ is excellent, does it render using vectors or something?
Most 2D games are sprite based and hence don't have polygons, which happen to be the main source of aliasing in 3D games. The ones that do (Street Fighter 4, Marvel vs Capcom 2) have obvious aliased edges.

However I'd like to know what they do in case of games Rayman Origins where they have something like a fluid that moves (the water) but looks sharp with no aliasing, it's still a shader and I suppose it's made of polygons too considering it reacts and deforms to the character movements.
 
Last edited by a moderator:
MSAA uses selective supersampling on geometry edge, but doesn't filter textures/shaders. However, high levels of MSAA will reduce the amount of visible steps in an aliased edge by more than a 2x supersampled image will. Similarly, high levels of texture filtering will reduce texture aliasing far better than 2 samples per pixel. That's where development of more targeted, more efficient techniques are important, and supersampling is left to PCs with too much power on their hands and nothing else to do except rendering everything 5x bigger.

But the image will still be blurrier. ;)

I want the razor-sharpness of an image rendered with a 1:1 ratio of screen resolution and rendering resolution. Rendering a game with 1080p on a 1080p screen is superior to rendering it with 720p in every way, at least in terms of image quality. You can't deny that. If you had a 1080p screen, would you render the image with 720p + 8xMSAA + 16:1AF or would you render it with 1080p + 4xMSAA + 8:1AF? I would do the latter without question. You can activate all your filter algorithms for your 720p image, 8xSGSSAA if you like, but you will never get one thing: The sharpness of a 1080p image.

In order to increase the sharpness of your 720p image you have two choices: Either you buy a smaller screen, or you increase the distance between yourself and the screen. Both choices are ridiculous, nobody will downgrade the TV only to have better IQ in games, and most gamers have an finite amount of space in their living room, which means that the 1080p screen is already adapted for the size of the room. And even if you have the possibility to sit 5 metres away from you screen, you will barely see anything. Video games are not movies. Instead of having close ups on faces all the time you need to detect enemies hiding in the distance. I'm playing on a 37 inch 1080p screen and while playing a game I sit about 1.5 - 2 metres away from it due to the bad IQ. I could be playing with a distance of 3 - 4 metres (the distance I watch movies with) but then I'm not able to recognize the details, enemies, etc. It's impossible.
 
Last edited by a moderator:
Most 2D games are sprite based and hence don't have polygons, which happen to be the main source of aliasing in 3D games. The ones that do (Street Fighter 4, Marvel vs Capcom 2) have obvious aliased edges.
Reason why games with 2D art usually do not have aliasing is because of the pre-filtering of art.
If pre-filtering is not used the games can have same same aliasing hell as any polygonal game.

However I'd like to know what they do in case of games Rayman Origins where they have something like a fluid that moves (the water) but looks sharp with no aliasing, it's still a shader and I suppose it's made of polygons too considering it reacts and deforms to the character movements.
Considering that the game is 2D game you can easily use what ever method you like for it. (Including many which require no polygons, like distance fields)
 
Basically pick one.

Current gen level of graphics at 1080p. Or next gen graphics at 720p.

Really?? Are things looking that bad for next gen?

The jump from PS2 to PS3 was quite enormous, we got higher res graphics and IQ at the same time as a level of detail which was many times better - think any game really but Uncharted comes to mind.

Surely a PS4 will be able to increase the IQ as well as increase the level of detail, without having to choose between the two?

I would be very, very disappointed if that were not the case. They're many years apart after all.
 
They should go with GoW:A/Beyond/LastOfUs postprocessing AA solution or smth simmilar. Those Sony games have fantastic IQ.
 
Really?? Are things looking that bad for next gen?

The jump from PS2 to PS3 was quite enormous, we got higher res graphics and IQ at the same time as a level of detail which was many times better - think any game really but Uncharted comes to mind.

Surely a PS4 will be able to increase the IQ as well as increase the level of detail, without having to choose between the two?

I would be very, very disappointed if that were not the case. They're many years apart after all.

It may not be that bad. But it's what my expectations are. If it manages to improve on that, then that'll be a nice surprise for me. But, given the specs released for each console thus far, that's about what I'm expecting.

Considering that on PC it takes a rather significant increase in rendering power to drive both increased graphics rendering technology combined with increased resolution. Usually with a generation to generation upgrade you expect minor increases in one or the other but not both. It takes multiple generations to see a significant increase in one or the other, again not both. And potentially more than an order of magnitude increase to have both better graphics rendering technology combined with an increase in playable rendering resolution.

The benefit on PC however is that if the developer offers comprehensive control over the rendering tech used, you choose your own compromises between resolution, rendering speed, and the level of graphics technology used.

For consoles, the developer offers one setting that everyone uses. Hence you'll likely either get high resolution at similar to slightly better quality of current gen or current gen resolution (maybe slightly higher, IE - no sub 720p) with greatly enhanced graphics technology/performance.

Regards,
SB
 
It may not be that bad. But it's what my expectations are. If it manages to improve on that, then that'll be a nice surprise for me. But, given the specs released for each console thus far, that's about what I'm expecting.

Considering that on PC it takes a rather significant increase in rendering power to drive both increased graphics rendering technology combined with increased resolution. Usually with a generation to generation upgrade you expect minor increases in one or the other but not both. It takes multiple generations to see a significant increase in one or the other, again not both. And potentially more than an order of magnitude increase to have both better graphics rendering technology combined with an increase in playable rendering resolution.

The benefit on PC however is that if the developer offers comprehensive control over the rendering tech used, you choose your own compromises between resolution, rendering speed, and the level of graphics technology used.

For consoles, the developer offers one setting that everyone uses. Hence you'll likely either get high resolution at similar to slightly better quality of current gen or current gen resolution (maybe slightly higher, IE - no sub 720p) with greatly enhanced graphics technology/performance.

Regards,
SB

Well by that philosophy, the new GPUs are several generations beyond what's in PS3/360 today, so that leaves me more optimistic. Added to much more capable CPUs and something like 8 to 16 times more RAM. 7 years is a long time to merely get the same graphics at double res or keep the same res just to guarantee more details. I want i want i want! :D
 
They should go with GoW:A/Beyond/LastOfUs postprocessing AA solution or smth simmilar. Those Sony games have fantastic IQ.
They use MLAA with more refined edge searching.
SMAA and similar subpixel capable solutions should do well for most surfaces and then they would need a own solution for a hair.
 
Back
Top