Antialiasing VS HD

This.
Jaggies are only one artefact of aliasing, and not the most unpleasant one by far. Shimmering, texture crawling, small geometry "sizzling" are much worse to me.
yes +this is also part of the reason I pick higher res (also texture mapped polygons like in text looks better) FSAA only works on the polygon edges, Ive also noticed stuff like specular highlights can be duller with FSAA
 
yes +this is also part of the reason I pick higher res (also texture mapped polygons like in text looks better) FSAA only works on the polygon edges, Ive also noticed stuff like specular highlights can be duller with FSAA

Yes but higher res does almost nothing to help with those other problems. It's still there and still distracting. At least with AA you can remove one of the major eyesores.

Regards,
SB
 
it all depends on your tastes, like I say I prefer higher res since details stand out more, ideally of course we'ld like to render scenes with no AA + super high DPI (higher than 300dpi I forget exactly what the human eye can resolve down to) but thats still years away

same scene 1280x720@4xAA vs 1920x1080@noAA
AA.png
That. :mrgreen: That's why I prefer... the image on the left.
 
yes +this is also part of the reason I pick higher res (also texture mapped polygons like in text looks better) FSAA only works on the polygon edges, Ive also noticed stuff like specular highlights can be duller with FSAA
Please don't keep using the term FSAA, it's a marketing term which can mean pretty much anything. The anti-aliasing methods commonly used at the moment are super-sampling, multi-sampling, coverage masks and mipmapping.
 
Yes but higher res does almost nothing to help with those other problems. It's still there and still distracting. At least with AA you can remove one of the major eyesores.

Regards,
SB

Yes they do if you have a REEEALLLLLY high resolution with a very large amount of pixels packed into an arcminute of the eye.
If the eye does not have the ability to differentiate between 4 pixels and considers them a blur of pixels, you have natural AA in the works. The human eye also has certain density of receptors and having detail that exceeds their capable resolution WILL create AA.

EX: You don't notice Jaggies or crawling steps on your television set if you watch it 50 meters away. Your eye (if it's in the range of normal human eyesight) just cannot identify the features.
 
Yes they do if you have a REEEALLLLLY high resolution with a very large amount of pixels packed into an arcminute of the eye.

Which would be a monumental waste of processing power for no additional benefit over AA. I just played the inFAMOUS demo and my eyes started to bleed. Developers are adding more "stuff" on screen which just adds a lot more aliased edges resulting in a creepy, crawly, noisy, undefined mess.
 
Resolution is what I prefer. The image quality is more crisp.

Although I think the overall point is moot. You can have a really powerful GPU with fancy filtering, but the bottleneck is the display. The biggest problem is how most flat panel technologies are inferior to CRT with regards to fast motion. Video games normally have a lot of fast movement, so to have the illusion of liquid smooth movement the displays needs short hold times for each pixel.

LCD (and future OLED) is flawed in that it is an active matrix technology. The pixels will be held for many milliseconds each frame to help achieve a bright image. The longer the hold time, the more the image of each frame blurs on the retina of the human eye.

The flat panel display technology that has a short hold time for each pixel is SED from Canon. Sadly though it doesn't look like it will be mass produced.
 
Last edited by a moderator:
A lot of the Japanese console developers kept saying that same thing for a while about mipmapping (which is also anti-aliasing) but in the end so few people agreed even they had to cut that shit out. Do you turn off mipmapping? Upto 4x MSAA is close to free on modern hardware, the hardware cost necessary for that really isn't very big ... nothing compared to what rendering at 4x the resolution takes. Just use it, be glad for it and stop being a luddite :p
 
Last edited by a moderator:
A lot of the Japanese console developers kept saying that same thing for a while about mipmapping (which is also anti-aliasing) but in the end so few people agreed even they had to cut that shit out. Do you turn off mipmapping? Upto 4x MSAA is close to free on modern hardware, the hardware cost necessary for that really isn't very big ... nothing compared to what rendering at 4x the resolution takes. Just use it, be glad for it and stop being a luddite :p

Yup, except on Consoles you don't have the option to force AA as you do on PC. So still at the mercy of devs.

Would be interesting if next gen consoles had a forced AA mode. I doubt it will ever happen so will just have to keep hoping that devs will implement AA.

So until then, I'll just continue voting with my wallet and not buying any game that doesn't have AA unless it happens to be an exceptionally good game. Playing it over at friend's places will make do if they don't.

Regards,
SB
 
Please don't keep using the term FSAA, it's a marketing term which can mean pretty much anything. The anti-aliasing methods commonly used at the moment are super-sampling, multi-sampling, coverage masks and mipmapping.

Agreed. And I am going to take 720p with 4xMSAA and 8xAF over 1080p without any day of the week for non-RTS games where resolution is relevant to viewable game area (not to mention most games will run better at 720p with those settings as well).
 
Would be interesting if next gen consoles had a forced AA mode. I doubt it will ever happen so will just have to keep hoping that devs will implement AA.

You can't just "force" a MSAA mode. I don't understand why people think it's so simple. Ok, maybe in a single pass, forward lighting, LDR engine. But, good luck shipping a "next gen" looking game with that.
Please read up on how MSAA works before blaming lack of MSAA on developers.
Maybe if console manufacturers were to actually provide enough EDRAM and programmable MSAA resolve logic then we can talk about "no brainer" MSAA.
 
Which would be a monumental waste of processing power for no additional benefit over AA.
The question posed was ridiculously broad and made no mention of theoretical limits. So given a choice between any amount of AA and any amount of reoslution, I'd pick resolution. It is better, as it provide AA at a distance and clarity up close where low resoution is a blur.

Now fi the question is to be presented in a fashion that's technically answerable with some hardware reference 'given this system and this context' then we can go into the pros and cons of different IQ techniques and performance impacts and the best overall compromises. But that's not what the OP was asking.
 
In my experience, 720p + 4xMSAA (or even 8xCSAA) yields about the same performance as 1080p with no AA in most games. This is with a GTX260. So in this example you can have either 720p+4xMSAA or 1080p with no AA. I think the OP was asking, in a situation like this, which do you prefer? For me it's 720p+AA hands down.
 
Please don't keep using the term FSAA, it's a marketing term which can mean pretty much anything
sorry youre right, guilty as charged
Ive said it before here this forum needs a glossary
 
I currently prefer to use SSAA as much as possible in all games, even with the risk of lowering resolution to 800x600.
I just do not like any surface aliasing, also things tend to look a lot better and natural with enough samples.

What we need is pre-filtered geometry and a way to calculate lighting information for an area and not just the point within pixel. ;)
 
Last edited by a moderator:
...
EX: You don't notice Jaggies or crawling steps on your television set if you watch it 50 meters away. Your eye (if it's in the range of normal human eyesight) just cannot identify the features.

You wont see anything...
 
Many people use fixed resolution displays like LCD for their PC monitors, which means they have a native resolution, so I can't really fathom how they would be able to compare the difference in resolutions with accuracy.


I use a 21" Sony CRT with my PC and when I crank the resolution higher in games, the image quality is incredible. No amount of AA and filtering options at lower resolutions improves image quality to the same degree.
 
LCD (and future OLED) is flawed in that it is an active matrix technology. The pixels will be held for many milliseconds each frame to help achieve a bright image. The longer the hold time, the more the image of each frame blurs on the retina of the human eye.
Please don't confuse the term active matrix with hold type display. And both LCD backlights and OLED displays can be pulsed (though it may reduce the lifetime of the backlight/display).
 
Please don't confuse the term active matrix with hold type display. And both LCD backlights and OLED displays can be pulsed (though it may reduce the lifetime of the backlight/display).

Yes, both DLP and Plasma have hold times even though they aren't active matrix.

I doubt passive matrix OLED is happening anytime soon. Maybe by 2030. OLED seems to have lots of issues to overcome.
 
Back
Top