Why AA doesn't get the respect it probably deserves

theres no reason why any game should ever be incompatible with aa. is deferred rendering rly needed?

It's a design decision: make a higher quality deferred-renderer, or make a mediocre renderer (with the same amount of resources/time/etc)? Deferred rendering just simplifies things...
 
Firstly some 'facts'.

Bloom, HDR, various blur effects use render to texture and even if the actual rendering is still compatible with MSAA but control panel AA wont work. Getting control panel AA to work with games that use render to texture targets isn't something that's done automatically (can be done automagically though with the Chuck patch ;-) ). API limitations in Direct3D9 means a developer can't automatically support AA when doing render to texture. Though it doesn't require that much more code to get it to work, earlier versions of D3D9 didn't support doing it at all. Older engines may not have been designed to support it and getting things to work may not be straight forward for developers working with older engines.

Now some opinion.

The attitude of some people annoys me. The whole 'If I don't like it, no one should have it' attitude just holds everyone back. Next year a card will be released that can play game 'x' with all effects on with AA no sweat even if no card of today could do it. The industry is always advancing so I find arguing against something like AA offensive. If your technique is not compatible with AA, fine I'll excuse you for now, but if your technique does work with AA and you dont want to support it because of speed hit on current and even previous generation hardware, then that's not cool.

If you are so concerned about the artistic integrety of your game and require that the game runs at ludicrously high speed all the time, then get out of the PC games industry. You have to support everything from the GMA900 to the GeForce 8800 and everything inbetween. If you are concerned about the 'speed' of the game on a 8800 then, the GMA900 is not going to be able to play it. If the GMA900 can play it at the right speed, then the 8800 is probably going to be able to play it with 16x FSAA no sweat. Way to be short sighted. On the PC platform you're always screwed. If you want constant reliable speed, go work on a console.

Now comes a stupid situation. Developers by not supporting AA controls in games themselves have forced gamers to use the AA controls in the driver control panels. This then prevents devlopers from disabling AA on devices that are too slow to support it on their title. This causes support problems when gamers have AA on in the control panel and get slow performance in game. Then of course is the whole don't need to support AA controls in game cause the user can do it in the control panel argument. This situation of course is unresolvable because the driver writers aren't going to remove the AA controls from the drivers and there are too many devs out there that do not appreciate AA for what it does (it makes things look better!) and don't include AA controls in game.

Control panel AA setting was something that 3dfx and Nvidia needed almost 10 years ago when there were no API support for selecting FSAA modes to render in. 10 years later and devs still don't get it...
 
I hope game profiles would be more used, people seem not to be aware they exist (at least on the nvidia side). on my crappy gf4 I default to 2x AA, with App preference for some games, 6xS for another (have to use rivatuner for that mode), etc.
 
I'm rather stunned that some people are so shallow that they'd point-blank refuse to play a game if there's no chance of using anti-aliasing; I can only imagine that if they played console games before the likes of the XB360 or PC games over 6 years ago, they must have sat through the unholy torture of being 'forced' to 'endure' the aliasing. The horror!
Actually I started out with Pong way back when, I thought the Atari 2600 was amazing at one time too.

But ever since I tried out a 9500 pro on Unreal 2 and saw useable AA for the first time I have been bloody hooked, it's just a whole 'nother level of immersion into a game for me.

Jaggies and shimmers remind me I'm playing a game. And I got a 360 recently for a few weeks to use and "Gears of War" yesterday and my very first thought when the game fired up was, "F-CK! No AA!".

Seriously, it was.
 
the AA revelation for me was counterstrike 1.5, on voodoo5, 800x600 4x LOD -1.5, lots of phrases ending on 5, in that sentence. totally usable as well. really awesome in 1024 but ran a bit slow, so 800x600 60fps did the job.

that seriously was a big step up similar to the voodoo1, and never had a game felt so immersive since the original doom (first seen on a 486 DX 33 with a sound card, so it rocked!)

I wish the voodoo5 6000 had been released, so I could play CS with it. off course if I win a lottery I'll go do some ebaying, but I never play lottery.
 
Let me preface this with the fact that I'm a huge proponent of generalized anti-aliasing even at the expense of high resolutions. I agree that excessive aliasing is completely unacceptable, and gamers should certainly vote with their wallets on this issue.

HOWEVER, MSAA is not the final solution IMHO. It can certainly help, but only with one type of aliasing, and making assumptions that simply do not hold for a lot of styles of rendering.

I admittedly have a personal beef with "unsafe" hardware optimizations and options, because I've been bitten too many times by MSAA, or gamma correction, or "optimized filtering" being applied to situations where it simply isn't correct. For example, "gamma correcting" pre-tone-mapped colours? "Gamma-correcting" depths in a variance shadow map? MSAAing arbitrary data in deferred geometry buffers? Those are all nonsensical.

theres no reason why any game should ever be incompatible with aa.
No offense intended, but that really shows how little you know about modern graphics techniques.

is deferred rendering rly needed?
Yes; it's arguably a better technique than forward rendering on modern video cards and the "smarter" devs have started to switch at least portions of their engines over.

Anyways I don't want to come off as harsh, but there are *tons* of reasons why MSAA can be inapplicable to games, the most important of which is deferred shading. Does this mean I don't want anti-aliasing? Of course not. It simply means that I don't think MSAA is the final answer, even in D3D10.
 
MSAA + deferred shading should be perfectly fine on D3D10 hw, I know for a fact that it's doable on some DX9 class hw too (no tricks, no compromises)
 
MSAA + deferred shading should be perfectly fine on D3D10 hw, I know for a fact that it's doable on some DX9 class hw too (no tricks, no compromises)
True, but MSAA+deferred shading is something that the application must do... it cannot be simply "forced on" automatically in the control panel. Aside, I haven't yet seen an implementation or performance numbers - do you have a link?
 
Aside, I haven't yet seen an implementation or performance numbers - do you have a link?
Only in my mind at the moment :)
[edit] theoretically it should be quite fast, supersampling would be applied only to pixels whose fragments don't belong to a single primitive.
 
the reason id prefer having aa control be thru the drvier is because it ensures that u get to use all the available modes of your card, and just just 2/4/6(if u have an ati card).

id love nothing more than to have better af, i agree with you

well every pc game ive seen thats using a deferred renderer is not even close to high quality and actually looks very mediocre(stalker, graw, are there any others right now?). doesnt crysis use forward rendering? i wouldnt call that mediocre.

andy until i see something that shows deferred rendering as clearly superior from a product standpoint, i still see no reason why games shouldnt support aa. its 2007 weve had the ability do do aa with these features for some time, and we still have games being released in 2007 that dont support it.

not sure how im being annoying or a broken record but this is how i feel
 
the reason id prefer having aa control be thru the drvier is because it ensures that u get to use all the available modes of your card, and just just 2/4/6(if u have an ati card).
Th available modes can be queried through the API, so there's no real reason (except laziness as you note) that games shouldn't support all of the modes via the ingame controls.

well every pc game ive seen thats using a deferred renderer is not even close to high quality and actually looks very mediocre(stalker, graw, are there any others right now?).
There aren't many right now due to reluctance to target fairly high end hardware. That will change. I also wouldn't judge the "quality" based on current games... what deferred shading can do is allow many more light sources and complex lighting environments without destroying the frame rate.

andy until i see something that shows deferred rendering as clearly superior from a product standpoint, i still see no reason why games shouldnt support aa.
There are other cases in which MSAA doesn't work; deferred shading is just one example. Note that several UE3-based games do not support AA because of doing semi-deferred shadowing.
 
i wasnt aware the s/w could just query the api to display all modes the card supports, so if devs do this its fine then. i thought the only modes available were the ones the devs specifically added.

if devs arent going to target highend hardware anyway, isnt the use of deferred rendering wasted anyway? and we just miss out on aa for nothing
 
Th available modes can be queried through the API, so there's no real reason (except laziness as you note) that games shouldn't support all of the modes via the ingame controls.

For Direct3D Nvidia doesn’t report/support all modes through the API. Even the modes that are reported need some additional “magicâ€￾ code that translate the API results in human readable names.

There are other cases in which MSAA doesn't work; deferred shading is just one example. Note that several UE3-based games do not support AA because of doing semi-deferred shadowing.

That was one of the reasons why I vote against using deferred shading algorithms in our engine. At the same time I vote for including at least 2xAA in all our performance requirements.
 
The AA debate began in the Voodoo 5 / Geforce 2 era. Back then, the typical playable AA resolution was 1024 x 768 on a 17" CRT.

I think it should be realised that AA is not as desparately needed as back then. The pixel density of today's good LCD panels is about 3 times as a 17" CRT at 1024 x 768. Jaggies are much smaller now. Not the fist sized chunky bits that was around in that era.

If pixel density continues to go up (and I don't see why not), then aliasing artifacts will become so small as to be essentially a non-issue.
 
Actually, for most pixel density is much much LOWER now than in the 3dfx V5 era.

Back then typical gaming displays would be 15" and 17" monitors with some people affording the luxury of a 19" or 21" monitor.

Translated to LCD sizes that would be 14", 16", 18" and 20" respectively. I can't think of many games that I played in less than 1024 or 1280 res at the time. Which makes only the 21" monitors and in some case the 19" monitors higher in pixel density when gaming if you compare todays resolutions of 1600 or 1920. However 1920x1200 on a 24" monitor would still have similar or worse pixel densities than the more common 1024 or 1280 on a 15" or 17" monitor back in the day.

Add in the fact that due to the sharpness of LCD's compared to the vast majority of mainstream CRT's and the need for AA is even greater today than it was back in the V5 era.

And to someone that noted that we must have been screaming about AA on earlier gen consoles. Yes, it was unbearable. Thus, while I owned a Playstation and a Playstation 2. I never played games on either after the first month of owning them. I played more games on my old Atari 2600 than I ever did on either Playstation. And while I owned a Xbox, it was used more as a media center device than it was a gaming device.

Great, deferred rendering can allow for more effects. But it does not allow for some form of AA, then it's not worth 5 dollars from me. Hell, I wouldn't even play it for free.

If a game cannot do the absolute most basic Image Quality features of AA. Then, irrational as it may be, it's a backwards thinking game, IMO.

Regards,
SB

PS - /cheer Demirug for voting for at the least testing performance with 2x AA in whatever engine he's working on.
 
For Direct3D Nvidia doesn’t report/support all modes through the API. Even the modes that are reported need some additional “magic” code that translate the API results in human readable names.
There is certainly some "magic" for the CSAA mode names, but the N-sample MSAA modes (and "qualities") are reported normally. For example, source-based games seem to provide a good list of AA modes.

That was one of the reasons why I vote against using deferred shading algorithms in our engine. At the same time I vote for including at least 2xAA in all our performance requirements.
MSAA just isn't important enough to me to drop something as algorithmically important as deferred rendering. There are other good ways to do AA on modern hardware that cover a lot more cases than just edge aliasing. That said, it will certainly depend on the application being developed so I can understand people making different choices. However IMHO people have no right to demand MSAA support and trash games/code/algorithms that do not support it. MSAA is really the "hack" if you get right down to it, that works in a few common - but becoming less - rendering pipelines.

And of course you can do "some form of" AA with deferred rendering. I'm personally a proponent of super-sampling both the spatial and temporal domains... hardware is getting fast enough and the resulting quality is unsurpassed. As mentioned by nAo, even MSAA can be made to work in D3D10 though.
 
Aye, from an end user perspective it doesn't matter "much" if it's MSAA or SSAA or any other form of AA that removes the Jagged edges that crawl around distracting them from the visuals a developer is trying to present.

As long as it supports some form of AA that removes that distraction then that's a good thing.

For example just purely MSAA is "good enough" since it removes edge aliasing. However, it didn't do much with some of the other jaggies that exist with Alpha Textures or shaders, which also drive me batty.

But at least having some form of AA, it unconsciously makes me think the industry isn't moving backwards.

And with the advent of HDR and high contrast scenarios. Jagged edges become even more pronounced and irritating.

If I'm running around in a world created by someone. The last thing I want is to run around thinking to myself, "Oooh neat water. Ugh jaggies. Oooooh I like the way the ambient light works. Ugh more jaggies there. Oooooh, look at the beautiful sky. OMG, the jaggies on those mountains are horrible. Wow, neat sword and I love the particle effects on it. Ugh but it's so jaggy and the jaggies crawl as the idle animation of the guy moves it around."

Right there would be the point where I've seen enough and I put the game away never to be played again.

Yes, I realize that developers want to include faster yet more complex (simpler) ways of rendering. However, as someone that is then expected to pay to play that game. If I spend time being inevitably drawn to the most distracting things then has it really paid off? I know for quite a few people it wouldn't matter much. Just look around this forum. But for some people, like myself, it's a distraction we're rather not have to try to deal with in order to try to enjoy the otherwise great visuals someone may have created.

After all, isn't "most" of the purpose of great 3D visuals an effort to draw a person into a scene and hopefully make them forget it's artificial, at least for a little while? And what do the most to destroy that illusion? Jaggies, crawling jaggies, texture shimmering, shader aliasing and shimmering, etc.

A developer could make the most fabulously realistic and graphically advanded engine in the world. But if the distractions end up outweighing the otherwise wonderful visuals, then where do we end up?

Regards,
SB
 
There is certainly some "magic" for the CSAA mode names, but the N-sample MSAA modes (and "qualities") are reported normally. For example, source-based games seem to provide a good list of AA modes.

Yes, but there are some more modes (with super sampling) that are not reported.

MSAA just isn't important enough to me to drop something as algorithmically important as deferred rendering. There are other good ways to do AA on modern hardware that cover a lot more cases than just edge aliasing. That said, it will certainly depend on the application being developed so I can understand people making different choices. However IMHO people have no right to demand MSAA support and trash games/code/algorithms that do not support it. MSAA is really the "hack" if you get right down to it, that works in a few common - but becoming less - rendering pipelines.

Yes MSAA is a hack but every other AA method is it more or less to. Maybe the A-Buffer is the only real AA solution.

The customers may have no right to demand MSAA support but they the right to know when it is not supported and finally can decide not to buy.

Anyway MSAA was in our case not the only reason against deferred shading but it was a point on the list as there is a high chance that we get some nasty edge aliasing.

And of course you can do "some form of" AA with deferred rendering. I'm personally a proponent of super-sampling both the spatial and temporal domains... hardware is getting fast enough and the resulting quality is unsurpassed. As mentioned by nAo, even MSAA can be made to work in D3D10 though.

I would be happy if I could force a D3D10 card with enough power for super sampling as minimum requirement. But this is not even an option and we have to life with much less.
 
Yes, but there are some more modes (with super sampling) that are not reported.
True, but you can do supersampling yourself for the same performance hit and it'll work on all hardware.

Yes MSAA is a hack but every other AA method is it more or less to. Maybe the A-Buffer is the only real AA solution.
Granted, but I consider super-sampling a lot more robust than multisampling.

The customers may have no right to demand MSAA support but they the right to know when it is not supported and finally can decide not to buy.
Agreed, and people should 100% vote with their wallets.
 
Actually, for most pixel density is much much LOWER now than in the 3dfx V5 era.

Well I was inspired by this thread to see the impact of AA and AF at 1920x1200 on a 17" screen. The test scene was a frame from GT1 in 3dmark06.

AF had a much, much greater impact as expected. However, the jaggies were still noticeable and 4xAA did a good job of cleaning up the screen. So even with very high pixel densities AA will still be able to make a considerable improvement.
 
Back
Top