Why AA doesn't get the respect it probably deserves

I know this is slightly OT, but the content of this thread moves me to comment. I mourn the days of the GAMEPLAY being the immersive element of a game. Of the days when the graphics may have been crude, but they conveyed enough that it allowed the pure enjoyment inherent in the game to outweigh what they lacked.

Reading comments of people saying that if a game doesn't have support for AA, they won't play it, really just brings home to me that gaming no longer seems to be as much about playing games, as it is about watching them.

I guess it just makes me sad that devs seem to be so much less willing to take a risk on a game that brings something new or revolutionary in terms of gameplay elements, but they'll spend spend spend on pretty shiny things... What makes me sadder is that I see gamers as the cause of this. More and more I find myself impressed by the graphics in a new game - and then completely underwhelmed by the game itself.

To get a little bit more back on topic, I'll admit that AA is the first thing to go if I'm in search of more frames. AF is next, and then I'll drop resolution. However, if I do drop in res, I will then try with the higest details I can run at that res, including AA.

Usually tho, I have to say I wouldn't drop the res just for the purpose of running with AA. I can't say that I'm overly distracted by jaggies when I'm playing a game. Maybe because I'm playing games where I need to concentrate on what I'm doing, not how pretty the scenery is...
 
Reading comments of people saying that if a game doesn't have support for AA, they won't play it, really just brings home to me that gaming no longer seems to be as much about playing games, as it is about watching them.

Well on one hand I do agree that many people who are into 3D hardware are not really into games per se. It's just the thrill of the technology and new toys that draw them in.

But you have to draw the line somewhere. We've definitely come to expect more sophisticated visuals from our games regardless of the quality of gameplay. So there is some minimum standard that has to be met for a given genre. Visuals are an integral aspect of gameplay as well.
 
I guess it just makes me sad that devs seem to be so much less willing to take a risk on a game that brings something new or revolutionary in terms of gameplay elements, but they'll spend spend spend on pretty shiny things... What makes me sadder is that I see gamers as the cause of this.
Ok how did we get from devs being pussies to it being the gamer's fault? :???:
 
Dig - yeah, I guess I could've explained my reasoning on that a bit better... I guess my point to an extent was bemoaning what I think of as the "magpie effect" that gamers (I'm including myself here, just in case you're wondering) go thru on release, or discussion of a new game.

How often do you here people talking about how the game plays, or being excited about what new gameplay features the game might bring? Maybe I'm not reading the right forums, but to me, for PC games, its basically never. All that gets discussed is how pretty the game is going to be. Or how many shiny things there will be for us magpies. How the new engine is revolutionary, not for what it lets you do in the game, but for the way it renders shadows in real time, or whatever.

With people focussing on these things, its easier to understand a devs reluctance to try something innovative that may not have top of the market graphics, but might have something truly revolutionary in terms of gameplay.

The biggest exception to this in recent times has been the Wii, and I guess we're still waiting to see how that pans out in the long term...

If I'm still not making sense, I'm sorry - I'm sposed to be working right now...
 
icecold and trini, I'm not disagreeing in any way with your comments either, well, with the possible exception to graphics being equally as important as gameplay. Sure, graphics is important, and as the tech develops there is no reason we shouldn't expect better looking games. But I'd be happy to sacrifice "new visual shiny things" for "amazing gameplay experience".

I guess my point was more that I do agree that graphics, and IQ are important, its barely logical, in my mind, to write off a game because its not pretty enough for you... I can understand that this is something of a subjective issue tho.

I still have fond memories of early games. I wish someone would release a game that is as absorbing to me now, as Deathtrack, Stunts and Doom2 were back in the day.
 
The AA debate began in the Voodoo 5 / Geforce 2 era. Back then, the typical playable AA resolution was 1024 x 768 on a 17" CRT.

I think it should be realised that AA is not as desparately needed as back then. The pixel density of today's good LCD panels is about 3 times as a 17" CRT at 1024 x 768. Jaggies are much smaller now. Not the fist sized chunky bits that was around in that era.

If pixel density continues to go up (and I don't see why not), then aliasing artifacts will become so small as to be essentially a non-issue.

Ironically if I go as high as 2048*1536 on the current 21" CRT, I might reach ~130dpi on one axis and get some form of oversampling (due to streched content) on the other, yet it doesn't cure a number of cases like shader or alpha test aliasing amongst others.

If you really want to get rid of dancing meanders across the screen there's for me no escape than to antialias. I'm not aware what kind of resolutions large 3D animation studios render in, yet despite the fact that movies are totally different story they still use sophisticated antialiasing and temporal antialiasing algorithms instead of insanely high resolutions.
 
True, but you can do supersampling yourself for the same performance hit and it'll work on all hardware.

Do you mean supersampling in the form of selective shader AA? By all means yes to that and where's that bow down smiley again?

As for supersampling being a more robust sollution I don't disagree, but if we'd be talking about full scene SS I don't think I could afford the fillrate hit to target an acceptable resolution. (I bought the 21" CRT on purpose a couple of years ago and hope for it to last until better display technology arrives than LCD. I don't want to drag the debate into another senseless CRT vs. LCD debate, but for anything that moves I cannot stand LCDs. For office use they're outstanding though.) Given the former parenthesis I have to target at least 1280*960 and no less if am to use full scene SSAA. Now considering that I'd be happy that I can reach that resolution in coming games with even just 4xMSAA, I have severe doubts that there's any headroom left for added fillrate/SSAA scenarios.

To widen that one a bit, most of today's users have LCDs; lowering resolution there is even less optimal than in my purely subjective case.

I've absolutely nothing against Supersampling, I'd be a moron to not recognize its benefits, but the overall cost for using it in full screen is way too high to even consider it even for today's games. The ideal sollution in my mind is a combination of MSAA+AF and selective/adaptive SSAA wherever MSAA fails. If the ballpark between a usable resolution with MSAA and a usable resolution with SSAA is too large (something that's pretty easy to play around with on CRT), then Supersampling can in no way replace the ultimately high dpi value of the much higher resolution.
 
The attitude of some people annoys me. The whole 'If I don't like it, no one should have it' attitude just holds everyone back.
I think this works both ways, depending on your stance.

The industry is always advancing so I find arguing against something like AA offensive.
I don't think any developer I know would tell me "AA does not improve IQ" or, perhaps more importantly, "AA has very little positive impact on IQ". No one is arguing about the positive impact AA has on IQ.

If your technique is not compatible with AA, fine I'll excuse you for now, but if your technique does work with AA and you dont want to support it because of speed hit on current and even previous generation hardware, then that's not cool.
I don't think this is the real problem with developers. I am more inclined to think this is a support issue they and their publishers don't want to wish upon themselves.

This may seem crazy but I don't find it unbelievable when a few of them tell me not a lot of folks RTFM (which usually has an entry that says "Disable AA" when the question has to do wityh slow performance), which means, generally speaking and excluding folks here, not a lot of folks understand how expensive AA is. <shrug>

If you want constant reliable speed, go work on a console.
This really doesn't help solve anything, Ryan. All you're really saying is, if PC games developers are way too concerned about the negative performance impact of AA they may support in their games, then they should not be making PC games. This would be a loss to everyone.

I repeat, I don't think the cost of AA is the main reason why AA isn't as widely supported by PC game developers.

Control panel AA setting was something that 3dfx and Nvidia needed almost 10 years ago when there were no API support for selecting FSAA modes to render in. 10 years later and devs still don't get it...
I think they get it. But if we still have many members of the public still preferring ever higher resolutions to any kind of AA and we do not object to such a preference, then perhaps we should consider not objecting to developers having the same preference...?
 
Dig - yeah, I guess I could've explained my reasoning on that a bit better... I guess my point to an extent was bemoaning what I think of as the "magpie effect" that gamers (I'm including myself here, just in case you're wondering) go thru on release, or discussion of a new game.

How often do you here people talking about how the game plays, or being excited about what new gameplay features the game might bring? Maybe I'm not reading the right forums, but to me, for PC games, its basically never. All that gets discussed is how pretty the game is going to be. Or how many shiny things there will be for us magpies. How the new engine is revolutionary, not for what it lets you do in the game, but for the way it renders shadows in real time, or whatever.

Well, the depth and quality of the game play and the depth and quality of the graphics in a game are two different subjects, aren't they? A site like B3d has traditionally focused on the subject of 3d-rendering. Still, there's a "games" forum within the greater B3d forums for you to visit if you grow weary of reading all the minutiae about rendering and you just want to talk about the games themselves. You shouldn't expect a forum dedicated to opining about 3d rendering to delve very much into whether or not a particular game is any good or not, as such expectations lead directly into the misapprehensions you've expressed here.

It's a bit like wading through the engineering manuals for a Gulfstream corporate jet when what you really want to read about is whether or not "Coffee, Tea, or Me?" is something you might realistically expect to hear from a stewardess when taking a commercial flight...;)
 
Wow, talk about timely and relevant articles.....
http://www.anandtech.com/video/showdoc.aspx?i=2947
Wow, seriously... maybe they read the thread ;)

Regarding SSAA, I'm not so convinced that it's unreasonable overall. With some games hitting several hundred frames per second, why not use those (otherwise wasted) frames on super-sampling? Furthermore as polygon sizes get smaller, the current rasterization method only remains efficient as long as triangles are a certain minimum size, which can be accomplished nicely by uniform grid supersampling.

Still, I'm the happiest with jittered super-sampling - even stuff like quincunx. With increasingly complex and aliased shaders, and techniques like raytracing for secondary rays coming down the pipes, more or less everything will benefit from some super-sampling. I've had some good tech demo experiences with jittered SS and I'm dying to try sampling the time domain as well to get some free motion blur! :)
 
I don't buy into the temporal antialasing, except for people on those pesky 60Hz monitors :D.
is your jittered SS equivalent to RGSS?

/edit : except, not expect, damn.
 
Last edited by a moderator:
don't want to sound like a broken disk but in the foreasable future (until we don't have some real hw supersampling support..) dx10 + deferred rendering + AA makes a lot of sense.One might even resolve all the fragments belonging to a pixel computing lighting terms with different frequencies: as a per pixel diffuse + ambient lighting term and a per fragment specular term and so on..
We don't even need to go fully deferred: shadows/occlusions term can be easily deferred, but we could also decide to only defer part of the lighting computations, not all of them.
[edit] to make it more clear I'm talking about those pixels whose fragments belong to a unique primitive, otherwise we need to supersample those pixels to avoid artifacts
 
Last edited:
Regarding SSAA, I'm not so convinced that it's unreasonable overall. With some games hitting several hundred frames per second, why not use those (otherwise wasted) frames on super-sampling? Furthermore as polygon sizes get smaller, the current rasterization method only remains efficient as long as triangles are a certain minimum size, which can be accomplished nicely by uniform grid supersampling.

You'd have to go back way further than say UT2k4, something extremely CPU bound and a resolution threshold not higher than 1280*something. Unfortunately I don't have any SS available anymore to test with on the G80, but the only game that was fully playable on the former 7800GTX with 16xS and 16xAF in it's maximum available resolution (1280*960) was Unreal1.

Still, I'm the happiest with jittered super-sampling - even stuff like quincunx. With increasingly complex and aliased shaders, and techniques like raytracing for secondary rays coming down the pipes, more or less everything will benefit from some super-sampling. I've had some good tech demo experiences with jittered SS and I'm dying to try sampling the time domain as well to get some free motion blur! :)

If you'd promise me that I wouldn't have to drop too much in resolution I'd be all game. Somehow I'm not convinced any of it is feasable in upcoming games. I'll be happy if I will be able to play something like Crysis with 4xMSAA in 1280.

It's not the quality of Supersampling I'm questioning here, rather it's usability due to it's inevitable fillrate penalty.
 
That's nice and all Ailuros, but it misses the point that we don't even have the option to enable or disable a "quality" mode of AA that would involve some form of SSAA.

If you noticed I mentioned previously that it would be a feature most likely uses for past game and for currrent and future games that don't go "balls to the walls" on 3D bells and whistles.

For games such as those there would always be the fallback option of enabling a "performance" mode of AA in the control panel that would involve some form of MSAA and Transparancy AA.

Speaking for myself I would gladly lower resolution and texture quality in order to experience the wonderful goodness that was 3dfx's RGSS AA. And if that was unplayable even at lower resolutions/texture details/etc, then I still have the option of trying MSAA.

And if RGSS AA proved too slow and MSAA didn't work with said game engine, I could then put it away and decide not to play it until faster hardware came out that could effectively use SSAA on said game.

Whereas currently if MSAA doesn't work with said game engine, it just goes in the trash for me. Never to be played.

I think I and most proponents of SSAA would argue that it's most likely to be used with older games and current and future games that aren't bleeding edge. And we're fine with that.

Regards,
SB
 
A very brief off-topic reply to graphics vs. gameplay.

I never said I wouldn't play a game that didn't have cutting edge 3D bells and whistles. However if said game did not allow me to enable anyform of AA or AF whatsoever. Then I would most certain not play that game.

Considering it's "generally" (meaning not always) only cutting edge games currently that will not work with current forms of AA, I don't quite fit your generalization. Thus it's usually those games considered to be at the cutting edge of "pretty" graphics that I will not play regardless of whether the gameplay is good or not. Games with "bad" or "outdated" graphics are just fine as long as I can enable AA and AF.

Regards,
SB
 
AA is the last thing on my list to enable, after everything else. Mind you, I use it a lot more than I used to, but in my pre-7900 GTX days I didn't use it because I prefer a higher resolution, AF and detail levels first with playable framerates. Now that I have an 8800 GTS, I can use AA all of the time without really worrying too much about a performance hit. I do like high amounts of AA, but for me it is last on the list.

I played Oblivion to death last year on a 7900 GTX and still have not fired it up under my 8800 GTS to see it in action with both HDR and AA. The absence of AA didn't hamper my enjoyment of it at all; I'm totally burned out on the game at present because I played it so much last year.

Shadow quality, texture quality, HDR and all of the other nifty innovations will always come first for me, then high resolutions (though this has ceased to be a problem - the only games I do not play in native resolution are ones that do not go that high) and then I will apply the highest level of AA that I can while achieving acceptable framerates.

AA is last on the list, highest detail level and features first.
 
Back
Top