Why AA doesn't get the respect it probably deserves

and again i bring up oblivion, td unlimited, far cry etc
When did these games start development (note : I'm not talking about their release to the public)? When did we get hardware that made HDR + AA possible?

I don't know your posting history, Can someone enlighten me?
 
I see AA as a feature for use in older titles really. I get really nostalgic about gaming as I don't play often nor many games at all, so it's always nice to get some new hardware and then enable the highest AA and AF modes for an older game that I really like (e.g Max Payne 2 or Doom 3 at this point).

If the game cannot use AA due to some graphical feature, I won't mind it. I think I'd notice the lack of something like HDRR & tone mapping over AA during gameplay anyway. The edge aliasing is something I only really notice if I'm not moving around. Sure there's a difference with MSAA, but I get used to it when it's not there.

As for performance deficit, I won't mind depending on the game itself. I have a pretty high tolerance for low framerates (in the 20s), and I didn't mind playing a game such as F.E.A.R. or Quake IV when the performance tanked in certain spots. The games that require more responsivity and reaction speed, I'll disable AA. Again, I see it as just a luxury more than an absolute requirement.

I take a somewhat different stance with consoles and large HDTVs, however. The relatively large pixel sizes make the aliasing too noticeable at times. It's not like I'd toss the game if it didn't have it; if it weren't enabled, I'd hope the devs would implement something to make it less noticeable (DOF effects, subtle bloom, adjust colour palette, motion blur, or something).

It was interesting to find out that Lost Planet dynamically adjusted AA levels depending on framerate. I mean, if the game is mostly at a decent framerate with AA, then I don't mind it losing the AA during the heavy rendering scenes because it seems likely to me that there'd be so much going on that I wouldn't care about or notice the aliasing.
 
they can control it, some games support aa, but only the aa selectable thru the in game menu, control panel aa has no effect.
And that's the way it should be. BECAUSE AA IS A FEATURE DEVELOPERS CANNOT CONTROL WHEN IT COMES TO MILLIONS AND MILLIONS OF MACHINES.

Sorry for the caps : In other words, I'm looking at this from the POV of developers and all I can say is that the purpose and appeal of things like control panel AA is to sell more video cards.

There is a reason why we have DirectX as a standard.

trackmania united for example. the problem with this is that u only get aa on the edges the developers choose which sucks. i want to enable FULL SCENE aa as i deem necessary.
What do you mean by "FULL SCENE aa"?

Look, I know what you mean but I don't think you know what it means to release a game to billions of folks on the planet with possibly no more than 50 support staff handling enquiries on one game title. You are greedy, you want it all, so do I. But the reality is we won't get it, dude :)
 
You've edited your post.

imo devs should just design their games so that control panel aa works.
So, really, what you're really saying is to agree with what my dev friend recommended : A Sticker on games jewel cases saying "Don't fucking use AA".

Why aren't all these different "control panel AA" options by various IHVs available as in-game menu AA options? Because developers don't "design their games" with this in mind? Do you know how ridiculous this sounds?!

Sorry, I don't mean to sound aggressive but to say that we aren't exactly looking at the same big picture would be an understatement.
 
Mods, I never like recommending moving a thread I started here to another forum but, again, things haven't worked out the way I wished. 3D Graphics Companies/Industries/etc may be a better forum....
 
Why aren't all these different "control panel AA" options by various IHVs available as in-game menu AA options? Because developers don't "design their games" with this in mind? Do you know how ridiculous this sounds?!
Uhm, actually that's pretty much exactly why. They could if they chose to, they don't.

I understand that you feel it's not worth their time/effort/cost looking at it from the dev's perspective, but looking at it from the gamer's perspective I think that attitude sort of sucks.

I don't play games that I can't use AA in, I just don't. It annoys me too much.
 
I'm rather stunned that some people are so shallow that they'd point-blank refuse to play a game if there's no chance of using anti-aliasing; I can only imagine that if they played console games before the likes of the XB360 or PC games over 6 years ago, they must have sat through the unholy torture of being 'forced' to 'endure' the aliasing. The horror!
 
Note - Passionate Rant of AA coming up. The short version. If a game company cannot get AA to work with their game. They will never get any money from me and they'll immediately be put into the running for top 10 worst games ever made.

Everytime rumors of a new version of DirectX comes out. I keep hoping and praying that Microsoft makes it mandatory to have at least 2x AA or 4x AA. Along with at least 8x anistropic. And everytime I'm left disappointed.

I'm a graphics whore. I love graphics, there's absolutely no other reason to get a cutting edge video card.

However, anything that take me away from enjoying the graphics immediately makes me want to "fix" it in some way.

The vast majority of games. I'll look at the shadows for 5 or 10 minutes and then immediately look for how to turn them off as they invariably look badly done, hacked into the game, or just don't look good.

I'll look at nice HDR effects and think cool lighting, and immediately turn it off I see jaggies somewhere and cannot enable AA. I had Oblivions for months before I actually bothered to play it because of this. A few times I came close to trashing but rumors of HDR and AA with ATI cards made me keep it and I was glad I did.

Ghost Recon Advanced Warfighter looked nice, until I found I couldn't enable AA. Graphics were nice enough that I tried to play for about 30 minutes before just uninstalling it because the Jaggies were driving me absolutely nuts at 1920x1200 res. I still have no clue if the game even had good gameplay or not.

Now days, I wait to see a review. If the review mentions a game cannot use or enable AA. I don't bother to even consider buying it. And chances are if it's added in a patch after the fact, I STILL won't buy it.

Yeah, they can do all kinds of things to make the graphics more realistic. But the moment I see jaggies crawling around. That just puts it right in there all the other ho-hum games that look unrealistic to me.

I think it's because I rely so much on peripheral vision over what's directly in front of me. As such even if the jaggies aren't where my vision is focused. I'll still see it and it'll literally drag my eyes over to it. At which point my peripheral vision will catch jaggies in other places and my vision will immediately be dragged over there.

No amount of good gameplay, well done textures, or anything else can hide them.

Pant, pant, pant, pant...

If you couldn't tell, it's the single most important thing in 3d graphics for me. Without that. Everything cannot impress.

Regards,
SB
 
I don't really think there will be PC game in the near future that will ship with any AA setting turned on. Developers are just uncomfortable with anything that cuts your pixel fillrate in half and requires a whole lot of additional work to come around transparency etc. and performance issues, yet is not regarded as important by the majority of gamers.They will still probably leave the decision to the user - and I as a game player almost never have a chance to use AA in my favorite games, either because they don't even support AA or the performance is unacceptable on my (always mid-level) 3D card.


Things might change if 3D chip vendors come with a "free" AA implementation on the whole range of their cards, which is also fully orthogonal with all the features AND requires no "interesting" tricks like additional rendering passes. However, most PC users want their games to auto-configure their settings, and the current API mess might require the developers to use some dirty tricks like getting PCI IDs of the videocard and setting the perfomance/quality settings accordingly - a solution that probably makes more problems than it tries to avoid.

If developers had more direct control over antialiasing patterns and filtering settings (still provided that AA is practically "free" and requires little to no additional effort), so they could scale AA settings like they scale shaders or geometry detail, they would probably start to link AA settings with their own in-game detail and performance profiles. Letting the user choose the AA pattern could be a good start - even now, many DX9.0c games only offer a simple checkmark for enabling the AA, which I can not attribute to anything beside a clear apathy on the part of developers...


I believe Microsoft realizes this situation as well... looking from their GDC2007 presentation The Future of DirectX, they do want to give more control over AA to the developer starting with Direct3D 10.1, and at the same time think about eliminating obscure sorting requirements the future versions. They have gone as far as suggesting the control of sample locations in a MSAA mask, along with full order-independed transparency through hardware sorting employing tiling and even an accumulation buffer!
 
Last edited by a moderator:
reverend i rly dont see what the big deal is. DEVS SHOULD SIMPLY CODE GAMES THAT WORK WITH AA AND LEAVE IT UP TO THE USER TO DECIDE IF HE WANTS TO ENABLE IT THRU THE CONTROL PANEL.

if devs want to be lazy and release total garbage with half broken game engines that dont support aa, they have no right to complain when people bootleg their crappy games because they arent worth the plastic they are wrapped in.

when i say FULL SCENE aa, i mean aa is applied to the entire scene, every edge. not only the edges the developers deem as important. who is paying for the game? we are so if they want us to support their work by spending OUR money stop releasing half finished rushed crap
 
That's encouraging to hear DmitryKo. I have no problem with developer's not specificially coding for AA as long as they don't do something something that prevents me from enabling AA.

I'd obviously prefer it if they did code with AA in mind. However, it's enough to be able to enable it in a control panel and have it work in game.

Considering how long AA has been a basic feature of Video cards now, I just find it completely and utterly inexcusable for a title to not work with AA.

I can live with bugs. I can live with really really bad bugs. I can live with crashes to desktop. I can even live with a game occasionally locking up and rebooting my computer. Thankfully, that appears to have gone the way of the dodo with Vista 64. At least with ATI drivers.

But I simply cannot live without the ability to enable AA either through an in game menu or throught he control panel.

Regards,
SB
 
Gah, I can't wait until I can edit.

I was going to put in an addendum that I realize that last gen of games with HDR didn't have AA due to Nvidia (Bad Nvidia, me hates you for taking away my precioussssss), but as that is no longer the case, I'm not so forgiving of game companies that continue to code engines that are incompatable with AA.

Regards,
SB
 
Couldn't we also say something similar for garden variety bi- or trilinear then?
Absolutely. It's just a tradeoff of complexity VS quality. In the Lance Williams paper that introduced MIP mapping and "trilinear" (amongst other things), Williams does point out the compromises made to the quality in order to achieve speed.

I'm trying to understand what you mean here: if a signal (texture) has been correctly prefiltered and if the mip map selection scheme is also correctly implemented then AF shouldn't be regarded as an AA process, right?
IMHO, AA is either preventing high-frequency data, that cannot be represented in a sampled represenation of a signal, causing incorrect results in that sampled represention (or at least masking them so that they are not distracting :) ). I would therefore feel that filtering of the input signal counts as a valid (in fact, the best) technique. However, completely correct signak filtering is exceedingly difficult. AF is of course implementation dependent, but let's assume we are talking about a "fast" method that builds on top of (predetermined) pre-filtering techniques.

The "correct" pre-filtering for a particular MIP level will assume a fixed projection. Generally this will have equal, power of 2 compression in X and Y. Furthermore, it might only use a simple Box filter which does not correctly pre-filter the data. Of course, Fourier analysis can be used to do correct generation of the smaller maps but it is more expensive.

The problem then lies in the fact that the sample region of the texture that corresponds to the pixel is going to be vaguely elliptical or, better, a distorted Gaussian. This means that we have to approximate this sample region by constructing it from the smaller building blocks available in the MIP levels. If you don't do this correctly, you will still end up with illegally high frequencies creeping in and ruining the result. In this sense, I feel completely justified in saying it is AA.
 
Icecold, were you ever told how much of an annoying broken record you could be? First the Unreal Tournament discussions, and now this - damn! :p Anyway, there are three primary reasons behind the lack of MSAA in games. The first is lack of AA+HDR on G7x. The second is render targets on previous APIs. The third is deferred rendering, at least pre-DirectX10.

99% of the time, not having MSAA support is not being "lazy". It's just that there are hardware of API restrictions that prevent it. Now, what *is* lazy is not having options to disable the effects or HDR that mean MSAA didn't work. The exception to this is deferred rendering, because I really don't think you should expect developers to write two completely different engines. Not that there aren't other very creative forms of antialiasing you can develop (better than edge blurring, I promise!) for deferred renderers, though...
 
theres no reason why any game should ever be incompatible with aa. is deferred rendering rly needed? ive yet to see any games using it accomplish anything special. just cuz a certain range of video cards doesnt support it doesnt mean u neet to prevent EVERYONE from using it.

btw how about that g80 performance piece? :smile:
 
id much rather devs just leave aa out of the in game settings and let us add it thru the control panel.

Where's your problem exactly if the game ships with AA switched off (as usual up to now) in its options? The more in game options the merrier.

In game settings have better chances for higher compatibility and games should actually IMO support all important features found in the driver control panels and not the other way around.

by shimmering i meant edge/polygon shimmering, not texture shimmering.

I thought they call it edge/polygon aliasing; shimmering describes a totally different side-effect.

when i say FULL SCENE aa, i mean aa is applied to the entire scene, every edge. not only the edges the developers deem as important. who is paying for the game? we are so if they want us to support their work by spending OUR money stop releasing half finished rushed crap

Well if it would be possible I'd go even further than that and would require future games to force any filtering related optimisations found in about every driver set today completely off. Either the game would dump you back to the desktop or crawl along like a slideshow; that way I'd at least get out of my eyesight the neverending filtering optimisation crap.

Ironically they are allowing one thing and shouldn't allow the other? Why on God's green earth not anyway? All of today's AF algorithms are purely adaptive, meaning each texture gets as many AF samples as its "steepness" would require. I don't see why developers shouldn't use any adaptive shader AA in future games since it's the only way to cure shader aliasing. And if that should be your problem then they'll simply just add an in game switch to enable/disable shader AA if that's all that bothers.

The bitter reality I think Reverend has in mind is that developers are for years now treating AA as some sort of redundant luxury feature. I don't think that fact will change even a bit in the future, but what puzzles me is what you specifically are supporting or "defending" here.
 
theres no reason why any game should ever be incompatible with aa. is deferred rendering rly needed? ive yet to see any games using it accomplish anything special. just cuz a certain range of video cards doesnt support it doesnt mean u neet to prevent EVERYONE from using it.

btw how about that g80 performance piece? :smile:

Try implementing a shitload of lights into a game without killing an IMR. Just for the record Uttar is right you do sound like a broken record and for the second record MRTs wouldn't work with garden variety MSAA under D3D9.0 on genuine Tile Based Deferred Renderers either me thinks.
 
Back
Top